Oct. 12, 2022, 1:12 a.m. | Kaitlin Maile, Dennis G. Wilson, Patrick Forré

cs.LG updates on arXiv.org arxiv.org

Incorporating equivariance to symmetry groups as a constraint during neural
network training can improve performance and generalization for tasks
exhibiting those symmetries, but such symmetries are often not perfectly nor
explicitly present. This motivates algorithmically optimizing the architectural
constraints imposed by equivariance. We propose the equivariance relaxation
morphism, which preserves functionality while reparameterizing a group
equivariant layer to operate with equivariance constraints on a subgroup, as
well as the $[G]$-mixed equivariant layer, which mixes layers constrained to
different groups to …

arxiv networks neural networks optimization subgroups

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Sr. Software Development Manager, AWS Neuron Machine Learning Distributed Training

@ Amazon.com | Cupertino, California, USA