Harmformer: Harmonic Networks Meet Transformers for Continuous Roto-Translation Equivariance
Tomáš Karella, Adam Harmanec, Jan Kotera, Jan Blažek, and Filip Šroubek
Appears in NeurIPS 2024 Workshop on Symmetry and Geometry in Neural Representations, 2024
CNNs exhibit inherent equivariance to image translation, leading to efficient parameter and data usage, faster learning, and improved robustness. The concept of translation equivariant networks has been successfully extended to rotation transformation using group convolution for discrete rotation groups and harmonic functions for the continuous rotation group encompassing 360°. We explore the compatibility of the SA mechanism with full rotation equivariance, in contrast to previous studies that focused on discrete rotation. We introduce the Harmformer, a harmonic transformer with a convolutional stem that achieves equivariance for both translation and continuous rotation. Accompanied by an end-to-end equivariance proof, the Harmformer not only outperforms previous equivariant transformers, but also demonstrates inherent stability under any continuous rotation, even without seeing rotated samples during training.