TY - JOUR AU - Lin, Feng AU - Xu, Haohang AU - Li, Houqiang AU - Xiong, Hongkai AU - Qi, Guo-Jun PY - 2021/05/18 Y2 - 2024/03/28 TI - Auto-Encoding Transformations in Reparameterized Lie Groups for Unsupervised Learning JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 35 IS - 10 SE - AAAI Technical Track on Machine Learning III DO - 10.1609/aaai.v35i10.17044 UR - https://ojs.aaai.org/index.php/AAAI/article/view/17044 SP - 8610-8617 AB - Unsupervised training of deep representations has demonstrated remarkable potentials in mitigating the prohibitive expenses on annotating labeled data recently. Among them is predicting transformations as a pretext task to self-train representations, which has shown great potentials for unsupervised learning. However, existing approaches in this category learn representations by either treating a discrete set of transformations as separate classes, or using the Euclidean distance as the metric to minimize the errors between transformations. None of them has been dedicated to revealing the vital role of the geometry of transformation groups in learning representations. Indeed, an image must continuously transform along the curved manifold of a transformation group rather than through a straight line in the forbidden ambient Euclidean space. This suggests the use of geodesic distance to minimize the errors between the estimated and groundtruth transformations. Particularly, we focus on homographies, a general group of planar transformations containing the Euclidean, similarity and affine transformations as its special cases. To avoid an explicit computing of intractable Riemannian logarithm, we project homographies onto an alternative group of rotation transformations SR(3) with a tractable form of geodesic distance. Experiments demonstrate the proposed approach to Auto-Encoding Transformations exhibits superior performances on a variety of recognition problems. ER -