1. Latent Space Translation via Inverse Relative Projection
- Author
-
Maiorca, Valentino, Moschella, Luca, Fumero, Marco, Locatello, Francesco, and Rodolà, Emanuele
- Subjects
Computer Science - Machine Learning - Abstract
The emergence of similar representations between independently trained neural models has sparked significant interest in the representation learning community, leading to the development of various methods to obtain communication between latent spaces. "Latent space communication" can be achieved in two ways: i) by independently mapping the original spaces to a shared or relative one; ii) by directly estimating a transformation from a source latent space to a target one. In this work, we combine the two into a novel method to obtain latent space translation through the relative space. By formalizing the invertibility of angle-preserving relative representations and assuming the scale invariance of decoder modules in neural models, we can effectively use the relative space as an intermediary, independently projecting onto and from other semantically similar spaces. Extensive experiments over various architectures and datasets validate our scale invariance assumption and demonstrate the high accuracy of our method in latent space translation. We also apply our method to zero-shot stitching between arbitrary pre-trained text and image encoders and their classifiers, even across modalities. Our method has significant potential for facilitating the reuse of models in a practical manner via compositionality., Comment: arXiv admin note: text overlap with arXiv:2311.00664, arXiv:2406.11014
- Published
- 2024