1. Brain2GAN: Reconstructing perceived faces from the primate brain via StyleGAN3
- Author
-
Dado, T.M., Papale, P., Lozano, A., Le, L., Wang, F., Gerven, M.A.J. van, Roelfsema, P.R., Güçlütürk, Y., Güçlü, U., Dado, T.M., Papale, P., Lozano, A., Le, L., Wang, F., Gerven, M.A.J. van, Roelfsema, P.R., Güçlütürk, Y., and Güçlü, U.
- Abstract
ICLR 2023: The Eleventh International Conference on Learning Representations (Kigali, Rwanda, May 1-5, 2023), Item does not contain fulltext, Neural coding characterizes the relationship between stimuli and their corresponding neural responses. The usage of synthesized yet photorealistic reality by generative adversarial networks (GANs) allows for superior control over these data: the underlying feature representations that account for the semantics in synthesized data are known a priori and their relationship is perfect rather than approximated post-hoc by feature extraction models. We exploit this property in neural decoding of multi-unit activity responses that we recorded from the primate brain upon presentation with synthesized face images in a passive fixation experiment. The face reconstructions we acquired from brain activity were astonishingly similar to the originally perceived face stimuli. This provides strong evidence that the neural face manifold and the disentangled w-latent space conditioned on StyleGAN3 (rather than the z-latent space of arbitrary GANs or other feature representations we encountered so far) share how they represent the high-level semantics of the high-dimensional space of faces.
- Published
- 2023