Back to Search
Start Over
Causal Deep Learning: Causal Capsules and Tensor Transformers
- Source :
- Proceedings of the 26th International Conference on Pattern Recognition (ICPR 2022) Montreal, Canada, Aug. 21-25, 2022
- Publication Year :
- 2022
-
Abstract
- We derive a set of causal deep neural networks whose architectures are a consequence of tensor (multilinear) factor analysis. Forward causal questions are addressed with a neural network architecture composed of causal capsules and a tensor transformer. The former estimate a set of latent variables that represent the causal factors, and the latter governs their interaction. Causal capsules and tensor transformers may be implemented using shallow autoencoders, but for a scalable architecture we employ block algebra and derive a deep neural network composed of a hierarchy of autoencoders. An interleaved kernel hierarchy preprocesses the data resulting in a hierarchy of kernel tensor factor models. Inverse causal questions are addressed with a neural network that implements multilinear projection and estimates the causes of effects. As an alternative to aggressive bottleneck dimension reduction or regularized regression that may camouflage an inherently underdetermined inverse problem, we prescribe modeling different aspects of the mechanism of data formation with piecewise tensor models whose multilinear projections are well-defined and produce multiple candidate solutions. Our forward and inverse neural network architectures are suitable for asynchronous parallel computation.<br />Comment: The document contains both the article and the supplemental material
Details
- Database :
- arXiv
- Journal :
- Proceedings of the 26th International Conference on Pattern Recognition (ICPR 2022) Montreal, Canada, Aug. 21-25, 2022
- Publication Type :
- Report
- Accession number :
- edsarx.2301.00314
- Document Type :
- Working Paper