Back to Search Start Over

All-In-One: Artificial Association Neural Networks

Authors :
Kim, Seokjun
Jang, Jaeeun
Kim, Hyeoncheol
Publication Year :
2021

Abstract

Most deep learning models are limited to specific datasets or tasks because of network structures using fixed layers. In this paper, we discuss the differences between existing neural networks and real human neurons, propose association networks to connect existing models, and describe multiple types of deep learning exercises performed using a single structure. Further, we propose a new neural data structure that can express all basic models of existing neural networks in a tree structure. We also propose an approach in which information propagates from leaf to a root node using the proposed recursive convolution approach (i.e., depth-first convolution) and feed-forward propagation is performed. Thus, we design a ``data-based,'' as opposed to a ``model-based,'' neural network. In experiments conducted, we compared the learning performances of the models specializing in specific domains with those of models simultaneously learning various domains using an association network. The model learned well without significant performance degradation compared to that for models performing individual learning. In addition, the performance results were similar to those of the special case models; the output of the tree contained all information from the tree. Finally, we developed a theory for using arbitrary input data and learning all data simultaneously.<br />Comment: Model Agnostic, structurally free, graph neural networks, neural data structure, recursive neural networks

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2111.00424
Document Type :
Working Paper