Back to Search
Start Over
Block Neural Network Avoids Catastrophic Forgetting When Learning Multiple Task
- Publication Year :
- 2017
-
Abstract
- In the present work we propose a Deep Feed Forward network architecture which can be trained according to a sequential learning paradigm, where tasks of increasing difficulty are learned sequentially, yet avoiding catastrophic forgetting. The proposed architecture can re-use the features learned on previous tasks in a new task when the old tasks and the new one are related. The architecture needs fewer computational resources (neurons and connections) and less data for learning the new task than a network trained from scratch
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.1711.10204
- Document Type :
- Working Paper