Back to Search Start Over

Learn to Predict Sets Using Feed-Forward Neural Networks.

Authors :
Rezatofighi H
Zhu T
Kaskman R
Motlagh FT
Shi JQ
Milan A
Cremers D
Leal-Taixe L
Reid I
Source :
IEEE transactions on pattern analysis and machine intelligence [IEEE Trans Pattern Anal Mach Intell] 2022 Dec; Vol. 44 (12), pp. 9011-9025. Date of Electronic Publication: 2022 Nov 07.
Publication Year :
2022

Abstract

This paper addresses the task of set prediction using deep feed-forward neural networks. A set is a collection of elements which is invariant under permutation and the size of a set is not fixed in advance. Many real-world problems, such as image tagging and object detection, have outputs that are naturally expressed as sets of entities. This creates a challenge for traditional deep neural networks which naturally deal with structured outputs such as vectors, matrices or tensors. We present a novel approach for learning to predict sets with unknown permutation and cardinality using deep neural networks. In our formulation we define a likelihood for a set distribution represented by a) two discrete distributions defining the set cardinally and permutation variables, and b) a joint distribution over set elements with a fixed cardinality. Depending on the problem under consideration, we define different training models for set prediction using deep neural networks. We demonstrate the validity of our set formulations on relevant vision problems such as: 1) multi-label image classification where we outperform the other competing methods on the PASCAL VOC and MS COCO datasets, 2) object detection, for which our formulation outperforms popular state-of-the-art detectors, and 3) a complex CAPTCHA test, where we observe that, surprisingly, our set-based network acquired the ability of mimicking arithmetics without any rules being coded.

Details

Language :
English
ISSN :
1939-3539
Volume :
44
Issue :
12
Database :
MEDLINE
Journal :
IEEE transactions on pattern analysis and machine intelligence
Publication Type :
Academic Journal
Accession number :
34705634
Full Text :
https://doi.org/10.1109/TPAMI.2021.3122970