Back to Search
Start Over
Differentiable learning of matricized DNFs and its application to Boolean networks.
- Source :
- Machine Learning; Aug2023, Vol. 112 Issue 8, p2821-2843, 23p
- Publication Year :
- 2023
-
Abstract
- Boolean networks (BNs) are well-studied models of genomic regulation in biology where nodes are genes and their state transition is controlled by Boolean functions. We propose to learn Boolean functions as Boolean formulas in disjunctive normal form (DNFs) by an explainable neural network Mat_DNF and apply it to learning BNs. Directly expressing DNFs as a pair of binary matrices, we learn them using a single layer NN by minimizing a logically inspired non-negative cost function to zero. As a result, every parameter in the network has a clear meaning of representing a conjunction or literal in the learned DNF. Also we can prove that learning DNFs by the proposed approach is equivalent to inferring interpolants in logic between the positive and negative data. We applied our approach to learning three literature-curated BNs and confirmed its effectiveness. We also examine how generalization occurs when learning data is scarce. In doing so, we introduce two new operations that can improve accuracy, or equivalently generalizability for scarce data. The first one is to append a noise vector to the input learning vector. The second one is to continue learning even after learning error becomes zero. The first one is explainable by the second one. These two operations help us choose a learnable DNF, i.e., a root of the cost function, to achieve high generalizability. [ABSTRACT FROM AUTHOR]
- Subjects :
- COST functions
BOOLEAN functions
Subjects
Details
- Language :
- English
- ISSN :
- 08856125
- Volume :
- 112
- Issue :
- 8
- Database :
- Complementary Index
- Journal :
- Machine Learning
- Publication Type :
- Academic Journal
- Accession number :
- 169749391
- Full Text :
- https://doi.org/10.1007/s10994-023-06346-5