75 results on '"Hopfield network"'
Search Results
2. Solving HornSAT Fuzzy Logic Neuro-symbolic Integration
- Author
-
Azizan, Farah Liyana, Sathasivam, Saratha, Ali, Majid Khan Majahar, Alzaeemi, Shehab Abdulhabib Saeed, Kacprzyk, Janusz, Series Editor, and Abdul Karim, Samsul Ariffin, editor
- Published
- 2022
- Full Text
- View/download PDF
3. Adapted Model Neural-Like Hopfield Network and the Algorithm of Its Training for Finding the Roots Systems of Linear Algebraic Equations
- Author
-
Gluhov, Alexander, Baranovskiy, Anatoly, Fomenko, Yulia, Bochkov, Alexander, Kacprzyk, Janusz, Series Editor, Gomide, Fernando, Advisory Editor, Kaynak, Okyay, Advisory Editor, Liu, Derong, Advisory Editor, Pedrycz, Witold, Advisory Editor, Polycarpou, Marios M., Advisory Editor, Rudas, Imre J., Advisory Editor, Wang, Jun, Advisory Editor, and Silhavy, Radek, editor
- Published
- 2021
- Full Text
- View/download PDF
4. M-ary Hopfield Neural Network Based Associative Memory Formulation: Limit-Cycle Based Sequence Storage and Retrieval
- Author
-
Ladwani, Vandana M., Ramasubramanian, V., Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Woeginger, Gerhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Farkaš, Igor, editor, Masulli, Paolo, editor, Otte, Sebastian, editor, and Wermter, Stefan, editor
- Published
- 2021
- Full Text
- View/download PDF
5. Challenges for the Optimization of Drug Therapy in the Treatment of Cancer
- Author
-
Carels, Nicolas, Conforte, Alessandra Jordano, Lima, Carlyle Ribeiro, da Silva, Fabricio Alves Barbosa, Crippen, Gordon, Advisory Editor, Dress, Andreas, Editor-in-Chief, Giegerich, Robert, Editorial Board Member, Kelso, Janet, Editorial Board Member, Linial, Michal, Editor-in-Chief, Felsenstein, Joseph, Advisory Editor, Troyanskaya, Olga, Editor-in-Chief, Gusfield, Dan, Advisory Editor, Myers, Gene, Editorial Board Member, Istrail, Sorin, Advisory Editor, Pevzner, Pavel, Editorial Board Member, Vingron, Martin, Editor-in-Chief, Lengauer, Thomas, Advisory Editor, McClure, Marcella, Advisory Editor, Nowak, Martin, Advisory Editor, Sankoff, David, Advisory Editor, Shamir, Ron, Advisory Editor, Steel, Mike, Advisory Editor, Stormo, Gary, Advisory Editor, Tavaré, Simon, Advisory Editor, Warnow, Tandy, Advisory Editor, Welch, Lonnie, Advisory Editor, da Silva, Fabricio Alves Barbosa, editor, Carels, Nicolas, editor, Trindade dos Santos, Marcelo, editor, and Lopes, Francisco José Pereira, editor
- Published
- 2020
- Full Text
- View/download PDF
6. Interval Methods for Seeking Fixed Points of Recurrent Neural Networks
- Author
-
Kubica, Bartłomiej Jacek, Hoser, Paweł, Wiliński, Artur, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Woeginger, Gerhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Krzhizhanovskaya, Valeria V., editor, Závodszky, Gábor, editor, Lees, Michael H., editor, Dongarra, Jack J., editor, Sloot, Peter M. A., editor, Brissos, Sérgio, editor, and Teixeira, João, editor
- Published
- 2020
- Full Text
- View/download PDF
7. A New Approximation Algorithm for the d-dimensional Knapsack Problem Based on Hopfield Networks
- Author
-
Wu, Hsin-Lung, Chang, Jui-Sheng, Chang, Jen-Chun, Howlett, Robert James, Series Editor, Jain, Lakhmi C., Series Editor, Pan, Jeng-Shyang, editor, Ito, Akinori, editor, and Tsai, Pei-Wei, editor
- Published
- 2019
- Full Text
- View/download PDF
8. Hopfield Associative Memory with Quantized Weights
- Author
-
Tarkov, Mikhail S., Kacprzyk, Janusz, Series Editor, Kryzhanovsky, Boris, editor, Dunin-Barkowski, Witali, editor, Redko, Vladimir, editor, and Tiumentsev, Yury, editor
- Published
- 2019
- Full Text
- View/download PDF
9. Multi-modal Associative Storage and Retrieval Using Hopfield Auto-associative Memory Network
- Author
-
Shriwas, Rachna, Joshi, Prasun, Ladwani, Vandana M., Ramasubramanian, V., Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Woeginger, Gerhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Tetko, Igor V., editor, Kůrková, Věra, editor, Karpov, Pavel, editor, and Theis, Fabian, editor
- Published
- 2019
- Full Text
- View/download PDF
10. Writing to the Hopfield Memory via Training a Recurrent Network
- Author
-
Bao, Han, Zhang, Richong, Mao, Yongyi, Huai, Jinpeng, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Woeginger, Gerhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Nayak, Abhaya C., editor, and Sharma, Alok, editor
- Published
- 2019
- Full Text
- View/download PDF
11. Fingerprint Retrieval Using a Specialized Ensemble of Attractor Networks
- Author
-
González, Mario, Dávila, Carlos, Dominguez, David, Sánchez, Ángel, Rodriguez, Francisco B., Hutchison, David, Editorial Board Member, Kanade, Takeo, Editorial Board Member, Kittler, Josef, Editorial Board Member, Kleinberg, Jon M., Editorial Board Member, Mattern, Friedemann, Editorial Board Member, Mitchell, John C., Editorial Board Member, Naor, Moni, Editorial Board Member, Pandu Rangan, C., Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Terzopoulos, Demetri, Editorial Board Member, Tygar, Doug, Editorial Board Member, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Rojas, Ignacio, editor, Joya, Gonzalo, editor, and Catala, Andreu, editor
- Published
- 2019
- Full Text
- View/download PDF
12. Bistable Sigmoid Networks
- Author
-
Uschakow, Stanislav, Fischer, Jörn, Ihme, Thomas, Hutchison, David, Editorial Board Member, Kanade, Takeo, Editorial Board Member, Kittler, Josef, Editorial Board Member, Kleinberg, Jon M., Editorial Board Member, Mattern, Friedemann, Editorial Board Member, Mitchell, John C., Editorial Board Member, Naor, Moni, Editorial Board Member, Pandu Rangan, C., Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Terzopoulos, Demetri, Editorial Board Member, Tygar, Doug, Editorial Board Member, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Rojas, Ignacio, editor, Joya, Gonzalo, editor, and Catala, Andreu, editor
- Published
- 2019
- Full Text
- View/download PDF
13. Ensemble of Attractor Networks for 2D Gesture Retrieval
- Author
-
Dávila, Carlos, González, Mario, Pérez-Medina, Jorge-Luis, Dominguez, David, Sánchez, Ángel, Rodriguez, Francisco B., Hutchison, David, Editorial Board Member, Kanade, Takeo, Editorial Board Member, Kittler, Josef, Editorial Board Member, Kleinberg, Jon M., Editorial Board Member, Mattern, Friedemann, Editorial Board Member, Mitchell, John C., Editorial Board Member, Naor, Moni, Editorial Board Member, Pandu Rangan, C., Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Terzopoulos, Demetri, Editorial Board Member, Tygar, Doug, Editorial Board Member, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Rojas, Ignacio, editor, Joya, Gonzalo, editor, and Catala, Andreu, editor
- Published
- 2019
- Full Text
- View/download PDF
14. An Overview of Different Neural Network Architectures
- Author
-
Skansi, Sandro, Mackie, Ian, Series Editor, Abramsky, Samson, Advisory Editor, Hankin, Chris, Advisory Editor, Kozen, Dexter C., Advisory Editor, Pitts, Andrew, Advisory Editor, Riis Nielson, Hanne, Advisory Editor, Skiena, Steven S, Advisory Editor, Stewart, Iain, Advisory Editor, Hinchey, Mike, Advisory Editor, and Skansi, Sandro
- Published
- 2018
- Full Text
- View/download PDF
15. Guaranteed Training Set for Associative Networks
- Author
-
Volna, Eva, Kotyrba, Martin, Kacprzyk, Janusz, Series editor, Pal, Nikhil R., Advisory editor, Bello Perez, Rafael, Advisory editor, Corchado, Emilio S., Advisory editor, Hagras, Hani, Advisory editor, Kóczy, László T., Advisory editor, Kreinovich, Vladik, Advisory editor, Lin, Chin-Teng, Advisory editor, Lu, Jie, Advisory editor, Melin, Patricia, Advisory editor, Nedjah, Nadia, Advisory editor, Nguyen, Ngoc Thanh, Advisory editor, Wang, Jun, Advisory editor, and Matoušek, Radek, editor
- Published
- 2017
- Full Text
- View/download PDF
16. Capacity and Retrieval of a Modular Set of Diluted Attractor Networks with Respect to the Global Number of Neurons
- Author
-
González, Mario, Dominguez, David, Sánchez, Ángel, Rodríguez, Francisco B., Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Weikum, Gerhard, Series editor, Rojas, Ignacio, editor, Joya, Gonzalo, editor, and Catala, Andreu, editor
- Published
- 2017
- Full Text
- View/download PDF
17. Attractor Basin Analysis of the Hopfield Model: The Generalized Quadratic Knapsack Problem
- Author
-
García, Lucas, Talaván, Pedro M., Yáñez, Javier, Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Weikum, Gerhard, Series editor, Rojas, Ignacio, editor, Joya, Gonzalo, editor, and Catala, Andreu, editor
- Published
- 2017
- Full Text
- View/download PDF
18. Application of Hopfield Neural Network to the N-Queens Problem
- Author
-
Lapushkin, Andrei A., Kacprzyk, Janusz, Series editor, Samsonovich, Alexei V., editor, Klimov, Valentin V., editor, and Rybina, Galina V., editor
- Published
- 2016
- Full Text
- View/download PDF
19. Hopfield Network with Interneuronal Connections Based on Memristor Bridges
- Author
-
Tarkov, Mikhail S., Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Weikum, Gerhard, Series editor, Cheng, Long, editor, Liu, Qingshan, editor, and Ronzhin, Andrey, editor
- Published
- 2016
- Full Text
- View/download PDF
20. Artificial Neural Network
- Author
-
Rathore, Heena and Rathore, Heena
- Published
- 2016
- Full Text
- View/download PDF
21. Discovery of Salient Low-Dimensional Dynamical Structure in Neuronal Population Activity Using Hopfield Networks
- Author
-
Effenberger, Felix, Hillar, Christopher, Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Weikum, Gerhard, Series editor, Feragen, Aasa, editor, Pelillo, Marcello, editor, and Loog, Marco, editor
- Published
- 2015
- Full Text
- View/download PDF
22. The Optical Mouse: Early Biomimetic Embedded Vision
- Author
-
Lyon, Richard F., Singh, Sameer, Series editor, Kang, Sing Bing, Series editor, Kisačanin, Branislav, editor, and Gelautz, Margrit, editor
- Published
- 2014
- Full Text
- View/download PDF
23. Neural Networks
- Author
-
Lynch, Stephen and Lynch, Stephen
- Published
- 2014
- Full Text
- View/download PDF
24. Framework of Optimization Methodology with Use of an Intelligent Hybrid Transport Management System Based on Hopfield Network and Travelling Salesman Problem
- Author
-
Kubiak, Natalia, Stachowiak, Agnieszka, Kacprzyk, Janusz, Series editor, Omatu, Sigeru, editor, Neves, José, editor, Rodriguez, Juan M. Corchado, editor, Paz Santana, Juan F, editor, and Gonzalez, Sara Rodríguez, editor
- Published
- 2013
- Full Text
- View/download PDF
25. On Efficiency and Predictability of the Dynamics of Discrete-Time Boolean Networks
- Author
-
Predrag T. Tosic
- Subjects
Hopfield network ,Theoretical computer science ,Computational complexity theory ,Discrete time and continuous time ,Computer science ,Context (language use) ,Predictability ,Network dynamics ,Cellular automaton ,Decidability - Abstract
We discuss possible interpretations of “(in)efficient” and “(un)predictable” systems when those concepts are applied to discrete networks and their dynamics. We are interested in computational notions of the network dynamics’ predictability and efficiency. Deterministic discrete network’s dynamics is efficient if it settles fairly quickly into an appropriate stationary pattern; that is, efficiency is essentially synonymous with a short transient chain until a stationary system state is reached. An inefficient dynamical system, then, is one that takes a long time to converge to a stationary behavior. The issue of (in)efficiency is related to, but different from, that of the network dynamics’ (un)predictability. We call dynamics predictable, if the “ultimate destiny” (i.e., convergence to a stationary behavior) can be efficiently determined or predicted, without performing a step-by-step simulation of the network's dynamics—unless such simulation itself can be done efficiently. For discrete systems with finite configuration spaces, all properties of their dynamics are decidable; however, for non-trivial network sizes, various properties may or may not be possible to predict within reasonable computational resources. In that context, we discuss how computational complexity of reachability problems and enumeration problems about several classes of cellular and network automata relate to whether those networks’ dynamics are predictable. We then discuss some examples of cellular automata and other Boolean networks whose dynamics are (i) both predictable and efficient, (ii) efficient but unpredictable, (iii) predictable yet inefficient, and finally (iv) both unpredictable and inefficient. In particular, we briefly overview several classes of Boolean networks that fall into the categories (i), (ii) and (iv).
- Published
- 2021
26. Quantum Based Deep Learning Models for Pattern Recognition
- Author
-
Kapil Kumar Soni, Prakhar Shrivastava, and Akhtar Rasool
- Subjects
Hopfield network ,Hebbian theory ,Artificial neural network ,Computer science ,business.industry ,Deep learning ,Pattern recognition (psychology) ,Boltzmann machine ,Pattern recognition ,Data pre-processing ,Artificial intelligence ,business ,Quantum computer - Abstract
The machine learning model influences the pattern recognition based on extraction of relative patterns, but not enough capable of efficiently processing the data set that needs layered interaction. So, deep learning model takes the advantage of artificial neural network for processing the data in layered abstraction by exploring massive parallelism, although the classical implementations of such model may not be competent due to the processing and storage of large neural networks. Current research explores quantum computation potentials and utilizes its significance for supporting inherent parallelism, as the machine perform exponential operations in single step of execution. The possible classical designs of pattern recognition using deep learning are Hopfield network and Boltzmann machine and their equivalent quantum models can remain effective and overcome the processing limitations of classical model by incorporating Grover’s search method and quantum Hebbian learning. The aim of writing this article is to introduce the necessity of deep learning model, an emergence of quantum computations, discussion over requisites of data preprocessing techniques, then to propose the classical equivalent quantum deep learning model along with the algorithms, complexity comparison and speedup analysis followed by conclusive aspects that proves effectiveness of quantum deep learning model and future works.
- Published
- 2021
27. Adapted Model Neural-Like Hopfield Network and the Algorithm of Its Training for Finding the Roots Systems of Linear Algebraic Equations
- Author
-
Anatoly Baranovskiy, Yulia Fomenko, Alexander Bochkov, and Alexander Gluhov
- Subjects
Artificial neural network ,Computer science ,Iterative method ,Structure (category theory) ,Jacobi method ,Hopfield network ,Algebraic equation ,symbols.namesake ,Matrix (mathematics) ,symbols ,MATLAB ,Algorithm ,computer ,computer.programming_language - Abstract
An approach for automatic formation of the structure of the Hopfield neural network model and its training (parameter settings) for solving systems of linear algebraic equations (SLAE) of arbitrary order is proposed. Adapted for solving of SLAE model of network is configured automatically in the environment for modeling Simulink, interacting with the Matlab computing system, which allows the user to vary the input data (order, constants vector and coefficients matrix system of linear algebraic equations). The results of research on the quality of the finding for solutions to SLAEs are presented.
- Published
- 2021
28. Mutually Connected Neural Networks
- Author
-
Atsuya Oishi and Genki Yagawa
- Subjects
Hopfield network ,Artificial neural network ,Computer science ,Section (archaeology) ,Boltzmann machine ,Topology - Abstract
This chapter focuses on the mutually connected neural networks. Section 4.1 describes the Hopfield network, and Sect. 4.2 the Boltzmann machine.
- Published
- 2021
29. Deep Learning Techniques and COVID-19 Drug Discovery: Fundamentals, State-of-the-Art and Future Directions
- Author
-
Hamed Hashemi-Dezaki, Saeedeh Lotfi, Luigi La Spada, Mojgan Dehghani, Morteza Jamshidi, Asal Sabet, Mohammad Mahdi Honari, Pedram Lalbakhsh, Mirhamed Mirmozafari, Saeed Roshani, Jakub Talla, Zahra Malek, Mohammad Jamshidi, Alireza Jamshidi, Ali Lalbakhsh, Sobhan Roshani, Zdeněk Peroutka, Farimah Hadjilooei, Václav Matoušek, and Sahar Ahmadi
- Subjects
Hopfield network ,Deep belief network ,Restricted Boltzmann machine ,Drug development ,Artificial neural network ,Process (engineering) ,business.industry ,Computer science ,Deep learning ,State (computer science) ,Artificial intelligence ,business ,Data science - Abstract
The world is in a frustrating situation, which is exacerbating due to the time-consuming process of the COVID-19 vaccine design and production. This chapter provides a comprehensive investigation of fundamentals, state-of-the-art and some perspectives to speed up the process of the design, optimization and production of the medicine for COVID-19 based on Deep Learning (DL) methods. The proposed platforms are able to be used as predictors to forecast antigens during the infection disregarding their abundance and immunogenicity with no requirement of growing the pathogen in vitro. First, we briefly survey the latest achievements and fundamentals of some DL methodologies, including Deep Boltzmann Machines (DBM), Restricted Boltzmann Machine (RBM), Deep Belief Network (DBN), Hopfield network and Long Short-Term Memory(LSTM). These techniques help us to reach an integrated approach for drug development by non-conventional antigens. We then propose several DL-based platforms to utilize for future applications regarding the latest publications and medical reports. Considering the evolving date on COVID-19 and its ever-changing nature, we believe this survey can give readers some useful ideas and directions to understand the application of Artificial Intelligence (AI) to accelerate the vaccine design not only for COVID-19 but also for many different diseases or viruses.
- Published
- 2021
30. M-ary Hopfield Neural Network Based Associative Memory Formulation: Limit-Cycle Based Sequence Storage and Retrieval
- Author
-
Vandana M. Ladwani and V. Ramasubramanian
- Subjects
Hopfield network ,Sequence ,Artificial neural network ,Computer science ,business.industry ,Limit cycle ,Computer data storage ,Limit (mathematics) ,Content-addressable memory ,DUAL (cognitive architecture) ,business ,Algorithm - Abstract
In this paper, we examine Hopfield network composed of multi-state neurons for storing sequence data as limit cycles of the network. Earlier, we had presented uni-modal data - particularly text, speech and audio data storage and retrieval in bipolar Hopfield based associative memory architecture. We extended this to multi-modal data and we demonstrated that Hopfield can indeed work as content addressable memory for multi-modal data. This paper is a step towards realising a more wider definition of multi-modality. We present a M-ary Hopfield associative memory model for storing limit cycle data. The proposed system uses a dual weight learning mechanism to exhibit limit cycle behavior in which sequence data can be stored and retrieved. We particularly deal with a) sequence of images and b) movie clip data as instances of limit cycle data. We also propose and use a two stage firing mechanism to retrieve the stored sequence data from the limit cycles. We present a trade-off behavior between the number of cycles and length of cycles the network can store and we demonstrate that the network capacity is still of the order of network size i.e., O(N) for limit cycle data. This represents a first of its kind attempt for sequence storage and retrieval in Hopfield network as limit-cycles, particularly with image-sequence and movie-content data of real-world scales.
- Published
- 2021
31. Hopfield Network Based Approximation Engine for NP Complete Problems
- Author
-
T. D. Manjunath, Jyothi S. Nayak, Nesar Prafulla, and S. Samarth
- Subjects
Hopfield network ,Reduction (complexity) ,Class (set theory) ,Mathematical optimization ,Property (programming) ,Computer science ,Field (mathematics) ,Graph theory ,NP-complete ,Time complexity - Abstract
NP Complete problems belong to a computational class of problems that have no known polynomial time solutions. Many popular and practically useful problems in the field of optimization and graph theory which have real life application are known to be NP Complete and solving them exactly is intractable. The existing problem with these is that there is no known efficient way to locate a solution in the first place, the most notable characteristic of NP-complete problems is that no fast solution to them is known. However approximate solutions can be obtained in polynomial time. Hopfield networks are one of the ways to obtain approximate solution to the problems in polynomial time. Exploiting the reducibility property and the capability of Hopfield Networks to provide approximate solutions in polynomial time we propose a Hopfield Network based approximation engine to solve these NP complete problems.
- Published
- 2020
32. Challenges for the Optimization of Drug Therapy in the Treatment of Cancer
- Author
-
Fabricio Alves Barbosa da Silva, Alessandra Jordano Conforte, Carlyle Ribeiro Lima, and Nicolas Carels
- Subjects
Computer science ,business.industry ,Cancer ,Inference ,Computational biology ,medicine.disease ,Network dynamics ,Metastasis ,Hopfield network ,Drug repositioning ,medicine ,Identification (biology) ,Personalized medicine ,business - Abstract
Personalized medicine aims at identifying specific targets for treatment considering the gene expression profile of each patient individually. We discuss the challenges for personalized oncology to take off and present an approach based on hub inhibition that we are developing. That is, the subtraction of RNA-seq data of tumoral and non-tumoral surrounding tissues in biopsies allows the identification of up-regulated genes in tumors of patients. Targeting connection hubs in the subnetworks formed by the interactions between the proteins of up-regulated genes is a suitable strategy for the inhibition of tumor growth and metastasis in vitro. The most relevant protein targets may be further analyzed for drug repurposing by computational biology. The subnetworks formed by the interactions between the proteins of up-regulated genes allow the inference by Shannon entropy of the number of targets to be inhibited according to the tumor aggressiveness. There are common targets between tumoral tissues but many others are personalized at a molecular level. We also consider additional measures and more sophisticated modeling. This approach is necessary to improve the rational choice of therapeutic targets and the description of network dynamics. The modeling of attractors through Hopfield Network and ordinary differential equations are given here as examples.
- Published
- 2020
33. Hopfield Networks for Vector Quantization
- Author
-
Rafet Sifa, Rajkumar Ramamurthy, and Christian Bauckhage
- Subjects
Computer science ,Kernel density estimation ,Vector quantization ,02 engineering and technology ,Energy minimization ,Adiabatic quantum computation ,01 natural sciences ,Hopfield network ,Set (abstract data type) ,0103 physical sciences ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Quadratic unconstrained binary optimization ,010306 general physics ,Algorithm - Abstract
We consider the problem of finding representative prototypes within a set of data and solve it using Hopfield networks. Our key idea is to minimize the mean discrepancy between kernel density estimates of the distributions of data points and prototypes. We show that this objective can be cast as a quadratic unconstrained binary optimization problem which is equivalent to a Hopfield energy minimization problem. This result is of current interest as it suggests that vector quantization can be accomplished via adiabatic quantum computing.
- Published
- 2020
34. Interval Methods for Seeking Fixed Points of Recurrent Neural Networks
- Author
-
Artur Wiliński, Bartłomiej Jacek Kubica, and Paweł Hoser
- Subjects
Automatic differentiation ,Computer science ,010103 numerical & computational mathematics ,02 engineering and technology ,Fixed point ,Solver ,01 natural sciences ,Stationary point ,Hopfield network ,Nonlinear system ,Recurrent neural network ,0202 electrical engineering, electronic engineering, information engineering ,Interval (graph theory) ,020201 artificial intelligence & image processing ,0101 mathematics ,Algorithm - Abstract
The paper describes an application of interval methods to train recurrent neural networks and investigate their behavior. The HIBA_USNE multithreaded interval solver for nonlinear systems and algorithmic differentiation using ADHC are used. Using interval methods, we can not only train the network, but precisely localize all stationary points of the network. Preliminary numerical results for continuous Hopfield-like networks are presented.
- Published
- 2020
35. RBF, SOM, Hopfield, and Deep Neural Networks
- Author
-
Cosimo Distante and Arcangelo Distante
- Subjects
Hopfield network ,Artificial neural network ,Computer science ,business.industry ,Deep learning ,Convergence (routing) ,Feature (machine learning) ,Unsupervised learning ,Radial basis function ,Artificial intelligence ,business ,Convolutional neural network - Abstract
In this chapter, four different types of neural networks are described: Radial Basis Functions-RBF, Self-Organizing Maps-SOM, the Hopfield, and the deep neural networks. RBF uses a different approach in the design of a neural network based on the hidden layer (unique in the network) composed of neurons in which radial basis functions are defined, hence the name of Radial Basis Functions, and which performs a nonlinear transformation of the input data supplied to the network. The SOM network, on the other hand, has an unsupervised learning model and has the originality of autonomously grouping input data on the basis of their similarity without evaluating the convergence error with external information on the data, but evaluating the quantization error on map network. With the Hopfield network, the learning model is supervised and with the ability to store information and retrieve it through even partial content of the original information. The network is associated with an energy function to be minimized during its evolution with a succession of states until reaching a final state corresponding to the minimum of the energy function. This feature allows it to be used to solve and set up an optimization problem in terms of the objective function to be associated with an energy function. The chapter concludes with the description of the foundation of the newly fashionable methods based on convolutional neural networks (CNN), by now the most widespread since 2012, based on the deep learning architecture (deep learning).
- Published
- 2020
36. Multitask Hopfield Networks
- Author
-
Marco Frasca, Giuliano Grossi, and Giorgio Valentini
- Subjects
0301 basic medicine ,business.industry ,Computer science ,Node (networking) ,Multi-task learning ,010501 environmental sciences ,Network dynamics ,Machine learning ,computer.software_genre ,01 natural sciences ,Hopfield network ,03 medical and health sciences ,Task (computing) ,030104 developmental biology ,Benchmark (computing) ,Graph (abstract data type) ,Artificial intelligence ,business ,computer ,0105 earth and related environmental sciences ,Parametric statistics - Abstract
Multitask algorithms typically use task similarity information as a bias to speed up and improve the performance of learning processes. Tasks are learned jointly, sharing information across them, in order to construct models more accurate than those learned separately over single tasks. In this contribution, we present the first multitask model, to our knowledge, based on Hopfield Networks (HNs), named HoMTask. We show that by appropriately building a unique HN embedding all tasks, a more robust and effective classification model can be learned. HoMTask is a transductive semi-supervised parametric HN, that minimizes an energy function extended to all nodes and to all tasks under study. We provide theoretical evidence that the optimal parameters automatically estimated by HoMTask make coherent the model itself with the prior knowledge (connection weights and node labels). The convergence properties of HNs are preserved, and the fixed point reached by the network dynamics gives rise to the prediction of unlabeled nodes. The proposed model improves the classification abilities of singletask HNs on a preliminary benchmark comparison, and achieves competitive performance with state-of-the-art semi-supervised graph-based algorithms.
- Published
- 2020
37. Delayed Hybrid Impulsive Neural Networks
- Author
-
Xuemin Shen, Bin Hu, and Zhi-Hong Guan
- Subjects
Hopfield network ,Quantitative Biology::Neurons and Cognition ,Transmission (telecommunications) ,Exponential stability ,Artificial neural network ,Computer science ,Computer Science::Neural and Evolutionary Computation ,Stability (learning theory) ,Uniqueness ,Topology ,Instability ,Interpretation (model theory) - Abstract
This chapter first introduces the continuous-time Hopfield neural networks. The existence and uniqueness of equilibrium, as well as its stability and instability, of continuous-time Hopfield networks are analyzed, and less conservative yet more general results are established. Then, in light of the continuous-time architecture of Hopfield networks, the impulsive Hopfield neural networks with transmission delays are formulated and explained. Many evolutionary processes, particularly biological systems, that exhibit impulsive dynamical behaviors, can be described by the impulsive Hopfield neural networks. Fundamental issues such as the global exponential stability, the existence and uniqueness of the equilibrium of such impulsive Hopfield networks are established. A numerical example is given for illustration and interpretation of the theoretical results.
- Published
- 2019
38. Writing to the Hopfield Memory via Training a Recurrent Network
- Author
-
Yongyi Mao, Jinpeng Huai, Han Bao, and Richong Zhang
- Subjects
Quantitative Biology::Neurons and Cognition ,SIMPLE (military communications protocol) ,business.industry ,Computer science ,Supervised learning ,02 engineering and technology ,010501 environmental sciences ,01 natural sciences ,Hopfield network ,Recurrent neural network ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Noise (video) ,Artificial intelligence ,business ,Protocol (object-oriented programming) ,0105 earth and related environmental sciences - Abstract
We consider the problem of writing on a Hopfield network. We cast the problem as a supervised learning problem by observing a simple link between the update equations of Hopfield network and recurrent neural networks. We compare the new writing protocol to existing ones and experimentally verify its effectiveness. Our method not only has a better ability of noise recovery, but also has a bigger capacity compared to the other existing writing protocols.
- Published
- 2019
39. Bistable Sigmoid Networks
- Author
-
Thomas Ihme, Jörn Fischer, and Stanislav Uschakow
- Subjects
Hopfield network ,Hebbian theory ,Recurrent neural network ,Quantitative Biology::Neurons and Cognition ,Bistability ,Artificial neural network ,Computer science ,Learning rule ,Sigmoid function ,Topology ,MNIST database - Abstract
It is commonly known that Hopfield Networks suffer from spurious states and from low storage capacity. To eliminate the spurious states Bistable Gradient Networks (BGN) introduce neurons with bistable behavior. The weights in BGN are calculated in analogy to those of Hopfield Networks, associated with Hebbian learning. Unfortunately, those networks still suffer from small storage capacity, resulting in high reconstruction errors when used to reconstruct noisy patterns. This paper proposes a new type of neural network consisting of neurons with a sigmoid hyperbolic tangent transfer function and a direct feedback. The feedback renders the neuron bistable. Furthermore, instead of using Hebbian learning which has some drawbacks when applied to overlapped patterns, we use the first order Contrastive Divergence (CD1) learning rule. We call these Networks Bistable Sigmoid Networks (BSN). When recalling patterns from the MNIST database the reconstruction error is zero even for high load providing no noise is applied. For an increasing noise level or an increasing amount of patterns the error rises only moderate.
- Published
- 2019
40. Ensemble of Attractor Networks for 2D Gesture Retrieval
- Author
-
Jorge Luis Pérez-Medina, Carlos Dávila, Mario González, Ángel Sánchez, David Dominguez, and Francisco B. Rodriguez
- Subjects
0209 industrial biotechnology ,Artificial neural network ,business.industry ,Computer science ,Pattern recognition ,Computer Science::Human-Computer Interaction ,02 engineering and technology ,Hopfield network ,ComputingMethodologies_PATTERNRECOGNITION ,020901 industrial engineering & automation ,Gesture recognition ,Attractor ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Artificial intelligence ,Noise (video) ,business ,Gesture - Abstract
This work presents an Ensemble of Attractor Neural Networks (EANN) model for gesture retrieval. 2D single-stroke gestures were captured and tested offline by the ensemble. The ensemble was compared to a single attractor with the same complexity, i.e. with equal connectivity. We show that the ensemble of neural networks improves the gesture retrieval in terms of capacity and quality of the gestures retrieval, regarding the single network. The ensemble was able to improve the retrieval of correlated patterns with a random assignment of pattern subsets to the ensemble modules. Thus, optimizing the ensemble input is a possibility for maximizing the patterns retrieval. The proposed EANN proved to be robust for gesture recognition with large initial noise promising to be robust for gesture invariants.
- Published
- 2019
41. Multi-modal Associative Storage and Retrieval Using Hopfield Auto-associative Memory Network
- Author
-
Prasun Joshi, Vandana M. Ladwani, V. Ramasubramanian, and Rachna Shriwas
- Subjects
Computer science ,business.industry ,Pattern recognition ,02 engineering and technology ,Content-addressable memory ,Autoassociative memory ,Hopfield network ,03 medical and health sciences ,0302 clinical medicine ,Modal ,Robustness (computer science) ,Learning rule ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Artificial intelligence ,business ,030217 neurology & neurosurgery - Abstract
Recently we presented text storage and retrieval in an auto-associative memory framework using the Hopfield neural-network. This realized the ideal functionality of Hopfield network as a content-addressable information retrieval system. In this paper, we extend this result to multi-modal patterns, namely, images with text captions and show that the Hopfield network indeed can store and retrieve such multi-modal patterns even in an auto-associative setting. Within this framework, we examine two central issues such as (i) performance characterization to show that the O(N) capacity of the Hopfield network for a network of size N neurons under the Pseudo-inverse learning rule is still retained in the multi-modal case, and (ii) the retrieval dynamics of the multi-modal pattern (i.e., image and caption together) under various types of queries such as image\(+\)caption, image only and caption only, in line with a typical multi-modal retrieval system where the entire multi-modal pattern is expected to be retrieved even with a partial query pattern from any of the modalities. We present results related to these two issues on a large database of 7000\(+\) captioned-images and establish the practical scalability of both the storage capacity and the retrieval robustness of the Hopfield network for content-addressable retrieval of multi-modal patterns. We point to the potential of this work to extend to a more wider definition of multi-modality as in multi-media content, with various modalities such as video (image sequence) synchronized with sub-title text, speech, music and non-speech.
- Published
- 2019
42. Fingerprint Retrieval Using a Specialized Ensemble of Attractor Networks
- Author
-
Francisco B. Rodriguez, Carlos Dávila, David Dominguez, Ángel Sánchez, and Mario González
- Subjects
0209 industrial biotechnology ,Artificial neural network ,business.industry ,Computer science ,Fingerprint (computing) ,Fingerprint retrieval ,Low activity ,Pattern recognition ,02 engineering and technology ,Hopfield network ,020901 industrial engineering & automation ,Attractor ,0202 electrical engineering, electronic engineering, information engineering ,Code (cryptography) ,High activity ,020201 artificial intelligence & image processing ,Artificial intelligence ,business - Abstract
We tested the performance of the Ensemble of Attractor Neural Networks (EANN) model for fingerprint learning and retrieval. The EANN model has proved to increase the random patterns storage capacity, when compared to a single attractor of equal connectivity. In this work, we tested the EANN with real patterns, i.e. fingerprints dataset. The EANN improved the retrieval performance for real patterns more than tripling the capacity of the single attractor with the same number of connections. The EANN modules can also be specialized for different patterns sets according to their characteristics, i.e. pattern/network sparseness (activity). Three EANN modules were assigned with skeletonized fingerprints (low activity), binarized (original) fingerprints (medium activity), and dilated/thickened fingerprint (high activity), and their retrieval was checked. The more sparse the code the larger the storage capacity of the module. The EANN demonstrated to improve the retrieval capacity of the single network, and it can be very helpful for module specialization for different types of real patterns.
- Published
- 2019
43. A New Approximation Algorithm for the d-dimensional Knapsack Problem Based on Hopfield Networks
- Author
-
Jui-Sheng Chang, Jen-Chun Chang, and Hsin-Lung Wu
- Subjects
Hopfield network ,Scheme (programming language) ,Computer science ,Knapsack problem ,Approximation algorithm ,Greedy algorithm ,NP-complete ,computer ,Algorithm ,computer.programming_language - Abstract
In this paper, we study the d-dimensional knapsack problem (d-KP). The problem d-KP is a generalized version of the well-known knapsack problem (1-KP) which is known to be an NP-complete problem. It is also known that there is no fully polynomial-time approximation scheme for d-KP for \(d >1\) unless \(P=NP\). In this work, we design an approximation algorithm for d-KP based on the Hopfield networks. Experimental results show that our proposed algorithm outperforms a well-known greedy algorithm in many cases.
- Published
- 2018
44. Hopfield Associative Memory with Quantized Weights
- Author
-
Mikhail S. Tarkov
- Subjects
Quantitative Biology::Neurons and Cognition ,Artificial neural network ,Computer Science::Neural and Evolutionary Computation ,Binary number ,Memristor ,Content-addressable memory ,Topology ,Capacitance ,law.invention ,Moduli ,Hopfield network ,Quantization (physics) ,law ,Mathematics - Abstract
The use of binary and multilevel memristors in the hardware neural networks implementation necessitates their weight coefficients quantization. In this paper we investigate the Hopfield network weights quantization influence on its information capacity and resistance to input data distortions. It is shown that, for a weight level number of the order of tens, the quantized weights Hopfield-Hebb network capacitance approximates its continuous weights version capacity. For a Hopfield projection network, similar result can be achieved only for a weight levels number of the order of hundreds. Experiments have shown that: (1) binary memristors should be used in Hopfield-Hebb networks, reduced by zeroing all weights in a given row which moduli are strictly less than the maximum weight in the row; (2) in the Hopfield projection networks with quantized weights, multilevel memristors with a weight levels number significantly more than two should be used, with a specific levels number depending on the stored reference vectors dimension, their particular set and the permissible input data noise level.
- Published
- 2018
45. An Overview of Different Neural Network Architectures
- Author
-
Sandro Skansi
- Subjects
Hopfield network ,Artificial neural network ,business.industry ,Computer science ,Artificial intelligence ,business - Abstract
Energy-based models are a specific class of neural networks. The simplest energy model is the Hopfield Network dating back from the 1980s (Hopfield Proc Nat Acad Sci USA 79(8):2554–2558, 1982, [1]). Hopfield networks are often thought to be very simple, but they are quite different from what we have seen before.
- Published
- 2018
46. New Starting Point of the Continuous Hopfield Network
- Author
-
Karim El Moutaouakil and Khalid Haddouch
- Subjects
Hopfield network ,Mathematical optimization ,021103 operations research ,Quadratic equation ,Computer science ,0211 other engineering and technologies ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Point (geometry) ,02 engineering and technology ,Task (project management) - Abstract
In recent years, the continuous Hopfield network has become the most required tool to solve quadratic problems (QP). But, it suffers from some drawbacks, such as, the initial states. This later affect the convergence to the optimal solution and if a bad starting point is arbitrarily specified, the infeasible solution is generated. In this paper, we examine this issue and try to provide a new technique to choose a good starting point in order to give a good optimal solution for any quadratic problems (QP). Numerical simulations are provided to demonstrate the performance of this new technique applied to task assignment problems.
- Published
- 2018
47. Biologically Inspired Augmented Memory Recall Model for Pattern Recognition
- Author
-
Goutam Mylavarapu, Johnson P. Thomas, and K. Ashwin Viswanathan
- Subjects
Hopfield network ,Recall ,Artificial neural network ,Computer science ,business.industry ,Cognitive computing ,Pattern recognition (psychology) ,Process (computing) ,Artificial intelligence ,business ,Field (computer science) ,Domain (software engineering) - Abstract
The concept of modeling a machine which can adapt to the dynamic changes in environment has fascinated the field of Artificial Intelligence. Machine Learning has made inroads in every possible domain. New techniques are developed which can mimic human like responses and thoughts. Cognitive computing has developed renewed interest in the community with advent of Artificial Neural Nets (ANN). In this paper, we present a biological inspired approach to building a augmented memory recall model which can learn usage access patterns and reconstruct from them when presented with noisy or broken concepts. We use Hopfield Networks in a distributed parallel architecture like Hadoop. We also present a mechanism for augmenting the memory capacity of Hopfield Nets. Our model is tested on a real world dataset by parallelizing the learning process thereby increasing the computing power to recognize patterns.
- Published
- 2018
48. Neural Network Model of Unconscious
- Author
-
Alexandr A. Ezhov
- Subjects
0301 basic medicine ,Hopfield network ,03 medical and health sciences ,Class (computer programming) ,030104 developmental biology ,0302 clinical medicine ,Unconscious mind ,Artificial neural network ,business.industry ,Computer science ,Artificial intelligence ,business ,030217 neurology & neurosurgery - Abstract
We describe neural network model of unconscious processing as it defined by the symmetrical logic of Matte Blanco. The model system consists of hierarchy of ensembles of Hopfield network representing definite classes of objects. Patterns in each of a network are considered as in some sense identical representatives of given class. These networks generate their self-reproducible descendants which can exchange patterns with each other and generate self-reproducible networks of a higher level representing wider classes of objects. We also give some examples of applications of this model.
- Published
- 2018
49. Multi-Layer Solution of Heat Equation
- Author
-
Alexander Vasilyev, Dmitry Tarkhov, and Tatiana V. Lazovskaya
- Subjects
FTCS scheme ,Hopfield network ,Recurrence relation ,Partial differential equation ,Artificial neural network ,Ordinary differential equation ,Computer Science::Neural and Evolutionary Computation ,Applied mathematics ,Heat equation ,Parabolic partial differential equation ,Mathematics - Abstract
A new approach to the construction of multilayer neural network approximate solutions for evolutionary partial differential equations is considered. The approach is based on the application of the recurrence relations of the Euler, Runge-Kutta, etc. methods to variable length intervals. The resulting neural-like structure can be considered as a generalization of a feedforward multilayer network or a recurrent Hopfield network. This analogy makes it possible to apply known methods to the refinement of the obtained solution, for example, the backpropagation algorithm. Earlier, a similar approach has been successfully used by the authors in the case of ordinary differential equations. Computational experiments are performed on one test problem for the one-dimensional (in terms of spatial variables) heat equation. Explicit formulas are derived for the dependence of the resulting neural network output on the number of layers. It was found that the error tends to zero with an increasing number of layers, even without the use of the network learning.
- Published
- 2017
50. Intelligent Decision System for Stock Exchange Data Processing and Presentation
- Author
-
Lidia Jackowska-Strumiłło and Michał Paluch
- Subjects
Hopfield network ,Data processing ,Artificial neural network ,Stock exchange ,Order (exchange) ,Computer science ,Technical analysis ,Information system ,Data mining ,computer.software_genre ,computer ,Expert system - Abstract
The paper describes the architecture of the information expert system designed and dedicated to stock exchange data processing, analyzing and presentation. The system uses Artificial Neural Networks (ANN) combined with technical analysis and fractal analysis to predict future price of stock exchange assets. It also enables selection of companies, which assets will be increased. The selection of companies is based on comparing algorithms implementing Hopfield network. The system displays a list of the companies with the highest expected profit sorted in descending order.
- Published
- 2017
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.