190 results on '"Information transfer"'
Search Results
2. Information Transfer in Financial Markets
- Author
-
Bossomaier, Terry, Barnett, Lionel, Harré, Michael, Lizier, Joseph T., Bossomaier, Terry, Barnett, Lionel, Harré, Michael, and Lizier, Joseph T.
- Published
- 2016
- Full Text
- View/download PDF
3. Measuring the Dynamics of Information Processing on a Local Scale in Time and Space
- Author
-
Lizier, Joseph T., Abarbanel, Henry, Series editor, Braha, Dan, Series editor, Érdi, Péter, Series editor, Friston, Karl, Series editor, Haken, Hermann, Series editor, Jirsa, Viktor, Series editor, Kacprzyk, Janusz, Series editor, Kaneko, Kunihiko, Series editor, Kelso, Scott, Series editor, Kirkilionis, Markus, Series editor, Kurths, Jürgen, Series editor, Nowak, Andrzej, Series editor, Reichl, Linda, Series editor, Schuster, Peter, Series editor, Schweitzer, Frank, Series editor, Sornette, Didier, Series editor, Thurner, Stefan, Series editor, Wibral, Michael, editor, Vicente, Raul, editor, and Lizier, Joseph T., editor
- Published
- 2014
- Full Text
- View/download PDF
4. A Framework for the Local Information Dynamics of Distributed Computation in Complex Systems
- Author
-
Lizier, Joseph T., Prokopenko, Mikhail, Zomaya, Albert Y., Zelinka, Ivan, Series editor, Adamatzky, Andrew, Series editor, Chen, Guanrong, Series editor, and Prokopenko, Mikhail, editor
- Published
- 2014
- Full Text
- View/download PDF
5. Towards Quantifying Interaction Networks in a Football Match
- Author
-
Cliff, Oliver M., Lizier, Joseph T., Wang, X. Rosalind, Wang, Peter, Obst, Oliver, Prokopenko, Mikhail, Hutchison, David, editor, Kanade, Takeo, editor, Kittler, Josef, editor, Kleinberg, Jon M., editor, Kobsa, Alfred, editor, Mattern, Friedemann, editor, Mitchell, John C., editor, Naor, Moni, editor, Nierstrasz, Oscar, editor, Pandu Rangan, C., editor, Steffen, Bernhard, editor, Terzopoulos, Demetri, editor, Tygar, Doug, editor, Weikum, Gerhard, editor, Goebel, Randy, editor, Tanaka, Yuzuru, editor, Wahlster, Wolfgang, editor, Siekmann, Jörg, editor, Behnke, Sven, editor, Veloso, Manuela, editor, Visser, Arnoud, editor, and Xiong, Rong, editor
- Published
- 2014
- Full Text
- View/download PDF
6. Information Dynamics in the Interaction between a Prey and a Predator Fish
- Author
-
Feng Hu, Li-Juan Nie, and Shi-Jian Fu
- Subjects
information transfer ,predation dynamics ,transfer entropy ,vigilance zone ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Accessing information efficiently is vital for animals to make the optimal decisions, and it is particularly important when they are facing predators. Yet until now, very few quantitative conclusions have been drawn about the information dynamics in the interaction between animals due to the lack of appropriate theoretic measures. Here, we employ transfer entropy (TE), a new information-theoretic and model-free measure, to explore the information dynamics in the interaction between a predator and a prey fish. We conduct experiments in which a predator and a prey fish are confined in separate parts of an arena, but can communicate with each other visually and tactilely. TE is calculated on the pair’s coarse-grained state of the trajectories. We find that the prey’s TE is generally significantly bigger than the predator’s during trials, which indicates that the dominant information is transmitted from predator to prey. We then demonstrate that the direction of information flow is irrelevant to the parameters used in the coarse-grained procedures. We further calculate the prey’s TE at different distances between it and the predator. The resulted figure shows that there is a high plateau in the mid-range of the distance and that drops quickly at both the near and the far ends. This result reflects that there is a sensitive space zone where the prey is highly vigilant of the predator’s position.
- Published
- 2015
- Full Text
- View/download PDF
7. Information Transfer between Stock Market Sectors: A Comparison between the USA and China
- Author
-
Peng Yue, Yaodong Fan, Jonathan A. Batten, and Wei-Xing Zhou
- Subjects
information transfer ,transfer entropy ,stock markets ,econophysics ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Information diffusion within financial markets plays a crucial role in the process of price formation and the propagation of sentiment and risk. We perform a comparative analysis of information transfer between industry sectors of the Chinese and the USA stock markets, using daily sector indices for the period from 2000 to 2017. The information flow from one sector to another is measured by the transfer entropy of the daily returns of the two sector indices. We find that the most active sector in information exchange (i.e., the largest total information inflow and outflow) is the non-bank financial sector in the Chinese market and the technology sector in the USA market. This is consistent with the role of the non-bank sector in corporate financing in China and the impact of technological innovation in the USA. In each market, the most active sector is also the largest information sink that has the largest information inflow (i.e., inflow minus outflow). In contrast, we identify that the main information source is the bank sector in the Chinese market and the energy sector in the USA market. In the case of China, this is due to the importance of net bank lending as a signal of corporate activity and the role of energy pricing in affecting corporate profitability. There are sectors such as the real estate sector that could be an information sink in one market but an information source in the other, showing the complex behavior of different markets. Overall, these findings show that stock markets are more synchronized, or ordered, during periods of turmoil than during periods of stability.
- Published
- 2020
- Full Text
- View/download PDF
8. Information Dynamics in Networks and Phase Transitions
- Author
-
Lizier, Joseph T. and Lizier, Joseph T.
- Published
- 2013
- Full Text
- View/download PDF
9. Conclusion
- Author
-
Lizier, Joseph T. and Lizier, Joseph T.
- Published
- 2013
- Full Text
- View/download PDF
10. Information Transfer
- Author
-
Lizier, Joseph T. and Lizier, Joseph T.
- Published
- 2013
- Full Text
- View/download PDF
11. Information Transfer in Biological and Bio-Inspired Systems
- Author
-
Lizier, Joseph T. and Lizier, Joseph T.
- Published
- 2013
- Full Text
- View/download PDF
12. Causality Analysis for COVID-19 among Countries Using Effective Transfer Entropy
- Author
-
Baki Ünal, Mühendislik ve Doğa Bilimleri Fakültesi -- Endüstri Mühendisliği Bölümü, and Ünal, Baki
- Subjects
COVID-19 ,causality analysis ,causality network ,transfer entropy ,network analysis ,Causality network ,Connectivity ,Transfer entropy ,Granger Causality ,Physics ,Electrical Engineering, Electronics & Computer Science - Security Systems - Blockchain ,Information Transfer ,General Physics and Astronomy ,Network analysis ,Causality analysis - Abstract
In this study, causalities of COVID-19 across a group of seventy countries are analyzed with effective transfer entropy. To reveal the causalities, a weighted directed network is constructed. In this network, the weights of the links reveal the strength of the causality which is obtained by calculating effective transfer entropies. Transfer entropy has some advantages over other causality evaluation methods. Firstly, transfer entropy can quantify the strength of the causality and secondly it can detect nonlinear causal relationships. After the construction of the causality network, it is analyzed with well-known network analysis methods such as eigenvector centrality, PageRank, and community detection. Eigenvector centrality and PageRank metrics reveal the importance and the centrality of each node country in the network. In community detection, node countries in the network are divided into groups such that countries in each group are much more densely connected.
- Published
- 2022
13. Complexity in Economic and Social Systems.
- Author
-
Drożdż, Stanisław, Drożdż, Stanisław, Kwapień, Jarosław, and Oświęcimka, Paweł
- Subjects
Information technology industries ,BDS ,Baidu Index ,EMD ,Ethiopia ,Euler characteristic ,GARCH model ,IPO timing ,Kondratieff waves ,Lyapunov ,Nash equilibrium ,Polish Green Island effect ,Red Queen effect ,Shannon-entropy ,Tsallis entropy ,Zipf law ,agent-based computational economics ,agent-based modelling ,bargaining ,central-banking ,chaos ,cluster-entropy ,complex adaptive systems ,complex network ,complex networks ,complex systems ,complexity economics ,complexity in stock market ,complexity of IPOs ,complexity science ,conjunctural movements ,copula functions ,correlation coefficient ,correlation dimension ,correspondence analysis ,cross-shareholding network ,cryptocurrencies ,cybernetics ,detrended cross-correlations ,development ,discrete-time models ,dual graph ,dynamic game model ,dynamical complexity ,dynamics ,economic complexity ,econophysics ,edge of chaos ,entropic susceptibilities ,entropies ,entropy economics ,entropy weight TOPSIS ,evolutionarily stable strategies ,evolutionary dynamics ,evolutionary information search dynamics ,extreme returns ,fake news ,feedback loops ,finance ,financial institution ,financial markets ,forecasting market risk ,four-colour theorem ,gain function ,gender productivity gap ,general system theory ,generalized Pareto distribution ,generalized autoregressive conditional heteroscedasticity model (GARCH) ,homo oeconomicus ,inequality ,information demand ,information theory ,information transfer ,innovative activity ,irreversible processes ,jump volatility ,land acquisition ,leveraged trading ,liquidity benchmark ,liquidity proxy ,location quotient ,macroeconomics ,macroprudential policy ,manufacturing industry ,measure of economic development ,minimal spanning tree ,mixture of distribution hypothesis ,motivation ,multifractal analysis ,multivariate transfer entropy ,municipality ,mutual information ,n/a ,network theory ,non-ergodic ill-behaved inverse problems ,non-extensive cross-entropy econometrics ,non-linear dynamics ,nonlinear dynamics ,partial determination ,peaks over threshold ,platforms for participation ,power law ,pricing constraint ,prosumption ,public administration sector ,real estate ,real option ,recurrence plots ,rumor spreading ,self-exciting point process ,speculation ,stock exchange market ,stock market ,stock markets ,stock price crash risk ,structural entropy ,systemic risk ,threshold effect ,time series ,time series analysis ,transfer entropy ,universal complexity measure ,value at risk ,volatility clustering ,volatility estimate ,wealth condensation ,websites - Abstract
Summary: There is no term that better describes the essential features of human society than complexity. On various levels, from the decision-making processes of individuals, through to the interactions between individuals leading to the spontaneous formation of groups and social hierarchies, up to the collective, herding processes that reshape whole societies, all these features share the property of irreducibility, i.e., they require a holistic, multi-level approach formed by researchers from different disciplines. This Special Issue aims to collect research studies that, by exploiting the latest advances in physics, economics, complex networks, and data science, make a step towards understanding these economic and social systems. The majority of submissions are devoted to financial market analysis and modeling, including the stock and cryptocurrency markets in the COVID-19 pandemic, systemic risk quantification and control, wealth condensation, the innovation-related performance of companies, and more. Looking more at societies, there are papers that deal with regional development, land speculation, and the-fake news-fighting strategies, the issues which are of central interest in contemporary society. On top of this, one of the contributions proposes a new, improved complexity measure.
14. Transfer Information Energy: A Quantitative Indicator of Information Transfer between Time Series.
- Author
-
Caţaron, Angel and Andonie, Răzvan
- Subjects
- *
KNOWLEDGE transfer , *ENTROPY (Information theory) , *TIME series analysis , *INTERNET of things , *INFORMATION theory - Abstract
We introduce an information-theoretical approach for analyzing information transfer between time series. Rather than using the Transfer Entropy (TE), we define and apply the Transfer Information Energy (TIE), which is based on Onicescu's Information Energy. Whereas the TE can be used as a measure of the reduction in uncertainty about one time series given another, the TIE may be viewed as a measure of the increase in certainty about one time series given another. We compare the TIE and the TE in two known time series prediction applications. First, we analyze stock market indexes from the Americas, Asia/Pacific and Europe, with the goal to infer the information transfer between them (i.e., how they influence each other). In the second application, we take a bivariate time series of the breath rate and instantaneous heart rate of a sleeping human suffering from sleep apnea, with the goal to determine the information transfer heart → breath vs. breath → heart. In both applications, the computed TE and TIE values are strongly correlated, meaning that the TIE can substitute the TE for such applications, even if they measure symmetric phenomena. The advantage of using the TIE is computational: we can obtain similar results, but faster. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
15. Nonlinearity matters: The stock price – trading volume relation revisited
- Author
-
Simon Behrendt and Alexander Schmidt
- Subjects
Economics and Econometrics ,Information transfer ,050208 finance ,Relation (database) ,05 social sciences ,Linear model ,Theoretical models ,Sample (statistics) ,Bivariate analysis ,Stock price ,Nonlinear system ,Granger causality ,0502 economics and business ,Econometrics ,Economics ,Transfer entropy ,050207 economics ,Empirical evidence ,Stock (geology) - Abstract
The purpose of this paper is to investigate the information transfer in the relation between stock prices and trading volume. While several theoretical models establish this relation, determining its direction remains an empirical question. Conventional linear approaches, such as Granger causality, provide only limited insights. Importantly, they do not take into account the nonlinear nature of this relation which is advocated by theoretical models of noninformational trading. Moreover, they cannot deduce the dominant direction of the information transfer. Both shortcomings can be addressed by relying upon the concept of Shannon transfer entropy. In an empirical application to a large sample of stocks, we employ this model-free measure and find: (i) A substantial amount of nonlinear information transfer across stocks, and (ii) this information predominantly flows from returns to trading volume growth. Thus, we present empirical evidence that the relation between these financial variables is in fact likely to be nonlinear.
- Published
- 2021
16. Calculating Transfer Entropy from Variance–Covariance Matrices Provides Insight into Allosteric Communication in ERK2
- Author
-
Luisa Garcia Michel, Benjamin C. Ahlbrecht, Daniel Barr, and Clara Keirns
- Subjects
Mitogen-Activated Protein Kinase 1 ,Information transfer ,Protein Conformation ,Computer science ,Entropy ,Allosteric regulation ,Context (language use) ,Molecular Dynamics Simulation ,Covariance ,Computer Science Applications ,Molecular dynamics ,Matrix (mathematics) ,Allosteric Regulation ,Amino Acid Substitution ,Joint probability distribution ,Cascade ,Graph (abstract data type) ,Transfer entropy ,Sensitivity (control systems) ,Physical and Theoretical Chemistry ,Biological system - Abstract
Transfer entropy methods provide an approach to understanding asymmetric information flow in coupled systems, with particular application to understanding allosteric interactions in biomolecular systems. Transfer entropy analysis holds the potential to reveal pathways or networks of residues that are coupled in their information flow and thus give new insights into folding and binding dynamics. Most current methods for calculating transfer entropy require very long simulations and almost equally long calculations of joint probability histograms to compute the information transfer that make these methods either functionally intractable or statistically unreliable. Available approximate methods based on graph and network theory approaches are rapid but lose sensitivity to the chemical nature of the biomolecules and thus are not applicable in mutation studies. We show that reliable estimates of the transfer entropy can be obtained from the variance-covariance matrix of atomic fluctuations, which converges quickly and retains sensitivity to the full chemical profile of the biomolecular system. We validate our method on ERK2, a well-studied kinase involved in the MAPK signaling cascade for which considerable computational, experimental, and mutation data are available. We present the results of transfer entropy analysis on data obtained from molecular dynamics simulations of wild type active and inactive ERK2, along with mutants Q103A, I84A, L73P, and G83A. We show that our method is consistent with the results of computational and experimental studies on ERK2, and we provide a method for interpreting networks of interconnected residues in the protein from a perspective of allosteric coupling. We introduce new insights about possible allosteric activity of the extreme N-terminal region of the kinase, which to date has been under-explored in the literature and may provide an important new direction for kinase studies. We also describe evidence that suggests activation may occur by different paths or routes in different mutants. Our results highlight systematic advantages and disadvantages of each method for calculating transfer entropy and show the important role of transfer entropy analysis for understanding allosteric behavior in biomolecular systems.
- Published
- 2021
17. Financial time series analysis based on effective phase transfer entropy.
- Author
-
Yang, Pengbo, Shang, Pengjian, and Lin, Aijing
- Subjects
- *
TIME series analysis , *PHASE-transfer catalysis , *DYNAMICAL systems , *KNOWLEDGE transfer , *MATHEMATICAL variables , *BUSINESS cycles - Abstract
Transfer entropy is a powerful technique which is able to quantify the impact of one dynamic system on another system. In this paper, we propose the effective phase transfer entropy method based on the transfer entropy method. We use simulated data to test the performance of this method, and the experimental results confirm that the proposed approach is capable of detecting the information transfer between the systems. We also explore the relationship between effective phase transfer entropy and some variables, such as data size, coupling strength and noise. The effective phase transfer entropy is positively correlated with the data size and the coupling strength. Even in the presence of a large amount of noise, it can detect the information transfer between systems, and it is very robust to noise. Moreover, this measure is indeed able to accurately estimate the information flow between systems compared with phase transfer entropy. In order to reflect the application of this method in practice, we apply this method to financial time series and gain new insight into the interactions between systems. It is demonstrated that the effective phase transfer entropy can be used to detect some economic fluctuations in the financial market. To summarize, the effective phase transfer entropy method is a very efficient tool to estimate the information flow between systems. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
18. Transfer Entropy for Coupled Autoregressive Processes
- Author
-
Shawn D. Pethel and Daniel W. Hahs
- Subjects
transfer entropy ,autoregressive process ,Gaussian process ,information transfer ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
A method is shown for computing transfer entropy over multiple time lags for coupled autoregressive processes using formulas for the differential entropy of multivariate Gaussian processes. Two examples are provided: (1) a first-order filtered noise process whose state is measured with additive noise, and (2) two first-order coupled processes each of which is driven by white process noise. We found that, for the first example, increasing the first-order AR coefficient while keeping the correlation coefficient between filtered and measured process fixed, transfer entropy increased since the entropy of the measured process was itself increased. For the second example, the minimum correlation coefficient occurs when the process noise variances match. It was seen that matching of these variances results in minimum information flow, expressed as the sum of transfer entropies in both directions. Without a match, the transfer entropy is larger in the direction away from the process having the larger process noise. Fixing the process noise variances, transfer entropies in both directions increase with the coupling strength. Finally, we note that the method can be generally employed to compute other information theoretic quantities as well.
- Published
- 2013
- Full Text
- View/download PDF
19. On Thermodynamic Interpretation of Transfer Entropy
- Author
-
Don C. Price, Joseph T. Lizier, and Mikhail Prokopenko
- Subjects
transfer entropy ,information transfer ,entropy production ,irreversibility ,Kullback–Leibler divergence ,thermodynamic equilibrium ,Boltzmann’s principle ,causal effect ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
We propose a thermodynamic interpretation of transfer entropy near equilibrium, using a specialised Boltzmann’s principle. The approach relates conditional probabilities to the probabilities of the corresponding state transitions. This in turn characterises transfer entropy as a difference of two entropy rates: the rate for a resultant transition and another rate for a possibly irreversible transition within the system affected by an additional source. We then show that this difference, the local transfer entropy, is proportional to the external entropy production, possibly due to irreversibility. Near equilibrium, transfer entropy is also interpreted as the difference in equilibrium stabilities with respect to two scenarios: a default case and the case with an additional source. Finally, we demonstrated that such a thermodynamic treatment is not applicable to information flow, a measure of causal effect.
- Published
- 2013
- Full Text
- View/download PDF
20. Moving Frames of Reference, Relativity and Invariance in Transfer Entropy and Information Dynamics
- Author
-
Joseph T. Lizier and John R. Mahoney
- Subjects
information theory ,information transfer ,information storage ,transfer entropy ,information dynamics ,cellular automata ,complex systems ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
We present a new interpretation of a local framework for informationdynamics, including the transfer entropy, by defining a moving frame of reference for theobserver of dynamics in lattice systems. This formulation is inspired by the idea ofinvestigating “relativistic” effects on observing the dynamics of information - in particular,we investigate a Galilean transformation of the lattice system data. In applying thisinterpretation to elementary cellular automata, we demonstrate that using a moving frameof reference certainly alters the observed spatiotemporal measurements of informationdynamics, yet still returns meaningful results in this context. We find that, as expected,an observer will report coherent spatiotemporal structures that are moving in their frame asinformation transfer, and structures that are stationary in their frame as information storage.Crucially, the extent to which the shifted frame of reference alters the results dependson whether the shift of frame retains, adds or removes relevant information regarding thesource-destination interaction.
- Published
- 2013
- Full Text
- View/download PDF
21. Quantifying the Information Flow between Ghana Stock Market Index and Its Constituents Using Transfer Entropy
- Author
-
Anokye M. Adam and Prince Mensah Osei
- Subjects
Information transfer ,Article Subject ,business.industry ,General Mathematics ,Fossil fuel ,General Engineering ,Unidirectional flow ,Engineering (General). Civil engineering (General) ,01 natural sciences ,Stock market index ,010305 fluids & plasmas ,0103 physical sciences ,QA1-939 ,Econometrics ,Transfer entropy ,Stock market ,TA1-2040 ,010306 general physics ,business ,Mathematics ,Information exchange ,Stock (geology) - Abstract
We quantify the strength and the directionality of information transfer between the Ghana stock market index and its component stocks as well as observe the same among the individual stocks on the market using transfer entropy. The information flow between the market index and its components and among individual stocks is measured by the effective transfer entropy of the daily logarithm returns generated from the daily market index and stock prices of 32 stocks ranging from 2nd January 2009 to 16th February 2018. We find a bidirectional and unidirectional flow of information between the GSE index and its component stocks, and the stocks dominate the information exchange. Among the individual stocks, SCB is the most active stock in the information exchange as it is the stock that receives the highest amount of information, but the most informative source is EGL (an insurance company) that has the highest net information outflow while the most information sink is PBC that has the highest net information inflow. We further categorize the stocks into 9 stock market sectors and find the insurance sector to be the largest source of information which confirms our earlier findings. Surprisingly, the oil and gas sector is the information sink. Our results confirm the fact that other sectors including oil and gas mitigate their risk exposures through insurance companies and are always expectant of information originating from the insurance sector in relation to regulatory compliance issues. It is our firm conviction that this study would allow stakeholders of the market to make informed buy, sell, or hold decisions.
- Published
- 2020
22. Anatomy of information transfer for Ibis using transfer entropy and active information storage
- Author
-
YanMing Fan, Lin Chen, and Haibin Duan
- Subjects
Ibis ,Collective behavior ,Information transfer ,biology ,Computer Networks and Communications ,Information storage ,Computer science ,Mutual information ,biology.organism_classification ,computer.software_genre ,Information theory ,Control and Systems Engineering ,Entropy (information theory) ,Transfer entropy ,Data mining ,computer - Abstract
As birds migrate in a linear formation, the frequent exchange of leader-follower in the Ibises can extend the formation flight time. In this paper, we utilize the information theory to analyze the underlying mechanism of the coordination and cooperation within the Ibises. To this end, we examine and quantify the dynamic characteristics of the information transfer in a group of Ibises at the individual level. Firstly, the local transfer entropy is employed to quantify the local dynamic information transfer during the migration of the Ibises. Therefore, we provide lucid explanations of how Ibis individually influences the group decision through information transfer to generate consistent collective behavior. The local active information storage is used to represent the memory of the Ibis. The relationship between the consistency of the collective behavior and the memory of the Ibis is obtained by calculating the mutual information between the historical state sequences and the future state of the Ibis. Finally, based on the dynamic information transfer, the relationship between the group memory and the consistent collective behavior, we effectively captured the relationship between the excitation and convergence of the Ibis’ collective behavior, and the variance of entropy is obtained.
- Published
- 2020
23. On Data-Driven Computation of Information Transfer for Causal Inference in Discrete-Time Dynamical Systems
- Author
-
Subhrajit Sinha and Umesh Vaidya
- Subjects
Information transfer ,Theoretical computer science ,Computer science ,Applied Mathematics ,General Engineering ,Operator theory ,Dynamical system ,01 natural sciences ,010305 fluids & plasmas ,System dynamics ,Data-driven ,010101 applied mathematics ,Transfer operator ,Modeling and Simulation ,Causal inference ,0103 physical sciences ,Transfer entropy ,0101 mathematics - Abstract
In this paper, we provide a novel approach to capture causal interaction in a dynamical system from time series data. In Sinha and Vaidya (in: IEEE conference on decision and control, pp 7329–7334, 2016), we have shown that the existing measures of information transfer, namely directed information, Granger causality and transfer entropy, fail to capture the causal interaction in a dynamical system and proposed a new definition of information transfer that captures direct causal interactions. The novelty of the information transfer definition used in this paper is the fact that it can differentiate between direct and indirect influences Sinha and Vaidya (2016). The main contribution of this paper is to show that the proposed definition of information transfers in Sinha and Vaidya (2016) and Sinha and Vaidya (in: Indian control conference, pp 303–308, 2017) can be computed from time series data, and thus, the direct influences in a dynamical system can be identified from time series data. We use transfer operator theoretic framework, involving Perron–Frobenius and Koopman operators for the data-driven approximation of the system dynamics and computation of information transfer. Several examples, involving linear and nonlinear system dynamics, are presented to verify the efficiency of the developed algorithm.
- Published
- 2020
24. Transfer Information Energy: A Quantitative Indicator of Information Transfer between Time Series
- Author
-
Angel Caţaron and Răzvan Andonie
- Subjects
Transfer Entropy ,time series prediction ,information transfer ,information energy ,IoT data analysis ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
We introduce an information-theoretical approach for analyzing information transfer between time series. Rather than using the Transfer Entropy (TE), we define and apply the Transfer Information Energy (TIE), which is based on Onicescu’s Information Energy. Whereas the TE can be used as a measure of the reduction in uncertainty about one time series given another, the TIE may be viewed as a measure of the increase in certainty about one time series given another. We compare the TIE and the TE in two known time series prediction applications. First, we analyze stock market indexes from the Americas, Asia/Pacific and Europe, with the goal to infer the information transfer between them (i.e., how they influence each other). In the second application, we take a bivariate time series of the breath rate and instantaneous heart rate of a sleeping human suffering from sleep apnea, with the goal to determine the information transfer heart → breath vs. breath → heart. In both applications, the computed TE and TIE values are strongly correlated, meaning that the TIE can substitute the TE for such applications, even if they measure symmetric phenomena. The advantage of using the TIE is computational: we can obtain similar results, but faster.
- Published
- 2018
- Full Text
- View/download PDF
25. Learning in Convolutional Neural Networks Accelerated by Transfer Entropy
- Author
-
Adrian Moldovan, Razvan Andonie, and Angel Caţaron
- Subjects
Information transfer ,causality ,business.industry ,Computer science ,Deep learning ,Science ,Physics ,QC1-999 ,Feed forward ,Stability (learning theory) ,Process (computing) ,transfer entropy ,General Physics and Astronomy ,deep learning ,Convolutional Neural Network ,Astrophysics ,Convolutional neural network ,Article ,QB460-466 ,Transfer entropy ,Artificial intelligence ,business ,Algorithm ,Smoothing - Abstract
Recently, there is a growing interest in applying Transfer Entropy (TE) in quantifying the effective connectivity between artificial neurons. In a feedforward network, the TE can be used to quantify the relationships between neuron output pairs located in different layers. Our focus is on how to include the TE in the learning mechanisms of a Convolutional Neural Network (CNN) architecture. We introduce a novel training mechanism for CNN architectures which integrates the TE feedback connections. Adding the TE feedback parameter accelerates the training process, as fewer epochs are needed. On the flip side, it adds computational overhead to each epoch. According to our experiments on CNN classifiers, to achieve a reasonable computational overhead–accuracy trade-off, it is efficient to consider only the inter-neural information transfer of the neuron pairs between the last two fully connected layers. The TE acts as a smoothing factor, generating stability and becoming active only periodically, not after processing each input sample. Therefore, we can consider the TE is in our model a slowly changing meta-parameter.
- Published
- 2021
26. Rich-Club Organization in Effective Connectivity among Cortical Neurons.
- Author
-
Nigam, Sunny, Shimono, Masanori, Ito, Shinya, Fang-Chin Yeh, Timme, Nicholas, Myroshnychenko, Maxym, Lapish, Christopher C., Tosi, Zachary, Hottowy, Pawel, Smith, Wesley C., Masmanidis, Sotiris C., Litke, Alan M., Sporns, Olaf, and Beggs, John M.
- Subjects
- *
NEURAL physiology , *PATCH-clamp techniques (Electrophysiology) , *SOMATOSENSORY cortex , *KNOWLEDGE transfer , *NEURAL circuitry - Abstract
The performance of complex networks, like the brain, depends on how effectively their elements communicate. Despite the importance of communication, it is virtually unknown how information is transferred in local cortical networks, consisting of hundreds of closely spaced neurons. To address this, it is important to record simultaneously from hundreds of neurons at a spacing that matches typical axonal connection distances, and at a temporal resolution that matches synaptic delays. We used a 512-electrode array (60 μm spacing) to record spontaneous activity at 20 kHz from up to 500 neurons simultaneously in slice cultures of mouse somatosensory cortex for 1 h at a time. We applied a previously validated version of transfer entropy to quantify information transfer. Similar to in vivo reports, we found an approximately lognormal distribution of firing rates. Pairwise information transfer strengths also were nearly lognormally distributed, similar to reports of synaptic strengths. Some neurons transferred and received much more information than others, which is consistent with previous predictions. Neurons with the highest outgoing and incoming information transfer were more strongly connected to each other than chance, thus forming a "rich club." We found similar results in networks recorded in vivo from rodent cortex, suggesting the generality of these findings. A rich-club structure has been found previously in large-scale human brain networks and is thought to facilitate communication between cortical regions. The discovery of a small, but information-rich, subset of neurons within cortical regions suggests that this population will play a vital role in communication, learning, and memory. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
27. Modelling a multiplex brain network by local transfer entropy
- Author
-
Fabrizio Parente and Alfredo Colosimo
- Subjects
Information transfer ,Computer science ,Science ,Entropy ,Neurophysiology ,Article ,03 medical and health sciences ,0302 clinical medicine ,Humans ,Set (psychology) ,030304 developmental biology ,0303 health sciences ,Multidisciplinary ,Network topology ,Series (mathematics) ,business.industry ,Neurosciences ,Brain ,Pattern recognition ,Cognition ,Extension (predicate logic) ,State (functional analysis) ,Computational neuroscience ,Connectome ,Medicine ,Transfer entropy ,Artificial intelligence ,business ,030217 neurology & neurosurgery - Abstract
In this work we report on a systematic study of the causal relations in information transfer mechanisms between brain regions under resting condition. The 1000 Functional Connectomes Beijing Zang dataset was used, which includes brain functional images of 180 healthy individuals. We first characterize the information transfer mechanisms by means of Transfer Entropy concepts and, on this basis, propose a set of indexes concerning the whole functional brain network in the frame of a multilayer description. By exploring the influence of a set of states in two given regions at time t (At; Bt.) over the state of one of them at a following time step (Bt+1), a series of time-dependent events can be observed pointing to four kinds of significant interactions, namely:- (de)activation in the same state (ActS); - (de)activation in the oppostive state (ActO);- turn off in the same state (TfS); - turn off in the opposite state (TfO).This leads to four specific rules and to a directional multilayer network based upon four interaction matrices, one for each rule. By hierarchical clustering methods the four rules can be reduced to two sharing some similarities with positive and negative functional connectivity. The global architecture of the four interactions and the features of single nodes were initially explored under stationary conditions. The information transfer mechanisms on the ensuing functional network were studied by specific indexes describing in a multilayer frame the effects of the network structure in several dynamical processes. The healthy subjects database was used to carefully calibrate and validate the proposed approach, whose final aim remains the detection of clinical differences among individuals, as well as among different cognitive states.
- Published
- 2021
28. Structural network inference from time-series data using a generative model and transfer entropy
- Author
-
Genzhou Zhang, Yangbin Zeng, Zhihong Zhang, Zhonghao Zhang, Guo Chen, Beizhan Wang, and Edwin R. Hancock
- Subjects
Information transfer ,Computer science ,Inference ,02 engineering and technology ,Machine learning ,computer.software_genre ,01 natural sciences ,Set (abstract data type) ,Artificial Intelligence ,0103 physical sciences ,Expectation–maximization algorithm ,0202 electrical engineering, electronic engineering, information engineering ,Time series ,010306 general physics ,Training set ,business.industry ,Directed graph ,Graph ,Generative model ,Signal Processing ,Graph (abstract data type) ,020201 artificial intelligence & image processing ,Transfer entropy ,Computer Vision and Pattern Recognition ,Artificial intelligence ,business ,computer ,Software - Abstract
In this paper we develop a novel framework for inferring a generative model of network structure representing the causal relations between data for a set of objects characterized in terms of time series. To do this we make use of transfer entropy as a means of inferring directed information transfer between the time-series data. Transfer entropy allows us to infer directed edges representing the causal relations between pairs of time series, and has thus been used to infer directed graph representations of causal networks for time-series data. We use the expectation maximization algorithm to learn a generative model which captures variations in the causal network over time. We conduct experiments on fMRI brain connectivity data for subjects in different stages of the development of Alzheimer’s disease (AD). Here we use the technique to learn class exemplars for different stages in the development of the disease, together with a normal control class, and demonstrate its utility in both graph multi-class and binary classifications. These experiments are showing the effectiveness of our proposed framework when the amounts of training data are relatively small.
- Published
- 2019
29. Information flow between Ibovespa and constituent companies
- Author
-
Borko Stosic, Jader da Silva Jale, Sílvio Fernando Alves Xavier Júnior, Tiago A. E. Ferreira, and Tatijana Stosic
- Subjects
Statistics and Probability ,Information transfer ,Index (economics) ,Condensed Matter Physics ,01 natural sciences ,Stock market index ,Maturity (finance) ,010305 fluids & plasmas ,0103 physical sciences ,Econometrics ,Transfer entropy ,Information flow (information theory) ,Composite index ,Developed market ,010306 general physics ,Mathematics - Abstract
We study the direction of information flow between Ibovespa index and its constituent companies using the Transfer entropy method. We find stronger information transfer from individual stocks towards the composite index, than in the opposite direction. Our results differ from those found for developed market, where market index was identified as driving force, indicating that the role of index increases with the maturity of the market.
- Published
- 2019
30. Group transfer entropy with an application to cryptocurrencies
- Author
-
Franziska J. Peter and Thomas Dimpfl
- Subjects
Statistics and Probability ,Cryptocurrency ,Information transfer ,Econophysics ,Stochastic process ,Computer science ,Financial market ,Context (language use) ,Condensed Matter Physics ,01 natural sciences ,010305 fluids & plasmas ,Granger causality ,0103 physical sciences ,Econometrics ,Transfer entropy ,Predictability ,010306 general physics - Abstract
The detection of informational leadership is a core issue in financial market microstructure. We use effective group transfer entropy (EGTE) as a measure for the predictability of a stochastic process using lagged observations on multiple related processes within the same system. We propose an appropriate bootstrap to derive confidence bounds and show by means of a simulation study that standard linear approaches in economics and finance, such as vector autoregressions and Granger-causality tests, are not well suited to detect information transfer. We empirically examine the markets for cryptocurrencies using intraday data and reveal that the dependencies are mostly of nonlinear nature, highlighting the applicability of EGTE in the context of this new financial product.
- Published
- 2019
31. Cross-Frequency Transfer Entropy Characterize Coupling of Interacting Nonlinear Oscillators in Complex Systems
- Author
-
Yang Hong, Wenbin Shi, and Chien-Hung Yeh
- Subjects
Adult ,Male ,Coupling ,Physics ,Information transfer ,Signal processing ,Databases, Factual ,Entropy ,Polysomnography ,Models, Neurological ,Biomedical Engineering ,Complex system ,Brain ,Electroencephalography ,Signal Processing, Computer-Assisted ,Nonlinear system ,Amplitude ,Humans ,Entropy (information theory) ,Transfer entropy ,Sleep Stages ,Statistical physics ,Algorithms - Abstract
The purpose of this study is to introduce a method in quantifying cross-frequency information transfer to characterize directional couplers between irregular oscillations in complex systems. Importantly, the method should be able to reflect the intrinsic mechanism of interacting oscillations faithfully. Six types of interacting oscillators, including phase-amplitude, amplitude-amplitude, and component-amplitude cross-frequency transfer entropy as well as their inverse transfer entropies, are within our scope in untangling the brain connectivity. Challenges with nonlinear and nonstationary patterns are designed to validate the robustness of the proposed method. We suggest this approach could be effective in identifying driving and responding elements of interacting oscillators across different time scales. Meanwhile, an atlas of interacting oscillators in sleep is constructed. High-frequency amplitude can inversely drive low-frequency phase stronger than the standard phase-amplitude coupling, and the low-frequency amplitude can be the driving force to the high-frequency amplitude in addition to the low-frequency phase. Unlike the standard phase-amplitude coupling, the proposed cross-frequency transfer entropy is applicable to quantify the interactions across phases, amplitudes, or even the components without methodological adjustments. Meanwhile, the exploration of causal relationship enables the identification of the driving force of information flow.
- Published
- 2019
32. Information Flow in a Boolean Network Model of Collective Behavior
- Author
-
Maurizio Porfiri
- Subjects
Information transfer ,Collective behavior ,Control and Optimization ,Theoretical computer science ,Markov chain ,Computer Networks and Communications ,Topology ,Information theory ,01 natural sciences ,010305 fluids & plasmas ,Noise ,Boolean network ,Control and Systems Engineering ,0103 physical sciences ,Signal Processing ,Transfer entropy ,Information flow (information theory) ,010306 general physics ,Mathematics - Abstract
In animal groups, leaders have often been proposed to be those individuals who possess additional knowledge about their surroundings, such as the location of a food source or a potential predator. Understanding how this information propagates through the group to shape collective response is an important step to elucidate the evolutionary basis of leadership. In this paper, we study a Boolean model of collective behavior, in which a single leader interacts with a group of followers in a binary decision-making process. Through an analytical treatment of the associated Markov chain, we establish closed-form solutions for the transition probability matrix and the stationary distribution, as functions of the noise in the decision-making process and the size of the group. We leverage these expressions to quantify information transfer within the group, measured through the information-theoretic construct of transfer entropy. We find that information transfer depends nonlinearly on the group size and noise. For low noise intensities, the system is nearly deterministic, such that no information is shared within the group; an equivalent effect is observed for large noise intensities, which mask the information transfer. We determine the existence of critical noise intensities at which the leader maximizes information transfer to a follower or followers maximize information sharing between each other for a given group size. These analytical findings suggest that noise might have a positive role in collective behavior, facilitating the transfer of knowledge within the group, from leaders to followers.
- Published
- 2018
33. Spectral Ranking of Causal Influence in Complex Systems
- Author
-
Tom Heskes, Tom Claassen, and Errol Zalmijn
- Subjects
FOS: Computer and information sciences ,Physics - Physics and Society ,Information transfer ,Computer science ,Computer Science - Information Theory ,Complex system ,FOS: Physical sciences ,General Physics and Astronomy ,lcsh:Astrophysics ,Physics and Society (physics.soc-ph) ,010103 numerical & computational mathematics ,Bivariate analysis ,computer.software_genre ,01 natural sciences ,Measure (mathematics) ,Article ,010305 fluids & plasmas ,lcsh:QB460-466 ,0103 physical sciences ,node importance ,0101 mathematics ,lcsh:Science ,complex systems ,Information Theory (cs.IT) ,Data Science ,transfer entropy ,eigenvector centrality ,original information ,Directed graph ,lcsh:QC1-999 ,Nonlinear Sciences - Adaptation and Self-Organizing Systems ,coupled Lorenz systems ,Ranking ,Graph (abstract data type) ,lcsh:Q ,Transfer entropy ,Data mining ,time series ,Adaptation and Self-Organizing Systems (nlin.AO) ,computer ,lcsh:Physics - Abstract
Like natural complex systems such as the Earth's climate or a living cell, semiconductor lithography systems are characterized by nonlinear dynamics across more than a dozen orders of magnitude in space and time. Thousands of sensors measure relevant process variables at appropriate sampling rates, to provide time series as primary sources for system diagnostics. However, high-dimensionality, non-linearity and non-stationarity of data remain a major challenge to effectively diagnose rare or new system issues by merely using model-based approaches. To reduce the causal search space, we validate an algorithm that applies transfer entropy to obtain a weighted directed graph from a system's multivariate time series and graph eigenvector centrality to identify the system's most influential parameters. The results suggest that this approach robustly identifies the true influential sources in a complex system, even when its information transfer network includes redundant edges., Comment: 5 pages, 4 figures
- Published
- 2021
34. Local Granger causality
- Author
-
Luca Faes, Yuri Antonacci, Tomas Scagliarini, Sebastiano Stramaglia, Stramaglia, Sebastiano, Scagliarini, Toma, Antonacci, Yuri, and Faes, Luca
- Subjects
FOS: Computer and information sciences ,Information transfer ,Gaussian ,FOS: Physical sciences ,techniques ,information theory ,granger causality ,Machine Learning (stat.ML) ,Quantitative Biology - Quantitative Methods ,01 natural sciences ,010305 fluids & plasmas ,Vector autoregression ,symbols.namesake ,Granger causality ,Statistics - Machine Learning ,0103 physical sciences ,Applied mathematics ,time serie ,010306 general physics ,Quantitative Methods (q-bio.QM) ,Mathematics ,Stochastic process ,Disordered Systems and Neural Networks (cond-mat.dis-nn) ,Condensed Matter - Disordered Systems and Neural Networks ,Computational Physics (physics.comp-ph) ,Discrete time and continuous time ,Autoregressive model ,FOS: Biological sciences ,Settore ING-INF/06 - Bioingegneria Elettronica E Informatica ,symbols ,Transfer entropy ,Physics - Computational Physics - Abstract
Granger causality is a statistical notion of causal influence based on prediction via vector autoregression. For Gaussian variables it is equivalent to transfer entropy, an information-theoretic measure of time-directed information transfer between jointly dependent processes. We exploit such equivalence and calculate exactly the 'local Granger causality', i.e. the profile of the information transfer at each discrete time point in Gaussian processes; in this frame Granger causality is the average of its local version. Our approach offers a robust and computationally fast method to follow the information transfer along the time history of linear stochastic processes, as well as of nonlinear complex systems studied in the Gaussian approximation., Comment: 4 figures
- Published
- 2021
35. Bidirected Information Flow in the High-Level Visual Cortex
- Author
-
Qiang Li
- Subjects
Information transfer ,Artificial neural network ,Computer science ,business.industry ,Information flow ,Pattern recognition ,Information theory ,Field (geography) ,Visual cortex ,medicine.anatomical_structure ,Flow (mathematics) ,medicine ,Transfer entropy ,Artificial intelligence ,business - Abstract
Understanding the brain function requires investigating information transfer across brain regions. Shannon began the remarkable new field of information theory in 1948. It basically can be divided into two categories: directed and undirected information-theoretical approaches. As we all know, neural signals are typically nonlinear and directed flow between brain regions. We can use directed information to quantify feed-forward information flow, feedback information, and instantaneous influence in the high-level visual cortex. Moreover, neural signals have bidirectional information flow properties and are not captured by the transfer entropy approach. Therefore, we used directed information to quantify bidirectional information flow in this study. We found that there has information flow between the scene-selective areas, e.g., OPA, PPA, RSC, and object-selective areas, e.g., LOC. Specifically, strong information flow exists between RSC and LOC. It explained that functionally coupled between RSC and LOC plays a vital role in visual scenes/object categories or recognition in our daily lives. Meanwhile, we also found weak reverse-directed information flow in the visual scenes and objects neural networks.
- Published
- 2021
36. On the impact of publicly available news and information transfer to financial markets
- Author
-
Petter N. Kolm, Metod Jazbec, Felix Faltings, Barna Pasztor, and Nino Antulov-Fantulin
- Subjects
FOS: Computer and information sciences ,Computer Science and Artificial Intelligence ,Computer Science - Machine Learning ,Physics - Physics and Society ,Information transfer ,Index (economics) ,InformationSystems_INFORMATIONSTORAGEANDRETRIEVAL ,FOS: Physical sciences ,Physics and Society (physics.soc-ph) ,Machine Learning (cs.LG) ,FOS: Economics and business ,0502 economics and business ,financial markets ,Trading strategy ,050207 economics ,complex systems ,Research Articles ,Statistical Finance (q-fin.ST) ,Quantitative Finance - Trading and Market Microstructure ,050208 finance ,Multidisciplinary ,05 social sciences ,Financial market ,Sentiment analysis ,transfer entropy ,1. No poverty ,Equity (finance) ,Quantitative Finance - Statistical Finance ,Data science ,Stock market index ,Trading and Market Microstructure (q-fin.TR) ,machine learning ,sentiment analysis ,Stock market ,Business ,InformationSystems_MISCELLANEOUS - Abstract
We quantify the propagation and absorption of large-scale publicly available news articles from the World Wide Web to financial markets. To extract publicly available information, we use the news archives from the Common Crawl, a non-profit organization that crawls a large part of the web. We develop a processing pipeline to identify news articles associated with the constituent companies in the S&P 500 index, an equity market index that measures the stock performance of US companies. Using machine learning techniques, we extract sentiment scores from the Common Crawl News data and employ tools from information theory to quantify the information transfer from public news articles to the US stock market. Furthermore, we analyse and quantify the economic significance of the news-based information with a simple sentiment-based portfolio trading strategy. Our findings provide support for that information in publicly available news on the World Wide Web has a statistically and economically significant impact on events in financial markets.
- Published
- 2021
37. Information rate in humans during visuomotor tracking
- Author
-
Sze-Ying Lam, Alexandre Zénon, Institut de Neurosciences cognitives et intégratives d'Aquitaine (INCIA), and Université Bordeaux Segalen - Bordeaux 2-Université Sciences et Technologies - Bordeaux 1-SFR Bordeaux Neurosciences-Centre National de la Recherche Scientifique (CNRS)
- Subjects
Information transfer ,information-processing rate ,Computer science ,[SDV]Life Sciences [q-bio] ,visuo-motor tracking ,Context (language use) ,lcsh:Astrophysics ,050105 experimental psychology ,Article ,[SHS]Humanities and Social Sciences ,03 medical and health sciences ,[SCCO]Cognitive science ,0302 clinical medicine ,Component (UML) ,lcsh:QB460-466 ,0501 psychology and cognitive sciences ,Predictability ,lcsh:Science ,05 social sciences ,Information processing ,transfer entropy ,[SDV.NEU.SC]Life Sciences [q-bio]/Neurons and Cognition [q-bio.NC]/Cognitive Sciences ,Code rate ,lcsh:QC1-999 ,lcsh:Q ,Algorithm ,030217 neurology & neurosurgery ,lcsh:Physics - Abstract
While previous studies of human information rate focused primarily on discrete forced-choice tasks, we extend the scope of the investigation to the framework of sensorimotor tracking of continuous signals. We show how considering information transfer in this context sheds new light on the problem; crucially, such an analysis requires one to consider and carefully disentangle the effects due to real-time information processing of surprising inputs (feedback component) from the contribution to performance due to prediction (feedforward component). We argue that only the former constitutes a faithful representation of the true information processing rate. We provide information-theoretic measures which separately quantify these components and show that they correspond to a decomposition of the total information shared between target and tracking signals. We employ a linear quadratic regulator model to provide evidence for the validity of the measures, as well as of the estimator of visual-motor delay (VMD) from experimental data, instrumental to compute them in practice. On experimental tracking data, we show that the contribution of prediction as computed by the feedforward measure increases with the predictability of the signal, confirming previous findings. Importantly, we further find the feedback component to be modulated by task difficulty, with higher information transmission rates observed with noisier signals. Such opposite trends between feedback and feedforward point to a tradeoff of cognitive resources/effort and performance gain.Author summaryPrevious investigations concluded that the human brain’s information processing rate remains fundamentally constant, irrespective of task demands. However, their conclusion rested in analyses of simple discrete-choice tasks. The present contribution recasts the question of human information rate within the context of visuomotor tasks, which provides a more ecologically relevant arena, albeit a more complex one. We argue that, while predictable aspects of inputs can be encoded virtually free of charge, real-time information transfer should be identified with the processing of surprises. We formalise this intuition by deriving from first principles a decomposition of the total information shared by inputs and outputs into a feedforward, predictive component and a feedback, error-correcting component. We find that the information measured by the feedback component, a proxy for the brain’s information processing rate, scales with the difficulty of the task at hand, in agreement with cost-benefit models of cognitive effort.
- Published
- 2020
38. Cryptic Information Transfer in Differently-Trained Recurrent Neural Networks
- Author
-
Arend Hintze and Christoph Adami
- Subjects
0209 industrial biotechnology ,Information transfer ,Artificial neural network ,business.industry ,Node (networking) ,Computer Science::Neural and Evolutionary Computation ,02 engineering and technology ,Backpropagation ,020901 industrial engineering & automation ,Recurrent neural network ,Genetic algorithm ,0202 electrical engineering, electronic engineering, information engineering ,Entropy (information theory) ,020201 artificial intelligence & image processing ,Transfer entropy ,Artificial intelligence ,business - Abstract
Artificial neural networks (ANNs) are one of the most promising tools in the quest to develop general artificial intelligence. Their design was inspired by how neurons in natural brains connect and process, the only other substrate to harbor intelligence. Compared to natural brains that are sparsely connected and that form sparsely distributed representations, ANNs instead process information by connecting all nodes of one layer to all nodes of the next. In addition, modern ANNs are trained with backpropagation, while their natural counterparts have been optimized by natural evolution over eons. Here we measure the transfer entropy, that is the information that is transferred from one node to another, to determine how information is propagating through recurrent neural networks optimized either by backpropagation or a genetic algorithm. Surprisingly, we find no difference in how they relay information, suggesting that it is not the optimization method, but instead their topological structure, that causes these ANNs to process information differently compared to the natural brains they seek to emulate.
- Published
- 2020
39. Entropy
- Author
-
Nicole Abaid, Irena Shaffer, Biomedical Engineering and Mechanics, and Mathematics
- Subjects
Information transfer ,Collective behavior ,animal group interaction ,microphone arrays ,Computer science ,General Physics and Astronomy ,lcsh:Astrophysics ,Human echolocation ,01 natural sciences ,Sonar ,Article ,03 medical and health sciences ,0302 clinical medicine ,3D tracking ,Position (vector) ,lcsh:QB460-466 ,0103 physical sciences ,lcsh:Science ,010306 general physics ,Myotis grisescens ,bat swarms ,biology ,business.industry ,transfer entropy ,Active sensing ,Pattern recognition ,biology.organism_classification ,lcsh:QC1-999 ,lcsh:Q ,Transfer entropy ,Artificial intelligence ,business ,lcsh:Physics ,030217 neurology & neurosurgery - Abstract
Many animal species, including many species of bats, exhibit collective behavior where groups of individuals coordinate their motion. Bats are unique among these animals in that they use the active sensing mechanism of echolocation as their primary means of navigation. Due to their use of echolocation in large groups, bats run the risk of signal interference from sonar jamming. However, several species of bats have developed strategies to prevent interference, which may lead to different behavior when flying with conspecifics than when flying alone. This study seeks to explore the role of this acoustic sensing on the behavior of bat pairs flying together. Field data from a maternity colony of gray bats (Myotis grisescens) were collected using an array of cameras and microphones. These data were analyzed using the information theoretic measure of transfer entropy in order to quantify the interaction between pairs of bats and to determine the effect echolocation calls have on this interaction. This study expands on previous work that only computed information theoretic measures on the 3D position of bats without echolocation calls or that looked at the echolocation calls without using information theoretic analyses. Results show that there is evidence of information transfer between bats flying in pairs when time series for the speed of the bats and their turning behavior are used in the analysis. Unidirectional information transfer was found in some subsets of the data which could be evidence of a leader&ndash, follower interaction.
- Published
- 2020
40. Susceptibility of Stock Market Returns to International Economic Policy: Evidence from Effective Transfer Entropy of Africa with the Implication for Open Innovation
- Author
-
Anokye M. Adam
- Subjects
Emerging stock markets ,Information transfer ,lcsh:Management. Industrial management ,Sociology and Political Science ,economic policy uncertainty ,Economic policy ,Development ,lcsh:Business ,ddc:650 ,0502 economics and business ,Economics ,050207 economics ,Stock (geology) ,Open innovation ,050208 finance ,05 social sciences ,effective transfer entropy ,Rényi transfer entropy ,African stock markets ,Stock market index ,lcsh:HD28-70 ,Transfer entropy ,Stock market ,lcsh:HF5001-6182 ,General Economics, Econometrics and Finance - Abstract
This study contributes to the scant finance literature on information flow from international economic policy uncertainty to emerging stock markets in Africa, using daily US economic policy uncertainty as a proxy and the daily stock market index for Botswana, Egypt, Ghana, Kenya, Morocco, Nigeria, Namibia, South Africa, and Zambia from 31 December 2010 to 27 May 2020, using the Ré, nyi effective transfer entropy. International economic policy uncertainty transmits significant information to Egypt, Ghana, Morocco, Namibia, and South Africa, and insignificant information to Botswana, Kenya, Nigeria, and Zambia. The asymmetry in the information transfer tends to make the African market an alternative for the diversification of international portfolios when the uncertainty of the global economic policy is on the rise. The findings also have implications for the adoption of open innovation in African stock markets.
- Published
- 2020
41. Synergistic Information Transfer in the Global System of Financial Markets
- Author
-
Tomas Scagliarini, Daniele Marinazzo, Rosario N. Mantegna, Luca Faes, Sebastiano Stramaglia, Scagliarini T., Faes L., Marinazzo D., Stramaglia S., and Mantegna R.N.
- Subjects
Information transfer ,FLOW ,General Physics and Astronomy ,synergy ,lcsh:Astrophysics ,GRANGER CAUSALITY ,Article ,econometrics ,stock market ,Business and Economics ,Granger causality ,Financial markets,Higher order dependencies, Synergy ,Order (exchange) ,lcsh:QB460-466 ,Economics ,Econometrics ,financial markets ,Information flow (information theory) ,NETWORK ,lcsh:Science ,information theory ,higher order dependencies ,CROSS-CORRELATIONS ,Financial market ,Stock market index ,lcsh:QC1-999 ,Mathematics and Statistics ,time series analysis ,lcsh:Q ,Transfer entropy ,Stock market ,lcsh:Physics - Abstract
Uncovering dynamic information flow between stock market indices has been the topic of several studies which exploited the notion of transfer entropy or Granger causality, its linear version. The output of the transfer entropy approach is a directed weighted graph measuring the information about the future state of each target provided by the knowledge of the state of each driving stock market index. In order to go beyond the pairwise description of the information flow, thus looking at higher order informational circuits, here we apply the partial information decomposition to triplets consisting of a pair of driving markets (belonging to America or Europe) and a target market in Asia. Our analysis, on daily data recorded during the years 2000 to 2019, allows the identification of the synergistic information that a pair of drivers carry about the target. By studying the influence of the closing returns of drivers on the subsequent overnight changes of target indexes, we find that (i) Korea, Tokyo, Hong Kong, and Singapore are, in order, the most influenced Asian markets, (ii) US indices SP500 and Russell are the strongest drivers with respect to the bivariate Granger causality, and (iii) concerning higher order effects, pairs of European and American stock market indices play a major role as the most synergetic three-variables circuits. Our results show that the Synergy, a proxy of higher order predictive information flow rooted in information theory, provides details that are complementary to those obtained from bivariate and global Granger causality, and can thus be used to get a better characterization of the global financial system.
- Published
- 2020
42. Collective Pulsing in Xeniid Corals: Part I—Using Computer Vision and Information Theory to Search for Coordination
- Author
-
Simon Garnier, Julia E. Samson, Dylan D. Ray, Maurizio Porfiri, and Laura Miller
- Subjects
0301 basic medicine ,Cnidaria ,Collective behavior ,Information transfer ,Computer science ,General Mathematics ,Immunology ,Information Theory ,Video Recording ,Information theory ,Models, Biological ,General Biochemistry, Genetics and Molecular Biology ,03 medical and health sciences ,0302 clinical medicine ,Alcyonacea ,Artificial Intelligence ,Animals ,Computer Simulation ,Symbiosis ,General Environmental Science ,Pharmacology ,Behavior, Animal ,biology ,General Neuroscience ,Mathematical Concepts ,Anthozoa ,Fluid transport ,biology.organism_classification ,030104 developmental biology ,Computational Theory and Mathematics ,030220 oncology & carcinogenesis ,Hydrodynamics ,Transfer entropy ,General Agricultural and Biological Sciences ,Isomap ,Biological system ,Algorithms - Abstract
Xeniid corals (Cnidaria: Alcyonacea), a family of soft corals, include species displaying a characteristic pulsing behavior. This behavior has been shown to increase oxygen diffusion away from the coral tissue, resulting in higher photosynthetic rates from mutualistic symbionts. Maintaining such a pulsing behavior comes at a high energetic cost, and it has been proposed that coordinating the pulse of individual polyps within a colony might enhance the efficiency of fluid transport. In this paper, we test whether patterns of collective pulsing emerge in coral colonies and investigate possible interactions between polyps within a colony. We video recorded different colonies of Heteroxenia sp. in a laboratory environment. Our methodology is based on the systematic integration of a computer vision algorithm (ISOMAP) and an information-theoretic approach (transfer entropy), offering a vantage point to assess coordination in collective pulsing. Perhaps surprisingly, we did not detect any form of collective pulsing behavior in the colonies. Using artificial data sets, however, we do demonstrate that our methodology is capable of detecting even weak information transfer. The lack of a coordination is consistent with previous work on many cnidarians where coordination between actively pulsing polyps and medusa has not been observed. In our companion paper, we show that there is no fluid dynamic benefit of coordinated pulsing, supporting this result. The lack of coordination coupled with no obvious fluid dynamic benefit to grouping suggests that there may be non-fluid mechanical advantages to forming colonies, such as predator avoidance and defense.
- Published
- 2020
43. Effective Transfer Entropy Approach to Information Flow Among EPU, Investor Sentiment and Stock Market
- Author
-
Hong-Yu Li and Can-Zhong Yao
- Subjects
Information transfer ,Stationary process ,information flow ,Materials Science (miscellaneous) ,Lag ,EPU ,Biophysics ,investor sentiment ,transfer entropy ,General Physics and Astronomy ,stock market ,lcsh:QC1-999 ,Exchange rate ,Econometrics ,Economics ,Transfer entropy ,Stock market ,Physical and Theoretical Chemistry ,Mathematical Physics ,Bandwagon effect ,Stock (geology) ,lcsh:Physics - Abstract
Although transfer entropy can test the nonlinear causal relationship between sequences, it is mainly used for stationary data. For nonstationary sequences with large fluctuations, the traditional transfer entropy method has obvious defects. Based on traditional transfer entropy, this paper proposes a transfer entropy method with rolling windows. We verify that this new method can capture the dynamic order between sequences and better reveal the nonlinear causality between nonstationary time series. Furthermore, we construct an investor sentiment index based on principal component analysis, and based on the proposed dynamic transfer entropy model, we analyze the information transfer relationship among economic policy uncertainty (EPU), investor sentiment and stock markets. The results of the information flow analysis of EPU and investor sentiment show that EPU influenced investor sentiment mainly from August 2015 to June 2016. Among different policies, China’s exchange rate reform policy and ‘circuit-breaker’ policy in the stock market have played an important role. The analysis of the information flow between sentiment and stock price returns shows that investor sentiment is more a reflection of changes in stock price returns with a 1-month lag order and that the stock market has a significant bargainer effect and a weaker bandwagon effect. There is no significant information flow transmission relationship between EPU and stock market volatility, which indicates that stock market fluctuations are basically not affected by national policy fluctuations. Although investor sentiment is affected by changes such as exchange rate reform and stock market policies, many investors do not form consensus expectations.
- Published
- 2020
44. Measuring spectrally-resolved information transfer
- Author
-
Edoardo Pinzuti, Patricia Wollstadt, Aaron J. Gutknecht, Michael Wibral, and Oliver Tüscher
- Subjects
0301 basic medicine ,Discrete wavelet transform ,Information transfer ,Computer science ,Entropy ,Information Theory ,0302 clinical medicine ,Wavelet ,Mathematical and Statistical Techniques ,Medicine and Health Sciences ,Biology (General) ,Wavelet Transforms ,Temporal cortex ,Mammals ,Ecology ,Systems Biology ,Applied Mathematics ,Simulation and Modeling ,Physics ,Wavelet transform ,Magnetoencephalography ,Eukaryota ,Brain ,Signal Filtering ,Computational Theory and Mathematics ,Modeling and Simulation ,Physical Sciences ,Vertebrates ,Thermodynamics ,Engineering and Technology ,Wavelet transforms ,Algorithms ,Information entropy ,Signal filtering ,Ferrets ,Permutation ,Anatomy ,Algorithm ,Information Entropy ,Research Article ,Computer and Information Sciences ,QH301-705.5 ,Wavelet Analysis ,Prefrontal Cortex ,Research and Analysis Methods ,03 medical and health sciences ,Cellular and Molecular Neuroscience ,Genetics ,Entropy (information theory) ,Animals ,Humans ,Information flow (information theory) ,Molecular Biology ,Ecology, Evolution, Behavior and Systematics ,Discrete Mathematics ,Organisms ,Biology and Life Sciences ,030104 developmental biology ,Combinatorics ,Signal Processing ,Amniotes ,Transfer entropy ,Zoology ,Mathematical Functions ,030217 neurology & neurosurgery ,Mathematics - Abstract
Information transfer, measured by transfer entropy, is a key component of distributed computation. It is therefore important to understand the pattern of information transfer in order to unravel the distributed computational algorithms of a system. Since in many natural systems distributed computation is thought to rely on rhythmic processes a frequency resolved measure of information transfer is highly desirable. Here, we present a novel algorithm, and its efficient implementation, to identify separately frequencies sending and receiving information in a network. Our approach relies on the invertible maximum overlap discrete wavelet transform (MODWT) for the creation of surrogate data in the computation of transfer entropy and entirely avoids filtering of the original signals. The approach thereby avoids well-known problems due to phase shifts or the ineffectiveness of filtering in the information theoretic setting. We also show that measuring frequency-resolved information transfer is a partial information decomposition problem that cannot be fully resolved to date and discuss the implications of this issue. Last, we evaluate the performance of our algorithm on simulated data and apply it to human magnetoencephalography (MEG) recordings and to local field potential recordings in the ferret. In human MEG we demonstrate top-down information flow in temporal cortex from very high frequencies (above 100Hz) to both similarly high frequencies and to frequencies around 20Hz, i.e. a complex spectral configuration of cortical information transmission that has not been described before. In the ferret we show that the prefrontal cortex sends information at low frequencies (4-8 Hz) to early visual cortex (V1), while V1 receives the information at high frequencies (> 125 Hz)., Author summary Systems in nature that perform computations typically consist of a large number of relatively simple but interacting parts. In human brains, for example, billions of neurons work together to enable our cognitive abilities. This well-orchestrated teamwork requires information to be exchanged very frequently. In many cases this exchange happens rhythmically and, therefore, it seems beneficial for our understanding of physical systems if we could link the information exchange to specific rhythms. We here present a method to determine which rhythms send, and which rhythms receive information. Since many rhythms can interact at both sender and receiver side, we show that the above problem is tightly linked to partial information decomposition—an intriguing problem from information theory only solved recently, and only partly. We applied our novel method to information transfer in the human inferior temporal cortex, a brain region relevant for object perception, and unexpectedly found information transfer originating at very high frequencies at 100Hz and then forking to be received at both similarly high but also much lower frequencies around 20Hz. These results overturn the current standard assumption that low frequencies send information to high frequencies.
- Published
- 2020
45. Improving on transfer entropy-based network reconstruction using time-delays: Approach and validation
- Author
-
Rifat Sipahi and Maurizio Porfiri
- Subjects
Information transfer ,Theoretical computer science ,Dynamical systems theory ,Computer science ,Applied Mathematics ,Computation ,Chaotic ,General Physics and Astronomy ,Statistical and Nonlinear Physics ,Dynamical system ,01 natural sciences ,010305 fluids & plasmas ,0103 physical sciences ,Premise ,Metric (mathematics) ,Transfer entropy ,010306 general physics ,Mathematical Physics - Abstract
Transfer entropy constitutes a viable model-free tool to infer causal relationships between two dynamical systems from their time-series. In an information-theoretic sense, transfer entropy associates a cause-and-effect relationship with directed information transfer, such that one may improve the prediction of the future of a dynamical system from the history of another system. Recent studies have proposed the use of transfer entropy to reconstruct networks, but the inherent dyadic nature of this metric challenges the development of a robust approach that can discriminate direct from indirect interactions between nodes. In this paper, we seek to fill this methodological gap through the cogent integration of time-delays in the transfer entropy computation. By recognizing that information transfer in the network is bound by a finite speed, we relate the value of the time-delayed transfer entropy between two nodes to the number of walks between them. Upon this premise, we lay out the foundation of an alternative framework for network reconstruction, which we illustrate through closed-form results on three-node networks and numerically validate on larger networks, using examples of Boolean models and chaotic maps.
- Published
- 2020
46. Measuring spectrally-resolved information transfer for sender- and receiver-specific frequencies
- Author
-
P. Wollsdtadt, Oliver Tuescher, Edoardo Pinzuti, Michael Wibral, and Aaron J. Gutknecht
- Subjects
Temporal cortex ,Information transfer ,Computer science ,Information theory ,01 natural sciences ,Surrogate data ,03 medical and health sciences ,0302 clinical medicine ,0103 physical sciences ,Transfer entropy ,Communication source ,Information flow (information theory) ,010306 general physics ,Algorithm ,030217 neurology & neurosurgery ,Information exchange - Abstract
Information transfer, measured by transfer entropy, is a key component of distributed computation. It is therefore important to understand the pattern of information transfer in order to unravel the distributed computational algorithms of a system. Since in many natural systems distributed computation is thought to rely on rhythmic processes a frequency resolved measure of information transfer is highly desirable. Here, we present a novel algorithm, and its efficient implementation, to identify separately frequencies sending and receiving information in a network. Our approach relies on the invertible maximum overlap discrete wavelet transform (MODWT) for the creation of surrogate data in the computation of transfer entropy and entirely avoids filtering of the original signals. The approach thereby avoids well-known problems due to phase shifts or the ineffectiveness of filtering in the information theoretic setting. We also show that measuring frequency-resolved information transfer is a partial information decomposition problem that cannot be fully resolved to date and discuss the implications of this issue. Last, we evaluate the performance of our algorithm on simulated data and apply it to human magnetoencephalography (MEG) recordings and to local field potential recordings in the ferret. In human MEG we demonstrate top-down information flow in temporal cortex from very high frequencies (above 100Hz) to both similarly high frequencies and to frequencies around 20Hz, i.e. a complex spectral configuration of cortical information transmission that has not been described before. In the ferret we show that the prefrontal cortex sends information at low frequencies (4-8 Hz) to early visual cortex (V1), while V1 receives the information at high frequencies (> 125 Hz).Author SummarySystems in nature that perform computations typically consist of a large number of relatively simple but interacting parts. In human brains, for example, billions of neurons work together to enable our cognitive abilities. This well-orchestrated teamwork requires information to be exchanged very frequently. In many cases this exchange happens rhythmically and, therefore, it seems beneficial for our understanding of physical systems if we could link the information exchange to specific rhythms. We here present a method to determine which rhythms send, and which rhythms receive information. Since many rhythms can interact at both sender and receiver side, we show that the interpretation of results always needs to consider that the above problem is tightly linked to partial information decomposition - an intriguing problem from information theory only solved recently, and only partly. We applied our novel method to information transfer in the human inferior temporal cortex, a brain region relevant for object perception, and unexpectedly found information transfer originating at very high frequencies at 100Hz and then forking to be received at both similarly high but also much lower frequencies around 20Hz. These results overturn the current standard assumption that low frequencies send information to high frequencies.
- Published
- 2020
- Full Text
- View/download PDF
47. Information Transfer between Stock Market Sectors: A Comparison between the USA and China
- Author
-
Wei-Xing Zhou, Peng Yue, Jonathan A. Batten, and Yaodong Fan
- Subjects
Information transfer ,Statistical Finance (q-fin.ST) ,Financial market ,transfer entropy ,General Physics and Astronomy ,Quantitative Finance - Statistical Finance ,Financial system ,Real estate ,lcsh:Astrophysics ,stock markets ,lcsh:QC1-999 ,Article ,Corporate finance ,FOS: Economics and business ,econophysics ,lcsh:QB460-466 ,information transfer ,Stock market ,Profitability index ,lcsh:Q ,Business ,lcsh:Science ,Information exchange ,Stock (geology) ,lcsh:Physics - Abstract
Information diffusion within financial markets plays a crucial role in the process of price formation and the propagation of sentiment and risk. We perform a comparative analysis of information transfer between industry sectors of the Chinese and the USA stock markets, using daily sector indices for the period from 2000 to 2017. The information flow from one sector to another is measured by the transfer entropy of the daily returns of the two sector indices. We find that the most active sector in information exchange (i.e., the largest total information inflow and outflow) is the {\textit{non-bank financial}} sector in the Chinese market and the {\textit{technology}} sector in the USA market. This is consistent with the role of the non-bank sector in corporate financing in China and the impact of technological innovation in the USA. In each market, the most active sector is also the largest information sink that has the largest information inflow (i.e., inflow minus outflow). In contrast, we identify that the main information source is the {\textit{bank}} sector in the Chinese market and the {\textit{energy}} sector in the USA market. In the case of China, this is due to the importance of net bank lending as a signal of corporate activity and the role of energy pricing in affecting corporate profitability. There are sectors such as the {\textit{real estate}} sector that could be an information sink in one market but an information source in the other, showing the complex behavior of different markets. Overall, these findings show that stock markets are more synchronized, or ordered, during periods of turmoil than during periods of stability., 12 pages including 8 figures
- Published
- 2020
48. Information Dynamics Analysis: A new approach based on Sparse Identification of Linear Parametric Models*
- Author
-
Laura Astolfi, Yuri Antonacci, Luca Faes, Antonacci, Yuri, Faes, Luca, and Astolfi, Laura
- Subjects
Multivariate statistics ,Computer science ,Entropy ,Gaussian ,0206 medical engineering ,Normal Distribution ,02 engineering and technology ,01 natural sciences ,LASSO regression ,010305 fluids & plasmas ,symbols.namesake ,information Transfer ,State Space models ,Granger causality ,Lasso (statistics) ,0103 physical sciences ,Statistics::Methodology ,State space ,Least-Squares Analysis ,Shrinkage ,Sparse matrix ,Electroencephalography ,020601 biomedical engineering ,Autoregressive model ,state space model ,Parametric model ,Ordinary least squares ,Linear Models ,symbols ,Transfer entropy ,Algorithm ,Information dyancamic analysi - Abstract
The framework of information dynamics allows to quantify different aspects of the statistical structure of multivariate processes reflecting the temporal dynamics of a complex network. The information transfer from one process to another can be quantified through Transfer Entropy, and under the assumption of joint Gaussian variables it is strictly related to the concept of Granger Causality (GC). According to the most recent developments in the field, the computation of GC entails representing the processes through a Vector Autoregressive (VAR) model and a state space (SS) model typically identified by means of the Ordinary Least Squares (OLS). In this work, we propose a new identification approach for the VAR and SS models, based on Least Absolute Shrinkage and Selection Operator (LASSO), that has the advantages of maintaining good accuracy even when few data samples are available and yielding as output a sparse matrix of estimated information transfer. The performances of LASSO identification were first tested and compared to those of OLS by a simulation study and then validated on real electroencephalographic (EEG) signals recorded during a motor imagery task. Both studies indicated that LASSO, under conditions of data paucity, provides better performances in terms of network structure. Given the general nature of the model, this work opens the way to the use of LASSO regression for the computation of several measures of information dynamics currently in use in computational neuroscience.
- Published
- 2020
49. Transfer Entropy as a Measure of Brain Connectivity: A Critical Analysis With the Help of Neural Mass Models
- Author
-
Mauro Ursino, Giulia Ricci, Elisa Magosso, Ursino M., Ricci G., and Magosso E.
- Subjects
0301 basic medicine ,neural mass models ,Information transfer ,Multivariate statistics ,causality ,Computer science ,Neuroscience (miscellaneous) ,Bivariate analysis ,Measure (mathematics) ,non-linear neural phenomena ,lcsh:RC321-571 ,Correlation ,03 medical and health sciences ,Cellular and Molecular Neuroscience ,bivariate transfer entropy ,0302 clinical medicine ,information transfer ,Spurious relationship ,lcsh:Neurosciences. Biological psychiatry. Neuropsychiatry ,neural mass model ,excitatory and inhibitory synapse ,Original Research ,excitatory and inhibitory synapses ,Trentool software ,030104 developmental biology ,connectivity ,A priori and a posteriori ,Transfer entropy ,Algorithm ,030217 neurology & neurosurgery ,Neuroscience - Abstract
Objective: Assessing brain connectivity from electrophysiological signals is of great relevance in neuroscience, but results are still debated and depend crucially on how connectivity is defined and on mathematical instruments utilized. Aim of this work is to assess the capacity of bivariate Transfer Entropy (TE) to evaluate connectivity, using data generated from simple neural mass models of connected Regions of Interest (ROIs). Approach: Signals simulating mean field potentials were generated assuming two, three or four ROIs, connected via excitatory or by-synaptic inhibitory links. We investigated whether the presence of a statistically significant connection can be detected and if connection strength can be quantified. Main Results: Results suggest that TE can reliably estimate the strength of connectivity if neural populations work in their linear regions, and if the epoch lengths are longer than 10 s. In case of multivariate networks, some spurious connections can emerge (i.e., a statistically significant TE even in the absence of a true connection); however, quite a good correlation between TE and synaptic strength is still preserved. Moreover, TE appears more robust for distal regions (longer delays) compared with proximal regions (smaller delays): an approximate a priori knowledge on this delay can improve the procedure. Finally, non-linear phenomena affect the assessment of connectivity, since they may significantly reduce TE estimation: information transmission between two ROIs may be weak, due to non-linear phenomena, even if a strong causal connection is present. Significance: Changes in functional connectivity during different tasks or brain conditions, might not always reflect a true change in the connecting network, but rather a change in information transmission. A limitation of the work is the use of bivariate TE. In perspective, the use of multivariate TE can improve estimation and reduce some of the problems encountered in the present study.
- Published
- 2020
50. Multiscale transfer entropy: Measuring information transfer on multiple time scales
- Author
-
Xuemei Li, Yupeng Sun, Xiaojun Zhao, and Pengjian Shang
- Subjects
Numerical Analysis ,Information transfer ,Series (mathematics) ,Computer science ,Applied Mathematics ,Estimator ,01 natural sciences ,010305 fluids & plasmas ,Nonlinear system ,Autoregressive model ,Modeling and Simulation ,0103 physical sciences ,Transfer entropy ,Statistical physics ,010306 general physics ,Spurious relationship ,Autoregressive fractionally integrated moving average - Abstract
In this paper, we propose a novel multiscale transfer entropy (MTE) approach to quantify the information transfer of time series on multiple time scales. The MTE combines the advantages of both the multiscale analysis and the transfer entropy, which is able to identify directional, dynamical and scale-dependent information flows. To minimize finite size effects and to avoid spurious detection of causality, we resort to a refined time-delayed multiscale transfer entropy (TMTE) estimator given on overlapping coarse-graining. We also suggest several extensions of the TMTE, including an effective TMTE and a net TMTE. Synthetic simulations including the linear vector autoregressive (VAR) models, the long-range correlated ARFIMA processes, and the nonlinear Rossler systems are analyzed, and an application of the TMTE to daily closing prices and trading volumes of S&P 500 is studied.
- Published
- 2018
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.