TY - JOUR
T1 - Transfer entropy in continuous time, with applications to jump and neural spiking processes
AU - Spinney, Richard E.
AU - Prokopenko, Mikhail
AU - Lizier, Joseph T.
PY - 2017/3/31
Y1 - 2017/3/31
N2 - Transfer entropy has been used to quantify the directed flow of information between source and target variables in many complex systems. While transfer entropy was originally formulated in discrete time, in this paper we provide a framework for considering transfer entropy in continuous time systems, based on Radon-Nikodym derivatives between measures of complete path realizations. To describe the information dynamics of individual path realizations, we introduce the pathwise transfer entropy, the expectation of which is the transfer entropy accumulated over a finite time interval. We demonstrate that this formalism permits an instantaneous transfer entropy rate. These properties are analogous to the behavior of physical quantities defined along paths such as work and heat. We use this approach to produce an explicit form for the transfer entropy for pure jump processes, and highlight the simplified form in the specific case of point processes (frequently used in neuroscience to model neural spike trains). Finally, we present two synthetic spiking neuron model examples to exhibit the pertinent features of our formalism, namely, that the information flow for point processes consists of discontinuous jump contributions (at spikes in the target) interrupting a continuously varying contribution (relating to waiting times between target spikes). Numerical schemes based on our formalism promise significant benefits over existing strategies based on discrete time formalisms.
AB - Transfer entropy has been used to quantify the directed flow of information between source and target variables in many complex systems. While transfer entropy was originally formulated in discrete time, in this paper we provide a framework for considering transfer entropy in continuous time systems, based on Radon-Nikodym derivatives between measures of complete path realizations. To describe the information dynamics of individual path realizations, we introduce the pathwise transfer entropy, the expectation of which is the transfer entropy accumulated over a finite time interval. We demonstrate that this formalism permits an instantaneous transfer entropy rate. These properties are analogous to the behavior of physical quantities defined along paths such as work and heat. We use this approach to produce an explicit form for the transfer entropy for pure jump processes, and highlight the simplified form in the specific case of point processes (frequently used in neuroscience to model neural spike trains). Finally, we present two synthetic spiking neuron model examples to exhibit the pertinent features of our formalism, namely, that the information flow for point processes consists of discontinuous jump contributions (at spikes in the target) interrupting a continuously varying contribution (relating to waiting times between target spikes). Numerical schemes based on our formalism promise significant benefits over existing strategies based on discrete time formalisms.
UR - http://www.scopus.com/inward/record.url?scp=85016508082&partnerID=8YFLogxK
U2 - 10.1103/PhysRevE.95.032319
DO - 10.1103/PhysRevE.95.032319
M3 - Article
C2 - 28415203
AN - SCOPUS:85016508082
SN - 1539-3755
VL - 95
SP - 1
EP - 21
JO - Physical Review E - Statistical, Nonlinear, and Soft Matter Physics
JF - Physical Review E - Statistical, Nonlinear, and Soft Matter Physics
IS - 3
M1 - 032319
ER -