Transfer entropy and transient limits of computation

Mikhail Prokopenko*, Joseph T. Lizier

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

43 Citations (Scopus)
21 Downloads (Pure)


Transfer entropy is a recently introduced information-theoretic measure quantifying directed statistical coherence between spatiotemporal processes, and is widely used in diverse fields ranging from finance to neuroscience. However, its relationships to fundamental limits of computation, such as Landauer's limit, remain unknown. Here we show that in order to increase transfer entropy (predictability) by one bit, heat flow must match or exceed Landauer's limit. Importantly, we generalise Landauer's limit to bi-directional information dynamics for non-equilibrium processes, revealing that the limit applies to prediction, in addition to retrodiction (information erasure). Furthermore, the results are related to negentropy, and to Bremermann's limit and the Bekenstein bound, producing, perhaps surprisingly, lower bounds on the computational deceleration and information loss incurred during an increase in predictability about the process. The identified relationships set new computational limits in terms of fundamental physical quantities, and establish transfer entropy as a central measure connecting information theory, thermodynamics and theory of computation.

Original languageEnglish
Article number5394
Pages (from-to)1-7
Number of pages7
JournalScientific Reports
Publication statusPublished - 23 Jun 2014

Bibliographical note

Copyright the Author(s) 2014. The original publication is available at DOI: 10.1038/srep05394. Version archived for private and non-commercial use with the permission of the author/s and according to publisher conditions. For further rights please contact the publisher.


Dive into the research topics of 'Transfer entropy and transient limits of computation'. Together they form a unique fingerprint.

Cite this