Computational Modelling Group

24th November 2009 midnight  Imperial College

Information erasure lurking behind measures of complexity

Dr Karoline Wiesner
University of Bristol

Web page
https://www8.imperial.ac.uk/content/dav/ad/workspaces/complexityscience/Abstract_Wiesner.pdf
Submitter
Hans Fangohr

Complex systems are found in most branches of science. It is still argued how to best quantify their complexity and to what end. One prominent measure of complexity (the statistical complexity) has an operational meaning in terms of the amount of resources needed to forecasting a system’s behaviour. Another one (the e?ective measure complexity, aka excess entropy) is a measure of mutual information stored in the system proper. We show that for any given system the two measures di?er by the amount of information erased during forecasting. We interpret the di?erence as ine?ciency of a given model. We ?nd a bound to the ratio of the two measures de?ned as information-processing e?ciency, in analogy to the second law of thermodynamics. This new link between two prominent measures of complexity provides a quantitative crite- rion for good models of complex systems, namely those with little information erasure.