1999, Volume 2, Number 4, pp.1--25
Entropy, information and relative criteria of order for states of open
systems are considered. In the theory of communication two definitions of
concept "information" are known. One of them coincides in form with the
Boltzmann entropy. The second definition of information is defined as the
difference between unconditional and conditional entropies. In the present
work the latter is used for the definition of information about states of
open systems with various meanings of controlling (governing) parameter.
Two kinds of open systems are considered. Those systems belong to the first
kind that with zero value of controling parameter are in the equilibrium
state. The information about the equilibrium state is equal to zero.
During self-organization in the process of removal from the equilibrium
state the information is increased. For open systems of this class the
conservation law for the sum of information and entropy with all values
of controlling parameter is proved.
In open systems of the second class the equilibrium is impossible. For
them the concept ''norm of chaoticity '' is introduced. It allows to
consider two kinds of self-organization processes and to give the
corresponding definitions of information.
The treatment is carried out on a number (classical and quantum) of
examples of physical systems. The example of a medico-biological system is
also considered.
Full text: Acrobat PDF (223KB) PostScript (992KB) PostScript.gz (218KB)
Copyright © Nonlinear Phenomena in Complex Systems. Last updated: February 11, 2001