Intensive entropic non-triviality measure
We discuss a way of characterizing probability distributions, complementing that provided by the celebrated notion of information measure, with reference to a measure of complexity that we call a "nontriviality measure". Our starting point is the "LMC" measure of complexity advan...
Guardado en:
Publicado: |
2004
|
---|---|
Materias: | |
Acceso en línea: | https://bibliotecadigital.exactas.uba.ar/collection/paper/document/paper_03784371_v334_n1-2_p119_Lamberti http://hdl.handle.net/20.500.12110/paper_03784371_v334_n1-2_p119_Lamberti |
Aporte de: |
Sumario: | We discuss a way of characterizing probability distributions, complementing that provided by the celebrated notion of information measure, with reference to a measure of complexity that we call a "nontriviality measure". Our starting point is the "LMC" measure of complexity advanced by López-Ruiz et al. (Phys. Lett. A 209 (1995) 321) and its analysis by Anteneodo and Plastino (Phys. Lett. A 223 (1997) 348). An improvement of some of their troublesome characteristics is thereby achieved. Basically, we replace the Euclidean distance to equilibrium by the Jensen-Shannon divergence. The resulting measure turns out to be (i) an intensive quantity and (ii) allows one to distinguish between different degrees of periodicity. We apply the "cured" measure to the logistic map so as to clearly exhibit its advantages. © 2004 Elsevier B.V. All rights reserved. |
---|