ISSN:
1089-7682
Source:
AIP Digital Archive
Topics:
Physics
Notes:
That the topological entropy, hTμ, of a C1〈r≤2diffeomorphism, cursive-phi:M→M, of a surface, M, upon which invariant measure(s) μ are concentrated, varies as the product of its average leading Lyapunov characteristic exponent, λ¯μ, and the Hausdorff dimension of its support, dμ,was proven by Pesin [Russ. Math Surveys 32, 55–114 (1977)] for nonuniform partial hyperbolic systems and by Ledreppier and Young [Ergod. Theor. Dyn. Syst. 2, 109–123 (1982)], and Manning [Ergod. Theor. Dyn. Syst. 1, 451–459 (1981)] for uniformly hyperbolic (Axiom A) diffeomorphisms. When considered in conjunction with the post-Shannon information encoding theorems of Adler [Trans. Am. Math. Soc. 114, 309–319 (1965); Mem. Am. Math. Soc., No. 219 (1979)] and others, this suggests a way to differentiate equal entropy behaviors in systems with varying patterns of dynamical behaviors. Here we show this relation to be useful in the quantitative discrimination among the behaviors of abstract neuronal models and two real, finite time, partially and nonuniformly hyperbolic, brain-related dynamical systems. We observe a trade-off in finite time between two competing dynamical processes, jittery sticking (tending to increase dμ) and convective escaping (more prominently incrementing λ¯μ+). In finite time systems, these changes in combination can statistically conserve the dynamical entropy, hTμ, while altering the Levy characteristic exponent, α (describing the tail of the density distribution of observables, ρ(x)∼exp−γ|x|α,1≤α≤2), and the Mandelbrot-Hurst exponent 0〈H*〈1, such that H*〉0.5 implicates sequential correlations and H*〈0.5 sequential anticorrelation. When the relation hTμ=λ¯μ+dμ fails, the way it does so provides information about the system. © 1997 American Institute of Physics.
Type of Medium:
Electronic Resource
URL:
http://dx.doi.org/10.1063/1.166241
Permalink