Abstract
We propose the divergences from statistics and information theory (IT) as a set of separation indices between signal and noise in stochastic nonlinear dynamical systems (SNDS). The divergences provide a more informative alternative to the signal-to-noise ratio (SNR) and have the advantage of being applicable to virtually any kind of stochastic system. Moreover, divergences are intimately connected to various fundamental limits in IT. Using the properties of divergences, we show that the classical stochastic resonance (SR) curve can be interpreted as the performance of a nonoptimal, or mismatched, detector applied to the output of a SNDS. Indeed, for a prototype double-well system with forcing in the form of white Gaussian noise plus a possible embedded signal, the whole information loss can be attributed to this mismatch; an optimal detection procedure (for the signal) gives the same performance when based on the output as when based on the input of the system. More generally, it follows that, when characterizing signal-noise separation (or system performance) of SNDS in terms of criteria that do not correspond to IT limits, the choice of criterion can be crucial. The indicated figure of merit will then not be universal and will be relevant only to some family of applications, such as the classical (narrow-band SNR) SR criterion, which is relevant for narrow-band post processing. We illustrate the theory using simple SNDS excited by both wide- and narrow-band signals; however, we stress that the results are applicable to a much larger class of signals and systems.
- Received 2 June 2000
DOI:https://doi.org/10.1103/PhysRevE.63.011107
©2000 American Physical Society