The goal of this work is to study the fluctuations the eye is subjected to, from the point of view of noise-enhanced
processing. In this paper we consider a basic model of retina: a regular sampler subjected to space and time
fluctuations to model the random sampling and the eye-tremor respectively. We also take into account the
filtering made by the photoreceptors and we focus on a stochastic model of natural scene. To quantify the
effect of the noise, we study the correlation coefficient between the signal acquired by a given photoreceptor,
and a given point of the scene the eye is looking at. We then show on academic examples as well as on a more
realistic case that the fluctuations which affect the retina can induce noise-enhanced processing effects. Finally,
we interpret why this effect is possible. We especially interpret the microsaccadic movement of the retina as a
stochastic control.
KEYWORDS: Neurons, Monte Carlo methods, Stochastic processes, Sensors, Interference (communication), Data processing, Error analysis, Signal processing, Quantization, Psychophysics
Pooling networks are composed of noisy independent neurons that all noisily process the same information in
parallel. The output of each neuron is summed into a single output by a fusion center. In this paper we study
such a network in a detection or discrimination task. It is shown that if the network is not properly matched to
the symmetries of the detection problem, the internal noise may restore at least partially some kind of optimality.
This is shown for both (i) noisy threshold model neurons, as well as (ii) Poisson neuron models. We also study an
optimized version of the network, mimicking the notion of excitation/inhibition. We show that, when properly
tuned, the network may reach optimality in a very robust way. Furthermore, we find in this optimization that
some neurons remain inactive. The pattern of inactivity is organized in a strange branching structure, the
meaning of which remains to be elucidated.
Neural circuit architecture is a fundamental characteristic of the brain, and how architecture is bound to biological
functions is still an open question. Some neuronal geometries seen in the retina or the cochlea are intriguing:
information is processed in parallel by several entities like in "pooling" networks which have recently drawn the
attention of signal processing scientists. These systems indeed exhibit the noise-enhanced processing effect, which
is also actively discussed in the neuroscience community at the neuron scale. The aim of our project is to use
in-vitro ordered neuron networks as living paradigms to test ideas coming from the computational science. The
different technological bolts that have to be solved are enumerated and the first results are presented. A neuron
is a polarised cell, with an excitatory axon and a receiving dendritic tree. We present how soma confinement
and axon differentiation can be induced by surface functionalization techniques. The recording of large neuron
networks, ordered or not, is also detailed and biological signals shown. The main difficulty to access neural noise
in the case of weakly connected networks grown on micro electrode arrays is explained. This open the door to
a new detection technology suitable for sub-cellular analysis and stimulation, whose development will constitute
the next step of this project.
Pooling networks of noisy threshold devices are good models for natural networks (e.g. neural networks in some
parts of sensory pathways in vertebrates, networks of mossy fibers in the hippothalamus, . . . ) as well as for
artificial networks (e.g. digital beamformers for sonar arrays, flash analog-to-digital converters, rate-constrained
distributed sensor networks, . . . ). Such pooling networks exhibit the curious effect of suprathreshold stochastic
resonance, which means that an optimal stochastic control of the network exists.
Recently, some progress has been made in understanding pooling networks of identical, but independently
noisy, threshold devices. One aspect concerns the behavior of information processing in the asymptotic limit of
large networks, which is a limit of high relevance for neuroscience applications. The mutual information between
the input and the output of the network has been evaluated, and its extremization has been performed. The
aim of the present work is to extend these asymptotic results to study the more general case when the threshold
values are no longer identical. In this situation, the values of thresholds can be described by a density, rather
than by exact locations. We present a derivation of Shannon's mutual information between the input and output
of these networks. The result is an approximation that relies a weak version of the law of large numbers, and a
version of the central limit theorem. Optimization of the mutual information is then discussed.
KEYWORDS: Probability theory, Sensors, Signal processing, Interference (communication), Sensor performance, Neurons, Signal detection, Neural networks, Quantization, Data conversion
The goal of the paper is the study of suboptimal quantizer based detectors. We place ourselves in the situation where internal noise is present in the hard implementation of the thresholds. We hence focus on the study of random quantizers, showing that they present the noise-enhanced detection property. The random quantizers studied are of two types: time invariant when sampled once for all the observations, time variant when sampled at each time. They are built by adding fluctuations on the thresholds of a uniform quantizer. If the uniform quantizer is matched to the symmetry of the detection problem, adding fluctuation deteriorates the performance. If the uniform quantizer is mismatched, adding noise can improve the performance. Furthermore, we show that the time varying quantizer is better than the time invariant quantizer, and we show that both are more robust than the optimal quantizer. Finally, we introduce the adapted random quantizer for which the levels are chosen in order to approximate the likelihood ratio.
In this paper, we revisit the problem of detecting a known signal corrupted by an independent identically distributed α-stable noise. The implementation of the optimal receiver, i.e. the log-likelihood ratio, requires the explicit expression of the probability density function of the noise. In the general α-stable case, there exists no closed-form for the probability density function of the noise. To avoid the numerical evaluation of the probability density function of the noise, we propose to study a parametric suboptimal detector based on properties of α-stable noise and on implementation considerations. We focus our attention on several optimization criteria of the parameters, showing that our choice allows the optimization without using the explicit expression of the noise probability density function. The chosen detector allows to retrieve the optimal Gaussian detector (matched filter) as well as the locally optimal detector in the Cauchy context. The performance of the detector is studied and compared to usual detectors and to the optimal detector. The robustness of the detector against the signal amplitude and the stability index of the noise is discussed.
KEYWORDS: Probability theory, Binary data, Sensors, Detection theory, Information theory, Stochastic processes, Signal processing, Signal detection, Neurons, Measurement devices
In this paper we revisit the asymmetric binary channel from the double point of view of detection theory and information theory. We first evaluate the capacity of the asymmetric binary channel as a function of the probabilities of false alarm and of detection, thus allowing a noise distribution independent analysis. This sets the a priori probabilities of the hypotheses and couples the two points of view. We then study the simple realization of the asymmetric binary channel using a threshold device. We particularly revisit noise-enhanced processing for subthreshold signals using the aforementioned parametrization of the capacity, and we report a somewhat paradoxical effect: using the channel at its capacity precludes in general an optimal detection.
One of the most common characteristic of a system exhibiting stochastic resonance is the existence of a maximum in the output signal-to-noise ratio when plotted against the power of the input noise. This property is at the root of the use of stochastic resonance in detection, since it is generally admitted that performance of detection increases with the signal-to-noise ratio. We show in this paper that this statement is not always true by examining the key index of performance in detection: the probability of detection. Furthermore, when the probability of detection can be increased by an increase of the power of the noise, we address the practical problem of adding noise. We consider in particular the alpha-stable case for which addition does not change the probability density function of the noise.
We present a theory of stochastic processes that are finite size scale invariant. Such processes are invariant under generalized dilations that operate on bounded ranges of scales and amplitudes. We recall here the theory of deterministic finite size scale invariance, and introduce an operator called Lamperti transform that makes equivalent generalized dilations and translations. This operator is then used to defined finite size scale invariant processes as image of stationary processes. The example of the Brownian motion is presented is some details to illustrate the definitions. We further extend the theory to the case of finite size scale invariant processes with stationary increments.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.