Paper
15 June 2007 Pooling networks for a discrimination task: noise-enhanced detection
Author Affiliations +
Proceedings Volume 6602, Noise and Fluctuations in Biological, Biophysical, and Biomedical Systems; 66020S (2007) https://doi.org/10.1117/12.724641
Event: SPIE Fourth International Symposium on Fluctuations and Noise, 2007, Florence, Italy
Abstract
Pooling networks are composed of noisy independent neurons that all noisily process the same information in parallel. The output of each neuron is summed into a single output by a fusion center. In this paper we study such a network in a detection or discrimination task. It is shown that if the network is not properly matched to the symmetries of the detection problem, the internal noise may restore at least partially some kind of optimality. This is shown for both (i) noisy threshold model neurons, as well as (ii) Poisson neuron models. We also study an optimized version of the network, mimicking the notion of excitation/inhibition. We show that, when properly tuned, the network may reach optimality in a very robust way. Furthermore, we find in this optimization that some neurons remain inactive. The pattern of inactivity is organized in a strange branching structure, the meaning of which remains to be elucidated.
© (2007) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Pierre-Olivier Amblard, Steeve Zozor, Mark D. McDonnell, and Nigel G. Stocks "Pooling networks for a discrimination task: noise-enhanced detection", Proc. SPIE 6602, Noise and Fluctuations in Biological, Biophysical, and Biomedical Systems, 66020S (15 June 2007); https://doi.org/10.1117/12.724641
Lens.org Logo
CITATIONS
Cited by 8 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Neurons

Monte Carlo methods

Stochastic processes

Sensors

Interference (communication)

Data processing

Error analysis

Back to Top