Maximizing the entropy of histogram bar heights to explore neural activity: A simulation study on auditory and tactile fibers
654.jpg
PDF

Abstract

Neurophysiologists often use histograms to explore patterns of activity in neural spike trains. The bin size selected to construct a histogram is crucial: too large bin widths result in coarse histograms, too small bin widths expand unimportant detail. Peri-stimulus time (PST) histograms of simulated nerve fibers were studied in the current article. This class of histograms gives information about neural activity in the temporal domain and is a density estimate for the spike rate. Scott's rule based on modern statistical theory suggests that the optimal bin size is inversely proportional to the cube root of sample size. However, this estimate requires a priori knowledge about the density function. Moreover, there are no good algorithms for adaptive-mesh histograms, which have variable bin sizes to minimize estimation errors. Therefore, an unconventional technique is proposed here to help experimenters in practice. This novel method maximizes the entropy of histogram-bar heights to find the unique bin size, which generates the highest disorder in a histogram (i.e., the most complex histogram), and is useful as a starting point for neural data mining. Although the proposed method is ad hoc from a density-estimation point of view, it is simple, efficient and more helpful in the experimental setting where no prior statistical information on neural activity is available. The results of simulations based on the entropy method are also discussed in relation to Ellaway's cumulative-sum technique, which can detect subtle changes in neural activity in certain conditions.
PDF
Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright (c) 2005 Acta Neurobiologiae Experimentalis

Downloads

Download data is not yet available.