## Mutual information, Fisher information, and Efficient codingXue-Xin Wei and Alan A Stocker
Neural Computationvol. 28, p. 305-326, Feb. 2016 |

Fisher information is generally believed to represent a lower bound on mutual information (Brunel and Nadal, 1998), a result that is frequently used in the assessment of neural coding efficiency. However, we demonstrate that the relation between these two quantities is more nuanced than previously thought. For example, we find that in the small noise regime Fisher information actually provides an upper bound on mutual information. Generally, our results show that it is more appropriate to consider Fisher information as an approximation rather than a bound on mutual information. We analytically derive the correspondence between the two quantities and the conditions under which the approximation is good. Our results have implications for neural coding theories, and the link between neural population coding and psychophysically measurable behavior. Specifically, they allow us to formulate the Efficient coding problem of maximizing mutual information between a stimulus variable and the response of a neural population in terms of Fisher information. We derive a signature of Efficient coding expressed as the correspondence between the population Fisher information and the distribution of the stimulus variable. The signature is more general than previously proposed solutions that rely on specific assumptions about the neural tuning characteristics. We demonstrate that it can explain measured tuning characteristics of cortical neural populations that do not agree with previous models of Efficient coding.

pdf | www- Nature Neuroscience 2015. In this paper, we use a very similar formulation of the Efficient coding problem to non-parametrically constrain the likelihood functions of a Bayesian observer model.