An intuition for physicists: information gain from experiments [CL]

http://arxiv.org/abs/2205.00009


How much one has learned from an experiment is quantifiable by the information gain, also known as the Kullback-Leibler divergence. The narrowing of the posterior parameter distribution $P(\theta|D)$ compared with the prior parameter distribution $\pi(\theta)$, is quantified in units of bits, as: $ D_{\mathrm{KL}}(P|\pi)=\int\log_{2}\left(\frac{P(\theta|D)}{\pi(\theta)}\right)\,P(\theta|D)\,d\theta $. This research note gives an intuition what one bit of information gain means. It corresponds to a Gaussian shrinking its standard deviation by a factor of three.

Read this paper on arXiv…

J. Buchner
Tue, 3 May 22
34/82

Comments: Accepted to RNAAS