There’s a very interesting part of the Likelihood Principle. The principle is commonly used via Likelihood ratio tests. It says that two likelihoods proportional by a constant contain the same information about a parameter vector.
It so happens that the binomial and negative binomial distributions yield proportional likelihoods. Voila math, but really code is math.
First I create the likelihoods of each distribution. It’s not a typo, their bodies are only different by one letter and two numbers.
negative_binomial_likelihood <- function(p) {
prod(dnbinom(40, 7, p))
}
binomial_likelihood <- function(p) {
prod(dbinom(7, 47, p))
}
Now, we can create a nice span of parameter in which we can calulate the likelihood. Probability-esque, but technically plausability.
We’re taking the product of the points evaluated by the distribution function.
library(dplyr); library(ggplot2)
tibble(p = seq(0.01, 0.99, 0.01)) %>%
rowwise() %>%
mutate(nb_likelihood = negative_binomial_likelihood(p),
binomial_likelihood = binomial_likelihood(p)) %>%
ungroup() %>% # important!
# calculate the relative likelihood of the negative binomial
mutate(nb_maximum_likelihood = max(nb_likelihood),
`Negative binomial relative likelihood` = nb_likelihood / nb_maximum_likelihood) %>%
# calculate the relative likelihood of the binomial
mutate(binomial_maximum_likelihood = max(binomial_likelihood),
`Binomial relative likelihood` = binomial_likelihood / binomial_maximum_likelihood) %>%
select(p, `Binomial relative likelihood`, `Negative binomial relative likelihood`) %>%
reshape2::melt("p") %>% as_tibble() %>%
ggplot(aes(x = p, y = value)) +
geom_line() +
ylab("Relative likelihood") +
xlab("Probability of success") +
theme_bw(18) + theme(axis.text = element_text(colour = "black")) +
facet_wrap(~ variable, ncol = 1)
They’re the same!
Why?
Two stories.
The first story goes: I flip a coin until I see the seventh head. It takes me forty seven flips to pick up that seven.
The second story goes: I flip a coin forty seven times. And I saw seven heads.
Does it matter how I flipped the coin if you saw seven heads and forty tails?
- Bayesian: no.
- Frequentist: yes.
- Information theoretic: no.
- Minimum description length: Not certain yet, but almost sure no.
- Fiducial: not yet sure.
- Structural inference: not sure yet.
- MaxEnt: worthy.
LS0tCnRpdGxlOiAiTmVnYXRpdmUgYmlub21pYWwgLyBiaW5vbWlhbCBleHBlcmltZW50cyIKb3V0cHV0OiBodG1sX25vdGVib29rCi0tLQoKVGhlcmUncyBhIHZlcnkgaW50ZXJlc3RpbmcgcGFydCBvZiB0aGUgTGlrZWxpaG9vZCBQcmluY2lwbGUuIFRoZSBwcmluY2lwbGUgaXMgY29tbW9ubHkgdXNlZCB2aWEgTGlrZWxpaG9vZCByYXRpbyB0ZXN0cy4gSXQgc2F5cyB0aGF0IHR3byBsaWtlbGlob29kcyBwcm9wb3J0aW9uYWwgYnkgYSBjb25zdGFudCBjb250YWluIHRoZSBzYW1lIGluZm9ybWF0aW9uIGFib3V0IGEgcGFyYW1ldGVyIHZlY3Rvci4KCkl0IHNvIGhhcHBlbnMgdGhhdCB0aGUgYmlub21pYWwgYW5kIG5lZ2F0aXZlIGJpbm9taWFsIGRpc3RyaWJ1dGlvbnMgeWllbGQgcHJvcG9ydGlvbmFsIGxpa2VsaWhvb2RzLiBWb2lsYSBtYXRoLCBidXQgcmVhbGx5IFtjb2RlIGlzIG1hdGhdKGh0dHBzOi8vZ2l0aHViLmNvbS9zZWFyY2g/cD0zJnE9JTIycHJvZCUyOGRiaW5vbSUyMiZ0eXBlPUNvZGUmdXRmOD0lRTIlOUMlOTMpLgoKRmlyc3QgSSBjcmVhdGUgdGhlIGxpa2VsaWhvb2RzIG9mIGVhY2ggZGlzdHJpYnV0aW9uLiBJdCdzIG5vdCBhIHR5cG8sIHRoZWlyIGJvZGllcyBhcmUgb25seSBkaWZmZXJlbnQgYnkgb25lIGxldHRlciBhbmQgdHdvIG51bWJlcnMuCgpgYGB7cn0KbmVnYXRpdmVfYmlub21pYWxfbGlrZWxpaG9vZCA8LSBmdW5jdGlvbihwKSB7CiAgcHJvZChkbmJpbm9tKDQwLCA3LCBwKSkKfQoKYmlub21pYWxfbGlrZWxpaG9vZCA8LSBmdW5jdGlvbihwKSB7CiAgcHJvZChkYmlub20oNywgNDcsIHApKQp9CmBgYAoKTm93LCB3ZSBjYW4gY3JlYXRlIGEgbmljZSBzcGFuIG9mIHBhcmFtZXRlciBpbiB3aGljaCB3ZSBjYW4gY2FsdWxhdGUgdGhlIGxpa2VsaWhvb2QuIFByb2JhYmlsaXR5LWVzcXVlLCBidXQgdGVjaG5pY2FsbHkgcGxhdXNhYmlsaXR5LgoKV2UncmUgdGFraW5nIHRoZSBwcm9kdWN0IG9mIHRoZSBwb2ludHMgZXZhbHVhdGVkIGJ5IHRoZSBkaXN0cmlidXRpb24gZnVuY3Rpb24uCgpgYGB7cn0KbGlicmFyeShkcGx5cik7IGxpYnJhcnkoZ2dwbG90MikKdGliYmxlKHAgPSBzZXEoMC4wMSwgMC45OSwgMC4wMSkpICU+JQogIHJvd3dpc2UoKSAlPiUKICBtdXRhdGUobmJfbGlrZWxpaG9vZCA9IG5lZ2F0aXZlX2Jpbm9taWFsX2xpa2VsaWhvb2QocCksCiAgICAgICAgIGJpbm9taWFsX2xpa2VsaWhvb2QgPSBiaW5vbWlhbF9saWtlbGlob29kKHApKSAlPiUKICB1bmdyb3VwKCkgJT4lICMgaW1wb3J0YW50IQogICMgY2FsY3VsYXRlIHRoZSByZWxhdGl2ZSBsaWtlbGlob29kIG9mIHRoZSBuZWdhdGl2ZSBiaW5vbWlhbAogIG11dGF0ZShuYl9tYXhpbXVtX2xpa2VsaWhvb2QgPSBtYXgobmJfbGlrZWxpaG9vZCksCiAgICAgICAgIGBOZWdhdGl2ZSBiaW5vbWlhbCByZWxhdGl2ZSBsaWtlbGlob29kYCA9IG5iX2xpa2VsaWhvb2QgLyBuYl9tYXhpbXVtX2xpa2VsaWhvb2QpICU+JQogICMgY2FsY3VsYXRlIHRoZSByZWxhdGl2ZSBsaWtlbGlob29kIG9mIHRoZSBiaW5vbWlhbAogIG11dGF0ZShiaW5vbWlhbF9tYXhpbXVtX2xpa2VsaWhvb2QgPSBtYXgoYmlub21pYWxfbGlrZWxpaG9vZCksCiAgICAgICAgIGBCaW5vbWlhbCByZWxhdGl2ZSBsaWtlbGlob29kYCA9IGJpbm9taWFsX2xpa2VsaWhvb2QgLyBiaW5vbWlhbF9tYXhpbXVtX2xpa2VsaWhvb2QpICU+JQogIHNlbGVjdChwLCBgQmlub21pYWwgcmVsYXRpdmUgbGlrZWxpaG9vZGAsIGBOZWdhdGl2ZSBiaW5vbWlhbCByZWxhdGl2ZSBsaWtlbGlob29kYCkgJT4lCiAgcmVzaGFwZTI6Om1lbHQoInAiKSAlPiUgYXNfdGliYmxlKCkgJT4lCiAgZ2dwbG90KGFlcyh4ID0gcCwgeSA9IHZhbHVlKSkgKwogIGdlb21fbGluZSgpICsKICB5bGFiKCJSZWxhdGl2ZSBsaWtlbGlob29kIikgKwogIHhsYWIoIlByb2JhYmlsaXR5IG9mIHN1Y2Nlc3MiKSArCiAgdGhlbWVfYncoMTgpICsgdGhlbWUoYXhpcy50ZXh0ID0gZWxlbWVudF90ZXh0KGNvbG91ciA9ICJibGFjayIpKSArCiAgZmFjZXRfd3JhcCh+IHZhcmlhYmxlLCBuY29sID0gMSkKYGBgCgpUaGV5J3JlIHRoZSBzYW1lIQoKV2h5PwoKVHdvIHN0b3JpZXMuCgpUaGUgZmlyc3Qgc3RvcnkgZ29lczogSSBmbGlwIGEgY29pbiB1bnRpbCBJIHNlZSB0aGUgc2V2ZW50aCBoZWFkLiBJdCB0YWtlcyBtZSBmb3J0eSBzZXZlbiBmbGlwcyB0byBwaWNrIHVwIHRoYXQgc2V2ZW4uCgpUaGUgc2Vjb25kIHN0b3J5IGdvZXM6IEkgZmxpcCBhIGNvaW4gZm9ydHkgc2V2ZW4gdGltZXMuIEFuZCBJIHNhdyBzZXZlbiBoZWFkcy4KCkRvZXMgaXQgbWF0dGVyIGhvdyBJIGZsaXBwZWQgdGhlIGNvaW4gaWYgeW91IHNhdyBzZXZlbiBoZWFkcyBhbmQgZm9ydHkgdGFpbHM/CgotICBCYXllc2lhbjogbm8uCi0gIEZyZXF1ZW50aXN0OiB5ZXMuCi0gIEluZm9ybWF0aW9uIHRoZW9yZXRpYzogbm8uCi0gIE1pbmltdW0gZGVzY3JpcHRpb24gbGVuZ3RoOiBOb3QgY2VydGFpbiB5ZXQsIGJ1dCBhbG1vc3Qgc3VyZSBuby4KLSAgRmlkdWNpYWw6IG5vdCB5ZXQgc3VyZS4KLSAgU3RydWN0dXJhbCBpbmZlcmVuY2U6IG5vdCBzdXJlIHlldC4KLSAgTWF4RW50OiBbd29ydGh5XShodHRwczovL3NjaG9sYXIuZ29vZ2xlLmNvbS9zY2hvbGFyP2hsPWVuJmFzX3NkdD0wJTJDMSZxPW1heGltdW0rZW50cm9weStpbmZlcmVuY2UrcHJvY2VzcyZidG5HPSZvcT0rJTIybWF4aW11bStlbnRyb3B5K2luZmVyZW5jZSUyMikuCgoKCg==