29,99 €
inkl. MwSt.
Versandkostenfrei*
Versandfertig in über 4 Wochen
  • Broschiertes Buch

High Quality Content by WIKIPEDIA articles! In statistics, the observed information, or observed Fisher information, is the negative of the second derivative (the Hessian matrix) of the "log-likelihood" (the logarithm of the likelihood function). It is a sample-based version of the Fisher information. In mathematical statistics and information theory, the Fisher information (sometimes simply called information) is the variance of the score. In Bayesian statistics, the asymptotic distribution of the posterior mode depends on the Fisher information and not on the prior (according to the…mehr

Andere Kunden interessierten sich auch für
Produktbeschreibung
High Quality Content by WIKIPEDIA articles! In statistics, the observed information, or observed Fisher information, is the negative of the second derivative (the Hessian matrix) of the "log-likelihood" (the logarithm of the likelihood function). It is a sample-based version of the Fisher information. In mathematical statistics and information theory, the Fisher information (sometimes simply called information) is the variance of the score. In Bayesian statistics, the asymptotic distribution of the posterior mode depends on the Fisher information and not on the prior (according to the Bernstein von Mises Theorem, which was anticipated by Laplace for exponential families). The role of the Fisher information in the asymptotic theory of maximum-likelihood estimation was emphasized by the statistician R.A. Fisher (following some initial results by F. Y. Edgeworth). The Fisher information is also used in the calculation of the Jeffreys prior, which is used in Bayesian statistics.