Abstract
A statistical manifold M can be defined as a Riemannian manifold each of whose points is a probability distribution on the same support. In fact, statistical manifolds possess a richer geometric structure beyond the Fisher information metric defined on the tangent bundle TM. Recognizing that points in M are distributions and not just generic points in a manifold, TM can be extended to a Hilbert bundle HM. This extension proves fundamental when we generalize the classical notion of a point estimate-a single point in M-to a function on M that characterizes the relationship between observed data and each distribution in M. The log likelihood and score functions are important examples of generalized estimators. In terms of a parameterization θ:M→Θ⊂Rk, θ^ is a distribution on Θ while its generalization gθ^=θ^-Eθ^ as an estimate is a function over Θ that indicates inconsistency between the model and data. As an estimator, gθ^ is a distribution of functions. Geometric properties of these functions describe statistical properties of gθ^. In particular, the expected slopes of gθ^ are used to define Λ(gθ^), the Λ-information of gθ^. The Fisher information I is an upper bound for the Λ-information: for all g, Λ(g)≤I. We demonstrate the utility of this geometric perspective using the two-sample problem.