Brascamp–Lieb inequality

In mathematics, the Brascamp–Lieb inequality can refer to two inequalities. The first is a result in geometry concerning integrable functions on n-dimensional Euclidean space . It generalizes the Loomis–Whitney inequality and Hölder's inequality. The second is a result of probability theory which gives a concentration inequality for log-concave probability distributions. Both are named after Herm Jan Brascamp and Elliott H. Lieb.

The geometric inequality

Fix natural numbers m and n. For 1  i  m, let ni  N and let ci > 0 so that

Choose non-negative, integrable functions

and surjective linear maps

Then the following inequality holds:

where D is given by

Another way to state this is that the constant D is what one would obtain by restricting attention to the case in which each is a centered Gaussian function, namely .[1]

Relationships to other inequalities

The geometric Brascamp–Lieb inequality

The geometric Brascamp–Lieb inequality is a special case of the above,[2] and was used by Ball (1989) to provide upper bounds for volumes of central sections of cubes.[3]

For i = 1, ..., m, let ci > 0 and let ui  Sn−1 be a unit vector; suppose that ci and ui satisfy

for all x in Rn. Let fi  L1(R; [0, +∞]) for each i = 1, ..., m. Then

The geometric Brascamp–Lieb inequality follows from the Brascamp–Lieb inequality as stated above by taking ni = 1 and Bi(x) = x · ui. Then, for zi  R,

It follows that D = 1 in this case.

Hölder's inequality

As another special case, take ni = n, Bi = id, the identity map on , replacing fi by f1/ci
i
, and let ci = 1 / pi for 1  i  m. Then

and the log-concavity of the determinant of a positive definite matrix implies that D = 1. This yields Hölder's inequality in :

The concentration inequality

Consider a probability density function . is said to be a log-concave measure if the function is convex. Such probability density functions have tails which decay exponentially fast, so most of the probability mass resides in a small region around the mode of . The Brascamp–Lieb inequality gives another characterization of the compactness of by bounding the mean of any statistic .

Formally, let be any derivable function. The Brascamp–Lieb inequality reads:

where H is the Hessian and is the Nabla symbol.[4]

Relationship with other inequalities

The Brascamp–Lieb inequality is an extension of the Poincaré inequality which only concerns Gaussian probability distributions.

The Brascamp–Lieb inequality is also related to the Cramér–Rao bound. While Brascamp–Lieb is an upper-bound, the Cramér–Rao bound lower-bounds the variance of . The expressions are almost identical:

References

  1. This inequality is in Lieb, E. H. (1990). "Gaussian Kernels have only Gaussian Maximizers". Inventiones Mathematicae. 102: 179–208. doi:10.1007/bf01233426.
  2. This was derived first in Brascamp, H. J.; Lieb, E. H. (1976). "Best Constants in Young's Inequality, Its Converse and Its Generalization to More Than Three Functions". Adv. Math. 20: 151–172. doi:10.1016/0001-8708(76)90184-5.
  3. Ball, Keith M. (1989). "Volumes of Sections of Cubes and Related Problems". In Lindenstrauss, J.; Milman, V. D. Geometric Aspects of Functional Analysis (1987–88). Lecture Notes in Math. 1376. Berlin: Springer. pp. 251–260.
  4. This theorem was originally derived in Brascamp, H. J.; Lieb, E. H. (1976). "On Extensions of the Brunn–Minkowski and Prékopa–Leindler theorems, including inequalities for log concave functions, and with an application to the diffusion equation". Journal of Functional Analysis. 22: 366–389. doi:10.1016/0022-1236(76)90004-5. Extensions of the inequality can be found in Hargé, Gilles (2008). "Reinforcement of an Inequality due to Brascamp and Lieb". Journal of Functional Analysis. 254: 267–300. doi:10.1016/j.jfa.2007.07.019 and Carlen, Eric A.; Cordero-Erausquin, Dario; Lieb, Elliott H. (2013). "Asymmetric Covariance Estimates of Brascamp-Lieb Type and Related Inequalities for Log-concave Measures". Annales de l'Institut Henri Poincaré B. 49: 1–12. doi:10.1214/11-aihp462.
This article is issued from Wikipedia - version of the 10/30/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.