# Degenerate distribution

Cumulative distribution function
| |

Parameters | |
---|---|

Support | |

pmf | δ |

CDF | |

Mean | |

Median | |

Mode | |

Variance | |

Skewness | undefined |

Ex. kurtosis | undefined |

Entropy | |

MGF | |

CF |

In mathematics, a **degenerate distribution** or **deterministic distribution** is the probability distribution of a random variable which only takes a single value. Examples include a two-headed coin and rolling a die whose sides all show the same number. This distribution satisfies the definition of "random variable" even though it does not appear random in the everyday sense of the word; hence it is considered degenerate.

In the case of a real-valued random variable, the degenerate distribution is localized at a point *k*_{0} on the real line. The probability mass function equals 1 at this point and 0 elsewhere.

The distribution can be viewed as the limiting case of a continuous distribution whose variance goes to 0 causing the probability density function to be a delta function at *k*_{0}, with infinite height there but area equal to 1.

The cumulative distribution function of the degenerate distribution is:

## Constant random variable

In probability theory, a **constant random variable** is a discrete random variable that takes a constant value, regardless of any event that occurs. This is technically different from an **almost surely constant random variable**, which may take other values, but only on events with probability zero. Constant and almost surely constant random variables provide a way to deal with constant values in a probabilistic framework.

Let *X*: Ω → **R** be a random variable defined on a probability space (Ω, *P*). Then *X* is an *almost surely constant random variable* if there exists such that

and is furthermore a *constant random variable* if

Note that a constant random variable is almost surely constant, but not necessarily *vice versa*, since if *X* is almost surely constant then there may exist γ ∈ Ω such that *X*(γ) ≠ *c* (but then necessarily Pr({γ}) = 0, in fact Pr(X ≠ c) = 0).

For practical purposes, the distinction between *X* being constant or almost surely constant is unimportant, since the cumulative distribution function *F*(*x*) of *X* does not depend on whether *X* is constant or 'merely' almost surely constant. In this case,

The function *F*(*x*) is a step function; in particular it is a translation of the Heaviside step function.