**Definition:**

A **random variable** is a variable whose possible values are numerical outcomes of a random phenomenon. There are two types of random variables, ** discrete** and

**.**

*continuous***Discrete Random Variables:**

A discrete random variable can take only a finite number of distinct values. For example, attendance of a class on any given day, the number of patients in a doctor's surgery, the number of defective light bulbs in a box of 10, etc.

The **probability distribution** of a discrete random variable is a list of probabilities associated with each of its possible values. It is also sometimes called the **probability function** or the **probability mass function**.

Suppose a random variable *X* may take *k* different values, with the probability that *X = x**i* defined to be *P(X = x**i**) = p**i*. The probabilities *p**i* must satisfy the following:

*0 < pi < 1 for each i*

- p1 + p2 + ... + pk = 1.

**Continuos Random Variables:**

A ** continuous random variable** can take an infinite number of possible values. Continuous random variables are usually measurements, numerical interpretations of phenomenons. Examples include height, weight, the amount of sugar in an orange, the time required to run a mile.

Suppose a random variable *X* may take all values over an interval of real numbers. Then the probability that *X* is in the set of outcomes *A, P(A)*, is defined to be the area above *A* and under a curve. The curve, which represents a function *p(x)*, must satisfy the following:

*1:** The curve has no negative values (p(x) **>** 0 for all x)*

*2:** The total area under the curve is equal to 1.*