Mathematical operations over random variables

Two random variables are independent if the law of distribution of one of them does not vary from that which possible values were taken on by another variable. So, if a discrete random variable X can take on values xi (i = 1, 2, …, n), and a random variable Y – values yj (j = 1, 2, .., m) then the independence of the discrete random variables X and Y means the independence of the events X = xi and Y = yj for all i = 1, 2, .., n and j = 1, 2, .., m. Otherwise, the random variables are dependent.

For example, if there are tickets of two different monetary lotteries then the random variables X and Y expressing respectively a prize under each ticket (in monetary units) will be independent as at any prize under a ticket of one lottery the law of distribution of a prize under other ticket will not be changed. If the random variables X and Y express a prize under tickets of one monetary lottery then in this case X and Y are dependent since any prize under one ticket (X = xi) results in change of probabilities of a prize under other ticket (Y = yj), i.e. changes the law of distribution Y.

Let two random variables X and Y be given:

, .

The product kX of a random variable X on a constant k is the random variable which takes on values kxi with the same probabilities pi (i = 1, 2, .., n).

The m-th degree of a random variable X, i.e. X m is the random variable which takes on values xim with the same probabilities pi (i = 1, 2, .., n).

Example. Let a random variable X be given: . Find the law of distribution of the random variables: a) Y = 3X; b) Z = X 2.

Solution: a) The values of the random variable Y will be: 3 × (–2) = – 6; 3 × 1 = 3; 3 × 2 = 6 with the same probabilities 0,5; 0,3; 0,2, i.e. .

b) The values of the random variable Z will be: (– 2)2 = 4, 12 = 1, 22 = 4 with the same probabilities 0,5; 0,3; 0,2. Since the value Z = 4 can be obtained by squaring the values (– 2) with probability 0,5 and (+ 2) with probability 0,2, under the theorem of addition: P(Z = 4) = 0,5 + 0,2 = 0,7. Thus, we have the following law of the random variable Z:

The sum (the difference or the product) of random variables X and Y is the random variable which takes on all possible values of kind xi + yj (xi – yj or xi × yj) where i = 1, 2, …, n; j = 1, 2, …, m with the probabilities pij that the random variable X will take on the value xi, and Y – the value yj:

If random variables X and Y are independent, i.e. any events X = xi, Y = yj are independent, then by theorem of multiplication of probabilities for independent events

(Mathematical) expectation of a discrete random variable

One of the most important concepts in probability theory is the expectation of a random variable. If X is a discrete random variable having a probability mass function p(x), the (mathematical) expectation (the expected value or the mean) of X, denoted by M(X), is defined by

In words, the expected value of X is a weighted average of the possible values that X can take on, each value being weighted by the probability that X assumes it. For instance, if the probability mass function of X is given by

then

is just the ordinary average of the two possible values 0 and 1 that X can assume. On the other hand, if

then

is a weighted average of the two possible values 0 and 1.

Remark. The concept of expectation is analogous to the physical concept of the center of gravity of a distribution of mass. Consider a discrete random variable X having probability mass function p(xi), i ³ 1. If we now imagine a weightless rod in which weights with mass p(xi), i ³ 1, are located at the points xi, i ³ 1, then the point at which the rod would be in balance is known as the center of gravity. For those readers acquainted with elementary statics it is now a simple matter to show that this point is at M(X).

Example. The laws of distribution of random variables X and Y – the numbers of points beaten out by 1-st and 2-nd shooters are known:

It is necessary to find out which of two shooters shoots better.

Solution: Obviously, a shooter shoots better if he beats out more number of points on the average than another shooter.

Thus, the second shooter shoots better on the average.

If a discrete random variable X takes on an infinite (countable) set of values x1, x2, …, xn, … then the mathematical expectation or the expected value of such a discrete random variable is the sum of the following series (if it absolutely converges): .

Property 1. The mathematical expectation of a constant is equal to the constant:

M(C) = C

Property 2. A constant multiplier can be taken out for a sign of mathematical expectation, i.e.

M(kX) = kM(X)

Property 3. The mathematical expectation of the algebraic sum of finitely many random variables is equal to the sum of their mathematical expectations, i.e.

Property 4. The mathematical expectation of the product of finitely many mutually independent random variables is equal to the product of their mathematical expectations:

Property 5. The mathematical expectation of deviation of a random variable from its mathematical expectation is equal to zero:

M[X – M(X)] = 0

Наши рекомендации