A random variable is a variable which is concerned with random variable Probability Theory or we can say in Probability random variable are used for prediction of occurrence of events. Random variable can be defined as variable which obtains its value from measurement on a system and the components of system can vary. A random variable can also be described as a result of an occurring event. The value of random variable is not fixed, value of random variable can vary and it can take different values.
Probability DistributionBack to Top
Probability is a concept which measures the degree of uncertainty and that of certainty also.
Mathematically, we have probability of an event A is: P(A) = m/n.
Where 'm' = possible outcomes favorable to the occurrence of A,
n = Total number of possible outcomes,
and n – m = possible outcomes unfavorable to the occurrence of A.
We also notice that P(A) + P(A-) = m/n + (n-m)/n = 1
P(A) + P(not A) = 1
An event which is based on what has actually happened is called as an experiment and variable which are use in this event is called as an experiment variable like (5, 6, 7, 8, e, f, g, h) is a variable in probability distribution which define as a probability experiment. If an experiment (Condition under identical consumptions) can result in two or more outcomes, it is called random experiments and variables which are use in this random experiment is called as a Random Variable. Both variable and random variables are used in probability distribution.
Now we discuss probability distribution, to understand this let’s take few examples.
Example 1: Find the probability of getting less than 6 in throwing a die once?
Solution : For finding the probability of less than 6 in throwing a die once, we use following steps -
Step 1: When a die is tossed, there are 6 possible outcomes represented by :
S = 1, 2, 3, 4, 5, 6
So, total number of events are n(S) = 6
Step 2: Each possible outcome has a certain random variable and we assume that random variable is X, than favorable events = a number less than 6 = A = 1, 2, 3, 4, 5.
So, total number of favorable events are n(A) = 5. Therefore probability of getting less than 6 in throwing one die is P(A) = n(A)/n(S) = 5/6.
Example 2: Find probability distribution of two coins, when they are tossed at same time?
Solution: For solving the above problem, we use following steps -
Step 1: Let we have two coins and when we tossed both coins at same time, then total possible outcomes or Sample Space is:
S = HH, TH, TT, HT
So, n(S) = 4
Step 2: Each possible outcome has a certain random variable and we assume that random variable is X, where X represents the number of heads. So, value of variable X can take the values as 0, 1, 2. X will be 0 when we have the outcome as two heads means in TT form. X will be 1 when we have the outcome as a one head and one tail (HT or TH). And X will be 2 when we have the outcome as two heads in HH form. Now we calculate probability distribution.
When X is 0, than probability is P(X) = 0/4 = 0
When X is 1, than probability is P(X) = 1/4 = 0.25
When X is 2, than probability is P(X) = 2/4 = 0.50
These examples are shows how we calculate probability distributions.
Discrete Random VariableBack to Top
Basically random variables (i.e. real numbers) are used to make predictions based on the data obtained from the experiments. Random variables are developed for the analysis of games of chance. It’s the identity function for the experiment that is to be performed.
Discrete random variables are obtained by counting and they have the Integer values. Random variables are generally denoted by upper case letters or capital letters. Random variables are explained by their Probabilities.
For example let’s have a Random Variable ‘Y’ then-
P [ Y= y ] = 0.05 + 0.10 + 0.20 + 0.40 + 0.15 + 0.10 = 1.
For the notation P [Y, y], P ( y ) also can be used. It denotes the probability function and also named as probability mass function. If there are more than one functions then the cumulative probabilities are given as F (y) = ∑i≤y P (i). Here the probability F (y) is the probability that Y will have a value less than or equal to y. the function F (y) is named as cumulative distribution function (CDF).
A random variable ‘Y’ is to be called discrete its must that it can assume finite number of values. Like if two bulbs are selected from a certain lot then the number of the defective bulbs can be 0, 1, 2. Here The Range of the variable is from 0 to 2 and this random variable can have some selected values in the range 0 to 2. Means random variable can only specifies the specific values 0, 1, 2 not the values 1.1, 0.2 etc.
Discrete Probability Distribution is characterized by the probability mass function hence the distribution of the random variable would be discrete and it’s called the discrete random variable.
Continuous probability distribution is dependent on the probability density function. If the distribution of ‘Y’ is continuous then the variable ‘X’ is called as continuous random variable. Continuous random variable has a continuous range of values.
The concept behind the probability distribution and the random variables is to describe the mathematical discipline of Probability Theory. If from the population any value is measured like height of people. Durability of a matter, traffic floe etc. there are chances of having an error. For this problem numbers are used to describe a quantity. Probability distribution can be classified as discrete probability distribution or continuous probability distribution.
Let’s have some examples to understand discrete and continuous probability distribution.
· Assume the fire department mandates that all the fire fighters must weigh between 100 to 200 pounds then the weight of a fire fighter would be an example of continuous variable because the weight of fighters can have any value between 100 to 200 pounds.
· Assume the other example of flipping a coin. The number of heads then could be any integer value between zero and infinity, it could be any discrete number between the range so this is the example of discrete variable.
Mean of Discrete Random VariableBack to Top
We may also understand the expected of mean value by understanding the law of large numbers. The law of large number says that as the number of trials or experiments increases, then the mean of these trials will become closer to the true mean or the expected value of the distribution mean that is denoted by ‘µ’. If the expected value exist then the sample mean as sample size grows to the infinite. For example dice roll. The expected value is not predictable in real sense such like having 5.5 children.
For the large tails like Cauchy distribution; the mean value do not exist or cannot be calculated.
We can construct and compute the expected value which is equal to the probability of a given event; simply with the help of a function that is called Indicator function. We simply calculate the expectation of this function. The expectation of indicator function is one if the event occurs else it will be zero. For instance we justify estimating probabilities by frequencies with the help of the law of large numbers.
The mean of any Discrete Random Variable ‘Y’ is a Weighted Average or sum of all the possible values of the random variables that can be considered. The outcome of the mean of each random variable is ‘yi’ according to its probability that is denoted as pi. We use ‘µ’ to denote the expected value of ‘Y’.
Then the expected value or the mean value can be defined as:
µ = y1p1 + y2p2 + y3p3 + . . . . . . . . + ynpn,
µ = ∑ yipi.
The mean of a random variable is simple is medium to provide the average of the variables over many trials or observations.
The property of a random variable say ‘Y’ is that if it is adjusted by multiplying by a value say ‘c’ and adding a value say ‘e’ then the mean will become:
µe + cY = e + cµe; where ‘µ’ is the mean.
The main objective of this is to finding a unique or single value for a number of values or measurements in a given data set or sample. The related calculations of mean are Mode, arithmetic mean, geometrical mean, harmonic mean, root mean Square etc.
So this is all about the mean of discrete random variables.
Variance of a Discrete Random VariableBack to Top
Here Sample space S = (I,j) ; i<=1, j<=6 and random variable function F = i + j .
A random variable can accept finite number of values is called as a Discrete Random Variable, like we selected four bulbs from a given Set and number of defective bulbs may be 0, 1, 2, 3. So, in this situation we know range of defective bulbs and this kind of probability is known as discrete random variable. Now we will discuss how we calculate variance of discrete random variables:
We will use following steps to find out the variance of discrete random variable -
Step 1: First of all we will calculate the expected value that is E(X) of random variable ‘r’. For calculating the expected value, we use X = 1,2,3...........n.
Step 2: After Expected value of random variable, we will calculate the Square of expected value function of random variable means we calculate E(X – r)2 by using following table -
x m(x) (x – 7/2)2
1 - -
2 - -
3 - -
n - -
Step 3: After above two steps, we will calculate variance of discrete random variable by following formula -
v(X) = ∑ [E(X – r)2].
Suppose we throw a dice once and a random variable ‘X’ is generated then we can calculate the variance of random variable V(x) as-
Step 1: First of all we will find expected value of rolling a die-
Expected Numbers will be– 1, 2, 3, 4, 5, 6
So, expected value of ‘X’ is
r = E(X) = 1(1/6) + 2(1/6) + 3(1/6) + 4(1/6) + 5(1/6) + 6(1/6),
= 7 / 2
Step 2: Now we will calculate the square of expected value of random variable by using the following table -
x m(x) (x – 7/2)2
1 1/6 25/4
2 1/6 9/4
3 1/6 1/4
4 1/6 1/4
5 1/6 9/4
6 1/6 25/4
Step 3: Using the above table we can calculate variance of a discrete random variable -
V(X) = ∑ [E(X – r)2],
= 1/6[(25/4) + (9/4) + (1 / 4) + (1 / 4) + (9/4) + (25/4)
So, variance of discrete random is equals to 35/12.
Geometric Probability DistributionBack to Top
Statistically, geometric probability means referring to geometric probability distributions. For example, suppose you have a coin, then what do you think can be the probability that we get a head in third trial? This kind of probability is termed as a geometric probability and is symbolically written as G (x; P). Where, 'x' is total number of events possible and 'P' refers to Geometric Probability. Standard formula used for such distribution can be written as follows: Suppose any experiment consists of “X” samples and results in only one successful outcome. If probability of such an outcome is 'P', then geometric probability is given as: G (X; p) = p * qX – 1.