(probability theory:)
Probability speak for a measurable function on a probability space (i.e. a measure space of total measure 1).

Random variables are the formalisation of the concept of the result of an experiment. You have a system with a random element (your probability space); you perform a measurement. The result is a deterministic function of the state of your system, but your system itself is random. So the result is a random variable.

Examples:

  1. The number of heads in a sequence of 163 coin tosses.
  2. The average value of the results from rolling 17 dice (you'd expect this to be roughly 3.5, but since you're dividing by an odd number you'll never get this number!).

According to the study of statistics a random variable is one which can have any of a particular range of values, but does not have an equal probability of having each of those values. For example, the mark of a first year statistics student could be anything from 0-100% but it is most probable they will have a mark within two standard deviations of the median mark for the class.

The roll of a dice does not represent a random variable since there is an equal probability that it will land on any side. Statistics that are random variables are often expressed on a histogram.

A random variable X is a function from Ω (the event space) to the set of real numbers. if Ω is finite or countable, then X is a discrete random variable. If not X is a continuous random variable. This is a really wonderful definition, as it shows that random variables aren't variables at all and they aren't random either.

A discrete random variable is determined by its distribution function, f. This is a function such that f(k)=probability that X takes the value k (usually written P(X=k)).
f must have the obvious properties: f is positive, and Σf(k)=1, for k ranging over the range of X.
The expectancy of X (also called mean) is E(X)=ΣkP(X=k). Note that the mean might be infinite. We also define the variance to be σ2=E((X-E(X))2.
Before evaluating this, you may wish to convince that expectancy is linear : if a,b constants, X,Y random variables, then E(aX +bY)=a E(X) + b E(Y) (provided all these expectancies exist of course). Also E(a)=a

 σ2=E(X2 -2XE(X)-E(X)2)
   =E(X2)-2(E(X))2+(E(X))2 (by linearity of expectancy)
   =E(X2)-E(X)2, which is a useful formula for calculating variances.
finally, an example: the Poisson distribution. here we have P(X=k) = eλk/k!
λ is the parameter of the distribution, k is a positive integer.
Then E(X) = Σk=0k eλk/k!
     = eΣk=1λk/(k-1)!
     = λeΣk=1λk-1/(k-1)!
     = λeΣk=0λk/k!
     = λeeλ
     = λ

Thus the Poisson distribution has mean λ. you may wish to check for yourselves that its variance is also λ

As stated previously, continuous random variables are what happens when we have an uncountable event space. it seems natural to replace the sum by an integral. The distribution function F(x) is defined to be P(X≤x)=∫-∞xf(x)dx , where f is the probability density function (noted pdf). For all this to work we need the obivous properties:

  • f positive
  • -∞f(x)dx=1

It is natural enough to define mean and variance in a similar way : E(X)=∫-∞xf(x)dx, the definition of the variance is the same. Linearity of expectancy follows from linearity of the integral. As before, the expectancy may not exist. Note that discrete random variables are just continuous ones, whose distribution function is a step function.

To round things off, another example: the Cauchy distribution. (Yes this is the same Cauchy as the Cauchy-Argand diagram, Cauchy's Inequality, Cauchy integral formula etc... Nosy little bugger couldn't keep his nose out of any area of maths). The density function of the cauchy distribution is f(x)= a/π(x2+a2), where a is a non-zero real parameter.

-∞f(x)dx
=(1/a2π)-∞adx/(1+(x/a)2)
=(1/π)-∞dy/(1+y2)   by change of variables y = x/a
=(1/π)[arctan y ]-∞
= 1
f satisfies the rules to be a pdf. The fun starts if you try to calculate the mean: ∫-∞xf(x)dx =(1/2π)-∞2axdx/(x2+a2) =(a/2π)[ln(x2+a2)]-∞ Which as you can see does not have a limit. Hooray !! A distribution with an undefined mean !


Source: What i can remember of my Part IA Probability course, which i have an exam on in 2 days

Log in or register to write something here or to contact authors.