Appendix A

Probability Review

A.1 Standard Probability Theory

A.1.1 Probability Space

A probability space (Ω, bapp01-math-0001, ) is the provision of:

  • A set of all possible outcomes ω ∈ Ω, sometimes called states of Nature
  • A σ-algebra, that is, a set of measurable events Abapp01-math-0002, which (1) contains ∅, (2) is stable by complementation (bapp01-math-0003) and (3) is stable by countable unions (bapp01-math-0004)
  • A probability measure : bapp01-math-0005 → [0, 1], which (1) satisfies (Ω) = 1 and (2) is countably additive (bapp01-math-0006 where denotes disjoint union)

A.1.2 Filtered Probability Space

A filtered probability space (Ω, bapp01-math-0007, (t), ) is a probability space equipped with a filtration (t), which is an increasing sequence of σ-algebras (for any tt′:ttbapp01-math-0008). Informally the filtration represents “information” garnered through time.

A.1.3 Independence

Two events (A, B) ∈ bapp01-math-00092 are said to be independent whenever their joint probability is the product of individual probabilities:

equation

A.2 Random Variables, Distribution, and Independence

A.2.1 Random Variables

A random variable is a function X: Ω → mapping every outcome with a real number, such that the event {Xx} ∈ bapp01-math-0011 for all x. The notation Xbapp01-math-0012 is often used to indicate that X satisfies the requirements for a random variable with respect to the σ-algebra bapp01-math-0013.

The cumulative distribution function of X is then bapp01-math-0014, which is always defined. In most practical applications the probability mass function bapp01-math-0015 or density function bapp01-math-0016 (often denoted bapp01-math-0017 as well) contains all the useful information about X.

The mathematical expectation of X, if it exists, is then:

  • In general: bapp01-math-0018;
  • For clearly discrete random variables: bapp01-math-0019;
  • For random variables with density fX: bapp01-math-0020.

The law of the unconscious statistician states that if X has density fX the expectation of an arbitrary function g(X) is given by the inner product of fX and g:

equation

if it exists.

The variance of X, if it exists, is defined as bapp01-math-0022, and its standard deviation as bapp01-math-0023.

A.2.2 Joint Distribution and Independence

Given n random variables X1,…, Xn, their joint cumulative distribution function is:

equation

and each individual cumulative distribution function bapp01-math-0025 is then called “marginal.”

The n random variables are said to be independent whenever the joint cumulative distribution function of any subset is equal to the product of the marginal cumulative distribution functions:

equation

The covariance between two random variables X, Y is given as:

equation

and their correlation coefficient is defined as: bapp01-math-0028. If X, Y are independent, then their covariance and correlation is zero but the converse is not true. If ρ = ± 1 then bapp01-math-0029.

The variance of the sum of n random variables X1,…, Xn is:

equation

If X, Y are independent with densities fX, fY, the density of their sum X + Y is given by the convolution of marginal densities:

equation

A.3 Conditioning

Conditioning is a method to recalculate probabilities using known information. For example, at the French roulette the initial Ω is {0, 1,…, 36} but after the ball falls into a colored pocket, we can eliminate several possibilities even as the wheel is still spinning.

The conditional probability of an event A given B is defined as:

equation

Note that bapp01-math-0033 if A, B are independent.

This straightforwardly leads to the conditional expectation of a random variable X given an event B:

equation

Generally, the conditional expectation of X given a σ-algebra bapp01-math-0035 can be defined as the random variable Y such that:

equation

or equivalently: bapp01-math-0037. bapp01-math-0038 can be shown to exist and to be unique with probability 1.

The conditional expectation operator shares the usual properties of unconditional expectation (linearity; if XY then bapp01-math-0039; Jensen's inequality; etc.) and also has the following specific properties:

  • If X then bapp01-math-0040
  • If X and Y is arbitrary then bapp01-math-0041
  • Iterated expectations: if 12 are σ-algebras then bapp01-math-0042. In particular bapp01-math-0043

A.4 Random Processes and Stochastic Calculus

A random process, or stochastic process, is a sequence (Xt) of random variables. When Xtt for all t the process is said to be (t)-adapted.

The process (Xt) is called a martingale whenever for all t < t′: bapp01-math-0044.

The process (Xt) is said to be predictable whenever for all t: Xtt (Xt is knowable prior to t).

The path of a process (Xt) in a given outcome ω is the function bapp01-math-0045.

A standard Brownian motion or Wiener process (Wt) is a stochastic process with continuous paths that satisfies:

  • W0 = 0
  • For all t < t′ the increment WtWt follows a normal distribution with zero mean and standard deviation bapp01-math-0046
  • Any finite set of nonoverlapping increments bapp01-math-0047 is independent.

An Ito process (Xt) is defined by the stochastic differential equation:

equation

where W is a standard Brownian motion, (at) is a predictable and integrable process, and (bt) is a predictable and square-integrable process.

The Ito-Doeblin theorem states that a C2 function (f(Xt)) of an Ito process is also an Ito process with stochastic differential equation:

equation
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.191.189.23