Chapter 8

Preliminaries

Abstract

In this chapter we revise some basic concepts from stochastic analysis. We begin with the properties of stochastic processes before defining the stochastic integral in the sense of Itô. The third section is concerned with the chain rule in stochastic integration which is known as Itô's formula. Finally we present classical as well as more recent results on stochastic ordinary differential equations. These will be used in the finite dimensional approximation of stochastic PDEs in chapter 10.

Keywords

Stochastic analysis; Stochastic processes; Stochastic integration; Stochastic differential equations; Itô's formula

8.1 Stochastic processes

We consider random variables on a probability space (Ω,F,P)Image. Let (Ft)t0Image be a filtration such that FsFtFImage for 0st<Image. A real-valued stochastic process is a set of random variables X=(Xt)t0Image on (Ω,F)Image with values in (R,B(R))Image. A stochastic process can be interpreted as a function of t and ω, where t can be interpreted as time. For fixed ωΩImage the mapping tXt(ω)Image is called path or trajectory of X. We follow the presentation from [100] where the interested reader may also find details of the proofs.

Definition 8.1.1

A stochastic process is called measurable, if the mapping

(t,ω)Xt(ω):([0,)×Ω,B([0,))F)(R,B(R))

Image

is measurable.

Definition 8.1.2

A stochastic process is called adapted to the filtration (Ft)t0Image, if the mapping

ωXt(ω):(Ω,Ft)(R,B(R))

Image

is measurable for all t0Image.

Definition 8.1.3

A stochastic process is called progressively measurable, if the mapping

(s,ω)Xs(ω):([0,t]×Ω,B([0,t])Ft)(R,B(R))

Image

is measurable for all t0Image.

Remark 8.1.15

Progressive measurability implies adaptedness.

Theorem 8.1.29

If a stochastic process X is adapted to the filtration (Ft)t0Image and a.e. path is left-continuous or right-continuous, then X is progressively measurable.

The most important process is the Wiener process.

Definition 8.1.4

Wiener process

A Wiener process is a real valued stochastic process W=(Wt)t0Image with the following properties.

i) The increments of B are independent, i.e. for arbitrary 0t0<t1<<tnImage the random variables Wt1Wt0,Wt2Wt1,,WtnWtn1Image are independent.

ii) For all t>s0Image we have WtWsN(0,ts)Image.

iii) There holds W0=0Image almost surely.

iv) The mapping tWt(ω)Image is continuous for a.e. ωΩImage.

Definition 8.1.5

A filtration (Ft)t0Image is called right-continuous, if

Ft=ε>0Ft+εt0

Image

and left-continuous, if

Ft=s<tFst>0.

Image

Definition 8.1.6

A filtration (Ft)t0Image satisfies the usual conditions, if it is right-continuous and F0Image contains all the PImage-nullsets of FImage.

Definition 8.1.7

Let X be a stochastic process and TImage a FImage-measurable random variable with values in [0,]Image. We define XTImage in {T<}Image by

XT(ω)=XT(ω)(ω).

Image

If X(ω)=limtXt(ω)Image exists for a.e. ωΩImage we set XT(ω)=X(ω)Image in {T=}Image.

Definition 8.1.8

Let X be a stochastic process on a probability space (Ω,F,P)Image with filtration (Ft)t0Image. A random variable TImage is called a stopping time if the set {Tt}Image belongs to FtImage for all t0Image.

Then by T induced σ-algebra FTImage is given by

FT:={AF:A{Tt}Ftt0}.

Image

Lemma 8.1.1

Let X be a progressively (Ft)t0Image-measurable stochastic process. Let f:[0,)×RRImage be B([0,))B(R)Image-measurable. The process

Yt:=0tf(s,Xs)ds,t0,

Image

is progressively (Ft)t0Image-measurable. If TImage is a stopping time then YTImage is a FTImage-measurable random variable.

Definition 8.1.9

Let M=(Mt)t0Image be a (Ft)t0Image-adapted stochastic process with E[|Mt|]<Image for all t0Image. X is called a sub-martingale (super-martingale, respectively) if we have for all 0st<Image that PImage-a.s. E[Mt|Fs]MsImage (E[Mt|Fs]MsImage, respectively). M is called a martingale if it is a sub-martingale and a super-martingale.

Definition 8.1.10

Let A be an adapted stochastic process. A is called increasing, if we have for PImage-a.e. ωΩImage

i) A0=0Image;

ii) tAt(ω)Image is increasing and right-continuous;

iii) E[At]<Image for all t[0,)Image.

An increasing process is called integrable if E[A]<Image, where A(ω)=limtAt(ω)Image for ωΩImage.

Theorem 8.1.30

Doob–Meyer-decomposition

Let X be a positive submartingale with a.s. continuous trajectories. There is a continuous martingale M and an a.s. increasing continuous and adapted process A, such that

Xt2=Mt+At.

Image

The decomposition is unique.

Definition 8.1.11

Let X be a right-continuous martingale. We call X quadratically integrable, if E[Xt2]<Image, for all t0Image. If we have in addition that X0=0Image PImage-a.s., we write XM2Image or XM2cImage, if X is continuous.

Definition 8.1.12

For XM2Image and 0t<Image we define

Xt:=E[Xt2].

Image

We set

X:=n=1Xn12n

Image

which induces a metric d on M2Image given by

d(X,Y)=XY.

Image

Definition 8.1.13

Total variation

A function f:[0,t]RImage is called of bounded variation if there is M>0Image such that i=1n|f(xi)f(xi1)|MImage for all finite partitions Π={x0,x1,,xn}[0,t]Image (nNImage) with 0=x0<x1<<xn=tImage. The quantity

V(f):=sup{i=1n|f(xi)f(xi1)|:0=x0<x1<<xn=t,nN}

Image

is called total variation of f over [0,t]Image.

Definition 8.1.14

Quadratic variation

For XM2Image we define the quadratic variation of X, as the process Xt:=AtImage, where A is the increasing process from the Doob–Meyer-decomposition of X.

Definition 8.1.15

Covariation

For X,YM2Image we define the covariation X,YImage by

X,Yt:=14[X+YtXYt],

Image

for t0Image. The process XYX,YImage is a martingale. In particular, we have X,X=XImage.

Definition 8.1.16

p-th Variation

Let X be a stochastic process, p1Image, t>0Image fixed and Π={t0,t1tn}Image, with 0=t0<t1<<tn=tImage, nNImage, a partition of [0,t]Image. The p-th variation of X in Π is defined by

Vt(p)(Π)=k=0n|XtkXtk1|p.

Image

Theorem 8.1.31

Let XM2cImage, Π be a partition of [0,t]Image and Π:=max1kn|tktk1|Image the size of Π. Then we have limΠ0Vt(2)(Π)=XtImage in probability, i.e. for all ε>0Image, η>0Image there is δ>0Image, such that we have that

P(|Vt(2)(Π)Xt|>ε)<η,

Image

for Π<δImage.

8.2 Stochastic integration

The aim of this section is to define stochastic integrals of the form

IT(X)=0TXt(ω)dMt(ω).

Image (8.2.1)

Here M is a square integrable martingale, X a stochastic process and T>0Image. Throughout the section we assume that M0=0Image PImage-a.s. Moreover, we suppose that M is a quadratically integrable (Ft)t0Image-adapted martingale where (Ft)t0Image is a filtration which satisfies the usual conditions (see Definition 8.1.6). A process MM2cImage could be of unbounded variation in any finite subinterval of [0,T]Image. Hence integrals of the form (8.2.1) cannot be defined pointwise in ωΩImage. However, M has finite quadratic variation given by the continuous and increasing process MImage (see Theorem 8.1.31). Due to this fact, the stochastic integral can be defined with respect to continuous integrable martingales M for an appropriate class of integrands X.

The definition of the stochastic integral goes back to Itô. He studied the case where M is a Wiener process. His students Kunita and Watanabe considered the general case MM2cImage. In the following we have a look at the class of integrands which are allowed in (8.2.1). We define a measure μMImage on ([0,)×Ω,B([0,))F)Image by

μM(A)=E[0IA(t,ω)dMt(ω)]forAB([0,))F.

Image (8.2.2)

We call two (Ft)t0Image-adapted stochastic processes X=(Xt)t0Image and Y=(Yt)t0Image equivalent with respect to M, if Xt(ω)=Yt(ω)Image μMImage-a.e. This leads to the following equivalence relation: for a (Ft)t0Image-adapted process X we define

[X]T2:=E[0TXt2(ω)dMt(ω)],

Image (8.2.3)

provided the right-hand-side exists. So [X]TImage is the L2Image-norm of X as a function of (t,ω)Image with respect to the measure μMImage. We define the equivalence relation

XY[XY]T=0T>0.

Image (8.2.4)

Our definition of the stochastic integral will imply that I(X)Image and I(Y)Image coincide provided X and Y are equivalent.

Definition 8.2.1

We define LImage as the space of equivalence classes of progressively (Ft)t0Image-measurable processes X with [X]T<Image for all T>0Image.

Remark 8.2.16

By setting [XY]:=n=02n(1[XY]n)Image we can define a metric on LImage.

Remark 8.2.17

In the following we do not distinguish between X and the equivalence class XImage of X.

For 0<T<Image we define LTImage as the space of processes XLImage with Xt(ω)=0Image for all tTImage and a.e. ωΩImage and set

L:={XLT:E[0Xt2dMt]<}.

Image

A process XLTImage can be identified with a process only defined on [0,T]×ΩImage. In particular we have that LTImage is a closed subspace of the Hilbert space

HT:=L2(Ω×(0,T),FTB([0,T]),μM).

Image (8.2.5)

Definition 8.2.2

A process X is called step process if there is a strictly increasing sequence (tn)nNRImage with t0=0Image and limntn=Image, a sequence of random variables (ξn)nN0Image and C<Image with supnN0|ξn(ω)|CImage such that the following holds: for every ωΩImage we have that ξnImage is FtnImage-measurable for every nN0Image and we have the representation

Xt(ω)=ξ0(ω)I{0}(t)+i=0ξi(ω)I(ti,ti+1](t),

Image (8.2.6)

for all 0t<Image. The space of step processes is denoted by L0Image.

Remark 8.2.18

(i) Step processes are progressively measurable and bounded. (ii) There holds L0LImage.

Definition 8.2.3

Let XL0Image and MM2cImage. The stochastic integral of X with respect to M is the martingale transformation

It(X):=i=0n1ξi(Mti+1Mti)+ξn(MtMtn)=i=0ξi(Mtti+1Mtti),

Image (8.2.7)

for 0t<Image. Here nNImage is the unique natural number such that tnt<tn+1Image.

In order to define the stochastic integral for XLImage we have to approximate the elements of LImage in an appropriate way by step processes, i.e. by processes in L0Image. This can be done thanks to the following theorem.

Theorem 8.2.32

The space of step processes L0Image is dense in LImage with respect to the metric defined in Remark 8.2.16.

Definition 8.2.4

Let XLImage and MM2cImage. The stochastic integral of X with respect to M is the unique quadratically integrable martingale

I(X)={It(X),Ft,0t<},

Image

which satisfies limnI(X(n))I(X)=0Image, for every sequence (X(n))nNImage L0Image with limn[X(n)X]=0Image. We write

It(X)=0tXsdMs;0t<.

Image

Theorem 8.2.33

Let X,YLImage and 0s<t<Image. For the stochastic integrals I(X),I(Y)Image we have

a) I0(X)=0Image PImage-a.s.,

b) E[It(X)|Fs]=Is(X)Image, PImage-a.s. (martingale property),

c) E[(It(X))2]=E[0tXu2dMu]Image (Itô-isometry),

d) I(X)=[X]Image,

e) E[(It(X)Is(X))2|Fs]=E[stXu2dMu|Fs]Image PImage-a.s.,

f) I(αX+βY)=αI(X)+βI(Y)Image, for α,βRImage.

8.3 Itô's Lemma

One of the most important tools in stochastic analysis is Itô's Lemma. It is a chain-rule for paths of stochastic processes. In contrast to the deterministic case it can only be interpreted as an integral equation because the stochastic processes we are interested in (for instance the Wiener process) are in general not differentiable.

Definition 8.3.1

Continuous local martingale

Let (Xt)t0Image be a continuous process adapted to (Ft)t0Image. Assume there is a sequence of stopping times (Tn)nNImage of the filtration (Ft)t0Image, such that (Xt(n):=XtTn)t0Image is a (Ft)t0Image-martingale for all nNImage and P(limnTn=)=1Image. In this case we call X a continuous local martingale. If, in addition, X0=0Image PImage-a.s., we write XM2c,locImage.

Definition 8.3.2

Continuous semi-martingale

A continuous semi-martingale (Xt)t0Image is a (Ft)t0Image-adapted process such that the following (unique) decomposition holds:

Xt=X0+Mt+Bt,0t<.

Image (8.3.8)

In the above M=(Mt)t0M2c,locImage and B=(Bt)t0Image is the difference of two continuous increasing and (Ft)t0Image-adapted processes A±=(At±)t0Image, i.e. there holds

Bt=At+At,0t<,

Image (8.3.9)

with A0±=0Image PImage-a.s.

Theorem 8.3.34

Itô's Lemma

Let f:RRImage be a C2Image-function and (Xt)t0Image be a continuous (Ft)t0Image semi-martingale with the decomposition (8.3.8). The following holds PImage-a.s. for 0t<Image

f(Xt)=f(X0)+0tf(Xs)dXs+120tf(Xs)dMs=f(X0)+0tf(Xs)dMs+0tf(Xs)dBs+120tf(Xs)dMs.

Image (8.3.10)

Remark 8.3.19

The stochastic integral 0tf(Xs)dMsImage in (8.3.10) is a continuous local martingale. The other two integrals in (8.3.10) are Lebesgue–Stieltjes integrals. They are of bounded variation as a function of t. Due to this (f(Xt))t0Image is a continuous (Ft)t0Image semi-martingale.

Remark 8.3.20

Equation (8.3.10) is often written in differential form

df(Xt)=f(Xt)dXt+12f(Xt)dMt=f(Xt)dMt+f(Xt)dBt+12f(Xt)dMt.

Image

Note that this does not have a rigorous meaning. It only serves as an abbreviation of (8.3.10).

8.4 Stochastic ODEs

In this section we are concerned with stochastic differential equations. We seek a real-valued process (Xt)t[0,T]Image on a probability space (Ω,F,P)Image with filtration (Ft)t0Image such that

{dXt=μ(t,X)dt+Σ(t,X)dWt,X(0)=X0,

Image (8.4.11)

which holds true PImage-a.s. and for all t[0,T]Image. Here W is a Wiener process with respect to (Ft)t0Image. The functions μ,Σ:[0,T]×RRImage are assumed to be continuous. As in Remark 8.3.20, equation (8.4.11)1 is only an abbreviation for the integral equation

X(t)=X(0)+0tμ(s,X(s))ds+0tΣ(s,X(s))dWs.

Image (8.4.12)

There are two different concepts of solutions to (8.4.12).

i) We talk about strong solution (in the probabilistic sense) if the solution exists on a given probability space (Ω,F,P)Image with a given Wiener process W. A strong solution exists for a given initial datum X0L2(Ω,F0,P)Image and there holds X(0)=X0Image a.s.

ii) We talk about weak solution (in the probabilistic sense) or martingale solution if there is a probability space and a Wiener process such that (8.4.12) holds true. The solution is usually written as

((Ω,F,(Ft)t0,P),W,X).

Image

This means that when seeking a weak solution, constructing the probability space (and the Wiener process on it) is part of the problem. A solution typically exists for a given initial law Λ0Image and we have PX1(0)=Λ0Image. Even if an initial datum X0Image is given it might live on a different probability space. Hence X(0)Image and X0Image can only coincide in law.

Theorem 8.4.35

Let (Ω,F,P)Image be a probability space with filtration (Ft)t0Image and X0L2(Ω,F0,P)Image. Assume that μ and Σ are continuous on [0,T]×RImage and globally Lipschitz continuous with respect to the second variable. Then there is a unique (Ft)t0Image-adapted process X such that (8.4.12) holds PImage-a.s. for every 0tTImage and we have X(0)=X0Image a.s. The trajectories of X are a.s. continuous and we have

E[supt(0,T)|Xt|2]<.

Image

The existence of a strong solution in the sense of Theorem 8.4.35 is classical, see e.g. [12] and [82,83]. If the assumptions on the coefficients are weakened, strong solutions might not exist, see [17]. In this case we can only hope for a weak solution. We refer to [95] for a nice proof and further references.

Theorem 8.4.36

Let Λ0Image be a Borel probability measure on RImage. Assume that μ and Σ are continuous on [0,T]×RImage and have linear growth, i.e. there is K0Image such that

|μ(t,X)|+|Σ(t,X)|K(1+|X|)(t,X)[0,T]×R.

Image

There is a quantity ((Ω,F,(Ft)t0,P),W,X)Image with the following properties.

i) X is a (Ft)t0Image-adapted stochastic process with a.s. continuous trajectories such that

E[supt(0,T)|Xt|2]<.

Image

ii) Equation (8.4.12) holds PImage-a.s. for every 0tTImage.

iii) We have PX(0)1=Λ0Image.

The stochastic ODEs which appear later all have strong solutions. However, the concept of martingale solutions will be important for the SPDEs.

The stochastic ODEs we considered so far have two drawbacks. First, we need vector valued processes and, secondly, we have to weaken the assumptions on the drift μ (Lipschitz-continuity in X and linear growth is too strong). Everything in this chapter can be obviously extended to the multi-dimensional setting. Here a standard Wiener process in RMImage is a vector valued stochastic process and each of its components is a real valued Wiener process (recall Definition 8.1.4). Moreover, the components are independent. Getting rid of the assumed Lipschitz continuity is more difficult. Now seek a RNImage-valued process (Xt)t[0,T]Image on a probability space (Ω,F,P)Image with filtration (Ft)t0Image such that

{dXt=μ(t,X)dt+Σ(t,X)dWt,X(0)=X0.

Image (8.4.13)

Here W is a standard RMImage-valued Wiener process with respect to (Ft)t0Image and X0L2(Ω,F0,P)Image is some initial datum. The functions

μ:Ω×[0,T]×RNRN,Σ:Ω×[0,T]×RNRN×M,

Image

are continuous in XRNImage for each fixed t[0,T]Image, ωΩImage. Moreover, they are assumed to be progressively measurable. The application in Chapter 10 requires weaker assumptions as in the classical existence theorems mentioned above. In our application we only have local Lipschitz continuity of μ. Fortunately, some more recent results apply. In the following we state the assumptions which are in fact a special case of the assumptions in [124, Thm. 3.1.1.].

(A1) We assume that the following integrability condition on μ for all R<Image

0Tsup|X|R|μ(t,X)|2dt<inΩ.

Image

(A2) μ is weakly coercive, i.e. for all (t,X)[0,T]×RNImage we have that

μ(t,X)Xc(1+μ˜t|X|),

Image

where μ˜L2(Ω,F,P;L2(0,T))Image is (Ft)t0Image-adapted.

(A3) μ is locally weakly monotone, i.e. for all t[0,T]Image and all X,YRNImage the following holds

(μ(t,X)μ(t,Y)):(XY)0.

Image

(A4) Σ is Lipschitz continuous, i.e. for all t[0,T]Image and all X,YRNImage the following holds

|Σ(t,X)Σ(t,Y)|2c|XY|2.

Image

Theorem 8.4.37

Let μ and Σ satisfy (A1)(A4). Assume we have a given probability space (Ω,F,P)Image with filtration (Ft)t0Image, an initial datum X0L2(Ω,F0,P)Image and a Brownian motion W with respect to (Ft)t0Image. Then there is a unique (Ft)t0Image-adapted process satisfying

X(t)=X0+0tμ(σ,X(σ))dσ+0tΣ(σ,X(σ))dWσ,P-a.s.,

Image

for every t[0,T]Image. The trajectories of X are PImage-a.s. continuous and we have

E[supt(0,T)|Xt|2]<.

Image

Theorem 8.4.38

Let the assumptions of Theorem 8.4.37 hold. Assume that X0Lβ(Ω,F0,P)Image and μ˜Lβ(Ω,F,P;Lβ(0,T))Image for some β>2Image. Then we have

E[supt(0,T)|Xt|β]<.

Image

References

[12] L. Arnold, Stochastic Differential Equations: Theory and Applications. New York: J. Wiley & Sons; 1973.

[17] M.T. Barlow, One dimensional stochastic differential equation with no strong solution, J. Lond. Math. Soc. (2) 1982;26:335–347.

[82] A. Friedman, Stochastic Differential Equations and Applications I. New York: Academic Press; 1975.

[83] A. Friedman, Stochastic Differential Equations and Applications II. New York: Academic Press; 1976.

[95] M. Hofmanová, J. Seidler, On weak solutions of stochastic differential equations, Stoch. Anal. Appl. 2012;30(1):100–121.

[100] I. Karatzas, S.E. Shreve, Brownian Motion and Stochastic Calculus. Springer; 1998.

[124] C. Prévôt, M. Röckner, A concise course on stochastic partial differential equations. Lect. Notes Math.. Berlin: Springer; 2007;vol. 1905.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.129.194.106