7
Stochastic differential equations

7.1 Existence and uniqueness theorem and main proprieties of the solution

Let images be a complete probability space and images a standard Wiener process defined on it. Let images be a non‐anticipative filtration. Let images be a images‐measurable r.v. (therefore independent of the Wiener process); in particular, it can be a deterministic constant. Having defined in Chapter  the Itô stochastic integrals, the stochastic integral equation

(7.1)equation

does have a meaning for images if, as in Definition 6.4 of an Itô process,

equation

Of course we need to impose appropriate conditions on images and images to insure that images and images, so that the integral form 7.1 has meaning and so the corresponding (Itô) stochastic differential equation (SDE)

(7.2)equation

also has a meaning for images.

Besides having a meaning, we also want the SDE 7.2 to have a unique solution and this may require appropriate conditions for images and images. Like ordinary differential equations (ODEs), a restriction on the growth of these functions will avoid explosions of the solution (i.e. avoid the solution to diverge to images in finite time) and a Lipschitz condition will insure uniqueness of the solution.

We will denote by images the set of real‐valued functions images with domain images that are Borel‐measurable jointly in images and images1 and, for some images and all images and images, verify the two properties:

  • images (Lipschitz condition)
  • images (restriction on growth).

We will follow the usual convention of not making explicit the dependence of random variables and stochastic processes on chance images, but one should keep always in mind that such dependence exists and so the Wiener process and the solutions of SDE do depend on images.

We now state an existence and uniqueness theorem for SDEs which is, with small nuances, the one commonly shown in the literature. Note that the conditions stated in the theorem are sufficient (not necessary), so there may be some improvements (i.e. the class images of functions for which existence and uniqueness is insured may be enlarged), particularly in special cases. We will talk about that later.

Of course the starting time was labelled 0 and we will work on a time interval images just for convenience, but nothing prevents one from working on intervals like images.

Although further complications arise in the stochastic case, the theorem and the proof are inspired on an analogous existence and uniqueness theorem for ODEs. Both use the technique of proving uniqueness by assuming two solutions and showing they must be identical. Both use the constructive Picard's iterative method of approximating the solution by feeding one approximation into the integral equation to get the next approximation:

equation

This method, which is very convenient for the proof, can be used in practice to approximate the solution, although it is quite slow compared to other available methods.

Monte Carlo simulation of SDEs

Although a more complete treatment will be presented in Section 12.2, we mention here some ideas on how to simulate trajectories of the solution of an SDE.

If an explicit expression for the solution of the SDE can be obtained, it can be used to simulate trajectories of images. If the explicit expression only involves the Wiener process directly at the present time, we can simulate the Wiener process by the method of Section 4.1 and plug the values into the expression, as we will do to simulate the solution of the Black–Scholes model in Section 8.1. Monte Carlo simulation for more complicated explicit expressions will be addressed in Section  12.2 .

Unfortunately, in many cases we are unable to obtain an explicit expression, in which case the trajectories can be simulated using an approximation of images obtained by discretizing time. One uses a partition images of images with a sufficiently large images to reduce the numerical error. Typically, but not necessarily, the partition points are equidistant, with images. A trajectory is simulated iteratively at the partition points. One starts (step 0) simulating images by using the distribution of images; if images is a deterministic value, then images and no simulation is required. At each iteration (step images with images), one uses the previously simulated value images to simulate the next value images. The simplest method to do that is the Euler method, the same as is used for ODEs. In the context of SDEs, it is also known as the Euler–Maruyama method. In this method, images and images are approximated in the interval images by constants, namely by the values they take at the beginning point of the interval (to ensure the non‐antecipative property of the Itô calculus). This leads to the first‐order approximation scheme

equation

Since the approximate value of images is known from the previous iteration, the only thing random in the above scheme is the increment images. The increments images (images) are easily simulated (we have done this in Section  4.1 when simulating trajectories of the Wiener process), since they are independent random variables having a normal distribution with mean zero and variance images. At the end of the iterations, we have the approximate values at the time partition points of a simulated trajectory. Of course, this iterative procedure can be repeated to produce other trajectories.

There are faster methods (i.e. requiring not so large values of images), like the Milstein method. On the numerical resolution/simulation of SDE, we refer the reader to Kloeden and Platen (1992), Kloeden et al. (1994), Bouleau and Lépingle (1994), and Iacus (2008).

SDE existence and uniqueness theorem

7.2 Proof of the existence and uniqueness theorem

We now present a proof of the existence and uniqueness Theorem 7.1, which can be skipped by non‐interested readers. Certain parts of the proof are presented in a sketchy way; for more details, the reader can consult, for instance, Arnold (1974), Øksendal (2003), Gard (1988), Schuss (1980) or Wong and Hajek (1985).

(a) Proof of uniqueness

Let images and images be two a.s. continuous solutions. They satisfy 7.4 and so are non‐anticipative (only depend on past values of themselves and of the Wiener process). Since images and images are Borel measurable functions, images, images, images and images are non‐anticipative. We could work with images, but we have not yet proved that such second‐order moment indeed exists. So we replace it by the truncated moment images, with images if images and images for all images and images otherwise. Note that images is non‐anticipative, that images, and that images for images. Since images and images satisfy 7.4 , using the inequality images, one gets

equation

Using the Schwarz inequality (which, in particular, implies images images), the fact that images, and the norm preservation property for Itô integrals of images functions, one gets

equation

Putting images and using the Lipschitz condition, one has

(7.6)equation

Using the Bellman–Gronwall lemma,4 also known as Gronwall's inequality, one gets images, which implies images a.s. But

equation

and since images are images are a.s. bounded in images (due to their a.s. continuity), both probabilities on the right‐hand side become arbitrarily small for every sufficiently large images. So, images a.s. for images, and so also for all images (since the set of rational numbers images is countable). Due to the continuity, we have a.s. images for all images and therefore images a.s.

(b) Proof of existence

The proof of existence is based on the same Picard's method that is used on a similar proof for ODEs. It is an iterative method of successive approximations, starting with

(7.7)equation

and using the iteration

(7.8)equation

We just need to prove that images (images) are a.s. continuous functions and that this sequence uniformly converges a.s. The limit will then be a.s. a continuous function, which, as we will show, is the solution of the SDE.

Since images, we have images and will see by induction that images and images for all images. In fact, assuming this is true for images, we show it is true for images because (due to the restriction on growth, the norm preservation of the Itô integral, the inequality images, and the Schwarz's inequality), with images,

(7.9)equation

Using a reasoning similar to the one used in the proof of 7.6, but now with no need to used truncated moments (since the non‐truncated moments exist), one obtains

(7.10)equation

Iterating 7.10, one obtains by induction

equation

Using the restriction on growth, one obtains

equation

and so

equation

Since

equation

using (6.27) and the Lipschitz condition, one gets

equation

By Thebyshev inequality,

equation

By the Borel–Cantelli lemma,

equation

and so, for every sufficiently large images, images a.s. Since the series images is convergent, the series images converges uniformly a.s. on images (notice that the terms are bounded by images). Therefore

equation

converges uniformly a.s. on images.

This shows the a.s. uniform convergence of images on images as images. Denote the limit by images. Since images are obviously non‐anticipative, the same happens to images. The a.s. continuity of the images and the uniform convergence imply the a.s. continuity of images. Obviously, from the restriction on growth and a.s. continuity of images, we have images a.s. and images a.s. Consequently, images is an Itô process and the integrals in 7.4 make sense.

The only thing missing is to show that images is indeed a solution, i.e. that images satisfies 7.4 for images. Apply limits in probability to both sides of 7.8. The left side images converges in probability to images (the convergence is even a stronger a.s. uniform convergence). On the right‐hand side, we have:

  • The integrals images converge a.s., and so converge in probability to images. In fact, images a.s. (due to the Lipschitz condition).
  • The integrals images converge to images in probability. In fact, the Lipschitz condition implies images a.s. and we can use the result of Exercise 6.9 in Section 6.4.
  • So, the right‐hand side images images converges in probability to images

Since the limits in probability are a.s. unique, we have

equation

i.e. images satisfies 7.4 .

Proof that images and images

From 7.9, putting images gives images. Iterating, we have

equation

and therefore images. Letting images, by the dominated convergence theorem, we get

(7.11)equation

Consequently, images and images.

Proof that images and images are in images

From the previous result, images and images, and so images and images are in images.

As a consequence, the stochastic integral images is a martingale.

Proof of 7.5

By the Itô formula (6.37),

(7.12)equation

Applying mathematical expectations, one gets

equation

We have assumed that the stochastic integral has a null expected value, although we have not shown that the integrand function was in images. Since this may fail, we should, in good rigour, have truncated images by images (to ensure the nullity of expectation of the stochastic integral) and then go to the limit as images.

Using the restriction on growth and the inequality images, one gets

equation

Put images and images, note that images and apply Gronwall's inequality to obtain images, and so the first inequality of 7.5 .

From 7.4 , squaring and applying mathematical expectations, one gets

equation

Applying the first expression of 7.5 and bounding the images that shows up in the integral by images, one gets the second expression in 7.5 .

Proof that the solution is m.s. continuous

For images, images is also solution of images images. Since now the initial condition is the value images at time images and the length of the time interval is images, the second inequality in 7.5 now reads

(7.13)equation

and we can now use the first inequality of 7.5 to bound images by images. We conclude that images as images, which proves the m.s. continuity.

Proof of the semigroup property: For images, images

This is an easy consequence of the trivial property of integrals (if one splits the integration interval into two subintervals, the integral over the interval is the sum of the integrals over the subintervals), which gives images.

Proof that the solution is a Markov process

Let images. The intuitive justification that images is a Markov process comes from the semigroup property images. This means that images can be obtained in the interval images as solution of images. So, given images, images is defined in terms of images and images (images), and so is measurable with respect to images. Since images is images‐measurable, and so is independent of images, we conclude that, given images, images only depends on images. Since images is independent of images, also images (future value) given images (present value) is independent of images (past values). Thus, it is a Markov process.

A formal proof can been seen, for instance, in Gihman and Skorohod (1972), Arnold (1974), and Gard (1988).

Proof that, if images and images are also continuous in images, then images is a diffusion process

Let images and images. Conditioning on images (deterministic initial condition), from 7.13 we obtain

(7.14)equation

First of all, let us note that expressions similar to 7.5 can be obtained for higher even images‐order moments if images. Therefore, since now the initial condition images is deterministic and so has moments of any order images, we can use such expressions to obtain expressions of higher order similar to 7.14. We get

(7.15)equation

with images and images appropriate positive constants. Given images, since images, we get

equation

Due to 7.15, these moments exist for all images (even for odd images, the moment exists since the moment of order images exists). Since images is a Markov process with a.s. continuous trajectories, we just need to show that (5.1), (5.2), and (5.3) hold.

Due to the Lipschitz condition, notice that images and images are also continuous functions of images.

From 7.15 with images we get images, with images constant, so that images as images Therefore

equation

So, (5.1) holds.

Starting from 7.4 , since the Itô integral has zero expectation, we get

(7.16)equation

We also have, using the Lipschitz condition, Schwarz's inequality, and 7.14 ,

(7.17)equation

where images is a positive constant that does not depend on images. The continuity of images in images holds on the close interval images and is uniform. Therefore, given an arbitrary images, there is a images independent of images such that

(7.18)equation

From 7.16, 7.17, and 7.18, we obtain (5.2) with images.

To obtain (5.3) with images, one starts from 7.12 instead of 7.4 , using also as initial condition images and applying similar techniques.

This concludes the proof of Theorem 7.1. images

7.3 Observations and extensions to the existence and uniqueness theorem

The condition images (i.e. images has finite variance) is really not required and we could prove Theorem 7.1, with the exception of parts (c) and (d), without making that assumption. Of course, for parts (c) and (d) we would need the assumption, since otherwise the required second‐order moments of images might not exist.

In fact, except for parts (c) and (d), we could easily adapt the proof presented in Section 7.2 in order to wave that assumption. We would just replace images by its truncation to an interval images. Since the truncated r.v. is in images, the proof would stand for the truncated images, and then we would go to the limit as images.

The restriction to growth and the Lipschitz condition for images and images do not always need to hold for all points images, or images and images. In the case where the solution images of the SDE has values that always belong to a set images, then it is sufficient that the restriction on growth and the Lipschitz conditions are valid on images. In fact, in that case nothing changes if we replace images and images by other functions that coincide with them on images and have zero values out of images, and these other functions satisfy the restriction on growth and the Lipschitz condition.

For existence and uniqueness, we could use a weaker local Lipschitz condition for images and images instead of the global Lipschitz condition we have assumed in Theorem 7.1.

A function images with domain images satisfies a local Lipschitz condition if, for any images, there is a images such that, for all images and images, images, we have

(7.19)equation

The proof of existence uses a truncation of images to the interval images and ends by taking limits as images.

Consider images and images fixed. We can say that the SDE images, images, or the corresponding stochastic integral equation images, is a map (or transformation) that, given a r.v. images and a Wiener process images (on a certain given probability space endowed with a non‐anticipative filtration images), transforms them into the a.s. unique solution images of the SDE, which is adapted to the filtration. If we choose a different Wiener process, the solution changes. This type of solution, which is the one we have studied so far, is the one that is understood by default (i.e. if nothing in contrary is said). It is called a strong solution. Of course, once the Wiener process is chosen, the solution is unique a.s. and to each images there corresponds the value of images and the whole trajectory images of the Wiener process, which the SDE (or the corresponding stochastic integral equation) transforms into the trajectory images of the SDE solution. In summary, images and the Wiener process are both given and we seek the associated unique solution of the SDE.

There are also weak solutions. Again, consider images and images fixed. Now, however, only images is given, not the Wiener process. What we seek now is to find a probability space and a pair of processes images and images on that space such that images; of course, these are not unique. The difference is that, in the strong solution, the Wiener process is given a priori and chosen freely, the solution being dependent on the chosen Wiener process, while in the weak solution, the Wiener process is obtained a posteriori and is part of the solution.5 Of course, a strong solution is also a weak solution, but the reverse may fail. One can see a counter‐example in Øksendal (2003), in which the SDE has no strong solutions but does have weak solutions.

The uniqueness considered in Theorem 7.1 is the so‐called strong uniqueness, i.e. given two solutions, their sample paths coincide for all images with probability one. We also have weak uniqueness, which means that, given two solutions (no matter if they are weak or strong solutions), they have the same finite‐dimensional distributions. Of course, strong uniqueness implies weak uniqueness, but the reverse may fail. Under the conditions assumed in the existence and uniqueness Theorem 7.1, two weak or strong solutions are weakly unique. In fact, given two solutions, either strong or weak, they are also weak solutions. Let them be images and images (remember that a weak solution is a pair of the ‘solution itself’ and a Wiener process). Then, since by the theorem there are strong solutions, let images and images be the strong solutions corresponding to the Wiener process choices images and images. By Picard's method of successive approximations given by 7.7 7.8 , the approximating sequences have the same finite‐dimensional distributions, so the same happens to their a.s. limits images and images.

It is important to stress that, under the conditions assumed in the existence and uniqueness Theorem 7.1, from the probabilistic point of view, i.e. from the point of view of finite‐dimensional distributions, there is no difference between weak and strong solutions nor between the different possible weak solutions. This may be convenient since, to determine the probabilistic properties of the strong solution, we may work with weak solutions and get the same results.

Sometimes the Lipschitz condition or the restriction on growth are not valid and so the existence of strong solution is not guaranteed. In that case, one can see if there are weak solutions. Conditions for the existence of weak solutions can be seen in Stroock and Varadhan (2006) and Karatzas and Shreve (1991). Another interesting result is (see Karatzas and Shreve (1991)) that, if there is a weak solution and strong uniqueness holds, then there is a strong solution.

We have seen that, when images and images, besides satisfying the other assumptions of the existence and uniqueness Theorem 7.1, were also continuous functions of time, the solution of the SDE was a diffusion process with drift coefficient images and diffusion coefficient images.

The reciprocal problem is also interesting. Given a diffusion process images (images) in a complete probability space images with drift coefficient images and diffusion coefficient images, is there an SDE which has such a process as a weak solution? Under appropriate regularity conditions, the answer is positive. Some results on this issue can be seen in Stroock and Varadhan (2006), Karatzas and Shreve (1991), and Gihman and Skorohod (1972). So, in a way, there is a correspondence between solutions of SDE and diffusion processes.

In the particular case that images and images do not depend on time (and satisfy the assumptions of the existence and uniqueness Theorem 7.1), they are automatically continuous functions of time and therefore the solution of the SDE will be a homogeneous diffusion process with drift coefficient images and diffusion coefficient images. In this case, the SDE is an autonomous stochastic differential equation and its solution is also called an Itô diffusion. In this case, one does not need to verify the restriction on growth assumption since this is a direct consequence of the Lipschitz condition (this latter condition, of course, needs to be checked). In the autonomous case, one can work in the interval images since images and images do not depend on time.

In the autonomous case, under the assumptions of the existence and uniqueness Theorem 7.1, one can even conclude that the solution of the SDE is a strong Markov process (see, for instance, Gihman and Skorohod (1972) or Øksendal (2003)).

In the autonomous case, if we are working in one dimension (so this is not generalizable to multidimensional SDEs), we can get results even when the Lipschitz condition fails but images and images are continuously differentiable (i.e. are of class images). Note that, if they are of class images, they may or may not verify a Lipschitz condition; if they have bounded derivatives, they satisfy a Lipschitz condition, but this is not the case if the derivatives are unbounded.

The result by McKean (1969) for autonomous one‐dimensional SDEs is that, if images and images are of class images, then there is a unique (strong) solution up to a possible explosion time images. By explosion, we mean the solution becoming images. If images and images also satisfy a Lipschitz condition, that is sufficient to prevent explosions (i.e. one has images a.s.) and the solution exists for all times and is unique. If images and images are of class images but fail to satisfy a Lipschitz condition, one cannot exclude the possibility of an explosion, but there are cases in which one can show that an explosion is not possible (or, more precisely, has a zero probability of occurring) and therefore the solution exists for all times and is unique. We will see later some examples of such cases and of the simple methods used to show that in such cases there is no explosion. There is also a test to determine whether an explosion will or will not occur, the Feller test (see McKean (1969)).

Let us now consider multidimensional stochastic differential equations. The setting is similar to that of the multidimensional Itô processes of Section 6.7. The difference is that now one uses images and images. The Lipschitz condition and the restriction on growth of images and images in the existence and uniqueness Theorem 7.1 are identical with images representing the euclidean distance. The results of Theorem 7.1 remain valid. When the solution is a diffusion process, the drift coefficient is the vector images and the diffusion coefficient is the matrix images.

Notice, however, that, given a multidimensional diffusion process with drift coefficient images and diffusion coefficient images, there are several matrices images for which images, but all of them result in the same probabilistic properties for the solution of the SDE. So, from the point of view of probabilistic properties or of weak solutions, it is irrelevant which images one chooses.

The treatment in this chapter allows the inclusion of SDE of the type

equation

by adding the equation images and working in two dimensions with the vector images and the SDE

equation

The initial condition takes the form images. Note that images and images.

We can also have higher order stochastic differential equations of the form (here the superscript images represents the derivative of order images)

equation

with initial conditions images (images). For that, we use the same technique that is used for higher order ODEs, working with the vector

equation

and the SDE

equation

We can also have functions images and images that depend on chance images in a more general direct way (thus far they did not depend directly on chance, but they did so indirectly through images). This poses no problem as long as images and images are kept non‐anticipative.

Notes

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.149.254.35