Gaussian processes

Now we are ready to understand what Gaussian processes (GPs) are and how they are used in practice.

A somewhat formal definition of GPs, taken from Wikipedia, is as follows:

"The collection of random variables indexed by time or space, such that every finite collection of those random variables has a multivariate normal distribution, i.e. every finite linear combination of them is normally distributed."

The trick to understanding Gaussian processes is to realize that the concept of GP is a mental (and mathematical) scaffold, since, in practice, we do not need to directly work with this infinite mathematical object. Instead, we only evaluate the GPs at the points where we have data. By doing this, we collapse the infinite-dimensional GP into a finite multivariate Gaussian distribution with as many dimensions as data points. Mathematically, this collapse is effected by marginalization over the infinitely unobserved dimensions. Theory guarantees us that it is OK to omit (actually marginalize over) all points, except the ones we are observing. It also guarantees that we will always get a multivariate Gaussian distribution. Thus, we can rigorously interpret Figure 7.3 as actual samples from a Gaussian process!

Notice that we set the mean of the multivariate Gaussian at zero and model the functions using just the covariance matrix, through the exponentiated quadratic kernel. Setting the mean of a multivariate Gaussian at zero is common practice when working with Gaussian processes.

Gaussian processes are useful for building Bayesian non-parametric models, as we can use them as prior distributions over functions.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
13.59.199.250