Differentiation

There are three ways to approach the computation of derivatives:

  • Numerical differentiation refers to the process of approximation of the derivative of a given function at a point. In SciPy, we have the following procedures, which will be covered in detail:
    • For generic univariate functions, the central difference formula with fixed spacing.
    • It is always possible to perform numerical differentiation via Cauchy's theorem, which transforms the derivative into a definite integral. This integral is then treated with the techniques of numerical integration explained in the upcoming section.
  • Symbolic differentiation refers to computation of functional expressions of derivatives of functions, pretty much in the same way that we would do manually. It is termed symbolic because unlike its numerical counterpart, symbols take the role of variables, rather than numbers or vectors of numbers. To perform symbolic differentiation, we require a computer algebra system (CAS), and in the SciPy stack, this is achieved mainly through the SymPy library (see http://docs.sympy.org/latest/index.html). Symbolic differentiation and posterior evaluation is a good option as a substitute of numerical differentiation for very basic functions. However, in general, this method leads to overcomplicated and inefficient code. The speed of purely numerical differentiation is preferred, in spite of the possible occurrence of errors.
  • Automatic differentiation is another set of techniques to numerically evaluate the derivative of a function. It is not based upon any approximation schema. This is without a doubt the most powerful option in the context of high derivatives of multivariate functions.

    Note

    In the SciPy stack, this is performed through different unrelated libraries. Some of the most reliable are Theano (http://deeplearning.net/software/theano/) or FuncDesigner (http://www.openopt.org/FuncDesigner). For a comprehensive description and analysis of these techniques, a very good resource can be found at http://alexey.radul.name/ideas/2013/introduction-to-automatic-differentiation/.

Numerical differentiation

The most basic scheme for numerical differentiation is performed with the central difference formula with uniformly spaced nodes. To maintain symmetry, an odd number of nodes is required to guarantee smaller roundoff errors. An implementation of this simple algorithm is available in the module scipy.misc.

Tip

For information about the module scipy.misc, and enumeration of its basic routines, refer to the online documentation at http://docs.scipy.org/doc/scipy-0.13.0/reference/misc.html.

To approximate the first and second derivatives of the polynomial function, for example, f(x) = x5 at x=1 with 15 equally spaced nodes (centered at x=1) at distance dx=1e-6, we could issue the following command:

In [1]: import numpy as np
In [2]: from scipy.misc import derivative
In [3]: def f(x): return x**5
In [4]: derivative(f, 1.0, dx=1e-6, order=15)
Out[4]: 4.9999999997262723
In [5]: derivative(f, 1.0, dx=1e-6, order=15, n=2)
Out[5]: 19.998683310705456

Somewhat accurate, yet still disappointing since the actual values are 5 and 20, respectively.

Tip

Another flaw of this method (at least with respect to the implementation coded in SciPy) is the fact that the result relies on possibly large sums, and these are not stable. As users, we could improve matters by modifying the loop in the source of scipy.misc.derivative with the Shewchuk algorithm, for instance.

Symbolic differentiation

Exact differentiation for polynomials can be achieved through the module numpy.polynomial:

In [6]: p = np.poly1d([1,0,0,0,0,0]); 
   ...: print p
   5
1 x
In [7]: np.polyder(p,1)(1.0)             In [7]: p.deriv()(1.0)
Out[7]: 5.0                              Out[7]: 5.0 
In [8]: np.polyder(p,2)(1.0)             In [8]: p.deriv(2)(1.0)
Out[8]: 20.0                             Out[8]: 20.0

Symbolic differentiation is another way to achieve exact results:

In [9]: from sympy import diff, symbols
In [10]: x = symbols('x', real=True)
In [11]: diff(x**5, x)                   In [12]: diff(x**5, x, x)
Out[11]: 5*x**4                          Out[12]: 20*x**3
In [13]: diff(x**5, x).subs(x, 1.0)      
Out[13]: 5.00000000000000
In [14]: diff(x**5, x, x).subs(x, 1.0)
Out[14]: 20.0000000000000

Note the slight improvement (both in notation and simplicity of coding) when we differentiate more involved functions than simple polynomials. For example, for g(x) = e-xsinx at x=1:

In [15]: def g(x): return np.exp(-x) * np.sin(x)
In [16]: derivative(g, 1.0, dx=1e-6, order=101)
Out[16]: -0.11079376536871781
In [17]: from sympy import sin as Sin, exp as Exp
In [18]: diff(Exp(-x) * Sin(x), x).subs(x, 1.0)
Out[18]: -0.110793765306699

A great advantage of symbolic differentiation over its numerical or automatic counterparts is the possibility to compute partial derivatives with extreme ease. Let's illustrate this point by calculating a fourth partial derivative of the multivariate function h(x,y,z) = exyz at x=1, y=1, and z=2:

In [19]: y, z = symbols('y z', real=True)
In [20]: diff(Exp(x * y * z), z, z, y, x).subs({x:1.0, y:1.0, z:2.0})
Out[20]: 133.003009780752

Automatic differentiation

The third method employs automatic differentiation. For this example, we will use the library FuncDesigner:

In [21]: from FuncDesigner import oovar, exp as EXP, sin as SIN
In [22]: X = oovar('X'); 
   ....: G = EXP(-X) * SIN(X)
In [23]: G.D({X: 1.0}, X)
Out[23]: -0.11079376530669924

The result is obviously more accurate than the one obtained with numerical differentiation. Also, there was no need to provide any extra parameters.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.191.235.176