Projection theory plays an important role in subspace identification primarily because subspaces are created by transforming or “projecting” a vector into a lower dimensional space – the subspace [1–3]. We are primarily interested in two projection operators: (i) orthogonal and (ii) oblique or parallel. The orthogonal projection operator “projects” onto as or its complement , while the oblique projection operator “projects” onto “along” as . For instance, orthogonally projecting the vector , we have or , while obliquely projecting along is .
Mathematically, the ‐dimensional vector space can be decomposed into a direct sum of subspaces such that
then for the vector , we have
with defined as the projection of onto , while defined as the projection of onto .
When a particular projection is termed parallel () or “oblique,” the corresponding projection operator that transforms onto along is defined as , then
with defined as the oblique projection of onto along , while defined as the projection of onto along .
When the intersection , then the direct sum follows as
The matrix projection operation () is () a linear operator in satisfying the properties of linearity and homogeneity. It can be expressed as an idempotent matrix such that with the null space of given by . Here is the projection matrix onto along iff is idempotent. That is, the matrix is an orthogonal projection if and only if it is idempotent () and symmetric ().
Suppose that ', then with the vectors defined as “mutually orthogonal.” If ', then is orthogonal to expressed by spanning and called the orthogonal complement of . If the subspaces and are orthogonal , then their sum is called the direct sum such that
When and are orthogonal satisfying the direct sum decomposition (above), then is the orthogonal projection of onto .
If we define the projection operator over a random space, then the random vector () resides in a Hilbert space () with finite second‐order moments, that is, 2
The projection operator is now defined in terms of the expectation such that the projection of onto is defined by
for the pseudo‐inverse operator given by .
The corresponding projection onto the orthogonal complement () is defined by
Therefore, the space can be decomposed into the projections as
The oblique (parallel) projection operators follow as well for a random space, that is, the oblique projection operator of onto along
and therefore, we have the operator decomposition for 2
for is the oblique (parallel) projection of onto along and is the oblique projection of onto along .
Let then the row space of is spanned by the row vectors , while the column space of is spanned by the column vectors . That is,
and
Any vector in the row space of is defined by
and similarly for the column space
Therefore, the set of vectors and provide a set of basis vectors spanning the respective row or column spaces of and, of course, .
Projections or, more precisely, projection operators are well known from operations on vectors (e.g. the Gram–Schmidt orthogonalization procedure). These operations, evolving from solutions to least‐squares error minimization problem, can be interpreted in terms of matrix operations of their row and column spaces defined above. The orthogonal projection operator () is defined in terms of the row space of a matrix by
with its corresponding orthogonal complement operator as
These operators when applied to a matrix “project” the row space of a matrix onto ( ) the row space of such that given by
and similarly onto the orthogonal complement of such that
Thus, a matrix can be decomposed in terms of its inherent orthogonal projections as a direct sum as shown in Figure B.1a, that is,
Similar relations exist if the row space of is projected onto the column space of , then it follows that the projection operator in this case is defined by
along with the corresponding projection given by
Numerically, these matrix operations are facilitated by the LQ‐decomposition (see Appendix C) as
where the projections are given by
The oblique projection () of the onto along () the is defined by . This projection operation can be expressed as the rows of two nonorthogonal matrices ( and ) and their corresponding orthogonal complement. Symbolically, the projection of the rows of onto the joint row space of and () can be expressed in terms of two oblique projections and one orthogonal complement. This projection is illustrated in Figure B.1 b, where we see that the is first orthogonally projected onto the joint row space of and (orthogonal complement) and then decomposed along and individually enabling the extraction of the oblique projection of the onto along () the .
Pragmatically, this relation can be expressed into a more “operational” form that is applied in Section 6.5 to derive the N4SID algorithm 1 ,4, that is,
for the pseudo‐inverse operation (see Appendix C). Oblique projections also have the following properties that are useful in derivations:
Analogously, the orthogonal complement projection can be expressed in terms of the oblique projection operator as
Numerically, the oblique projection can be computed again using the LQ‐decomposition (see Appendix C).
where the oblique projection of the onto along the is given by
Some handy relationships between oblique and orthogonal projection matrices are given in Table B.1 (see [1–3, 5] for more details).1
Table B.1 Matrix Projections.
Projection: operators, projections, numerics | |||
Operation | Operator | Projection | Numerical (LQ‐decomposition) |
Orthogonal | |||
Oblique | |||
Relations | |||
— | — | ||
— | — | ||
— | |||
— |
18.117.227.194