12

Realization of behavior

Abstract

This chapter presents a new notion of realization of behavior. It has been shown that realization of behavior generalizes the classical concept of realization of transfer function matrix. The basic idea in this approach is to find an autoregressive moving-average (ARMA) representation for a given frequency behavior description such that the known frequency behavior is completely recovered to the corresponding dynamical behavior. From this point of view, realization of behavior is seen to be a converse procedure to the latent variable eliminating process. Such a realization approach is believed to be highly significant in modeling a dynamical system in some real cases where the system behavior is conveniently described in the frequency domain. Since no numerical computation is needed, the realization of behavior is believed to be particularly suitable for situations in which the coefficients are symbolic rather than numerical.

Based on this idea, the behavior structures of the generalized chain-scattering representation (GCSR) and the dual generalized chain-scattering representation (DGCSR) have been clarified. It has been shown that any GCSR or any DGCSR develops the same (frequency) behavior. Subsequently the corresponding ARMA representations are proposed and are proved to be realizations of behavior for any GCSR and any DGCSR. More specifically, two Rosenbrock PMDs are found to be the realizations of behavior for any GCSR.

Keywords

Realization of behavior; Transfer function matrix; Autoregressive moving-average (ARMA); Latent variable eliminating process; Chain-scattering representation (CSR); Generalized chain-scattering representation (GCSR); Dual generalized chain-scattering representation (DGCSR); Polynomial matrix description (PMD); Infinite-dimensional systems

12.1 Introduction

In classical network theory, a circuit representation called the chain matrix [21] was widely used to deal with the cascade connection of circuits arising in analysis and synthesis problems. Based on this, Kimura [22] developed the chain-scattering representation (CSR), which was subsequently used to provide a unified framework for Hsi1_e control theory. Kimura’s approach is, however, only available to the special cases where the matrices P21 and P12 (refer to Eq. 11.1) satisfy some assumptions of full rank. Recently, in [31] this approach was extended to the general case in which such conditions are essentially relaxed. From an input-output consistency point of view, the generalized chain-scattering representation (GCSR) and the dual generalized chain-scattering representation (DGCSR) emerge and are there successfully used to characterize the cascade structure property and the symmetry of general plants in a general setting.

Latterly behavioral theory (see, e.g., [26, 27]) has received broad acceptance as an approach for modeling dynamical systems. One of the main features of the behavioral approach is that it does not use the conventional input-output structure in describing systems. Instead, a mathematical model is used to represent the systems in which the collection of time trajectories of the relevant variables are viewed as the behavior of the dynamical systems. This approach has been shown [26, 28] to be powerful in system modeling and analysis. In contrast to this, the classical theories such as Kalman’s state-space description and Rosenbrock’s polynomial matrix description (PMD) take the input-output representation as their starting point. In many control contexts it has proven to be very convenient to adopt the classical input/state/output framework. It is often found that the system models, in many situations, can easily be formulated into the input/state/output models such as the state space descriptions and the PMDs. Based on such input/state/output representations, the action of the controller can usually be explained in a very natural manner and the control aims can usually be attained very effectively.

As far as the issue of system modeling is concerned the computer-aided procedure, i.e., the automated modeling technology, has been well developed as a practical approach over recent years. If the physical description of a system is known, the automated modeling approach can be applied to find a set of equation to describe the dynamical behavior of the given system. It is seen that, in many cases, such as in an electrical circuit or, more generally, in an interconnection of blocks, such a physical description is more conveniently specified through the frequency behavior of the system. Consequently in these cases, a more general theoretic problem arises, i.e., if the frequency behavior description of a system is known, what is the corresponding dynamical behavior? In other words, what is the input/output or input/state/output structure in the time domain that generates the given frequency domain description. It turns out that this question can be interpreted through the notion of realization of behavior, which we shall introduce in this chapter.

In fact, as we shall see later, realization of behavior, in many cases, amounts to introduction of latent variables in the time domain. From this point of view, realization of behavior can be understood as a converse procedure of the latent variable elimination theorem [27] in one way or another. It should be emphasized that realization of behavior also generalizes the notion of transfer function matrix realization in the classical control theory framework, as the behavior equation is a more general description than the transfer function matrix. As a special case, the behavior equations determine a transfer matrix of the system if they represent an input-output system [27, 29], i.e., when the matrices describing the system satisfy a full rank condition.

Recently in [30], a realization approach was suggested that reduces high-order linear differential equations to the first-order system representations by using the method of “linearization.” From the point of view of realization in a physical sense one is, however, forced to start from the system frequency behavior description into which system behavior is generally described rather than from the high-order linear differential equations in time domain. Also a constructive procedure for autoregressive (AR) equation realization has been introduced in [114].

One of the main aims of this chapter is to present a new notion of realization of behavior. Further to the results of [31], the input-output structure of the GCSRs and the DGCSRs are thus clarified by using this approach. Subsequently the corresponding autoregressive-moving-average (ARMA) representations are proposed and are proved to be realizations of behavior for any GCSR and for any DGCSR. Once these ARMA representations are proposed, one can further find the corresponding first-order system representations by using the method of [30] or other well-developed realization approaches such as Kailath [32].

These results are interesting in that they provide a good insight into the natural relationship between the (frequency) behavior of any GCSR and any DGCSR and the (dynamical) behavior of the corresponding ARMA representations. Since no numerical computation is involved in this approach, realization of behavior is particularly accessible for situations in which the coefficient are symbolic rather than numerical. Based on the realization of behavior, one can further find the corresponding first-order system representations by using the classical realization approaches [30, 32]. Also a constructive procedure for AR equation realization suggested by Vafiadis and Karcanias [114] can be used to obtain the required first-order system representations.

The following result can be easily proposed and can be easily proved by using the definition of 1-inverse of a polynomial matrix. It is presented here for later use.

Proposition 12.1.1

The ring of the polynomial matrices in denoted by R[s]si2_e. Let A[s]R[s]n×msi3_e, rankR[s]A[s]=rmin{m,n}si4_e, there exist nonsingular polynomial matrices Q(s)R[s]n×n,P(s)R[s]m×msi5_e, such that

Q(s)A(s)P(s)=B(s)000,

si6_e

where the polynomial matrix B[s]R[s]r×rsi7_e is invertible, then for any polynomial matrix L[s]R[s](nr)×(mr)si8_e, the following matrix X(s) is a {1}-inverse of A(s). X(s) is given by

X(s)=P(s)B1(s)00L(s)Q(s).

si9_e

12.2 Behavior realization

This section introduce the concept of realization of behavior. Recall that in the behavioral framework the (dynamical) behavioral equations of an ARMA representation [27] are

R1(ρ)u1(t)+R2(ρ)y1(t)=S(ρ)ξ(t),

si10_e  (12.1)

where w:=(y1(t))T,(u1(t))TTsi11_e stands for the external variables representing the dynamical behavior of the underlying dynamical system, ξ(t) which are called latent variables corresponding to auxiliary variables resulting from the modeling procedure. R1(ρ), R2(ρ) and S(ρ) are polynomial matrices containing the differential operator ρ = d/dt. In order to distinguish them from the existing notation y, u of Eq. (11.1), the external variables are denoted by y1 and u1. When S(ρ) = 0, Eq. (12.1) is termed [27] an AR representation.

In the following approach we are interested in the external behavior of the system (12.1), where we choose the underlying function space to be C:=C(R,R)si12_e, this function space consists of all the infinitely differentiable functions that are defined for all time R:=[0,+)si13_e and take values in the real number field Rsi14_e. For brevity, we write Ck:=C(R,Rk)si15_e. Then the dynamical external behavior of Eq. (12.1) is given by

Bd(R1,R2;S)=y1(t)u1(t)Cm+pξ(t)Cnsothat(8.1)isvalid=y1(t)u1(t)Cm+pR2(ρ),R1(ρ)y1(t)u1(t)ImS(ρ).

si16_e  (12.2)

From the above, to avoid the trivial case that the external behavior is empty, i.e., to ensure that there exist latent variables, every pair (y1(t), u1(t)) in the external behavior must be consistent, that is

(IS(ρ)S(ρ))(R1(ρ)u1(t)+R2(ρ)y1(t))=0,y1(t)u1(t)Bd(R1,R2;S),

si17_e

where {1}-inverse is arbitrary. It is immediately noted that, when S(ρ) is invertible or more specially when S(ρ) = I, the above condition is automatically satisfied.

In many real cases (for example, electrical circuits), however, system behavior is usually described (see Example 12.2.1) in the frequency domain as

A(s)u*(s)+B(s)y*(s)=C(s)η(s),

si18_e  (12.3)

where A(s)R[s]q×p,B(s)R[s]q×msi19_e, and C(s)R[s]q×n,si20_e as the following example suggests, are not polynomial but rational matrices. The vector-valued signals u*(s), y*(s), and η(s) live in the square (Lebesgue) integrable functional spaces L2p,L2msi21_e, and L2nsi22_e respectively.

Example 12.2.1

Consider the system. The loop equations of Fig. 12.1 are written as

f12-01-9780081019467
Fig. 12.1 An RC circuit.

R+2sCI1(s)1sCI2(s)=Ein(s)

si23_e  (12.4)

1sCI1(s)+R+1sCI2(s)=0.

si24_e  (12.5)

In order to obtain the transfer function Zt(s)=E0(s)Ein(s)si25_e, one needs to solve for I2(s) to obtain

I2(s)=Ein(s)Css2R2C2+3sRC+1

si26_e

and then note that the output voltage is

E0(s)=RI2(s).

si27_e

This suggests that I1(s) can be regarded as latent variable, the original loop equations thus can be written into

10Ein(s)+1sCR+1sCI2(s)=R+2sC1sCI1(s),

si28_e  (12.6)

which is just in the form of Eq. (12.3). The coefficient matrices of the external variables and the latent variables are seen to be rational but not necessarily polynomial.

As a special case when C(s) = 0 and B(s) is invertible, Eq. (12.3) determines a transfer function G(s) = −B−1(s)A(s).

The frequency behavior of Eq. (12.3) is given by

Bf(A,B;C)=y*(s)u*(s)L2m+pη(s)L2nsothat(8.3)isvalid=y*(s)u*(s)L2m+pB(s),A(s)y*(s)u*(s)ImC(s).

si29_e  (12.7)

In here the matrix B(s) is not necessarily invertible. It will be seen later that this is the reason why the realization of behavior comes out to be a generalization of realization of transfer function. Instead of this condition, we only make a relaxed assumption that every u* in the behavior is consistent, i.e.,

(IB(s)B(s))A(s)u*(s)=0,u*(s)Bf(A,B),

si30_e

where B(s) is any {1}-inverse of B(s). Again it should be noted that, when B(s) is invertible or more specially when B(s) = I, the above condition is automatically satisfied. By denoting

L(Bd)(R1,R2;S)=ŷ1(s)û1(s)ŷ1(s)û1(s)=Ly1(t)u1(t),

si31_e

y1(t)u1(t)Bd(R1,R2;S),ρiy1(0)=0,ρiu1(0)=0,ρiξ(0)=0,i=0,1,,

si32_e

where Lf(t)si33_e denotes the Laplace transformation of f(t), the definition of realization of behavior follows.

Definition 12.2.1

Given a frequency behavior description (Eq. 12.3), if there exists an ARMA representation (Eq. 12.1), i.e., there exist polynomial matrices R1(ρ), R2(ρ), and S(ρ) such that

L(Bd)(R1,R2;S)=Bf(A,B;C),

si34_e

then the ARMA representation (Eq. 12.1) is said to be a realization of behavior for Eq. (12.3).

Remark 12.2.1

It should be noted that, in the frequency behavior description (Eq. 12.3), the matrices A(s) and B(s) are assumed to be rational, not necessarily polynomial. If they are polynomial, it is obvious that the following AR representation [27]

A(ρ)u1(t)+B(ρ)y1(t)=0

si35_e

is a realization of behavior for Eq. (12.3). In this case, one does not need to introduce any latent variable.

Remark 12.2.2

The above concept is a generalization to the classical notion of realization of transfer function matrix.

To see this, let us consider the special case, when B(s) is invertible and C(s) = 0. Then Eq. (12.3) determines a transfer matrix and can be written as

y*(s)=B1(s)A(s)u*(s).

si36_e  (12.8)

If there exist polynomial matrices T(ρ), U(ρ), V (ρ), and W(ρ) of appropriate dimensions such that T(ρ) is invertible and

B1(s)A(s)=V(s)T1(s)U(s)+W(s),

si37_e  (12.9)

then by Definition 12.2.1, it is easy to verify that the following ARMA representation

I0y1(t)+W(ρ)U(ρ)u1(t)=V(ρ)T(ρ)x(t)

si38_e  (12.10)

is a realization of behavior for the frequency behavior description (Eq. 12.3). It is noted that Eq. (12.10) is nothing but the Rosenbrock PMD

T(ρ)x(t)=U(ρ)u1(t)y1(t)=V(ρ)x(t)+W(ρ)u1(t).

si39_e

The condition of consistency is seen to be satisfied because of the invertibility of T(ρ).

When T(ρ) = ρEA with E singular, the above PMD is termed a singular system, while when T(ρ) = ρIA, the above description is known as the conventional state space system. It is clearly seen that in the above special cases, realization of behavior is equivalent to realization of transfer function in the classical sense.

12.3 Realization of behavior for GCSRs and DGCSRs

Before developing the realization of behavior for GCSRs, we will establish a realization of behavior for the general plant. Given the general plant described by

z(s)y(s)=P(s)w(s)u(s)=P11(s)P12(s)P21(s)P22(s)w(s)u(s),

si40_e  (12.11)

where Pij(s) (i = 1, 2; j = 1, 2) are all rational matrices with dimensions mi × kj (i = 1, 2; j = 1, 2), consider the rational matrix P(s)R[s](m1+m2)×(k1+k2)si41_e. It is well-known [4] that there always exist nonunique polynomial pairs P1(s)R[s](m1+m2)×(m1+m2)si42_e and P2(s)R[s](m1+m2)×(k1+k2)si43_e such that

P(s)=P11(s)P2(s).

si44_e  (12.12)

It should be noted in here P1(s) and P2(s) do not need to be coprime. In this way, the following result is obtained.

Theorem 12.3.1

The following AR representation

P2(ρ)u1(t)+P1(ρ)y1(t)=0

si45_e  (12.13)

is a realization of behavior for the general plant (Eq. 12.11), where the external variables are denoted by

y1(t)u1(t):=z(t)y(t)w(t)u(t),

si46_e

and the polynomial matrices P1(ρ), P2(ρ) satisfy Eq. (12.12).

Proof

Under the decomposition (Eq. 12.12), Eq. (12.11) can be written into

P1(s)z(s)y(s)=P2(s)w(s)u(s),

si47_e

the above frequency behavior is seen to be Bf=Ker([P1(s),P2(s)]),si48_e while the dynamical external behavior of the AR representation (Eq. 12.13) is

Bd=y1(t)u1(t)Cm1+m2+k1+k2[P1(s),P2(s)]y1(t)u1(t)=0.

si49_e

Now let

P1(ρ)=P10+P11ρ++P1q1ρq1,P1jR[s](m1+m2)×(m1+m2),j=0,1,,q1P2(ρ)=P20+P21ρ++P2q2ρq2,P2jR[s](m1+m2)×(k1+k2),j=0,1,,q2.

si50_e

The Laplace transformation of Eq. (12.13) with zero initial condition yields

P2(s)û1(s)+P1(s)ŷ1(s)=0,

si51_e  (12.14)

where û1(s):=0+u1(t)estdt,ŷ1(s):=0+y1(t)estdt.si52_e Thus Eq. (8.14) gives

L(Bd)=Bf.

si53_e

Hence the theorem follows from Definition 12.2.1.

Remark 12.3.1

The above realizations are not unique due to the fact that the decompositions (Eq. 12.12) are not unique. Of course there exists a minimal realization among all the realizations. The main difference between the minimal realization and a nonminimal one is if the realization description contains redundant information. In some situations, it is important to have a description of behavior structure in terms of the original data, and the system may depend on parameters, redundant descriptions are subsequently preferable to minimal description because it allows more freedom to incorporate the dependence on the parameters in a nice way. Usually one is forced to deal with the issue of redundancy in actual system analysis. In the context of the state space systems and the generalized state space systems, such importance has already been stressed in Rosenbrock [5] and Luenberger [115]. For properties of first order realizations of AR equations, one can refer to [114].

Recall now Theorem 11.2.2. If the input-output pair (u(s), y(s)) is consistent about w to the plant P, then the GCSR is represented by

z(s)w(s)=GCHAIN(P;P21)u(s)y(s)h(s)=GCHAIN*(P;P21)u(s)y(s)+ΔGCHAIN(P;P21)h(s),

si54_e  (12.15)

where we denote the GCSR matrix

eq12-01-9780081019467

The above GCSR gives rise to the frequency behavior

Bf(I,GCHAIN(P;P21)):=y*(s)u*(s):=z(s)w(s)u(s)y(s)y*(s)=GCHAIN*(P;P21)u*(s)+ΔGCHAIN(P;P21)h(s),h(s)isarbitrary,(u(s),y(s))isconsistenttoP.

si55_e

It can be proved that every GCSR gives rise to the same frequency behavior, in other words, the frequency behavior of GCSRs is independent of the particular {1}-inverse. The following theorem establishes this observation.

Theorem 12.3.2

Given any two GCSEs GCHAIN(P;P21), GCHAIN(P;P21g)si56_e, which are formulated in terms of two {1}-inverse of P21 respectively, one has

Bf(I,GCHAIN(P;P21))=Bf(I,GCHAIN(P;P21g)).

si57_e

Proof

One only needs to prove that

Bf(I,GCHAIN(P;P21))Bf(I,GCHAIN(P;P21g)).

si58_e

The converse statement

Bf(I,GCHAIN(P;P21g))Bf(I,GCHAIN(P;P21))

si59_e

can be proved similarly.

From Lemma 11.1.2, there exists a matrix K(s) such that

P21(s)=P21g(s)+K(s)P21g(s)P21(s)K(s)P21(s)P21g(s).

si60_e  (12.16)

Also the input-output pair (u(s), y(s)), if being consistent to the plant P, must satisfy (Theorem 11.2.1)

(IP21(s)P21g(s))[P22(s),I]u(s)y(s)=0.

si61_e

This follows that

y(s)P22(s)u(s)=P21(s)P21g(s)(y(s)P22(s)u(s)).

si62_e  (12.17)

Given any

y*(s)u*(s)Bf(I,GCHAIN(P;P21)),

si63_e

there should be a rational vector h1(s) such that

z(s)w(s)=y*(s)=GCHAIN*(P;P21)u*(s)+ΔGCHAIN(P;P21)h1(s)=P12(s)P11(s)P21(s)P22(s)P11(s)P21(s)P21(s)P22(s)P21(s)u(s)y(s)+P11(s)(IP21(s)P21(s))IP21(s)P21(s)h1(s).

si64_e  (12.18)

By substituting Eqs. (12.16), (12.17) into Eq. (12.18), one yields

z(s)=(P12(s)P11(s)P21(s)P22(s))u(s)+P11(s)P21(s)y(s)+P11(s)(IP21(s)P21(s))h1(s)=[P12(s)P11(s)(P21g(s)+K(s)P21g(s)P21(s)K(s)P21(s)P21g(s))P22(s)]u(s)+P11(s)(P21g(s)+K(s)P21g(s)P21(s)K(s)P21(s)P21g(s))y(s)+P11(s)[I(P21g(s)+K(s)P21g(s)P21(s)K(s)P21(s)P21g(s))P21(s)]h1(s)=(P12(s)P11(s)P21g(s)P22(s))u(s)+P11(s)P21g(s)y(s)+P11(s)(IP21g(s)P21(s))[K(s)(y(s)P22(s)u(s))+(IK(s)P21(s))h1(s)];

si65_e

w(s)=P21(s)P22(s)u(s)+P21(s)y(s)+(IP21(s)P21(s))h1(s)=(P21g(s)+K(s)P21g(s)P21(s)K(s)P21(s)P21g(s))P22(s)u(s)+(P21g(s)+K(s)P21g(s)P21(s)K(s)P21(s)P21g(s))y(s)+[I(P21g(s)+K(s)P21g(s)P21(s)K(s)P21(s)P21g(s))P21(s)]h1(s)=P21g(s)P22(s)u(s)+P21g(s)y(s)+(IP21g(s)P21(s))[K(s)(y(s)P22(s)u(s))+(IK(s)P21(s))h1(s)].

si66_e

By letting

h2(s):=K(s)(y(s)P22(s)u(s))+(IK(s)P21(s))h1(s),

si67_e

the above formulations about z(s) and w(s) can be written into the following matrix form

z(s)w(s)=GCHAIN*(P;P21g)u*(s)+ΔGCHAIN(P;P21g)h2(s),

si68_e

which displays the fact that

y*(s)u*(s)Bf(I,GCHAIN(P;P21g)).

si69_e

Subsequently

Bf(I,GCHAIN(P;P21))Bf(I,GCHAIN(P;P21g)).

si58_e

This finishes the proof.

By the virtue of the above theorem, the frequency behavior of any GCSR thus can be simply denoted by Bf(I,GCHAIN(P)).si71_e

One of the remaining aims of this section is to show how the frequency behavior of GCSRs can be realized as the dynamical behavior of an ARMA representation through the approach of realization of behavior. To this end, the general plant (Eq. 12.11) is rewritten into

z(s)y(s)=P11*(s)g(s)P12*(s)g(s)P21*(s)g(s)P22*(s)g(s)w(s)u(s),

si72_e  (12.19)

where g(s) is the least common (monic) multiple of the denominator polynomials of all the entries in P(s), and Pij*(s)/g(s) = Pij(s), i = 1, 2; j = 1, 2. It is immediately noted that the above decomposition of P is a special case of Eq. (12.12). By letting

zc(s)yc(s)=g(s)Im1+m2z(s)y(s),

si73_e  (12.20)

P*(s)=P11*(s)P12*(s)P21*(s)P22*(s),

si74_e  (12.21)

where Im1+m2si75_e is the identity matrix with dimension m1 + m2. Eq. (12.19) thus takes the form

zc(s)yc(s)=P*(s)w(s)u(s).

si76_e  (12.22)

It should be noted that in here P*(s) is a polynomial matrix and g(s) is a polynomial.

As is shown before, the realization of behavior for a general plant is rather straightforward, while the realization of behavior for GCSR is more obscure, as in this case the introduction of latent variables is necessary. To propose a realization of behavior for GCSRs, consider the following ARMA representation

P22*(ρ)g(ρ)IP12*(ρ)000u(t)y(t)+00g(ρ)I00Iz(t)w(t)=P21*(ρ)P11*(ρ)Ix(t),t0.

si77_e  (12.23)

The above ARMA representation is in fact

P21*(ρ)x(t)=[P22*(ρ),g(ρ)I]u(t)y(t)g(ρ)I00Iz(t)w(t)=P11*(ρ)Ix(t)+P12*(ρ)000u(t)y(t),

si78_e  (12.24)

where all the identity matrices and all the zero block matrices are of appropriate dimensions. x(t)Ck1si79_e are the latent variables. It is noted that every condition pair

y1(t)u1(s):=z(t)w(t)u(t)y(t)Bd

si80_e

is consistent about the latent variables x(t) is equivalent that every input-output pair (u(s), y(s)) is consistent about w.

The following technical result will be used in the proof of Theorem 12.3.3. It is stated as a lemma.

Lemma 12.3.1

Let all the initial conditions and their derivatives of u(⋅) and y(⋅) be zero. The following condition of consistency in the time domain

[IP21*(ρ)(P21*(ρ))][P22*(ρ),g(ρ)I]u(t)y(t)=0

si81_e

is equivalent to the following consistency condition in the frequency domain

[IP21*(s)(P21*(s))][P22*(s),g(s)I]û(s)ŷ(s)=0,

si82_e

i.e.,

[IP21(s)(P21(s))][P22(s),I]û(s)ŷ(s)=0,

si83_e

where û(s)si84_e, ŷ(s)si85_e denote the Laplace transformation of u(t) and y(t) respectively.

Proof

Viewing the Laplace transformation and inverse Laplace transformation properties, one only needs to note that the matrix IP21*(⋅)(P21*(⋅)) is a polynomial matrix for any {1}-inverse of P21*. This can be done easily by using Proposition 12.1.1.

Now we are ready to state and prove the following result.

Theorem 12.3.3

The ARMA representation (Eq. 12.23) is a realization of behavior for any GCSR GCHAIN(P;P21).

Proof

Given any GCSR, its frequency behavior is Bf(I,GCHAIN(P)si86_e. The dynamical external behavior of the ARMA representation (Eq. 12.23) is

Bd=y1(t)u1(t)=z(t)w(t)u(t)y(t)Cm1+m2+k1+k2x(t)Ck1sothat(8.23)isvalid

si87_e  (12.25)

y1(t)u1(t)Bdsi88_e, to ensure that there exists x(t) such that Eq. (12.23), i.e., Eq. (12.24) is valid, u1(t) must be consistent to Eq. (12.24). This suggests that

[IP21*(ρ)(P21*(ρ))][P22*(ρ),g(ρ)I]u(t)y(t)=0.

si89_e

It is easily seen to be equivalent to

û(s)ŷ(s)Ker{[IP21(s)P21(s)][P22(s),I]}.

si90_e

The Laplace transformation of Eq. (12.24) (with zero initial conditions) yields

P21*(s)x^(s)=[P22*(s),g(s)I]û(s)ŷ(s),

si91_e  (12.26)

and

g(s)I00Iz^(s)ŵ(s)=P11*(s)Ix^(s)+P12*(s)000û(s)ŷ(s),

si92_e  (12.27)

where f^(s):=0+f(t)estdt.si93_e Due to the reason stated in the proof of Theorem 12.3.1 all the Laplace transformed initial vectors that are associated with the initial values of the variables and their derivatives are zeros.

Due to the consistency of û(s)ŷ(s)si94_e, Eq. (12.26) determines the latent variables x^(s)si95_e. By solving the latent variables x^si96_e in Eq. (12.26) and then substituting into Eq. (12.27), one obtains

g(s)I00Iz^(s)ŵ(s)=P12*(s)P11*(s)(P21*)(s)P22*(s)g(s)P11*(s)(P21*)(s)(P21*)(s)P22*(s)g(s)(P21*)(s)û(s)ŷ(s)+P11*(s)[I(P21*)(s)P21*(s)]I(P21*)(s)P21*(s)h(s),

si97_e  (12.28)

where h(s) is any rational vector. By noticing that Pij*(s) = g(s)Pij(s), i = 1, 2; j = 1, 2, and that P21(s) = g(s)(P21*)(s), Eq. (12.28) can also be written into

z^(s)ŵ(s)=P12(s)P11(s)P21(s)P22(s)P11(s)P21(s)P21(s)P22(s)P21(s)û(s)ŷ(s)+P11(s)[IP21(s)P21(s)]IP21(s)P21(s)h(s).

si98_e  (12.29)

It is thus seen that

y*(s)u*(s)=z^(s)ŵ(s)û(s)ŷ(s)Bf(I,GCHAIN(P)).

si99_e

So far it has been proved that

L(Bd)Bf(I,GCHAIN(P)).

si100_e

To prove the statement

Bf(I,GCHAIN(P))L(Bd),

si101_e

let

y*(s)u*(s)Bf(I,GCHAIN(P)),

si102_e

there should be a rational vector h1(s) such that

y*(s)=GCHAIN*(P;P21)u*(s)+ΔGCHAIN(P;P21)h1(s),

si103_e  (12.30)

for any {1}-inverse of P21. Furthermore, the input-output pair (u(s), y(s)) must be consistent to the plant P. Now let

x(s)=(P21*)(s)[P22*(s),g(s)I]u*(s)+[I(P21*)(s)P21*(s)]h1(s),

si104_e

using the consistency of (u(s), y(s)), it is easy to verify that the variables y*, u*, and x satisfy Eqs. (12.26), (12.27), this is to say that

y*(s)u*(s)L(Bd).

si105_e

On noticing Eq. (12.20), one can write the ARMA representation (Eq. 12.23) into the following Rosenbrock PMD

P21*(ρ)x(t)=[P22*(ρ),I]u(t)yc(t)zc(t)w(t)=P11*(ρ)Ix(t)+P12*(ρ)000u(t)yc(t).

si106_e  (12.31)

It is easily seen that

P21(s)=g(s)(P21*)(s).

si107_e

By substituting the above and

Pij(s)=Pij*(s)/g(s),i=1,2;j=1,2,

si108_e

into the GCSR

z(s)w(s)=P12(s)P11(s)P21(s)P22(s)P11(s)P21(s)P11(s)[IP21(s)P21(s)]P21(s)P22(s)P21(s)IP21(s)P21(s)u(s)y(s)h(s),

si109_e

using the notations of zc(s) = g(s)z(s) and yc(s) = g(s)y(s), one can find that

zc(s)w(s)=GCHAIN(P*;(P21*))u(s)yc(s)h(s),

si110_e

where we denote the matrix

eq12-02-9780081019467

A further result concerning the realization of behavior for any GCSR GCHAIN(P*; (P21*)) is the following theorem.

Theorem 12.3.4

The Rosenbrock PMD (Eq. 12.31) is a realization of behavior for any GCSR GCHAIN(P*;(P21*)).

Proof

This follows readily from Theorem 12.3.3 on noting that

zc(s)w(s)=GCHAIN(P*;(P21*))u(s)yc(s)h(s),

si110_e

where h(s) is arbitrary rational vector, and that in the matrix P* the least common (monic) multiple of the denominator polynomials of all the entries is 1.

The realization of behavior for DGCSRs of the plant P can be proposed in a completely analogous manner. To this end, consider the following ARMA representation

eq12-03-9780081019467(12.32)

The above ARMA representation is in fact

P12*(ρ)x(t)=[g(ρ)I,P11*(ρ)]z(t)w(t)I00g(ρ)Iu(t)y(t)=IP22*(ρ)x(t)+000P21*(ρ)z(t)w(t).

si112_e  (12.33)

Theorem 12.3.5

The ARMA representation (Eq. 12.32) is a realization of behavior for any DGCSR DGCHAIN(P;P12).

Proof

The proof is similar to that of Theorem 12.3.3.

Also if Eq. (12.32) is written into the following Rosenbrock PMD

P12*(ρ)x(t)=[I,P11*(ρ)]zc(t)w(t)u(t)yc(t)=IP22*(ρ)x(t)+000P21*(ρ)zc(t)w(t),

si113_e  (12.34)

one proposes a realization of behavior for any DGCSR DGCHAIN(P*;(P21*)), where any DGCHAIN(P*;(P12*)) is given by

eq12-04-9780081019467

and satisfies

u(s)yc(s)=DGCHAIN(P*;(P12*))zc(s)w(s)q(s),

si114_e

where q(s) is arbitrary rational vector. (P12*)(s) is any {1}-inverse of P12*(s). This result is stated in the following theorem.

Theorem 12.3.6

The Rosenbrock PMD (Eq. 12.34) is a realization of behavior for any DGCSR DGCHAIN(P*;(P12*)).

Proof

The proof follows from a direct application of Theorem 8.3.5 on noticing the special formulation of DGCSR DGCHAIN(P*;(P12*)).

Remark 12.3.2

The above theorems are interesting, not least for the way in which they clarify the input-output structure of GCSRs and that of DGCSRs. More importantly than this, however, is the observation that frequency behavior of any GCSR, or any DGCSR, can be completely recovered in a precise way, by introducing latent variables to the dynamical behavior of the ARMA representations via the approach of realization of behavior.

12.4 Conclusions

This chapter has presented a new notion of realization of behavior. It has been shown that realization of behavior generalizes the classical concept of realization of transfer function matrix by virtue of the input consistency assumption essentially relaxing the condition of full rank, which is put on the relevant matrix to ensure the existence of the transfer function. The basic idea in this approach is to find an ARMA representation for a given frequency behavior description such that the known frequency behavior is completely recovered from the corresponding dynamical behavior. From this point of view, realization of behavior is seen to be a converse procedure to the latent variable elimination process that was studied by Willems [27]. Such a realization approach is believed to be highly significant in modeling dynamical systems in some real cases where the system behavior is conveniently described in the frequency domain. Since no numerical computation is needed, the realization of behavior is believed to be particularly suitable for situations in which the coefficients are symbolic rather than numerical.

From an input/output viewpoint, a CSR is in fact an alternative form of system description. It is well-known and is widely used in classical circuit theory. This approach when combined with the theory of J-spectral factorization provides the most compact theory for Hsi115_e control. CSR is undoubtedly a potential research area due to the fact that it finds many applications in circuit theory, Hsi115_e control, behavior control, and infinite-dimension system theory. This chapter has investigated the input/output structure of the GCSR and has made contributions regarding this fundamental issue. Based on the approach of realization of behavior, the behavior structures of the GCSRs and the DGCSRs have been clarified. It has been shown that any GCSR or any DGCSR develops the same (frequency) behavior. Subsequently the corresponding ARMA representations are proposed and are proved to be realizations of behavior for any GCSR and for any DGCSR. More specifically, two Rosenbrock PMDs are found to be the realizations of behavior for any GCSR GCHAIN(P*;(P21*)) and any DGCSR DGCHAIN(P*;(P12*)). Once these ARMA representations are proposed, one can further find the corresponding first-order system representations by using the method of [30] or other well-developed realization approaches such as Kailath [32]. Also a constructive procedure for AR equation realization suggested by Vafiadis and Karcanias [114] can be used to obtain the required first-order system representations. The results are thus interesting in that they provide a natural linkage between the new chain-scattering approach and the well-developed Rosenbrock PMD theory or the developing behavior theory.

References

[21] Belevitch V. Classical Network Theory. San Francisco: Holden-Day; 1968.

[22] Kimura H. Chain-scattering representation, J-lossless factorization and Hsi116_e control. J. Math. Syst. Estimation Control. 1995;5:203–255.

[26] Willems J.C. From time series to linear system: Part 1. Finite dimensional linear time invariant systems: Part 2. Exact modeling: Part 3: Approximate modelling. Automatica. 1986;22:561–580.

[27] Willems J.C. Paradigms and puzzles in the theory of dynamical systems. IEEE Trans. Autom. Control. 1991;36(3):259–294.

[28] Antoulas A.C., Willems J.C. A behavioural approach to linear exact modeling. IEEE Trans. Autom. Control. 1993;38(12):1776–1800.

[29] Kuijper M., Schumacher J.M. Input-output structure of linear differential/algebraic systems. IEEE Trans. Autom. Control. 1993;38(3):404–414.

[30] Rosenthal J., Schumacher J.M. Realization by inspection. IEEE Trans. Autom. Control. 1997;42(9):1257–1263.

[31] Pugh A.C., Tan L. A generalized chain-scattering representation and its algebraic system properties. UK: Department of Mathematical Sciences, Loughborough University; 1998 A316.

[32] Kailath T. Linear Systems. Englewood Cliffs, NJ: Prentice-Hall; 1980.

[114] Vafiadis D., Karcanias N. First-order realizations of autoregressive equations. Int. J. Control. 1999;72(12):1043–1053.

[115] Luenberger D.G. Time invariant descriptor systems. Automatica. 1978;14(5):473–480.


“To view the full reference list for the book, click here

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.218.196.182