Doctoral Thesis Defense, Optimization and Systems Theory
Wednesday, June 6, 2001, 10.00, Kollegiesalen, Administration building, Valhallavägen 79, KTH


Anders Dahlén

Identification of stochastic systems: Subspace methods and covariance extension

Akademisk avhandling som med tillstånd av Kungliga Tekniska Högskolan framlägges till offentlig granskning för avläggande av filosofie doktorsexamen onsdagen den 6:e juni 2001 kl 10.00 i Kollegiesalen, Administrationsbyggnaden, Kungliga Tekniska Högskolan, Valhallavägen 79.

This thesis consists of four papers in identification of linear stochastic systems.

In the first paper it is briefly explained why certain subspace methods, for identification of time-series, may fail for theoretical reasons. Reproducible experiments are described that make it possible to test algorithms for failures. Massive failures of some popular subspace methods are verified through simulations.

In the second paper an alternative identification procedure which overcomes these difficulties is presented for scalar times-series. It is based on identification of a high-order maximum entropy model (AR model) followed by stochastically balanced truncation. The procedure is described using just linear algebraic operations, and therefore it inherits the nice properties of subspace methods. A complete analysis of the statistical convergence properties of the method is presented. In particular, it is shown that the transfer function of the estimated system tends in a "worst case" measure to the true transfer function. Simulations show convergence to the Cramér-Rao bounds of the obtained variances, as the data length increases.

In the third paper the identification procedure of the second paper is generalized to multivariate time-series. As in the scalar case, the procedure is described using just linear algebraic operations. The essential differences between the CCA subspace method and the proposed method are described. CCA estimates all covariances in a block Hankel matrix directly from data, whereas the proposed procedure uses covariance extension when constructing the Hankel matrix. A consistency and asymptotic normality proof for the identification procedure is given.

The fourth paper studies the relation between CCA and the proposed method (MEST) in more detail. For the sake of comparison, the two identification procedures are formulated in a uniform framework using the same truncation scheme, and from these expressions the essential difference becomes apparent. It is shown that MEST and CCA are asymptotically equivalent, which implies that they have the same asymptotic normal distribution. However, simulations indicate that AR-modeling and stochastically balanced truncation has a better performance than CCA in practice.


Calendar of seminars
Last update: May 9, 2001 by Anders Forsgren, anders.forsgren@math.kth.se.