Researchers: Anders Lindquist and Per Enqvist, in cooperation with C. I. Byrnes (Washington University, St Louis).
Sponsors: The Swedish Research Council for Engineering Sciences (TFR) and the Gran Gustafsson Foundation.
One of the most widely used methods of spectral estimation in signal and speech processing is linear predictive coding (LPC). LPC has some attractive features, which account for its popularity, including the properties that the resulting modeling filter (i) matches a finite window of n+1 covariance lags, (ii) is rational of degree at most n, and (iii) has stable zeros and poles. The only limiting factor of this methodology is that the modeling filter is ``all-pole", i.e., an autoregressive (AR) model.
In [A3], we present a systematic description of all autoregressive moving-average (ARMA) models of processes which have properties (i)-(iii) in the context of cepstral analysis and homomorphic filtering. Indeed, we show each such ARMA model determines and is completely determined by its finite windows of cepstral coefficients and covariance lags. This characterization has an intuitively appealing interpretation of a characterization by using measures of the transient and the steady-state behaviors of the signal, respectively. More precisely, we show that these nth order windows form local coordinates for all ARMA models of degree n and that the pole-zero model can be determined from the windows as the unique minimum of a convex objective function. We refine this optimization method by first noting that the maximum entropy design of an LPC filter is obtained by maximizing the zeroth cepstral coefficient, subject to the constraint (i). More generally, we modify this scheme to a more well-posed optimization problem where the covariance data enters as a constraint and the linear weights of the cepstral coefficients are ``positive'' - in a sense that a certain pseudo-polynomial is positive - rather succinctly generalizing the maximum entropy method. This problem is a homomorphic filter generalization of the maximum entropy method, leading to the design of all stable, minimum-phase modeling filter of degree n which interpolate the given covariance window. This is the dual problem, in the sense of mathematical programming, of an optimization problem, which we previously obtained for the rational covariance extension problem, and which we revisit in [C7].
However, this optimization problem can be become badly conditioned for some parameter values. Therefore, in [C19], a modification of the optimization problem to avoid ill-conditioning is proposed. This procedure avoids spectral factorization, which is computationally expensive, as well as numerical problems that may occur close to the boundary. However, the new optimization problem is in general not globally convex, but only locally convex, so the optimization procedure has to be initiated close to the optimum to ensure convergence. To this end, a homotopy continuation method is proposed. Since the geometry of the solutions to the optimization problem for varying parameter values is well known from our previous work, it follows that there is a smooth trajectory from the LPC solution to any particular solution with the same n+1 first covariances. Using a predictor-corrector path-following algorithm the solution to the optimization problem can thus be found.
In [R1], we study the well-posedness of the problems of determining shaping filters from combinations of finite windows of cepstral coefficients, covariance lags, or Markov parameters. For example, we determine whether there exists a shaping filter with prescribed window of Markov parameters and a prescribed window of covariance lags. We show that several such problems are well-posed in the sense of Hadamard; that is, one can prove existence, uniqueness (identifiability) and continuous dependence of the model on the measuremnts. Our starting point is the global analysis of linear systems, where one studies an entire class of systems or models as a whole, and where one views measurements, such as covariance lags and cepstral coefficients or Markov parameters, from data as functions on the entire class. This enables one to pose such problems in a way that tools from calculus, optimization, geometry and modern nonlinear analysis can be used to give a rigorous answer to such problems in an algorithm-independent fashion. In this language, we prove that a window of cepstral coeficients and a window of covariance coefficients yields a bona fide coordinate system on the space of shaping filters thereby establishing existence, uniqueness and smooth dependence of the model parameters on the measurements from data.