Eigendecomposition

23.08.2021

Eigendecompositions are used in various algorithms throughout FlexPro. They are the principal methodology in the:

Eigen (MUSIC, EV) Spectral Estimator

Eigendecompositions are also intrinsically a part of all FlexPro procedures that include SVD (singular value decomposition) methods:

AR (AutoRegressive) Spectral Estimator

ARMA (AutoRegressive Moving Average) Spectral Estimator

Processing the eigenmodes (eigenvectors and eigenvalues/singular values) of a data series is an important tool in signal analysis. Unlike Fourier decompositions, which partition signals based on harmonic frequency using parametric sines and cosines, an eigendecomposition partitions by signal strength using adaptive non-parametric basis functions. Signal components can thus be separated by differences in power.

Nomenclature

The identification, isolation, and reconstruction of signal components via eigendecomposition is known by a variety of names. "Singular Spectral Analysis", "Principal Component Analysis", and "Eigenfiltering" are common. FlexPro exclusively uses the "Eigendecomposition" designation since it represents a more precise description of the numeric method.

Eigenvalues and Singular Values

Although the terms eigenvectors and singular vectors are interchangeable, each eigenvalue will be the square of the respective singular value divided by the order of the decomposition.

Lagged Covariance or Data Matrix

An eigendecomposition can be achieved in any number of ways. The first step is always the creation of a matrix that uses lagged copies of subsets of the data series. This can be a straightforward data or trajectory matrix, such as the forward prediction (Fwd ), backward prediction (Bwd), or forward-backward (FB) prediction matrices used in autoregressive modeling. These data matrices are usually rectangular, and SVD is used to extract the eigenvectors and singular values. Unless the eigendecomposition involves the least-squares computations of parametric model coefficients, there is no difference between using a Fwd and the Bwd data matrix. An FB prediction matrix will contain twice the number of rows as a Fwd or Bwd matrix, and will require a greater processing time.

Another option is to use one of several methods to construct a covariance matrix from lagged copies of the data. This is a square matrix whose eigenvectors and singular values can be computed using either SVD or the EISPACK eigendecomposition procedures. Although the computation time is somewhat greater, FlexPro exclusively uses SVD for all eigendecompositions. The EISPACK routines can fail to find all eigenmodes whereas the SVD procedure should never fail.

A covariance-based procedure that enforces Toeplitz symmetry (all of the elements along each diagonal are equal), is reported to be of value for short data records. Typically, non-Toeplitz matrices will better map the variance in a data series.

Yet another option is to construct a square matrix based on the normal equations. This is similar to the covariance matrix approach in that a square matrix can be evaluated far more quickly than a full data matrix when the data set is large. On the other hand, the construction of the normal equations matrix can introduce precision losses which can adversely impact the computation of model coefficients.

Eigendecomposition Order

The order of the eigendecomposition is the number of data elements from the data set that is extracted for each segment or subset of the overall series, not the number of segments. For rectangular data matrices, the order or "embedding dimension" specifies the number of columns in the matrix and the number of segments specifies the rows. For a covariance matrix, both the column and row count will be equal to the order.

It is essential that the order of the eigendecomposition be high enough to offer good signal-noise separation. The order must be sufficient to fully partition the noise components and prevent them from corrupting the signal-being eigenmodes. In general, the higher the order (data length permitting), the more complete the partitioning as more eigenmodes are available to capture the random noise.

Signal-Noise Separation in Spectra

The most common use of eigendecomposition is to separate signal and noise. In the spectral procedures that employ SVD, the signal components are thresholded so that the noise elements do not factor into the matrix computations of AR and ARMA coefficients. The EigenAnalysis Spectral Estimatorprocedure uses the eigendecomposition directly as the noise-only eigenvectors are used to generate frequency estimates. These options do not reconstruct a time-domain data stream.

Signal and Noise Components

The first eigenmode will capture the predominant data trend in the signal, the second the next predominant, and so forth. It does not matter if the component captured is sinusoidal, a square wave, a sawtooth, or an anharmonic pattern. Further, the signal may be a slowly varying low frequency anharmonic oscillation, or a high frequency sinusoid. The eigenmodes are said to be adaptive, because they capture, in an eigenvalue-ordered sequence, the variance of the data in a non-parametric manner.

Two eigenmodes are need to capture an oscillatory trend. A pair of eigenmodes with nearly identical eigenvalues usually signifies a specific harmonic or anharmonic oscillation in the signal. One strength of eigendecomposition is the isolation of secondary signal components containing very little power. Such secondary components may not be visible in Fourier spectra because of spectral leakage and resolution considerations. Similarly these components may be invisible in AR and ARMA models since an autoregressive model will emphasize the primary components. The same is true for parametric sinusoid models since the total variance associated with the components of greater power can often overwhelm a least-squares fit that seeks to also include secondary components of far lesser power. Eigendecomposition offers the means to isolate the high and low power components for separate spectral analysis and fitting. The same is true for parametric sinusoid models, since the total variance associated with the components of greater power can often prevent accounting correctly for components of far lesser power during fitting. Eigendecomposition offers the means to isolate the high and low power components for separate spectral analysis and fitting.

References

An excellent reference for eigendecomposition can be found in:

J. B. Elsner and A. A. Tsonis, "Singular Spectral Analysis", Plenum Press, 1996.

See Also

Spectral Analysis Option

Spectral Estimators Analysis Object - EigenAnalysis Spectral Estimator

EigenAnalysis Algorithms

Spectral Estimators Tutorial

Share article or send as email:

You might be interested in these articles