In this paper, an output error (OE) model with random time delay is identified by using the expectation maximization (EM) algorithm. Since the regression model of the OE system has a colored noise, the finite impulse response (FIR) method is used to transform the OE model into an FIR model, whose regression model contains a white noise. An EM algorithm is proposed to iteratively estimate the time-delays and model parameters. Furthermore, the parameters of the OE model can be yielded based on the parameter estimates of the FIR model through the matrix transformation method. The convergence analysis and simulation results are given to illustrate the effectiveness of the proposed algorithm.
In this paper, an output error (OE) model with random time delay is identified by using the expectation maximization (EM) algorithm. Since the regression model of the OE system has a colored noise, the finite impulse response (FIR) method is used to transform the OE model into an FIR model, whose regression model contains a white noise. An EM algorithm is proposed to iteratively estimate the time-delays and model parameters. Furthermore, the parameters of the OE model can be yielded based on the parameter estimates of the FIR model through the matrix transformation method. The convergence analysis and simulation results are given to illustrate the effectiveness of the proposed algorithm.