High-dimensional matrix-variate time series data are becoming increasingly popular in economics and finance. This has stimulated the development of matrix factor models to achieve significant dimension reduction. This paper proposes an approximate dynamic matrix factor model that accounts for the time series nature of the data, and develops an EM algorithm to perform quasi-maximum likelihood estimation of the model parameters. The algorithm is further extended to estimate the dynamic matrix factor model on a dataset with arbitrary pattern of missing data. The finite sample properties of the proposed estimation strategies are assessed through a large simulation study and an application to a financial dataset.
Zoom link: https://univr.zoom.us/j/87229792672