Jump to ContentJump to Main Navigation
Show Summary Details

Studies in Nonlinear Dynamics & Econometrics

Ed. by Mizrach, Bruce

IMPACT FACTOR increased in 2015: 0.517
5-year IMPACT FACTOR: 0.628

SCImago Journal Rank (SJR) 2015: 0.426
Source Normalized Impact per Paper (SNIP) 2015: 0.546
Impact per Publication (IPP) 2015: 0.419

Mathematical Citation Quotient (MCQ) 2015: 0.01

99,00 € / $149.00 / £75.00*

See all formats and pricing


Select Volume and Issue


30,00 € / $42.00 / £23.00

Get Access to Full Text

An Information Theoretic Approach for Estimating Nonlinear Dynamic Models

Amos Golan1

1American University,

Citation Information: Studies in Nonlinear Dynamics & Econometrics. Volume 7, Issue 4, ISSN (Online) 1558-3708, DOI: 10.2202/1558-3708.1174, December 2003

Publication History

Published Online:

This article offers supplementary material which is provided at the end of the article.

Given the objective of estimating the unknown parameters of a possibly nonlinear dynamic model using a finite (and relatively small) data set, it is common to use a Kalman filter Maximum Likelihood (ML) approach, ML-type estimators or more recently a GMM (Imbens, Spady and Johnson, 1998), BMOM (Zellner 1997), or other information theoretic estimators (e.g., Golan, Judge and Miller, 1996). Except for the BMOM, the above ML-type methods require some distributional assumptions while the moment-type estimators require some assumptions on the moments of the underlying distribution that generated the data. In the BMOM approach however, sampling assumptions underlying most ML and other approaches are not employed for the given data. The error terms are viewed as parameters with unknown values.Based on a generalization of the Maximum Entropy (ME), a semi-parametric, Information-Theoretic (IT) framework for estimating dynamic models with minimal distributional assumptions is formulated here. Like the BMOM approach, under this formulation, one views the errors as another set of unknown parameters to be estimated. Thus, for any data set, the estimation problem is ill-posed (under-determined) where the number of unknowns is always greater than the number of data points. The Information-Theoretic approach is one way to estimate the unknown parameters.After developing the basic IT (entropy) model, a computationally efficient concentrated model is developed where the optimization is done with respect to the Lagrange multipliers associated with each observation. The dual concentrated model is used to contrast this IT approach with the more traditional ML-type estimators. Statistics and inference procedures are developed as well. Monte Carlo results for estimating the parameters of noisy, chaotic systems are presented.

Supplementary Article Materials

Comments (0)

Please log in or register to comment.