skip to main content
research-article

Approximate Inference for Observation-Driven Time Series Models with Intractable Likelihoods

Published:01 May 2014Publication History
Skip Abstract Section

Abstract

In this article, we consider approximate Bayesian parameter inference for observation-driven time series models. Such statistical models appear in a wide variety of applications, including econometrics and applied mathematics. This article considers the scenario where the likelihood function cannot be evaluated pointwise; in such cases, one cannot perform exact statistical inference, including parameter estimation, which often requires advanced computational algorithms, such as Markov Chain Monte Carlo (MCMC). We introduce a new approximation based upon Approximate Bayesian Computation (ABC). Under some conditions, we show that as n → ∞, with n the length of the time series, the ABC posterior has, almost surely, a Maximum A Posteriori (MAP) estimator of the parameters that is often different from the true parameter. However, a noisy ABC MAP, which perturbs the original data, asymptotically converges to the true parameter, almost surely. In order to draw statistical inference, for the ABC approximation adopted, standard MCMC algorithms can have acceptance probabilities that fall at an exponential rate in n and slightly more advanced algorithms can mix poorly. We develop a new and improved MCMC kernel, which is based upon an exact approximation of a marginal algorithm, whose cost per iteration is random, but the expected cost, for good performance, is shown to be O(n2) per iteration. We implement our new MCMC kernel for parameter inference from models in econometrics.

References

  1. C. Andrieu, A. Doucet, and R. Holenstein. 2010. Particle Markov chain Monte Carlo methods (with discussion). J. R. Statist. Soc. Ser. B 72, 269--342.Google ScholarGoogle ScholarCross RefCross Ref
  2. C. Andrieu and M. Vihola. 2014. Convergence properties of pseudo-marginal Markov chain Monte Carlo algorithms. Ann Appl. Probab. Retrieved March 20, 2014, from arxiv.org/abs/1210.1484. (To appear).Google ScholarGoogle Scholar
  3. S. Barthelmé and N. Chopin. 2014. Expectation-Propagation for Summary-Less, Likelihood-Free Inference. Technical Report. ENSAE. J. Amer. Statist. Assoc. (To appear).Google ScholarGoogle ScholarCross RefCross Ref
  4. M. A. Beaumont. 2003. Estimation of population growth or decline in genetically monitored populations. Genetics 164, 1139.Google ScholarGoogle ScholarCross RefCross Ref
  5. M. A. Beaumont, J. M. Cornuet, J. M. Marin, and C. P. Robert. 2009. Adaptive approximate Bayesian computation. Biometrika 86, 983--990.Google ScholarGoogle ScholarCross RefCross Ref
  6. O. Cappé, É. Moulines, and T. Ryden. 2005. Inference in Hidden Markov Models. Springer, New York, NY. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. J. M. Chambers, C. L. Mallows, and B. W. Stuck. 1976. Method for simulating stable random variables. J. Amer. Statist. Assoc. 71, 340--344.Google ScholarGoogle ScholarCross RefCross Ref
  8. D. R. Cox. 1981. Statistical analysis of time-series: Some recent developments. Scand. J. Statist. 8, 93--115.Google ScholarGoogle Scholar
  9. T. A. Dean, S. S. Singh, A. Jasra, and G. W. Peters. 2014. Parameter estimation for hidden Markov models with intractable likelihoods. Scand. J. Statist. Retrieved March 20, 2014, from arxiv.org/abs/1103.5399. (To appear). Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. P. Del Moral, A. Doucet, and A. Jasra. 2012. An adaptive sequential Monte Carlo method for approximate Bayesian computation. Statist. Comp. 22, 1009--1020. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. R. Douc, P. Doukhan, and E. Moulines. 2012. Ergodicity of observation-driven time series models and consistency of the maximum likelihood estimator. Stoch. Proc. Appl. 123, 2620--2647.Google ScholarGoogle ScholarCross RefCross Ref
  12. A. Doucet, M. Pitt, G. Deligiannidis, and R. Kohn. 2012. Efficient implementation of Markov chain Monte Carlo when using an unbiased likelihood estimator. Retrieved March 20, 2014, from arxiv.org/abs/1210.1871.Google ScholarGoogle Scholar
  13. J. Fan and Q. Yao. 2005. Nonlinear Time Series: Nonparametric and Parametric Methods. Springer, New York, NY.Google ScholarGoogle Scholar
  14. P. Fearnhead and D. Prangle. 2012. Constructing summary statistics for approximate Bayesian computation: Semi-automatic approximate Bayesian computation. J. Roy. Statist. Soc. Ser. B 74, 419--474.Google ScholarGoogle ScholarCross RefCross Ref
  15. A. Jasra, A. Lee, C. Yau, and X. Zhang. 2013. The alive particle filter. Retrieved March 20, 2014, from arxiv.org/abs/1304.0151.Google ScholarGoogle Scholar
  16. A. Jasra, S. S. Singh, J. S. Martin, and E. McCoy. 2012. Filtering via approximate Bayesian computation. Statist. Comp. 22, 1223--1237.Google ScholarGoogle ScholarCross RefCross Ref
  17. A. Lee. 2012. On the choice of MCMC kernels for approximate Bayesian computation with SMC samplers. In Proceedings of the Winter Simulation Conference (WSC’12). 1--12. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. A. Lee and K. Latuszynski. 2012. Variance bounding and geometric ergodicity of Markov chain Monte Carlo for approximate Bayesian computation. Retrieved March 20, 2014, from arxiv.org/abs/1210.6703.Google ScholarGoogle Scholar
  19. P. Majoram, J. Molitor, V. Plagnol, and S. Tavare. 2003. Markov chain Monte Carlo without likelihoods. Proc. Nat. Acad. Sci. 100, 15324--15328.Google ScholarGoogle ScholarCross RefCross Ref
  20. J.-M. Marin, P. Pudlo, C. P. Robert, and R. Ryder. 2012. Approximate Bayesian computational methods. Statist. Comp. 22, 1167--1180.Google ScholarGoogle ScholarCross RefCross Ref
  21. M. F. Neuts and S. Zacks. 1967. On mixtures of χ2 and F − distributions which yield distributions of the same family. Ann. Inst. Stat. Math. 19, 527--536.Google ScholarGoogle ScholarCross RefCross Ref
  22. R. D. Wilkinson. 2013. Approximate Bayesian computation (ABC) gives exact results under the assumption of model error. Statist. Appl. Genetics Mole. Biol. 12, 129--141.Google ScholarGoogle ScholarCross RefCross Ref
  23. S. Zacks. 1980. On some inverse moments of negative-binomial distributions and their application in estimation. J. Stat. Comp. Sim., 10, 163--165.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Approximate Inference for Observation-Driven Time Series Models with Intractable Likelihoods

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    Full Access

    • Published in

      cover image ACM Transactions on Modeling and Computer Simulation
      ACM Transactions on Modeling and Computer Simulation  Volume 24, Issue 3
      May 2014
      142 pages
      ISSN:1049-3301
      EISSN:1558-1195
      DOI:10.1145/2616590
      Issue’s Table of Contents

      Copyright © 2014 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 1 May 2014
      • Accepted: 1 November 2013
      • Revised: 1 September 2013
      • Received: 1 June 2013
      Published in tomacs Volume 24, Issue 3

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader