Abstract
In this article, we consider approximate Bayesian parameter inference for observation-driven time series models. Such statistical models appear in a wide variety of applications, including econometrics and applied mathematics. This article considers the scenario where the likelihood function cannot be evaluated pointwise; in such cases, one cannot perform exact statistical inference, including parameter estimation, which often requires advanced computational algorithms, such as Markov Chain Monte Carlo (MCMC). We introduce a new approximation based upon Approximate Bayesian Computation (ABC). Under some conditions, we show that as n → ∞, with n the length of the time series, the ABC posterior has, almost surely, a Maximum A Posteriori (MAP) estimator of the parameters that is often different from the true parameter. However, a noisy ABC MAP, which perturbs the original data, asymptotically converges to the true parameter, almost surely. In order to draw statistical inference, for the ABC approximation adopted, standard MCMC algorithms can have acceptance probabilities that fall at an exponential rate in n and slightly more advanced algorithms can mix poorly. We develop a new and improved MCMC kernel, which is based upon an exact approximation of a marginal algorithm, whose cost per iteration is random, but the expected cost, for good performance, is shown to be O(n2) per iteration. We implement our new MCMC kernel for parameter inference from models in econometrics.
- C. Andrieu, A. Doucet, and R. Holenstein. 2010. Particle Markov chain Monte Carlo methods (with discussion). J. R. Statist. Soc. Ser. B 72, 269--342.Google ScholarCross Ref
- C. Andrieu and M. Vihola. 2014. Convergence properties of pseudo-marginal Markov chain Monte Carlo algorithms. Ann Appl. Probab. Retrieved March 20, 2014, from arxiv.org/abs/1210.1484. (To appear).Google Scholar
- S. Barthelmé and N. Chopin. 2014. Expectation-Propagation for Summary-Less, Likelihood-Free Inference. Technical Report. ENSAE. J. Amer. Statist. Assoc. (To appear).Google ScholarCross Ref
- M. A. Beaumont. 2003. Estimation of population growth or decline in genetically monitored populations. Genetics 164, 1139.Google ScholarCross Ref
- M. A. Beaumont, J. M. Cornuet, J. M. Marin, and C. P. Robert. 2009. Adaptive approximate Bayesian computation. Biometrika 86, 983--990.Google ScholarCross Ref
- O. Cappé, É. Moulines, and T. Ryden. 2005. Inference in Hidden Markov Models. Springer, New York, NY. Google ScholarDigital Library
- J. M. Chambers, C. L. Mallows, and B. W. Stuck. 1976. Method for simulating stable random variables. J. Amer. Statist. Assoc. 71, 340--344.Google ScholarCross Ref
- D. R. Cox. 1981. Statistical analysis of time-series: Some recent developments. Scand. J. Statist. 8, 93--115.Google Scholar
- T. A. Dean, S. S. Singh, A. Jasra, and G. W. Peters. 2014. Parameter estimation for hidden Markov models with intractable likelihoods. Scand. J. Statist. Retrieved March 20, 2014, from arxiv.org/abs/1103.5399. (To appear). Google ScholarDigital Library
- P. Del Moral, A. Doucet, and A. Jasra. 2012. An adaptive sequential Monte Carlo method for approximate Bayesian computation. Statist. Comp. 22, 1009--1020. Google ScholarDigital Library
- R. Douc, P. Doukhan, and E. Moulines. 2012. Ergodicity of observation-driven time series models and consistency of the maximum likelihood estimator. Stoch. Proc. Appl. 123, 2620--2647.Google ScholarCross Ref
- A. Doucet, M. Pitt, G. Deligiannidis, and R. Kohn. 2012. Efficient implementation of Markov chain Monte Carlo when using an unbiased likelihood estimator. Retrieved March 20, 2014, from arxiv.org/abs/1210.1871.Google Scholar
- J. Fan and Q. Yao. 2005. Nonlinear Time Series: Nonparametric and Parametric Methods. Springer, New York, NY.Google Scholar
- P. Fearnhead and D. Prangle. 2012. Constructing summary statistics for approximate Bayesian computation: Semi-automatic approximate Bayesian computation. J. Roy. Statist. Soc. Ser. B 74, 419--474.Google ScholarCross Ref
- A. Jasra, A. Lee, C. Yau, and X. Zhang. 2013. The alive particle filter. Retrieved March 20, 2014, from arxiv.org/abs/1304.0151.Google Scholar
- A. Jasra, S. S. Singh, J. S. Martin, and E. McCoy. 2012. Filtering via approximate Bayesian computation. Statist. Comp. 22, 1223--1237.Google ScholarCross Ref
- A. Lee. 2012. On the choice of MCMC kernels for approximate Bayesian computation with SMC samplers. In Proceedings of the Winter Simulation Conference (WSC’12). 1--12. Google ScholarDigital Library
- A. Lee and K. Latuszynski. 2012. Variance bounding and geometric ergodicity of Markov chain Monte Carlo for approximate Bayesian computation. Retrieved March 20, 2014, from arxiv.org/abs/1210.6703.Google Scholar
- P. Majoram, J. Molitor, V. Plagnol, and S. Tavare. 2003. Markov chain Monte Carlo without likelihoods. Proc. Nat. Acad. Sci. 100, 15324--15328.Google ScholarCross Ref
- J.-M. Marin, P. Pudlo, C. P. Robert, and R. Ryder. 2012. Approximate Bayesian computational methods. Statist. Comp. 22, 1167--1180.Google ScholarCross Ref
- M. F. Neuts and S. Zacks. 1967. On mixtures of χ2 and F − distributions which yield distributions of the same family. Ann. Inst. Stat. Math. 19, 527--536.Google ScholarCross Ref
- R. D. Wilkinson. 2013. Approximate Bayesian computation (ABC) gives exact results under the assumption of model error. Statist. Appl. Genetics Mole. Biol. 12, 129--141.Google ScholarCross Ref
- S. Zacks. 1980. On some inverse moments of negative-binomial distributions and their application in estimation. J. Stat. Comp. Sim., 10, 163--165.Google ScholarCross Ref
Index Terms
- Approximate Inference for Observation-Driven Time Series Models with Intractable Likelihoods
Recommendations
A transdimensional approximate Bayesian computation using the pseudo-marginal approach for model choice
When the likelihood functions are either unavailable analytically or are computationally cumbersome to evaluate, it is impossible to implement conventional Bayesian model choice methods. Instead, approximate Bayesian computation (ABC) or the likelihood-...
Scalable inference for Markov processes with intractable likelihoods
Bayesian inference for Markov processes has become increasingly relevant in recent years. Problems of this type often have intractable likelihoods and prior knowledge about model rate parameters is often poor. Markov Chain Monte Carlo (MCMC) techniques ...
An adaptive sequential Monte Carlo method for approximate Bayesian computation
Approximate Bayesian computation (ABC) is a popular approach to address inference problems where the likelihood function is intractable, or expensive to calculate. To improve over Markov chain Monte Carlo (MCMC) implementations of ABC, the use of ...
Comments