UTS site search
A micropscope in a UTS: Science laboratory

Working Papers, Technical Reports and Publications

Garland Durham and John Geweke, 2014. Adaptive Sequential Posterior Simulators for Massively Parallel Computing Environments, 2013 (PDF, 352kB)

Publication: Advances in Econometrics (Ivan Jeliazkov  and Dale J. Poirier, eds.). Bayesian Model Comparison (Advances in Econometrics, Volume 34) Emerald Group Publishing Limited. Chapter 1, 1-44, 2014. 

Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially quite advantageous ways. But to fully exploit these benefits algorithms that conform to parallel computing environments are needed. Sequential Monte Carlo comes very close to this ideal whereas other approaches like Markov chain Monte Carlo do not. This paper presents a sequential posterior simulator well suited to this computing environment. The simulator makes fewer analytical and programming demands on investigators, and is faster, more reliable and more complete than conventional posterior simulators. The paper extends existing sequential Monte Carlo methods and theory to provide a thorough and practical foundation for sequential posterior simulation that is well suited to massively parallel computing environments. It provides detailed recommendations on implementation, yielding an algorithm that requires only code for simulation from the prior and evaluation of prior and data densities and works well in a variety of applications representative of serious empirical work in economics and finance. The algorithm facilitates Bayesian model comparison by producing marginal likelihood approximations of unprecedented accuracy as an incidental byproduct, is robust to pathological posterior distributions, and provides estimates of numerical standard error and relative numerical efficiency intrinsically. The paper concludes with an application that illustrates the potential of these simulators for applied Bayesian inference.

John Geweke, Garland Durham and Huaxin Xu, 2014. Bayesian Inference for Logistic Regression Models using Sequential Posterior Simulation, 2014 (PDF, 211kB)

Publication: Current Trends in Bayesian Methodology with Applications (S.K. Upadhyay, U. Singh, D. K. Dey and A. Loganathan, eds.) CRC Press.  Chapter 14, 289-312, 2015.

The logistic speci…fication has been used extensively in non-Bayesian statistics to model the dependence of discrete outcomes on the values of speci…ed covariates. Because the likelihood function is globally weakly concave estimation by maximum likelihood is generally straightforward even in commonly arising applications with scores or hundreds of parameters. In contrast Bayesian inference has proven awkward, requiring normal approximations to the likelihood or specialised adaptations of existing Markov chain Monte Carlo and data augmentation methods. This paper approaches Bayesian inference in logistic models using recently developed generic sequential posterior simulaton (SPS) methods that require little more than the ability to evaluate the likelihood function. Compared with existing alternatives SPS is much simpler, and provides numerical standard errors and accurate approximations of marginal likelihoods as by-products. The SPS algorithm for Bayesian inference is amenable to massively parallel implementation, and when implemented using graphical processing units it compares well with the best existing alternative. The paper demonstrates these points by means of several examples.

John Geweke and Bart Frischknecht, 2014. Exact Optimisation by Means of Sequentially Adaptive Bayesian Learning (PDF, 934kB)

Simulated annealing is a well-established approach to optimisation that is robust for irregular objective functions. Recently it has been improved using sequential Monte Carlo. This paper presents further improvements that yield the global optimum with accuracy constrained only by the limitations of ‡floating point arithmetic. Performance is illustrated using a standard set of six test problems in which simulated annealing has had mixed success. Our approach reliably fi…nds the exact global optimum in all six cases, and with fewer function evaluations than competing simulated annealing algorithms. This approach is a specifi…c case of the sequentially adaptive Bayesian learning algorithm, which uses feedback from particles to the design of the algorithm. The feature of this algorithm most critical to exact optimisation is targeted tempering, a new technique developed in this paper.

John Geweke, 2015. Sequentially Adaptive Bayesian Learning for a Nonlinear Model of the Secular and Cyclical Behavior of US Real GDP (PDF, 511kB)

There is a one-to-one mapping between the conventional time series parameters of a third-order autoregression and the more interpretable parameters of secular half-life, cyclical half-life and cycle period. The latter parameterization is better suited to interpretation of results using both Bayesian and maximum likelihood methods and to expression of a substantive prior distribution using Bayesian methods. The paper demonstrates how to approach both problems using the sequentially adaptive Bayesian learning algorithm and SABL software, which eliminates virtually of the substantial technical overhead required in conventional approaches and produces results quickly and reliably. The work utilises methodological innovations in SABL including optimization of irregular and multimodal functions and production of the conventional maximum likelihood asymptotic variance matrix as a by-product.

Garland Durham and John Geweke, 2015. Sequentially Adaptive Bayesian Learning Algorithms for Inference and Optimisation (PDF, 5.1MB)

The sequentially adaptive Bayesian learning algorithm is an extension and combination of sequential particle filters for a static target and simulated annealing. A key distinction between SABL and these approaches is that the introduction of information in SABL is adaptive and controlled, with control guaranteeing that the algorithm performs reliably and efficiently in a wide variety of settings without any specific further attention. This avoids the need for tedious tuning, tinkering, trial and error. The algorithm is pleasingly parallel and when executed using one or more graphics processing units is much faster than competing algorithms, many of which are not pleasingly parallel and unable to exploit the massively parallel architecture of GPUs. This paper describes the algorithm, provides theoretical foundations more self-contained than those in the existing literature, provides applications to Bayesian inference and optimization problems illustrating many advantages of the algorithm, and briefly describes the nonproprietary SABL software.