UTS site search

Professor John Geweke

Professional

Internationally renowned econometrician John Geweke came to UTS as Distinguished Research Professor in the School of Business in 2009. Professor Geweke is distinguished for his contributions to econometric theory in time series analysis and Bayesian modelling, and for applications in the fields of macroeconomics, finance, and microeconomics. He is a Fellow of the Econometric Society and the American Statistical Association. He has been co-editor of the Journal of Econometrics, the Journal of Applied Econometrics, and editor of the Journal of Business and Economic Statistics. His most recent book is Complete and Incomplete Econometric Models, published by Princeton University Press in January 2010. Currently he directs the six-investigator ARC – sponsored project, “Massively Parallel Algorithms for Bayesian Inference and Decision Making.”

Awards and Recognition

Fellow of the Econometric Society, since 1982
Fellow of the American Statistical Association, since 1990
Alfred P. Sloan Research Fellow, 1982-1984
H.I. Romnes Faculty Fellow, University of Wisconsin, 1982-1983
Dayton-Hudson Fellowship, 1970-1974
National Merit Scholar, 1966-1970
Member, Phi Beta Kappa and Phi Kappa Phi
Listed in Marquis' Who's Who in America, similar publications

Previous Academic Positions

Harlan McGregor Chair in Economic Theory and Professor of Economics and Statistics, University of Iowa, 1999-2009
Professor of Economics, University of Minnesota, 1990-2001
Director, Institute of Statistics and Decision Sciences, Duke University, 1987-1990
Professor of Statistics and Decision Sciences, Duke University, 1987-1990
William R. Kenan, Jr., Professor of Economics, Duke University, 1986-1990
Professor of Economics, Duke University, 1983-1986
Visiting Professor of Economics, Carnegie-Mellon University, 1982-1983
Visiting Professor of Statistics, Carnegie-Mellon University, 1982-1983
Professor of Economics, University of Wisconsin-Madison, 1982-1983
Associate Professor of Economics, University of Wisconsin-Madison, 1979-1982
Visiting Fellow, Warwick University, 1979
Assistant Professor of Economics, University of Wisconsin-Madison, 1975-1979

Image of John Geweke
Casual Academic, University Casual Academics
Associate Member, AAI - Advanced Analytics Institute
Doc. of Philosophy
 
Phone
+61 2 9514 9797

Books

Geweke, J., Koop, G. & Van Dijk, H. 2012, The Oxford Handbook of Bayesian Econometrics.
View/Download from: Publisher's site
© Oxford University Press 2011. All rights reserved. Bayesian econometric methods have enjoyed an increase in popularity in recent years. Econometricians, empirical economists, and policymakers are increasingly making use of Bayesian methods. The Oxford Handbook of Bayesian Econometrics is a single source about Bayesian methods in specialized fields. It contains articles by leading Bayesians on the latest developments in their specific fields of expertise. The volume provides broad coverage of the application of Bayesian econometrics in the major fields of economics and related disciplines, including macroeconomics, microeconomics, finance, and marketing. It reviews the state of the art in Bayesian econometric methodology, with articles on posterior simulation and Markov chain Monte Carlo methods, Bayesian nonparametric techniques, and the specialized tools used by Bayesian time series econometricians such as state space models and particle filtering. It also includes articles on Bayesian principles and methodology.
Geweke, J. 2010, Complete and Incomplete Econometric Models, 1, Princeton University Press, Princeton, USA.
View/Download from: UTS OPUS
Econometric models are widely used in the creation and evaluation of economic policy in the public and private sectors. But these models are useful only if they adequately account for the phenomena in question, and they can be quite misleading if they do not. In response, econometricians have developed tests and other checks for model adequacy. All of these methods, however, take as given the specification of the model to be tested. In this book, John Geweke addresses the critical earlier stage of model development, the point at which potential models are inherently incomplete. Summarizing and extending recent advances in Bayesian econometrics, Geweke shows how simple modern simulation methods can complement the creative process of model formulation. These methods, which are accessible to economics PhD students as well as to practicing applied econometricians, streamline the processes of model development and specification checking. Complete with illustrations from a wide variety of applications, this is an important contribution to econometrics that will interest economists and PhD students alike.
Geweke, J. 2005, Contemporary Bayesian Econometrics and Statistics, 1, John Wiley & Sons Inc, USA.
View/Download from: Publisher's site
From the back cover: This publication provides readers with a thorough understanding of Bayesian analysis that is grounded in the theory of inference and optimal decision-making. Contemporary Bayesian Econometrics and Statistics provides readers with state-of-the-art simulation methods and models that are used to solve complex real-world problems. Armed with a strong foundation in both theory and practical problem-solving tools, readers discover how to optimize decision-making when faced with problems that involve limited or imperfect data. The book begins by examining the theoretical and mathematical foundations of Bayesian statistics to help readers understand how and why it is used in problem solving. The author then describes how modern simulation methods make Bayesian approaches practical using widely available mathematical applications software. In addition, the author details how models can be applied to specific problems, including Linear models and policy choices, modeling with latent variables and missing data, time series models and prediction, and comparison and evaluation of models. The publication has been developed and fine tuned through a decade of classroom experiences, and readers will find the author's approach very engaging and accessible. There are nearly 200 examples and exercises to help readers see how effective use of Bayesian statistics enables them to make optimal decisions. Matlab and Splus computer programs are integrated throughout the book. An accompanying web site provides readers with datasets and computer code for many examples. This publication is tailored for research professionals who use econometrics and similar statistical methods in their work. With its emphasis on practical problem solving and extensive use of examples and exercises, this is also an excellent textbook for graduate-level students in a broad range of fields, including economics, statistics, the social sciences, business, and public policy.
Geweke, J. & Keane, M. 2001, Chapter 56 Computationally intensive methods for integration in econometrics.
View/Download from: Publisher's site
Until recently, inference in many interesting models was precluded by the requirement of high dimensional integration. But dramatic increases in computer speed, and the recent development of new algorithms that permit accurate Monte Carlo evaluation of high dimensional integrals, have greatly expanded the range of models that can be considered. This chapter presents the methodology for several of the most important Monte Carlo methods, supplemented by a set of concrete examples that show how the methods are used. Some of the examples are new to the econometrics literature. They include inference in multinomial discrete choice models and selection models in which the standard normality assumption is relaxed in favor of a multivariate mixture of normals assumption. Several Monte Carlo experiments indicate that these methods are successful at identifying departures from normality when they are present. Throughout the chapter the focus is on inference in parametric models that permit rich variation in the distribution of disturbances. The chapter first discusses Monte Carlo methods for the evaluation of high dimensional integrals, including integral simulators like the GHK method, and Markov Chain Monte Carlo methods like Gibbs sampling and the Metropolis-Hastings algorithm. It then turns to methods for approximating solutions to discrete choice dynamic optimization problems, including the methods developed by Keane and Wolpin, and Rust, as well as methods for circumventing the integration problem entirely, such as the approach of Geweke and Keane. The rest of the chapter deals with specific examples: classical simulation estimation for multinomial probit models, both in the cross sectional and panel data contexts; univariate and multivariate latent linear models; and Bayesian inference in dynamic discrete choice models in which the future component of the value function is replaced by a flexible polynomial. © 2001 Elsevier Inc. All rights reserved.
Geweke, J., Bonnen, J., Koshel, J. & White, A. 1999, Sowing the Seeds: Informing public Policy in the Economic Research Service of USDA., National Academy Press, Washington.
Geweke, J., Berry, D. & Chaloner, K. 1996, Bayesian Statistics and Econometrics: Essays in Honor of Arnold Zellner, Wiley, New York.
Geweke, J. 1996, Chapter 15 Monte carlo simulation and numerical integration.
View/Download from: Publisher's site
Geweke, J., Caines, P., Parzen, M. & Taqqu, M. 1993, New Directions in Time Series Analysis, Parts I and II., Springer-Verlag, New York.
Geweke, J. 1992, Decision Making under Risk and Uncertainty: New Models and Empirical Findings., Kluwer Academic Publishers, Dordrecht.
Geweke, J., Barnett, W. & Shell, K. 1989, Economic Complexity: Chaos, Sunspots, Bubbles and Nonlinearity, Cambridge University Press, Cambridge.
Geweke, J. 1985, Inferring Household Demand for Durable Goods, with Heterogeneous Preferences: A Case Study.
Geweke, J. 1984, Chapter 19 Inference and causality in economic time series models.
View/Download from: Publisher's site

Chapters

Geweke, J., Durham, G. & Xu, H. 2015, 'Bayesian Inference for Logistic Regression Models Using Sequential Posterior Simulation' in Upadhyay, S., Singh, U., Dey, D. & Loganathan, A. (eds), Current Trends in Bayesian Methodology with Applications, CRC Press, USA, pp. 290-310.
View/Download from: UTS OPUS
Durham, G. & Geweke, J. 2014, 'Adaptive Sequential Posterior Simulators for Massively Parallel Computing Environments' in Jeliazkov, I. & Poirier, D. (eds), Bayesian Model Comparison (Advances in Econometrics), Emerald Group Publishing Limited, USA, pp. 1-44.
View/Download from: UTS OPUS or Publisher's site
Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially quite advantageous ways. But to fully exploit these benefits algorithms that conform to parallel computing environments are needed. This paper presents a sequential posterior simulator designed to operate efficiently in this context. The simulator makes fewer analytical and programming demands on investigators, and is faster, more reliable, and more complete than conventional posterior simulators. The paper extends existing sequential Monte Carlo methods and theory to provide a thorough and practical foundation for sequential posterior simulation that is well suited to massively parallel computing environments. It provides detailed recommendations on implementation, yielding an algorithm that requires only code for simulation from the prior and evaluation of prior and data densities and works well in a variety of applications representative of serious empirical work in economics and finance. The algorithm facilitates Bayesian model comparison by producing marginal likelihood approximations of unprecedented accuracy as an incidental by-product, is robust to pathological posterior distributions, and provides estimates of numerical standard error and relative numerical efficiency intrinsically. The paper concludes with an application that illustrates the potential of these simulators for applied Bayesian inference.
Geweke, J., Koop, G. & van Dijk, H. 2011, 'Introduction' in Geweke, J., Koop, G. & van Dijk, H. (eds), The Oxford Handbook of Bayesian Econometrics, Oxford University Press, Oxford, pp. 1-8.
View/Download from: UTS OPUS
Bayesian econometric methods have enjoyed an increase in popularity in recent years. Econometricians, empirical economists, and policymakers are increasingly making use of Bayesian methods. This handbook is a single source for researchers and policymakers wanting to learn about Bayesian methods in specialized fields, and for graduate students seeking to make the final step from textbook learning to the research frontier. It contains contributions by leading Bayesians on the latest developments in their specific fields of expertise. The volume provides broad coverage of the application of Bayesian econometrics in the major fields of economics and related disciplines, including macroeconomics, microeconomics, finance, and marketing. It reviews the state of the art in Bayesian econometric methodology, with chapters on posterior simulation and Markov chain Monte Carlo methods, Bayesian nonparametric techniques, and the specialized tools used by Bayesian time series econometricians such as state space models and particle filtering. It also includes chapters on Bayesian principles and methodology.
Geweke, J. 2009, 'The SETAR Model of Tong and Lim and Advances in Computation' in Chan, K.S. (ed), Exploration of a Nonlinear World: An Appreciation of Howell Tong's Contributions to Statistics., World Scientific, Singapore, pp. 85-94.
View/Download from: UTS OPUS
This discussion revisits Tong and Lim's seminal 1980 paper on the SETAR model in the context of advances in computation since that time. Using the Canadian lynx data set from that paper, it compares exact maximum likelihood estimates with those in the original paper. It illustrates the application of Bayesian MCMC methods, developed in the intervening years, to this model and data set. It shows that SETAR is a limiting case of mixture of experts models and studies the application of one variant of those models to the lynx data set. The application is successful, despite the small size of the data set and the complexity of the model. Predictive likelihood ratios favor Tong and Lim's original model.
Geweke, J., Horowitz, J.L. & Pesaran, H. 2008, 'Econometrics' in Durlauf, S.N. & Blume, L.E. (eds), The New Palgrave Dictionary of Economics online, Palgrave Macmillan, Online, pp. 1-32.
View/Download from: UTS OPUS or Publisher's site
As a unified discipline, econometrics is still relatively young and has been transforming and expanding very rapidly. Major advances have taken place in the analysis of cross-sectional data by means of semiparametric and nonparametric techniques. Heterogeneity of economic relations across individuals, firms and industries is increasingly acknowledged and attempts have been made to take it into account either by integrating out its effects or by modelling the sources of heterogeneity when suitable panel data exist. The counterfactual considerations that underlie policy analysis and treatment valuation have been given a more satisfactory foundation. New time-series econometric techniques have been developed and employed extensively in the areas of macroeconometrics and finance. Nonlinear econometric techniques are used increasingly in the analysis of cross-section and time-series observations. Applications of Bayesian techniques to econometric problems have been promoted largely by advances in computer power and computational techniques. The use of Bayesian techniques has in turn provided the investigators with a unifying framework where the tasks of forecasting, decision making, model evaluation and learning can be considered as parts of the same interactive and iterative process, thus providing a basis for âreal time econometricsâ.
Keane, M. & Geweke, J. 2006, 'Bayesian Cross-Sectional Analysis of the Conditional Distrubution of Earnings of Men in the USA (1967-1996)' in Upadhyay, S.K., Singh, U. & Dey, D.K. (eds), Bayesian Statistics and Its Applications, Anshan Ltd, New Delhi, pp. 160-197.
View/Download from: UTS OPUS
Geweke, J. & Whiteman, C. 2006, 'Bayesian forecasting' in Elliot, G., Granger, C.W.J. & Timmerman, A. (eds), Handbook of Economic Forecasting, Elsevier, The Netherlands, pp. 3-80.
View/Download from: UTS OPUS or Publisher's site
Bayesian forecasting is a natural product of a Bayesian approach to inference. The Bayesian approach in general requires explicit formulation of a model, and conditioning on known quantities, in order to draw inferences about unknown ones. In Bayesian forecasting, one simply takes a subset of the unknown quantities to be future values of some variables of interest. This chapter presents the principles of Bayesian forecasting, and describes recent advances in computational capabilities for applying them that have dramatically expanded the scope of applicability of the Bayesian approach. It describes historical developments and the analytic compromises that were necessary prior to recent developments, the application of the new procedures in a variety of examples, and reports on two long-term Bayesian forecasting exercises
Geweke, J., Houser, D.E. & Keane, M. 2001, 'Simulation Based Inference for Dynamic Multinomial Choice Models' in A Companion to Theoretical Econometrics A Companion to Theoretical Econometrics A Companion to Theoretical Econometrics, Blackwell Publishing, USA, pp. 466-493.
Geweke, J. & Keane, M. 2001, 'Computationally intensive methods for integration in econometrics' in Handbook of Econometrics Handbook of Econometrics This Document Abstract Actions Cited By Save as Citation Alert E-mail Article Export Citation H, Elsevier Inc, North Holland, pp. 3463-3568.
Geweke, J. 2001, 'Embedding Bayesian Tools in Mathematical Software' in George, E.I. (ed), Bayesian Methods with Applications to Science, Policy and Official Statistics., Eurostat, Brussels, pp. 165-174.
The BACC software provides its users with tools for Bayesian Analysis, Computation and Communications. These tools are embedded in mathematical software applications such as Matlab and Gauss. From the userâs perspective, there is a seamless integration of special-purpose BACC commands with the powerful built-in commands of the application. Several models are currently available, and BACC is designed to be extendible. We give a brief demonstration of the use of BACC for Matlab, and discuss the implementation of new models for BACC.
Geweke, J. 2000, 'Simulation-Based Bayesian Inference for Economic Time Series' in Mariano, R., Schuermann, T. & Weeks, M.J. (eds), Simulation-Based Inference in Econometrics: Methods and Applications, Cambridge University Press, Cambridge, pp. 255-299.
Geweke, J. & Keane, M. 2000, 'Bayesian Inference for Dynamic Discrete Choice Models without the Need for Dynamic Programming' in Mariano, Schuermann & Weeks (eds), Simulation-based Inference in Econometrics: Methods and Applications, Cambridge University Press, UK, pp. 100-131.
Geweke, J. 1999, 'Simulation Methods for Model Criticism and Robustness Analysis' in Berger, J.O., Bernado, J.M., Dawid, A.P. & Smith, A.F.M. (eds), Bayesian Statistics 6, Oxford University Press, Oxford, UK, pp. 275-299.
Abstract: This paper exposits and develops Bayesian methods of model criticism and robustness analysis. The objectives are to clarify the Bayesian interpretation of non-Bayesian diagnostic tests, and provide explicitly Bayesian procedures accessible to practical investigators. Specific methods for prior density criticism and robustness analysis, and data density criticism, are presented. All are based on the approximation of appropriate Bayes factors, and avoid the need for posterior simulation under alternative model specifications. A general method of data density criticism is developed, that requires neither posterior simulation nor analytical approximations under any model specification. Some of the methods presented here have been implemented in user oriented software. The paper presents a few simple illustrations of the methods.
Geweke, J. & Keane, M. 1999, 'Mixture of Normals Probit Models' in Hsiao, C., Lahiri, K., Lee, L.F. & Pesaran, M.H. (eds), Analysis of Panels and Limited Dependent Variables: A Volume in Honor of G. S. Maddala, Cambridge University Press, Cambridge, pp. 49-78.
Abstract: This paper generalizes the normal probit model of dichotomous choice by introducing mixtures of normals distributions for the disturbance term. By mixing on both the mean and variance parameters and by increasing the number of distributions in the mixture these models effectively remove the normality assumption and are much closer to semiparametric models. When a Bayesian approach is taken, there is an exact finite-sample distribution theory for the choice probability conditional on the covariates. The paper uses artificial data to show how posterior odds ratios can discriminate between normal and nonnormal distributions in probit models. The method is also applied to female labor force participation decisions in a sample with 1,555 observations from the PSID. In this application, Bayes factors strongly favor mixture of normals probit models over the conventional probit model, and the most favored models have mixtures of four normal distributions for the disturbance term.
Geweke, J. 1999, 'Some Experiments in Constructing A Hybrid Model for Macroeconomic Analysis: A Comment' in McCallum, B. (ed), Carnegie-Rochester Conference Series on Public Policy, Elsevier, pp. 143-147.
Geweke, J. 1997, 'Posterior Simulators in Econometrics' in Kreps, D.M. & Wallis, K.F. (eds), Advances in Economics and Econometrics: Theory and Applications, Cambridge University Press, Cambridge, pp. 128-165.
Abstract: The development of posterior simulators in the last decade has revised beliefs about the foregoing three propositions held by many econometricians who have followed these developments closely. The purpose of this paper is to convey these innovations and their significance for applied econometrics, to econometricians who have not followed the relevant mathematical and applied literature. There are four substantive sections. One section reviews aspects of Bayesian inference essential to understanding the implications of posterior simulators for Bayesian econometrics. Another section describes these simulators and provides the essential convergence results. Implications of these procedures for some selected econometric models are drawn in a third section. This is done to indicate the range of tasks to which posterior simulators are well suited, rather than provide a representative survey of the recent Bayesian econometric literature. Finally, the paper turns to some implications for model comparison, and for communication between those who do applied work and their audiences, that are beginning to emerge from the use of posterior simulators in Bayesian econometrics.
Geweke, J. 1996, 'Bayesian Inference for Linear Models Subject to Linear Inequality Constraints' in Modeling And Prediction - Honoring Seymour Geisser, Springer-verlag, New York, pp. 248-263.
Abstract: The normal linear model, with sign or other linear inequality constraints on its coefficients, arises very commonly in many scientific applications. Given inequality constraints, Bayesian inference is much simpler than classical inference, but standard Bayesian computational methods become impractical when the posterior probability of the inequality constraints (under a diffuse prior) is small. This paper shows how the Gibbs sampling algorithm can provide an alternative, attractive approach to inference subject to linear inequality constraints in this situation, and how the GHK probability simulator may be used to assess the posterior probability of the constraints.
Geweke, J. 1996, 'Monte Carlo Simulation and Numerical Integration' in Amman, H.M., Kendrick, D.A. & Rust, J. (eds), Handbook of Computational Economics, elsevier, pp. 731-800.
Abstract: This is a survey of simulation methods in economics, with a specific focus on integration problems. It describes acceptance methods, importance sampling procedures, and Markov chain Monte Carlo methods for simulation from univariate and multivariate distributions and their application to the approximation of integrals. The exposition gives emphasis to combinations of different approaches and assessment of the accuracy of numerical approximations to integrals and expectations. The survey illustrates these procedures with applications to simulation and integration problems in economics.
Geweke, J. 1996, 'Variable Selection and Model Comparison in Regression' in Bernardo, J.M., Berger, J.O., Dawid, A.P. & Smith, A.F.M. (eds), Bayesian Statistics 5, Oxford University Press, Oxford, pp. 609-620.
Abstract: In the specification of linear regression models it is common to indicate a list of candidate variables from which a subset enters the model with nonzero coefficients. This paper interprets this specification as a mixed continuous-discrete prior distribution for coefficient values. It then utilizes a Gibbs sampler to construct posterior moments. It is shown how this method can incorporate sign constraints and provide posterior probabilities for all possible subsets of regressors. The methods are illustrated using some standard data sets.
Geweke, J. 1996, 'Variable selection Tests of Asset Pricing Models' in Gatsonis, C., Hodges, J.S., Kass, R.E., McCulloch, R.E., Rossi, P. & Singpurwalla, N.D. (eds), Case Studies in Bayesian Statistics (Volume 3), Springer-Verlag, New York.
Geweke, J. 1993, 'Inference and Forecasting for Chaotic Nonlinear Time Series' in Day, R.H. & Chen, P. (eds), Nonlinear Dynamics and Evolutionary Economics, Oxford University Press, Oxford.
Geweke, J. 1993, 'A Dynamic Index Model for Large Cross Sections' in Stock, J. & Watson, M. (eds), New Research on Business Cycles, Indicators and Forecasting, National Bureau of Economic Research, New York.
Geweke, J. 1992, 'Evaluating the Accuracy of Sampling-Based Approaches to the Calculation of Posterior Moments' in Bernardo, J.M., Berger, J.O., Dawid, A.P. & Smith, A.F.M. (eds), Bayesian Statistics 4, Oxford University Press, Oxford, pp. 169-194.
Abstract: Data augmentation and Gibbs sampling are two closely related, sampling-based approaches to the calculation of posterior moments. The fact that each produces a sample whose constituents are neither independent nor identically distributed complicates the assessment of convergence and numerical accuracy of the approximations to the expected value of functions of interest under the posterior. In this paper methods from spectral analysis are used to evaluate numerical accuracy formally and construct diagnostics for convergence. These methods are illustrated in the normal linear model with informative priors, and in the Tobit-censored regression model.
Geweke, J., Barnett, W. & Wolfe, M. 1991, 'Seminonparametric Bayesian Estimation of Consumer Demand and Factor Demand Functions' in Barnett, W.A., Cornet, B., D'Aspremont, C., Gabszewicz, J. & Mas-Colell, A. (eds), Equilibrium Theory and Applications, Cambridge University Press, Cambridge, pp. 425-480.
Geweke, J., Barnett, W. & Yue, P. 1991, 'Semiparametric Bayesian Estimation of the Asymptotically Ideal Model: The AIM Demand System' in Barnett, W.A., Powell, J. & Tauchen, G.E. (eds), Nonparametric and Semiparametric Methods in Econometrics and Statistics, Cambridge University Press, Cambridge, pp. 127-174.
Geweke, J. 1989, 'Modeling with Normal Polynomial Expansions' in Barnett, W.A., Geweke, J. & Shell, K. (eds), Economic Complexity: Chaos, Sunspots, Bubbles, and Nonlinearity, Cambridge University Press, Cambridge, pp. 337-360.
Abstract: Polynomial expansions of the normal probability density function are proposed as a class of models for unobserved components. Operational procedures for Bayesian inference in these models are developed, as are methods for combining a sequence of such models and evaluation of the hypotheses of normality and symmetry. The contributions of this chapter are illustrated with an application to daily rates of change in stock price.
Geweke, J. 1988, 'Exact Inference in Models with Autoregressive Conditional Heteroskedasticity' in Barnett, W.A., Berndt, E.R. & White, H. (eds), Dynamic Econometric Modeling, Cambridge University Press, Cambridge, pp. 73-103.
Geweke, J. 1987, 'Endogeneity and Exogeneity' in Eatwell, J., Milgate, M. & Newman, P. (eds), The New Palgrave: A Dictionary of Economic Theory and Doctrine., The Macmillan Press, London.
Geweke, J. 1984, 'Inference and Causality in Economic Time Series Models' in Griliches, Z. & Intriligator, M.D. (eds), Handbook of Econometrics, Volume 2, North-Holland, Amsterdam, pp. 1101-1144.
Geweke, J. 1983, 'Causality, Exogeneity, and Inference' in Hildenbrand, D. (ed), Advances in Econometrics, Cambridge University Press, Cambridge, pp. 209-236.
Geweke, J. 1982, 'Feedback Between Monetary Policy, Labor Market Activity, and Wage Inflation in the United States, 1955-1978' in Workers, Jobs and Inflation, Brookings Institution Press, Washington, pp. 159-198.
Geweke, J. & Weisbrod, B. 1980, 'Some Economic Consequences of Technological Advance in Medical Care: The Case of a New Drug' in Helms, R. (ed), Drugs and Health, American Enterprise Institute, Washington.
Geweke, J. & Dent, W. 1980, 'On Specification in Simultaneous Equation Models' in Kmenta, J. & Ramsey, J.B. (eds), Evaluation of Econometric Models, Elsevier Science & Technology Books, pp. 169-196.
Geweke, J. 1978, 'The Temporal and Sectoral Aggregation of Seasonally Adjusted Time Series' in Zellner, A. (ed), Seasonal Analysis of Economic Time Series, Government Printing Office, Washington, US, pp. 411-432.
Abstract: Procedures for the optimal seasonal adjustment of economic time series and their aggregation are derived, given a criterion suitable for the adjustment of the data used in political or journalistic contexts. It is shown that data should be adjusted jointly and then temporally or sectorally aggregated, as desired, a procedure that preserves linear aggregation identities. Examination of actual economic time series indicates that the optimal seasonal adjustment and aggregation of data provide a substantial improvement in the quality of sectorally disaggregated, adjusted data and considerably reduces the required subsequent revision of current adjusted series.
Geweke, J. 1978, 'The Revision of Seasonally Adjusted Time Series' in 1978 Proceedings of the Business and Economic Statistics Section - American Statistical Association, pp. 320-325.
Geweke, J. 1977, 'The Dynamic Factor Analysis of Economic TIme Series Model' in Aignew, D. & Goldberger, A. (eds), Latent Variables in Socioeconomic Models, AMsterdam: North-Holland, pp. 365-383.
Geweke, J. 1977, 'Wage and Price Dynamics in U.S. Manufacturing' in New Methods in Business Cycle Research, Federal Researve Bank of Minneapolis, Minneapolis, MN USA, pp. 111-158.

Conferences

Eckert, C., Geweke, J., Louviere, J.J., Satchell, S.E. & Thorp, S.J. 2011, 'Economic rationality, risk presentation, and retirement portfolio choice', Financial Management Association Annual Meeting, Denver, USA.
Geweke, J. 2010, 'Complete and Incomplete Bayesian Models for Financial Time Series', JSM Proceedings, Section on Bayesian Statistical Science, American Statistical Association, Vancouver, Canada.
This paper introduces the idea of an incomplete Bayesian model, which is a (possibly incoherent) prior predictive distribution for sample moments. Conventional complete Bayesian models also provide prior distributions for sample moments and consequently formal comparison of completely and incomplete models can be conducted by means of posterior odds ratios. This provides a logically consistent and workable Bayesian alternative to non-Bayesian significance tests and is an effective tool in the process of model development. These ideas are illustrated using three well-known alternative models for monthly S&P 500 index returns.
Geweke, J. 2000, 'Bayesian Communication: The BACC System', 2000 Proceedings of the Section on Bayesian Statistical Sciences - American Statistical Association, pp. 40-49.
Geweke, J., Keane, M. & Runkle, D. 1994, 'Recursively Simulating Multinomial Multiperiod Probit Probabilities', American Statistical Association 1994 Proceedings of the Business and Economic Statistics Section..
Geweke, J. & Terui, N. 1991, 'Threshold Autoregressive Models for Macroeconomic Time Series: A Bayesian Approach', American Statistical Association 1991 Proceedings of the Business and Economic Statistics Section, American Statistical Association, pp. 42-50.
Geweke, J. 1991, 'Efficient Simulation from the Multivariate Normal and Student-t DistribuAmerican tions Subject to Linear Constraints', Computing Science and Statistics: Proceedings of the Twenty-Third Symposium on the Interface, Interface Foundation of North America, Fairfax, pp. 571-578.
The following routines constitute the software for the paper, "Efficient Simulation from the Multivariate normal and Student-t Distributions Subject to Linear Constraints and the Evaluation of Constraint Probabilities," by John Geweke. This paper is to appear in the volume, "Computing Science and Statistics: Proceedings of the Twenty-Third Symposium on the Interface." This work was supported by NSF Grant SES-8908365.
Geweke, J. 1989, 'The Posterior Distribution of Roots in Multivariate Autoregressions', Erasmus University of Rotterdam - Econometric Institute.
Geweke, J. 1989, 'Acceleration Methods for Monte Carlo Integration in Bayesian Inference', American Statistical Society, Alexandria, pp. 587-592.
Abstract: Methods for the acceleration of Monte Carlo integration with n replications in a sample of size T are investigated. A general procedure for combining antithetic variation and grid methods with Monte Carlo methods is proposed, and it is shown that the numerical accuracy of these hybrid methods can be evaluated routinely. The derivation indicates the characteristics of applications in which acceleration is likely to be most beneficial. This is confirmed in a worked example, In which these acceleration methods reduce the computation time required to achieve a given degree of numerical Accuracy by several orders of magnitude.
Geweke, J. 1986, 'Fixed Investment in the American Business Cycle', The American Business Cycle: Continuity and Change, National Bureau of Economic Research, New York.
Geweke, J. 1983, 'Semi-Nonparametric and Nonparametric Regression: Consumer Demand Applications', Proceedings of the Business and Economic Section - American Statistical Association, American Statistical Association, USA.
Geweke, J. 1983, 'Models of X-11 and 'X-11 Forecast' Procedures', Applied Time Series Analysis of Economic Data, U.S. Bureau of the Census, Washington, pp. 12-13.
Geweke, J. 1982, 'New Divisia Indices of the Money Supply', Proceedings of the Business and Economics Section, American Statistical Association.
Geweke, J. 1978, 'On the Synthesis of Time Series and Econometric Models', Directions in Time Series, Institute of Mathematical Statistics.
Geweke, J. 1978, 'Some Recent Developments in Seasonal Adjustment', Directions in Time Series, Institute of Mathematical Statistics.

Journal articles

Bateman, H., Eckert, C., Geweke, J., Louviere, J.J., Satchell, S.E. & Thorp, S.J. 2016, 'Risk presentation and retirement portfolio choice', Review of Finance, vol. 20, no. 1, pp. 201-229.
View/Download from: Publisher's site
Bateman, H., Eckert, C., Geweke, J., Louviere, J., Satchell, S. & Thorp, S. 2016, 'Risk Presentation and Portfolio Choice', Review of Finance, vol. 20, no. 1, pp. 201-229.
View/Download from: UTS OPUS or Publisher's site
Durham, G., Geweke, J. & Ghosh, P. 2015, 'A comment on Christoffersen, Jacobs, and Ornthanalai (2012), "Dynamic jump intensities and risk premiums: Evidence from S&P 500 returns and options"', JOURNAL OF FINANCIAL ECONOMICS, vol. 115, no. 1, pp. 210-214.
View/Download from: Publisher's site
Geweke, J. & Petrella, L. 2014, 'Likelihood-based Inference for Regular Functions with Fractional Polynomial Approximations', Journal of Econometrics, vol. 183, no. 1, pp. 22-30.
The paper demonstrates limitations in previous work using Müntz-Szatz polynomial approximations for regular functions. It introduces an alternative set of fractional polynomial approximations not subject to these limitations. Using Weierstrass approximation theory it shows that the set of fractional polynomial approximations is dense on a Sobolev space of functions on a com-pact set. Imposing regularity conditions directly on the fractional polynomi-als produces pseudo-true approximations that converge rapidly to productions functions having no exact representation as fractional polynomials. A small Monte Carlo study recovers this convergence in Ânite sample, and the results are promising for future development of an adequate sampling-theoretic distribution theory.
Frischknecht, B.D., Eckert, C., Geweke, J. & Louviere, J.J. 2014, 'A Simple Method to Estimate Preference Parameters for Individuals', International Journal of Research in Marketing, vol. 31, pp. 35-48.
View/Download from: Publisher's site
Bateman, H., Eckert, C., Geweke, J., Louviere, J.J., Satchell, S.E. & Thorp, S.J. 2014, 'Financial competence, risk presentation and retirement portfolio preferences', Journal of Pension Economics and Finance, vol. 13, no. 1, pp. 27-61.
View/Download from: Publisher's site
Geweke, J. & Amisano, G. 2014, 'Analysis of variance for Bayesian inference', Econometric Reviews, vol. 33, no. 1-4, pp. 270-288.
View/Download from: Publisher's site
Durham, G. & Geweke, J. 2014, 'Adaptive sequential posterior simulators for massively parallel computing environments', Advances in Econometrics, vol. 34, pp. 1-44.
View/Download from: Publisher's site
Copyright © 2014 by Emerald Group Publishing Limited. Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially quite advantageous ways. But to fully exploit these benefits algorithms that conform to parallel computing environments are needed. This paper presents a sequential posterior simulator designed to operate efficiently in this context. The simulator makes fewer analytical and programming demands on investigators, and is faster, more reliable, and more complete than conventional posterior simulators. The paper extends existing sequential Monte Carlo methods and theory to provide a thorough and practical foundation for sequential posterior simulation that is well suited to massively parallel computing environments. It provides detailed recommendations on implementation, yielding an algorithm that requires only code for simulation from the prior and evaluation of prior and data densities and works well in a variety of applications representative of serious empirical work in economics and finance. The algorithm facilitates Bayesian model comparison by producing marginal likelihood approximations of unprecedented accuracy as an incidental by-product, is robust to pathological posterior distributions, and provides estimates of numerical standard error and relative numerical efficiency intrinsically. The paper concludes with an application that illustrates the potential of these simulators for applied Bayesian inference.
Durham, G. & Geweke, J. 2014, 'Improving Asset Price Prediction When All Models are False', Journal of Financial Econometrics, vol. 12, no. 2, pp. 278-306.
View/Download from: Publisher's site
Geweke, J. 2014, 'Review Essay on Charles F. Manski's Public Policy in an Uncertain World: Analysis and Decisions', JOURNAL OF ECONOMIC LITERATURE, vol. 52, no. 3, pp. 799-804.
View/Download from: Publisher's site
Bateman, H., Eckert, C., Geweke, J., Louviere, J.J., Thorp, S.J. & Satchell, S. 2012, 'Financial competence and expectations formation: Evidence from Australia', The Economic Record, vol. 88, no. 280, pp. 39-63.
View/Download from: UTS OPUS or Publisher's site
We study the financial competence of Australian retirement savers using self-assessed and quantified measures. Responses to financial literacy questions show large variation and compare poorly with some international surveys. Basic and sophisticated financial literacy vary significantly with most demographics, self-assessed financial competence, income, superannuation accumulation and net worth. General numeracy scores are largely constant across gender, age, higher education and income. Financial competence also significantly affects expectations of stock market performance. Using a discrete choice model, we show that individuals with a higher understanding of risk, diversification and financial assets are more likely to assign a probability to future financial crises rather than expressing uncertainty.
Geweke, J., Koop, G. & Paap, R. 2012, 'Introduction for the annals issue of the Journal of Econometrics on "Bayesian Models, Methods and Applications"', Journal of Econometrics, vol. 171, no. 2, pp. 99-100.
View/Download from: UTS OPUS or Publisher's site
This Annals issue of the Journal of Econometrics grew out of the European Seminar on Bayesian Econometrics (ESOBE) which was held at Erasmus University, Rotterdam on November 56, 2010. This conference was important for two reasons. First it inaugurated ESOBE, which has become a successful annual conference which brings European and international Bayesians together. Second, it celebrated the retirement of Herman van Dijk after a long and successful career in Bayesian econometric
Geweke, J. & Amisano, G. 2012, 'Prediction With Misspecified Models', American Economic Review, vol. 102, no. 3, pp. 482-486.
View/Download from: UTS OPUS or Publisher's site
Many decision-makers in the public and private sectors routinely consult the im- plications of formal economic and statistical models in their work. Especially in large organizations and for important decisions, there are often competing models. Of course no model under consideration is a literal representation of reality for the purposes at hand .more succinctly, no model is .true..and di¤erent models focus on di¤erent aspects of the relevant environment. This fact can often be supported by formal econometric tests concluding that the models at hand are, indeed, misspecified in various dimensions.
Geweke, J. 2012, 'Nonparametric Bayesian modelling of monotone preferences for discrete choice experiments', Journal of Econometrics, vol. 171, no. 2, pp. 185-204.
View/Download from: UTS OPUS or Publisher's site
Discrete choice experiments are widely used to learn about the distribution of individual preferences for product attributes. Such experiments are often designed and conducted deliberately for the purpose of designing new products. There is a long-standing literature on nonparametric and Bayesian modelling of preferences for the study of consumer choice when there is a market for each product, but this work does not apply when such markets fail to exist as is the case with most product attributes. This paper takes up the common case in which attributes can be quantified and preferences over these attributes are monotone. It shows that monotonicity is the only shape constraint appropriate for a utility function in these circumstances. The paper models components of utility using a Dirichlet prior distribution and demonstrates that all monotone nondecreasing utility functions are supported by the prior. It develops a Markov chain Monte Carlo algorithm for posterior simulation that is reliable and practical given the number of attributes, choices and sample sizes characteristic of discrete choice experiments. The paper uses the algorithm to demonstrate the flexibility of the model in capturing heterogeneous preferences and applies it to a discrete choice experiment that elicits preferences for different auto insurance policies.
Geweke, J. & Amisano, G. 2011, 'Hierarchical Markov normal mixture models with applications to financial asset returns', Journal of Applied Econometrics, vol. 26, no. 1, pp. 1-29.
View/Download from: UTS OPUS or Publisher's site
Abstract: Motivated by the common problem of constructing predictive distributions for daily asset returns over horizons of one to several trading days, this article introduces a new model for time series. This model is a generalization of the Markov normal mixture model in which the mixture components are themselves normal mixtures, and it is a specific case of an artificial neural network model with two hidden layers. The article characterizes the implications of the model for time series in two ways. First, it derives the restrictions placed on the autocovariance function and linear representation of integer powers of the time series in terms of the number of components in the mixture and the roots of the Markov process. Second, it uses the prior predictive distribution of the model to study the implications of the model for some interesting functions of asset returns. The article uses the model to construct predictive distributions of daily S&P 500 returns 1971-2005, US dollar -- UK pound returns 1972-1998, and one- and ten-year maturity bonds 1987-2006. It compares the performance of the model for these returns with ARCH and stochastic volatility models using the predictive likelihood function. The model's performance is about the same as its competitors for the bond returns, better than its competitors for the S&P 500 returns, and much better than its competitors for the dollar-pound returns. In- and out-of-sample validation exercises with predictive distributions identify some remaining deficiencies in the model and suggest potential improvements. The article concludes by using the model to form predictive distributions of one- to ten-day returns during volatile episodes for the S&P 500, dollar-pound and bond return series.
Geweke, J. & Jiang, Y. 2011, 'Inference and prediction in a multiple-structural-break model', Journal Of Econometrics, vol. 163, no. 2, pp. 172-185.
View/Download from: UTS OPUS or Publisher's site
This paper develops a new Bayesian approach to structural break modeling. The focuses of the approach are the modeling of in-sample structural breaks and forecasting time series allowing out-of-sample breaks. The model has several desirable features. Fir
Geweke, J. & Amisano, G. 2011, 'Optimal prediction pools', Journal Of Econometrics, vol. 164, no. 1, pp. 130-141.
View/Download from: UTS OPUS or Publisher's site
We consider the properties of weighted linear combinations of prediction models, or linear pools, evaluated using the log predictive scoring rule. Although exactly one model has limiting posterior probability, an optimal linear combination typically includes several models with positive weights. We derive several interesting results: for example, a model with positive weight in a pool may have zero weight if some other models are deleted from that pool. The results are illustrated using S&P 500 returns with six prediction models. In this example models that are clearly inferior by the usual scoring criteria have positive weights in optimal linear pools.
Geweke, J. & Amisano, G. 2011, 'Optimal prediction pools', vol. 164, no. 1, pp. 130-141.
We consider the properties of weighted linear combinations of prediction models, or linear pools, evaluated using the log predictive scoring rule. Although exactly one model has limiting posterior probability, an optimal linear combination typically includes several models with positive weights. We derive several interesting results: for example, a model with positive weight in a pool may have zero weight if some other models are deleted from that pool. The results are illustrated using S&P 500 returns with six prediction models. In this example models that are clearly inferior by the usual scoring criteria have positive weights in optimal linear pools.
Geweke, J. 2010, 'Bayesian Analysis of DSGE Models', Econometric Reviews, vol. 26, pp. 193-200.
Geweke, J. & Amisano, G. 2010, 'Comparing And Evaluating Bayesian Predictive Distributions Of Asset Returns', International Journal of Forecasting, vol. 26, no. 2, pp. 216-230.
View/Download from: UTS OPUS or Publisher's site
Bayesian inference in a time series model provides exact out-of-sample predictive distributions that fully and coherently incorporate parameter uncertainty. This study compares and evaluates Bayesian predictive distributions from alternative models, usin
Geweke, J. 2010, 'Comment', International Journal of Forecasting, vol. 26, no. 2, pp. 435-438.
View/Download from: UTS OPUS or Publisher's site
The article by Zellner and Ando proposes methods for coping with the excess kurtosis that is often observed in disturbances in applications of the seemingly unrelated regressions (SUR) model. This is an important topic which is of particular relevance in
Geweke, J. & Amisano, G. 2010, 'Comparing and evaluating Bayesian predictive distributions of asset returns', vol. 26, no. 2, pp. 216-230.
Bayesian inference in a time series model provides exact out-of-sample predictive distributions that fully and coherently incorporate parameter uncertainty. This study compares and evaluates Bayesian predictive distributions from alternative models, using as an illustration five alternative models of asset returns applied to daily S&P 500 returns from the period 1976 through 2005. The comparison exercise uses predictive likelihoods and is inherently Bayesian. The evaluation exercise uses the probability integral transformation and is inherently frequentist. The illustration shows that the two approaches can be complementary, with each identifying strengths and weaknesses in models that are not evident using the other.
Ackerberg, D., Geweke, J. & Hahn, J. 2009, 'Comments on 'Convergence Properties Likelihood of Computed Dynamic Models'', Econometrica, vol. 77, no. 6, pp. 2009-2017.
View/Download from: UTS OPUS or Publisher's site
We show by counterexample that Proposition 2 in Fernández-Villaverde, Rubio- Ramírez, and Santos (Econometrica (2006), 74, 93119) is false. We also show that even if their Proposition 2 were corrected, it would be irrelevant for parameter estimates. As a more constructive contribution, we consider the effects of approximation error on parameter estimation, and conclude that second order approximation errors in the policy function have at most second order effects on parameter estimates.
Geweke, J. & Keane, M. 2007, 'Smoothly mixing regressions', Journal of Econometrics, vol. 138, no. 1, pp. 252-290.
View/Download from: UTS OPUS or Publisher's site
This paper extends the conventional Bayesian mixture of normals model by permitting state probabilities to depend on observed covariates. The dependence is captured by a simple multinomial probit model. A conventional and rapidly mixing MCMC algorithm provides access to the posterior distribution at modest computational cost. This model is competitive with existing econometric models, as documented in the paper's illustrations. The first illustration studies quantiles of the distribution of earnings of men conditional on age and education, and shows that smoothly mixing regressions are an attractive alternative to nonBayesian quantile regression. The second illustration models serial dependence in the S&P 500 return, and shows that the model compares favorably with ARCH models using out of sample likelihood criteria.
Geweke, J. 2007, 'Interpretation and Inference in Mixture Models: Simple MCMC Works', Computational Statistics and Data Analysis, vol. 51, no. 7, pp. 3529-3550.
View/Download from: UTS OPUS or Publisher's site
Abstract: The mixture model likelihood function is invariant with respect to permutation of the components of the mixture. If functions of interest are permutation sensitive, as in classification applications, then interpretation of the likelihood function requires valid inequality constraints and a very large sample may be required to resolve ambiguities. If functions of interest are permutation invariant, as in prediction applications, then there are no such problems of interpretation. Contrary to assessments in some recent publications, simple and widely used Markov chain Monte Carlo (MCMC) algorithms with data augmentation reliably recover the entire posterior distribution.
Geweke, J. 2007, 'Bayesian model comparison and validation', American Economic Review, vol. 97, no. 2, pp. 60-64.
View/Download from: UTS OPUS or Publisher's site
Bayesian econometrics provides a tidy theory and practical methods of comparing and combining several alternative, completely specified models for a common data set. It is always possible that none of the specified models describe important aspects of the data well. The investigation of this possibility, a process known as model validation or model specification checking, is an important part of applied econometric work. Bayesian theory and practice for model validation are less well developed. A well-established Bayesian literature argues that non-Bayesian methods are essential in model validation. This line of though persists in Bayesian econometrics as well; the paper reviews these methods. The paper proposes an alternative, fully Bayesian method of model validation based on the concept of incomplete models, and argues that this method is also strategically advantageous in applied Bayesian econometrics.
Geweke, J. 2007, 'Comment', Econometric Reviews, vol. 26, no. 2-4, pp. 193-200.
View/Download from: Publisher's site
The article provides detailed and accurate illustrations of Bayesian analysis of DSGE models that are likely to be used increasingly in support of central bank policy making. These comments identify a dozen aspects of these methods, discussing how their application and improvement can contribute to effective support of policy.
Geweke, J., Groenen, P.J.E., Paap, R. & van Dijk, H.K. 2007, 'Computational techniques for applied econometric analysis of macroeconomic and financial processes', COMPUTATIONAL STATISTICS & DATA ANALYSIS, vol. 51, no. 7, pp. 3506-3508.
View/Download from: Publisher's site
Geweke, J. 2007, 'Bayesian dynamic econometrics - Comment', Econometric Reviews, vol. 26, no. 2-4, pp. 193-200.
View/Download from: Publisher's site
Abrantes-Metz, R., Froeb, L., Geweke, J. & Taylor, C. 2006, 'A Variance Screen for Collusion', International Journal Of Industrial Organisation, vol. 24, no. 3, pp. 467-486.
View/Download from: UTS OPUS or Publisher's site
Abstract: In this paper, we examine price movements over time around the collapse of a bid-rigging conspiracy. While the mean decreased by sixteen percent, the standard deviations increased by over two hundred percent. We hypothesize that conspiracies in other industries would exhibit similar characteristics and search for "pockets" of low price variation as indicators of collusion in the retail gasoline industry in Louisville. We observe no such areas around Louisville in 1996-2002.
Geweke, J. 2004, 'Getting it Right: Joint Distribution Tests of Posterior Simulators', Journal of the American Statistical Association, vol. 99, no. 467, pp. 799-804.
View/Download from: UTS OPUS or Publisher's site
Abstract: Analytical or coding errors in posterior simulators can produce reasonable but incorrect approximations of posterior moments. This article develops simple tests of posterior simulators that detect both kinds of errors, and uses them to detect and correct errors in two previously published papers. The tests exploit the fact that a Bayesian model specifies the joint distribution of observables (data) and unobservables (parameters). There are two joint distribution simulators. The marginal conditional simulator draws unobservables from the prior and then observables conditional on unobservables. The successive-conditional simulator alternates between the posterior simulator and an observables simulator. Formal comparison of moment approximations of the two simulators reveals existing analytical or coding errors in the posterior simulator.
Geweke, J. & H, T. 2003, 'Note on the Sampling Distribution for the Metropolis-Hastings Algorithm', Communications In Statistics-theory And Methods, vol. 32, pp. 775-789.
View/Download from: UTS OPUS or Publisher's site
Abstract: The Metropolis-Hastings algorithm has been important in the recent development of Bayes methods. This algorithm generates random draws from a target distribution utilizing a sampling (or proposal) distribution. This article compares the properties of three sampling distributions-the independence chain, the random walk chain, and the Taylored chain suggested by Geweke and Tanizaki (Geweke, J., Tanizaki, H. (1999). On Markov Chain Monte-Carlo methods for nonlinear and non-Gaussian state-space models. Communications in Statistics, Simulation and. Computation 28(4):867-894, Geweke, J., Tanizaki, H. (2001). Bayesian estimation of state-space model using the Metropolis-Hastings algorithm within Gibbs sampling. Computational Statistics and Data Analysis 37(2):151-170).
Geweke, J., Gowrisankaran, G. & Town, R. 2003, 'Bayesian Inference for Hospital Quality in a Selection Model', Econometrica, vol. 71, pp. 1215-1238.
View/Download from: UTS OPUS or Publisher's site
This paper develops new econometric methods to infer hospital quality in a model with discrete dependent variables and nonrandom selection. Mortality rates in patient discharge records are widely used to infer hospital quality. However, hospital admission is not random and some hospitals may attract patients with greater unobserved severity of illness than others. In this situation the assumption of random admission leads to spurious inference about hospital quality. This study controls for hospital selection using a model in which distance between the patient's residence and alternative hospitals are key exogenous variables. Bayesian inference in this model is feasible using a Markov chain Monte Carlo posterior simulator, and attaches posterior probabilities to quality comparisons between individual hospitals and groups of hospitals. The study uses data on 74,848 Medicare patients admitted to 114 hospitals in Los Angeles County from 1989 through 1992 with a diagnosis of pneumonia. It finds the smallest and largest hospitals to be of the highest quality. There is strong evidence of dependence between the unobserved severity of illness and the assignment of patients to hospitals, whereby patients with a high unobserved severity of illness are disproportionately admitted to high quality hospitals. Consequently a conventional probit model leads to inferences about quality that are markedly different from those in this study's selection model.
Geweke, J. & Durham, G. 2003, 'Iterative and Recursive Estimation in Structured Non-Adaptive Models', Journal of Business & Economic Statistics, vol. 21, no. 4, pp. 490-492.
This was an invited paper and the journal solicited various people to write comments on it. So I have a comment on this paper. It starts on page 490 at the end of the article.
Geweke, J. 2003, 'Econometric issues in using the AHEAD panel', vol. 112, no. 1, pp. 115-120.
Durham, G. & Geweke, J. 2003, 'Journal of Business and Economic Statistics: Comment', Journal of Business and Economic Statistics, vol. 21, no. 4, pp. 490-492.
Geweke, J. & Martin, D. 2002, 'Pitfalls in Drawing Policy Conclusions from Retrospective Survey Data: The Case of Advertising and Underage Smoking', Journal of Risk and Uncertainty, vol. 83, pp. 1181-1186.
Abstract: Measuring the impact of potentially controllable factors on the willingness of youth to undertake health risks is important to informed public health policy decisions. Typically the only data linking these factors with risk-taking behavior are retrospective. This study demonstrates, by means of a recent example, that there can be serious pitfalls in using even longitudinal retrospective data to draw conclusions about causal relations between potentially controllable factors and risk-taking behavior.
Geweke, J. 2002, 'Commentary: Econometric issues in using the AHEAD Panel', Journal of Econometrics, vol. 112, no. 1, pp. 115-120.
View/Download from: UTS OPUS or Publisher's site
This study provides an illuminating perspective on the relation between health and socio-economic status. It is notable in meeting, head on, various technical but critical issues that arise in using the AHEAD panel to address issues of causation between health and socio-economic status (SES). This panel provides multiple measures of both health and SES, and there is no prior consensus reduction of these many dimensions. Household wealth is the candidate summary measure of economic status, but as users of self-reported wealth know and the authors lucidly demonstrate, severe measurement errors raise a host of methodological problems of their own. These comments focus on the way the authors have addressed these and some of the other technical issues that have to be confronted in one way or another in order to address the central issues.
Geweke, J. 2001, 'Bayesian Econometrics and Forecasting', Journal of Econometrics, vol. 100, no. 1, pp. 11-15.
View/Download from: UTS OPUS or Publisher's site
Abstract: Contemporary Bayesian forecasting methods draw on foundations in subjective probability and preferences laid down in the mid-twentieth century, and utilize numerical methods developed since that time in their implementation. These methods unify the tasks of forecasting and model evaluation. They also provide tractable solutions for problems that prove difficult when approached using non-Bayesian methods. These advantages arise from the fact that the conditioning in Bayesian probability forecasting is the same as the conditioning in the underlying decision problems.
Geweke, J. 2001, 'Bayesian inference and posterior simulators', Canadian Journal of Agricultural Economics, vol. 49, no. 3, pp. 313-325.
View/Download from: UTS OPUS or Publisher's site
Abstract: Recent advances in simulation methods have made possible the systematic application of Bayesian methods to support decision making with econometric models. This paper outlines the key elements of Bayesian investigation, and the simulation methods applied to bring them to bear in application.
Geweke, J. & McCausland, W.J. 2001, 'Bayesian Specification Analysis in Econometrics', American Journal of Agricultural Economics, vol. 83, pp. 1181-1186.
View/Download from: Publisher's site
Geweke, J. & Tanizaki, H. 2001, 'Bayesian Estimation of State-Space Models Using Metropolis-Hastings Algorithm with Gibbs Sampling', Computational Statistics and Data Analysis, vol. 37, no. 2, pp. 151-170.
View/Download from: UTS OPUS
Abstract: In this paper, an attempt is made to show a general solution to nonlinear and/or non-Gaussian state-space modeling in a Bayesian framework, which corresponds to an extension of Carlin et al. (J. Amer. Statist. Assoc. 87(418) (1992) 493â500) and Carter and Kohn (Biometrika 81(3) (1994) 541â553; Biometrika 83(3) (1996) 589â601). Using the Gibbs sampler and the MetropolisâHastings algorithm, an asymptotically exact estimate of the smoothing
Geweke, J. 2001, 'A Note on Some Limitations of CRRS Utility', Economic Letters, vol. 71, no. 3, pp. 341-345.
View/Download from: UTS OPUS
Abstract: In a standard environment for choice under uncertainty with constant relative risk aversion (CRRA), the existence of expected utility is fragile with respect to changes in the distributions of random variables, changes in prior information, or the assumption of rational expectations.
Geweke, J. & Keane, M. 2000, 'An Empirical Analysis of Male Income Dynamics in the PSID: 1968-1989', Journal of Econometrics, vol. 96, no. 2, pp. 293-356.
View/Download from: Publisher's site
Geweke, J. & Keane, M. 2000, 'An empirical analysis of earnings dynamics among men in the PSID: 1968-1989', vol. 96, no. 2, pp. 293-356.
Geweke, J., Rust, J. & Van Dijk, H.K. 2000, 'Introduction - Inference and decision making', Journal of Applied Econometrics, vol. 15, no. 6, pp. 545-546.
View/Download from: Publisher's site
Geweke, J. & Tanizaki, H. 1999, 'On Markov Chain Monte Carlo Methods for Nonlinear and Non-Gaussian State-Space Models', Communications In Statistics-Simulation And Computation, vol. 28, pp. 867-894.
View/Download from: Publisher's site
Abstract: In this paper, a nonlinear and/or non-Gaussian smoother utilizing Markov chain Monte Carlo Methods is proposed, where the measurement and transition equations are specified in any general formulation and the error terms in the state-space model are not necessarily normal. The random draws are directly generated from the smoothing densities. For random number generation, the Metropolis-Hastings algorithm and the Gibbs sampling technique are utilized. The proposed procedure is very simple and easy for programming, compared with the existing nonlinear and non-Gaussian smoothing techniques. Moreover, taking several candidates of the proposal density function, we examine precision of the proposed estimator.
Geweke, J. 1999, 'Using Simulation Methods for Bayesian Econometric Models: Inference, Development and Communication', Econometric Reviews, vol. 18, no. 1, pp. 1-73.
View/Download from: UTS OPUS
Abstract: This paper surveys the fundamental principles of subjective Bayesian inference in econometrics and the implementation of those principles using posterior simulation methods. The emphasis is on the combination of models and the development of predictive distributions. Moving beyond conditioning on a fixed number of completely specified models, the paper introduces subjective Bayesian tools for formal comparison of these models with as yet incompletely specified models. The paper then shows how posterior simulators can facilitate communication between investigators (for example, econometricians) on the one hand and remote clients (for example, decision makers) on the other, enabling clients to vary the prior distributions and functions of interest employed by investigators. A theme of the paper is the practicality of subjective Bayesian methods. To this end, the paper describes publicly available software for Bayesian inference, model development, and communication and provides illustrations using two simple econometric models.
Geweke, J. 1999, 'Power of Tests in Binary Response Models: Comment', Econometrica, vol. 67, pp. 423-425.
View/Download from: Publisher's site
Geweke, J. 1998, 'Real and Spurious Long Memory Properties of Stock Market Data.', Journal of Business & Economic Statistics, vol. 16, pp. 269-271.
Geweke, J. & Petrella, L. 1998, 'Prior Density Ratio Class Robustness in Econometrics', Journal of Business & Economic Statistics, vol. 16, pp. 469-478.
View/Download from: Publisher's site
Abstract: This paper provides a general and efficient method for computing density ratio class bounds on posterior moments, given the output of a posterior simulator. It shows how density ratio class bounds for posterior odds ratios may be formed in many situations, also on the basis of posterior simulator output. The computational method is used to provide density ratio class bounds in two economic models. It is found that the exact bounds are approximated poorly by their asymptotic approximation, when the posterior distribution of the function of interest is skewed. It is also found that the posterior odds ratios display substantial variation within the density ratio class, in ways that cannot be anticipated by the asymptotic approximation.
Geweke, J., Keane, M. & Runkle, D. 1997, 'Statistical Inference In The Multinomial Multiperiod Probit Model', Journal Of Econometrics, vol. 80, no. 1, pp. 125-165.
View/Download from: Publisher's site
Abstract: Statistical inference in multinomial multiperiod probit models has been hindered in the past by the high dimensional numerical integrations necessary to form the likelihood functions, posterior distributions, or moment conditions in these models. We describe three alternative estimators, implemented using simulation-based approaches to inference, that circumvent the integration problem: posterior means computed using Gibbs sampling and data augmentation (GIBBS), simulated maximum likelihood (SML) estimation using the GHK probability simulator, and method of simulated moment (MSM) estimation using GHK. We perform a set of Monte-Carlo experiments to compare the sampling distributions of these estimators. Although all three estimators perform reasonably well, some important differences emerge. Our most important finding is that, holding simulation size fixed, the relative and absolute performance of the classical methods, especially SML, gets worse when serial correlation in disturbances is strong. In data sets with an AR(1) parameter of 0.50, the RMSEs for SML and MSM based on GHK with 20 draws exceed those of GIBBS by 9% and 0%, respectively. But when the AR(1) parameter is 0.80, the RMSEs for SML and MSM based on 20 draws exceed those of GIBBS by 79% and 37%, respectively, and the number of draws needed to reduce the RMSEs to within 10% of GIBBS are 160 and 80 respectively. Also, the SML estimates of serial correlation parameters exhibit significant downward bias. Thus, while conventional wisdom suggests that 20 draws of GHK is `enough' to render the bias and noise induced by simulation negligible, our results suggest that much larger simulation sizes are needed when serial correlation in disturbances is strong.
Geweke, J. & Zhou, G. 1996, 'Measuring the Pricing Error of the Arbitrage Price Theory', Review of financial studies, vol. 9, no. 2, pp. 557-587.
View/Download from: UTS OPUS
Abstract: This article provides an exact Bayesian framework for analyzing the arbitrage pricing theory (APT). Based on the Gibbs sampler, we show how to obtain the exact posterior distributions for functions of interest in the factor model. In particular, we propose a measure of the APT pricing deviations and obtain its exact posterior distribution. Using monthly portfolio returns grouped by industry and market capitalization, we find that there is little improvement in reducing the pricing errors by including more factors beyond the first one.
Geweke, J. 1996, 'Bayesian Reduced Rank Regression in Econometrics', Journal of Econometrics, vol. 75, no. 1, pp. 121-146.
View/Download from: Publisher's site
Abstract: The reduced rank regression model arises repeatedly in theoretical and applied econometrics. To date the only general treatment of this model have been frequentist. This paper develops general methods for Bayesian inference with noninformative reference priors in this model, based on a Markov chain sampling algorithm, and procedures for obtaining predictive odds ratios for regression models with different ranks. These methods are used to obtain evidence on the number of factors in a capital asset pricing model
Geweke, J. & Zhou, G. 1996, 'Measuring the Pricing Error of the Arbitrage Pricing Theory.', Review of Financial Studies, vol. 9, no. 2.
This article provides an exact Bayesian framework for analyzing the arbitrage pricing theory (APT). Based on the Gibbs sampler, we show how to obtain the exact posterior distributions for functions of interest in the factor model. In particular, we propose a measure of the APT pricing deviations and obtain its exact posterior distribution. Using monthly portfolio returns grouped by industry and market capitalization, we find that there is little improvement in reducing the pricing errors by including more factors beyond the first one. Article published by Oxford University Press on behalf of the Society for Financial Studies in its journal, The Review of Financial Studies.
Geweke, J. & Runkle, D. 1995, 'A Fine Time for Monetary Policy?', Federal Reserve Bank of Minneapolis Quarterly Review, vol. 19, no. 1, pp. 18-31.
View/Download from: UTS OPUS
Almost everyone would agree--even we in the Federal Reserve System--that monetary policy can be improved. But improving it requires accurate empirical descriptions of the current policy and the relationship between that policy and the economic variables policymakers care about. With those descriptions, we could, conceivably, predict how economic outcomes would change under alternative policies and hence find policies that lead to better economic outcomes. The first requirement of this policymaking problem is policy identification, and it is the focus of this study. Policy identification entails a specification of the instrument the Federal Reserve controls and a description of how that instrument is set based on information available when a policy decision is made. Because policy identification is a crucial step in the search for improved monetary policy, it has received much attention in the literature.
Geweke, J. 1994, 'Bayesian Analysis of Stochastic Volatility Models', Journal of Business & Economic Statistics, vol. 12, pp. 397-399.
Geweke, J., Keane, M. & Horowitz, J.L. 1994, 'Advances in Random Utility Models', Marketing Letters, vol. 5, pp. 311-322.
Abstract: In recent years, major advances have taken place in three areas of random utility modeling: (1) semiparametric estimation, (2) computational methods for multinomial probit models, and (3) computational methods for Bayesian estimation. This paper summarizes these developments and discusses their implications for practice.
Geweke, J. 1994, 'Priors for Macroeconomic Time Series and Their Application', Econometric Theory, vol. 10, pp. 609-632.
View/Download from: Publisher's site
Abstract: This paper takes up Bayesian inference in a general trend stationary model for macroeconomic time series with independent Student-t disturbances. The model is linear in the data, but non-linear in the parameters. An informative but nonconjugate family of prior distributions for the parameters is introduced, indexed by a single parameter which can be readily elicited. The main technical contribution is the construction of posterior moments, densities, and odds ratios using a six-step Gibbs sampler. Mappings from the index parameter of the family of prior distribution to posterior moments, densities, and odds ratios are developed for several of the Nelson-Plosser time series. These mappings show that the posterior distribution is not even approximately Gaussian, and indicate the sensitivity of the posterior odds ratio in favor of difference stationarity to the choice of prior distribution.
Geweke, J., Keane, M. & Runkle, D. 1994, 'Alternative Computational Approaches to Statistical Inference In The Multinomial Probit Model', Review Of Economics And Statistics, vol. 76, no. 4, pp. 609-632.
View/Download from: UTS OPUS or Publisher's site
This research compares several approaches to inference in the multinomial probit model, based on two Monte Carlo experiments for a seven choice model. The methods compared are the simulated maximum likelihood estimator using the GHK recursive probabilit
Horowitz, J.L., Bolduc, D., Divakar, S., Geweke, J., Gönül, F., Hajivassiliou, V., Koppelman, F.S., Keane, M., Matzkin, R., Rossi, P. & Ruud, P. 1994, 'Advances in random utility models report of the workshop on advances in random utility models duke invitational symposium on choice modeling behavior', Marketing Letters, vol. 5, no. 4, pp. 311-322.
View/Download from: Publisher's site
In recent years, major advances have taken place in three areas of random utility modeling: (1) semiparametric estimation, (2) computational methods for multinomial probit models, and (3) computational methods for Bayesian estimation. This paper summarizes these developments and discusses their implications for practice. © 1994 Kluwer Academic Publishers.
Geweke, J. & Terui, N. 1993, 'Bayesian Threshold Autoregressive Models for Nonlinear Time Series', Journal of Time Series Analysis, vol. 14, pp. 441-445.
Abstract: This paper provides a Bayesian approach to statistical inference in the threshold autoregressive model for time series. The exact posterior distribution of the delay and threshold parameters is derived, as is the multi-step-ahead predictive density. The proposed methods are applied to the Wolfe's sunspot and Canadian lynx data sets.
Geweke, J. 1993, 'Forecasting Time Series with Common Seasonal Patterns', Journal of Econometrics, vol. 55, pp. 201-202.
Geweke, J. 1993, 'Discussion on the Gibbs Sampler and Other Markov Chain Monte Carlo Methods.', Journal Of The Royal Statistical Society Series B-methodological, vol. 55, p. 74.
Clifford, P., Jennison, C., Wakefield, J., Phillips, D., Frigessi, A., Gray, A., Lawson, A., Forster, J., Ramgopal, P., Arslan, O., Constable, P., Kent, J., Wolff, R., Harding, E., Middleton, R., Diggle, P., Aykroyd, R., Berzuini, C., Brewer, M. & Aitken, C. 1993, 'Discussion On The Meeting On The Gibbs Sampler And Other Markov Chain-Monte Carlo Methods', Journal Of The Royal Statistical Society Series B-Methodological, vol. 55, no. 1, pp. 53-102.
NA
GEWEKE, J. 1993, 'BAYESIAN TREATMENT OF THE INDEPENDENT STUDENT-T LINEAR-MODEL', JOURNAL OF APPLIED ECONOMETRICS, vol. 8, pp. S19-S40.
View/Download from: Publisher's site
GEWEKE, J. 1993, 'REMARKS ON MY TERM AT JBES', JOURNAL OF BUSINESS & ECONOMIC STATISTICS, vol. 11, no. 4, pp. 427-427.
Geweke, J. 1992, 'Inference and Prediction in the Presence of Uncertainty and Determinism', Statistical Science, vol. 7, pp. 94-101.
Geweke, J. 1991, 'Generic, Algorithmic Approaches to Monte Carlo Integration in Bayesian Inference', Contemporary Mathematics, vol. 115, pp. 117-135.
Abstract: Program of research in generic, algorithmic approaches to Monte Carlo integration in Bayesian inference is summarized. The goal of this program is the development of a widely applicable family of solutions of Bayesian multiple integration problems, that obviate the need for case-by-case treatment of arcane problems in numerical analysis. The essentials of the Bayesian inference problem, with some reference to econometric applications, are set forth. Fundamental results in Monte Carlo integration are derived and their current implementation in software is described. Potential directions for fruitful new research are outlined.
Geweke, J., Barnett, W. & Wolfe, M. 1991, 'Seminonparametric Bayesian Estimation of the Asymptotically Ideal Production Model', Journal of Econometrics, vol. 49, pp. 5-50.
View/Download from: Publisher's site
Abstract: Recently it has been shown that seminonparametric methods can be used to produced high-quality approximations to a firm's technology. Unlike the local approximations provided by the conventional class of `flexible functional forms', seminonparametric methods generate global spans within large classes of functions. However, that approach usually spans a much larger space than the neoclassical function space relevant to most production modeling. An exception is the asymptotically ideal model (AIM) generated from the Müntz-Szatz series expansion. Since every basis function in that expansion is within the neoclassical function space, a straightforward method exists for imposing neoclassical regularity, when all factors are substitutes. Since the relevant constraints are inequality restrictions, we implement the approach using Bayesian methods to avoid the problems of sampling distribution truncation that would occur from sampling theoretic methods. We further discuss the relevant extensions that would permit complementary factors, nonconstant returns to scale, and technological change.
Geweke, J., Matchar, D., Simel, D. & Feussner, J. 1990, 'A Bayesian Method for Evaluating Medical Test Operating Characteristics When Some Patients Fail to be Diagnosed by the Reference Standard.', Medical Decision Making, vol. 10, pp. 114-115.
Matchar, D., Simel, D., Geweke, J. & Feussner, J. 1990, 'A Bayesian Method for Evaluating Medical Test Operating Characteristics When Some Patients Condituions Fail to be Diagnosed by the Reference Standard', Medical Decision Making, vol. 10, no. 2, pp. 102-115.
View/Download from: Publisher's site
Abstract: The evaluation of a diagnostic test when the reference standard fails to establish a diagnosis in some patients is a common and difficult analytical problem. Conventional operating characteristics, derived from a 2 x 2 matrix, require that tests have only positive or negative results, and that disease status be designated definitively as present or absent. Results can be displayed in a 2 x 3 matrix, with an additional column for undiagnosed patients, when it is not possible always to ascertain the disease status definitively. The authors approach this problem using a Bayesian method for evaluating the 2 x 3 matrix in which test operating characteristics are described by a joint probability density function. They show that one can derive this joint probability density function of sensitivity and specificity empirically by applying a sampling algorithm. The three-dimensional histogram resulting from this sampling procedure approximates the true joint probability density function for sensitivity and specificity. Using a clinical example, the authors illustrate the method and demonstrate that the joint probability density function for sensitivity and specificity can be influenced by assumptions used to interpret test results in undiagnosed patients. This Bayesian method represents a flexible and practical solution to the problem of evaluating test sensitivity and specificity when the study group includes patients whose disease could not be diagnosed by the reference standard. Keywords: Bayesian analysis; test operating characteristics; probability density functions. (Med Decis Making 1990;10:102-111)
Geweke, J. 1989, 'Bayesian Inference in Econometric Models Using Monte Carlo Integration', Econometrica, vol. 57, no. 6, pp. 1317-1339.
View/Download from: Publisher's site
Abstract: Methods for the systematic application of Monte Carlo integration with importance sampling to Bayesian inference in econometric models are developed. Conditions under which the numerical approximation of a posterior moment converges almost surely to the true value as the number of Monte Carlo replications increases, and the numerical accuracy of this approximation may be assessed reliably, are set forth. Methods for the analytical verification of these conditions are discussed. Importance sampling densities are derived from multivariate normal of Student t approximations to local behavior of the posterior density at its mode. These densities are modified by automatic rescaling along each axis. The concept of relative numerical efficiency is introduced to evaluate the adequacy of a chosen importance sampling density. The practical procedures based on these innovations are illustrated in two different models
Geweke, J. 1989, 'Exact Predictive Densities in Linear Models with ARCH Disturbances', Journal of Econometrics, vol. 40, pp. 63-86.
View/Download from: Publisher's site
Abstract: It is shown how exact predictive densities may be formed in the ARCH linear model by means of Monte Carlo integration with importance sampling. Several improvements in computational efficiency over earlier implementations of this procedure are developed, including use of the exact likelihood function rather than an asymptotic approximation to construct the importance sampling distribution, and antithetic acceleration of convergence. A numerical approach to the formulation of posterior odds ratios and the combination of non-nested models is also introduced. These methods are applied to daily quotations of closing stock prices. Forecasts are formulated using linear models, ARCH linear models and an integrated model constructed from the posterior probabilities of the respective models. The use of the exact predictive density in a decision-theoretic context is illustrated by deriving the optimal day-to-day portfolio adjustments of a trader with constant relative risk aversion.
Geweke, J. 1989, 'Sensitivity Analysis of Seasonal Adjustments: Empirical Case Studies', Journal of the American Statistical Association, vol. 84, pp. 28-30.
CARLIN, J., DEMPSTER, A., PIERCE, D., BELL, W., CLEVELAND, W., WATSON, M. & GEWEKE, J. 1989, 'SENSITIVITY ANALYSIS OF SEASONAL ADJUSTMENTS - EMPIRICAL CASE STUDIES - COMMENTS', JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, vol. 84, no. 405, pp. 6-30.
View/Download from: Publisher's site
Geweke, J. 1988, 'Antithetic Acceleration of Monte Carlo Integration in Bayesian Inference', Journal of Econometrics, vol. 38, no. 1-2, pp. 73-90.
View/Download from: UTS OPUS or Publisher's site
It is proposed to sample antithetically rather than randomly from the posterior density in Bayesian inference using Monte Carlo integration. Conditions are established under which the number of replications required with antithetic sampling relative to the number required with random sampling is inversely proportional to sample size, as sample size increases. The result is illustrated in an experiment using a bivariate vector autoregression.
Geweke, J. 1988, 'An Application of Operational-Subjective Statistical Methods to Rational Expectations', Journal Of Business & Economic Statistics, vol. 6, pp. 465-466.
Geweke, J. 1988, 'Operational Bayesian Methods in Econometrics', Journal of Economic Perspectives, vol. 2, pp. 159-166.
Geweke, J. 1988, 'Employment Discrimination and Statistical Science', Statistical Science, vol. 3, pp. 188-189.
Geweke, J. 1988, 'Checks of Model Adequacy for Univariate Time Series Models and Their Application to Econometric Relationships.', Econometric Reviews, vol. 7, no. 1, pp. 59-62.
Geweke, J. 1988, 'The Secular and Cyclical Behavior of Real GDP in Nineteen OECD Countries, 1957-1983', Journal of Business & Economic Statistics, vol. 6, pp. 479-486.
Abstract: Log per capita real gross domestic product is modeled as a third-order autoregression with a pair of complex roots whose amplitude is smaller than the amplitude of the real root. The behavior of this time series is interpreted in terms of these two amplitudes, the periodicity of the complex roots, and the standard deviation of the disturbance. Restrictions are evaluated and inference is conducted using the likelihood principle, applying Monte Carlo integration with importance sampling. These Bayesian procedures efficiently cope with restrictions that are awkward taking a classical approach. We find very little difference in the amplitudes of real roots between countries and of complex roots relative to within-country uncertainty. There are some substantial differences in the periodicities of complex roots, and the greatest differences between countries are found in the standard deviation of the disturbance.
GEWEKE, J. 1988, 'COMMENT ON POIRIER - OPERATIONAL BAYESIAN METHODS IN ECONOMETRICS', JOURNAL OF ECONOMIC PERSPECTIVES, vol. 2, no. 1, pp. 159-166.
Geweke, J. & Froeb, L. 1987, 'Long Run Competition in the U.S. Aluminum Industry', International Journal Of Industrial Organisation, vol. 5, pp. 67-78.
View/Download from: Publisher's site
Abstract: A methodology for examining dynamic structure-performance relationships in a single industry is proposed and illustrated. Implications of long run competitive behavior for a simple simultaneous equations model of structure and performance are derived and tested using recently developed methods for the interpretation of economic time series. It is concluded that the structure and performance in the U.S. aluminum industry in the postwar period conform well with the hypothesis that the primary aluminum market was competitive in the long run.
Geweke, J., Marshall, R. & Zarkin, G. 1986, 'Mobility Indices in Continuous Time Markov Chains', Econometrica, vol. 54, pp. 1407-1423.
View/Download from: Publisher's site
Abstract: The axiomatic derivation of mobility indices for first-order Markov chain models in discrete time is extended to continuous-time models. Many of the logical inconsistencies among axioms noted in the literature for the discrete time models do not arise for continuous time models. It is shown how mobility indices in continuous time Markov chains may be estimated from observations at two points in time. Specific attention is given to the case in which the states are fractiles, and an empirical example is presented.
Geweke, J., Marshall, R. & Zarkin, G. 1986, 'Exact Inference for Continuous Time Markov Chains', Review Of Economic Studies, vol. 53, pp. 653-669.
View/Download from: Publisher's site
Abstract: Methods for exact Bayesian inference under a uniform diffuse prior are set forth for the continuous time homogeneous Markov chain model. It is shown how the exact posterior distribution of any function of interest may be computed using Monte Carlo integration. The solution handles the problems of embeddability in a very natural way, and provides (to our knowledge) the only solution that systematically takes this problem into account. The methods are illustrated using several sets of data.
Geweke, J. 1986, 'Exact Inference in the Inequality Constrained Normal Linear Regression Model', Journal of Applied Econometrics, vol. 1, pp. 127-141.
Abstract: Inference in the inequality constrained normal linear regression model is approached as a problem in Bayesian inference, using a prior that is the product of a conventional uninformative distribution and an indicator function representing the inequality constraints. The posterior distribution is calculated using Monte Carlo numerical integration, which leads directly to the evaluation of expected values of functions of interest. This approach is compared with others that have been proposed. Three empirical examples illustrate the utility of the proposed methods using an inexpensive 32-bit microcomputer.
Geweke, J. 1986, 'The Superneutrality of Money in the United States: An Interpretation of the Evidence', Econometrica, vol. 54, pp. 1-22.
View/Download from: Publisher's site
Abstract: Structural and stochastic neutrality have refutable implications for aggregate economic time series only in conjunction with other maintained hypotheses. Simple and commonly employed maintained hypotheses lead to restrictions on measures of feedback and their decomposition by frequency. These restrictions also suggest an empirical interpretation of the notional long and short runs. It is found that a century of annual U.S. data, and postwar monthly data, consistently support structural superneutrality of money with respect to output and the real rate of return and consistently reject its superneutrality with respect to velocity. A quantitative characterization of the long run is suggested.
Geweke, J. 1986, 'Modeling Conditional Variance', Econometric Reviews, vol. 5, no. 1, pp. 57-61.
Geweke, J. 1985, 'Macroeconomic Modeling and the Theory of the Representative Agent', American Economic Review, vol. 75, pp. 206-210.
Geweke, J. & Porter-Hudak, S. 1984, 'The Estimation and Application of Long Memory Time Series Models', Journal of Time Series Analysis, vol. 4, pp. 221-238.
Abstract: The definitions of fractional Gaussian noise and integrated (or fractionally differenced) series are generalized, and it is shown that the two concepts are equivalent. A new estimator of the long memory parameter in these models is proposed, based on the simple linear regression of the log periodogram on a deterministic regressor. The estimator is the ordinary least squares estimator of the slope parameter in this regression, formed using only the lowest frequency ordinates of the log periodogram. Its asymptotic distribution is derived, from which it is evident that the conventional interpretation of these least squares statistics is justified in large samples. Using synthetic data the asymptotic theory proves to be reliable in samples of 50 observations or more. For three postwar monthly economic time series, the estimated integrated series model provides more reliable out-of-sample forecasts than do more conventional procedures.
Geweke, J. 1984, 'Measures of Conditional Linear Dependence and Feedback', Journal of the American Statistical Association, vol. 79, pp. 907-915.
View/Download from: Publisher's site
Abstract: Measures of linear dependence and feedback for two multiple time series conditional on a third are defined. The measure of conditional linear dependence is the sum of linear feedback from the first to the second conditional on the third, linear feedback from the second to the first conditional on the third, and instantaneous linear feedback between the first and second series conditional on the third. The measures are non-negative and may be expressed in terms of measures of unconditional feedback between various combinations of the three series. The measures of conditional linear feedback can be additively decomposed by frequency. Estimates of these measures are straightforward to compute, and their distribution can be routinely approximated by bootstrap methods. An empirical example involving real output, money, and interest rates is presented.
Geweke, J. & Weisbrod, B. 1984, 'How Does Technological Change Affect Health Care Expenditures? The Case of a New Drug', Evaluation Review, vol. 8, no. 1, pp. 75-92.
View/Download from: Publisher's site
Abstract: The expenditure consequences of the drug cimetidine for the period 1977-1979 are investigated. Using Medicaid data for the State of Michigan, it is found that expenditures for the first year of treatment of duodenal ulcers are reduced between 26% and 70% The methodology employed can be applied to the assessment of other medical technologies.
Geweke, J. 1984, 'The Indispensable Art of Econometrics', Journal of the American Statistical Association, vol. 79, pp. 25-26.
Geweke, J. 1984, 'Forecasting and Conditional Projection Using Realistic Prior Distributions', Econometric Reviews, vol. 5, no. 1, pp. 105-112.
Geweke, J. & Meese, R. 1984, 'A Comparison of Autoregressive Univariate Forecasting Procedures for Macroeconomic Time Series', Journal Of Business & Economic Statistics, vol. 2, pp. 187-202.
Abstract: The actual performance of several automated univariate autoregressive forecasting procedures, applied to 150 macroeconomic time series, are compared. The procedures are the random walk model as a basis for comparison; long autoregressions, with three alternative rules for lag length selection; and a long autoregression estimated by minimizing the sum of absolute deviations. The sensitivity of each procedure to preliminary transformations, data, periodicity, forecast horizon, loss function employed in parameter estimation, and seasonal adjustment procedures is examined. The more important conclusions are that Akaike's lag-length selection criterion works well in a wide variety of situations, the modeling of long memory components becomes important for forecast horizons of three or more periods, and linear combinations of forecasts do not improve forecast quality appreciably.
Meese, R. & Geweke, J. 1984, 'A Comparison of Autoregressive Univariate Forecasting Procedures for Macroeconomic Time Series.', Journal of Business and Economic Statistics, vol. 2, no. 3, pp. 191-200.
GEWEKE, J. 1984, 'THE INDISPENSABLE ART OF ECONOMETRICS - COMMENT', JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, vol. 79, no. 385, pp. 25-26.
Geweke, J., Meese, R. & Dent, W. 1983, 'Comparing Alternative Tests of Causality in Temporal Systems: Analytic Results and Experimental Evidence', Journal of Econometrics, vol. 21, pp. 161-194.
View/Download from: Publisher's site
Abstract: This paper discusses eight alternative tests of the absence of casual ordering, all of which are asymptotically valid under the null hypothesis in the sense that their limiting size is known. Their behavior under alternatives is compared analytically using the concept of approximate slope, and these results are supported by the outcomes of Monte Carlo experiments. The implications of these comparisons for applied work are unambiguous: Wald variants of a test attributed to Granger, and a lagged dependent variable version of Sim's test introduced in this paper, are equivalent in all relevant respects and are preferred to the other tests discussed.
Geweke, J. & Weisbrod, B. 1982, 'Clinical Evaluation vs. Economic Evaluation: The Case of a New Drug', Medical Care, vol. 20, pp. 821-830.
View/Download from: Publisher's site
Abstract: To economically evaluate a new. drug or other medical innovation one must assess both the changes in costs and in benefits. Safety and efficacy matter, but so do resource costs and social benefits. This paper evaluates the effects on expenditures of the recent introduction of cimetidine, a drug used in the prevention and treatment of duodenal ulcers. This evaluation is of interest in its own right and also as a "guide" for studying similar effects of other innovations. State Medicaid records are used to test the effects on hospitalization and aggregate medical care expenditures of this new medical innovation. After controlling to the extent possible for potential selection bias, we find that: 1) usage of cimetidine is associated with a lower level of medical care expenditures and fewer days of hospitalization per patient for those duodenal ulcer patients who had zero health care expenditures and zero days of hospitalization during the presample period; an annual cost saving of some $320.00 (20 per cent) per patient is indicated. Further analysis disclosed, however, that this saving was lower for patients with somewhat higher levels of health care expenditures and hospitalization in the presample period, and to some extent was reversed for the patients whose prior year's medical care expenditures and hospitalization were highest.
Geweke, J., Parzen, E., Pierce, D., Wei, W. & Zellner, A. 1982, 'The Measurement of Linear Dependence and Feedback Between Multiple Time Series', Journal of the American Statistical Association, vol. 77, pp. 304-324.
View/Download from: Publisher's site
Abstract: Measures of linear dependence and feedback for multiple time series are defined. The measure of linear dependence is the sum of the measure of linear feedback from the first series to the second, linear feedback from the second to the first, and instantaneous linear feedback. The measures are nonnegative, and zero only when feedback (causality) of the relevant type is absent. The measures of linear feedback from one series
Geweke, J. 1982, 'Measurement of Linear Dependence and Feedback Between Multiple Time Series', Journal of the American Statistical Association.
Geweke, J. & Singleton, K. 1981, 'Latent Variable Models for Time Series: A Frequency Domain Approach with an Application to the Permanent Income Hypothesis', Journal of Econometrics, vol. 17, no. 3, pp. 287-304.
View/Download from: UTS OPUS or Publisher's site
Abstract: The theory of estimation and inference in a very general class of latent variable models for time series is developed by showing that the distribution theory for the finite Fourier transform of the observable variables in latent variable models for time series is isomorphic to that for the observable variables themselves in classical latent variable models. This implies that analytic work on classical latent variable models can be adapted to latent variable models for time series, an implication which is illustrated here in the context of a general canonical form. To provide an empirical example a latent variable model for permanent income is developed, its parameters are shown to be identified, and a variety of restrictions on these parameters implied by the permanent income hypothesis are tested.
Geweke, J. 1981, 'The Approximate Slopes of Econometric Tests', Econometrica, vol. 49, no. 6, pp. 1427-1442.
View/Download from: UTS OPUS or Publisher's site
Abstract: In this paper the concept of approximate slope, introduced by R. R. Bahadur, is used to make asymptotic global power comparisons of econometric tests. The approximate slope of a test is the rate at which the logarithm of the asymptotic marginal significance level of the test decreases as sample size increases, under a given alternative. A test with greater approximate slope may therefore be expected to reject the null hypothesis more frequently under that alternative than one with smaller approximate slope. Two theorems, which facilitate the computation and interpretation of the approximate slopes of most econometric tests, are established. These results are used to undertake some illustrative comparisons. Sampling experiments and an empirical illustration suggest that the comparison of approximate slopes may provide an adequate basis for evaluating the actual performance of alternative tests of the same hypothesis
Geweke, J. & Meese, R. 1981, 'Estimating Regression Models of Finite but Unknown Order', International Economic Review, vol. 22, no. 1, pp. 54-70.
View/Download from: UTS OPUS
Examines problems associated with the estimation of the normal linear regression model of finite but unknown sequence of nested alternatives. Estimation criteria for the model selection; Derivation of the numerical bounds on the finite sample distribution; Relation of the estimation criterion functions proposed to other estimation criterion functions.
Geweke, J. & Singleton, K. 1981, 'Maximum Likelihood 'Confirmatory' Factor Analysis of Economic Time Series', International Economic Review, vol. 22, no. 1, pp. 37-54.
View/Download from: UTS OPUS or Publisher's site
Explains the theory of identification, estimation and inference in the dynamic confirmatory factor model for the economic time series. Derivation of the frequency domain representation of the model; Illustration of the nature of the identification problem for the dynamic confirmatory model; Dynamic confirmatory model of the business cycle motivated by Lucas theory of aggregate activity.
Geweke, J. 1981, 'A Comparison of Tests of the Independence of Two Covariance Stationary Time Series', Journal of the American Statistical Association, vol. 76, no. 374, pp. 363-373.
View/Download from: UTS OPUS or Publisher's site
Abstract: The approximate slopes of several tests of the independence of two covariance stationary time series are derived and compared. It is shown that the approximate slopes of regression tests are at least as great as those based on the residuals of univariate ARIMA models, and that there are cases in which the former are arbitrarily great while the latter are arbitrarily small. These analytical findings are supported by a Monte Carlo study that shows that in samples of size 100 and 250 the asymptotic distribution theory under the null hypothesis is adequate for all tests, but under alternatives to the null hypothesis the rate of Type II error for the test based on ARIMA model residuals is often more than double that of the regression tests.
Geweke, J. & Meese, R. 1981, 'Estimating regression models of finite but unknown order', vol. 16, no. 1, pp. 162-162.
GEWEKE, J. & MEESE, R. 1981, 'ESTIMATING REGRESSION-MODELS OF FINITE BUT UNKNOWN ORDER', INTERNATIONAL ECONOMIC REVIEW, vol. 22, no. 1, pp. 55-70.
View/Download from: Publisher's site
Geweke, J. & Singleton, K. 1980, 'Interpreting the Likelihood Ratio Statistic in Factor Models When Sample Size is Small', Journal of the American Statistical Association, vol. 75, no. 369, pp. 133-137.
View/Download from: UTS OPUS or Publisher's site
Abstract: The use of the likelihood ratio statistic in testing the goodness of fit of the exploratory factor model has no formal justification when, as is often the case in practice, the usual regularity conditions are not met. In a Monte Carlo experiment it is found that the asymptotic theory seems to be appropriate when the regularity conditions obtain and sample size is at least 30. When the regularity conditions are not satisfied, the asymptotic theory seems to be misleading in all sample sizes considered.
Geweke, J. & Feige, E. 1979, 'Some Joint Tests of the Efficiency of Markets for Forward Foreign Exchange', Review Of Economics And Statistics, vol. 61, no. 3, pp. 334-341.
Geweke, J. 1978, 'Testing the Exogeneity Specification in the Complete Dynamic Simultaneous Equation Model', Journal of Econometrics, vol. 7, no. 2, pp. 163-185.
View/Download from: UTS OPUS
Abstract: It is shown that in the complete dynamic simultaneous equation model exogenous variables cause endogenous variables in the sense of Granger (1969) and satisfy the criterion of econometric exogeneity discussed by Sims (1977a), but that the stationarity assumptions invoked by Granger and Sims are not necessary for this implication. Inference procedures for testing each implication are presented and a new joint test of both implications is derived. Detailed attention is given to estimation and testing when the error vector of the final form of the complete dynamic simultaneous equation model is both singular and serially correlated. The theoretical points of the paper are illustrated by testing the exogeneity specification in a small macroeconometric model.
Geweke, J. 1978, 'Temporal Aggregation in the Multiple Regression Model', Econometrica, vol. 46, no. 3, pp. 643-661.
View/Download from: UTS OPUS or Publisher's site
Abstract: The regression relation between regularly sampled Y(t) and X"1(t),..., X"N(t) implied by an underlying model in which time enters more generally is studied. The underlying model includes continuous distributed lags, discrete models, and stochastic differential equations as special cases. The relation between parameters identified by regular samplings of Y and X"j and those of the underlying model is characterized. Sufficient conditions for identification of the underlying model in the limit as disaggregation over time proceeds are set forth. Empirical evidence presented suggests that important gains can be realized from temporal disaggregation in the range of conventional measurement frequencies for macroeconomic data.
Geweke, J. 1976, 'A monetarist model of inflationary expectations : John Rutledge, (D.C. Health, Lexington, Massachusetts, 1974) pp. xv+115, $12.50', vol. 2, no. 1, pp. 125-127.
Geweke, J. 1975, 'A Monetarist Model of Inflationary Expectations', Journal of Monetary Economics.
Geweke, J., Durham, G. & Xu, H., 'Bayesian Inference for Logistic Regression Models Using Sequential Posterior Simulation'.
The logistic specification has been used extensively in non-Bayesian statistics to model the dependence of discrete outcomes on the values of specified covariates. Because the likelihood function is globally weakly concave estimation by maximum likelihood is generally straightforward even in commonly arising applications with scores or hundreds of parameters. In contrast Bayesian inference has proven awkward, requiring normal approximations to the likelihood or specialized adaptations of existing Markov chain Monte Carlo and data augmentation methods. This paper approaches Bayesian inference in logistic models using recently developed generic sequential posterior simulaton (SPS) methods that require little more than the ability to evaluate the likelihood function. Compared with existing alternatives SPS is much simpler, and provides numerical standard errors and accurate approximations of marginal likelihoods as by-products. The SPS algorithm for Bayesian inference is amenable to massively parallel implementation, and when implemented using graphical processing units it is more efficient than existing alternatives. The paper demonstrates these points by means of several examples.
Geweke, J., Gowrisankaran, G. & Town, R.J., 'Inferring Hospital Quality from Patient Discharge Records Using a Bayesian Selection Model'.
This paper develops new econometric methods to estimate hospital quality and other models with discrete dependent variables and non-random selection. Mortality rates in patient discharge records are widely used to infer hospital quality. However, hospital admission is not random and some hospitals may attract patients with greater unobserved severity of illness than others. In this situation the assumption of random admission leads to spurious inference about hospital quality. This study controls for hospital selection using a model in which distance between the patient?s residence and alternative hospitals are key exogenous variables. Bayesian inference in this model is feasible using a Markov chain Monte Carlo posterior simulator, and attaches posterior probabilities to quality comparisons between individual hospitals and groups of hospitals. The study uses data on 77,937 Medicare patients admitted to 117 hospitals in Los Angeles County from 1989 through 1992 with a diagnosis of pneumonia. It finds higher quality in smaller hospitals than larger, and in private for-profit hospitals than in hospitals in other ownership categories. Variations in unobserved severity of illness across hospitals is at least a great as variation in hospital quality. Consequently a conventional probit model leads to inferences about quality markedly different than those in this study?s selection model.

Other

Bateman, H., Eckert, C., Geweke, J., Louviere, J.J., Satchell, S. & Thorp, S.J. 2011, 'Financial competence, risk presentation and retirement portfolio preferences.'.
Bateman, H., Eckert, C., Geweke, J., Louviere, J.J., Satchell, S.E. & Thorp, S.J. 2011, 'Financial Competence and Expectations Formation: Evidence from Australia'.
Geweke, J. 2010, 'Comment: Financial Convergance Properties of the Likelihood of Computed Dynamic Models'.
There are technical errors in this article (Econometrica, January 2006) that are important, simple and correctable. The corrections substantially alter the article's conclusions.
Berg, J.E., Geweke, J. & Rietz, T.A. 2010, 'Memoirs of an Indifferent Trader: Estimating Forecast Distributions from Prediction Markets'.
View/Download from: UTS OPUS
Prediction markets for future events are increasingly common and they often trade several contracts for the same event. This paper considers the distribution of a normative risk-neutral trader who, given any portfolio of contracts traded on the event, would choose not to reallocate that portfolio of contracts even if transactions costs were zero. Because common parametric distributions can conflict with observed prediction market prices, the distribution is given a nonparametric representation together with a prior distribution favoring smooth and concentrated distributions. Posterior modal distributions are found for popular vote shares of the U.S. presidential candidates in the 100 days leading up to the elections of 1992, 1996, 2000, and 2004, using bid and ask prices on multiple contracts from the Iowa Electronic Markets. On some days, the distributions are multimodal or substantially asymmetric. The derived distributions are more concentrated than the historical distribution of popular vote shares in presidential elections, but do not tend to become more concentrated as time to elections diminishes.
Geweke, J., Ackerberg, D. & Hahn, J. 2010, 'Comments on 'Convergance Properties of the Likelihood of Computed Dynamic Models'.'.
Geweke, J. 2010, 'Bayesian and Non-Bayesian Analysis of the Seemingly Unrelated Regression Model with Student-t Errors and Its Application to Forecasting: Comment'.
Bateman, H., Eckert, C., Geweke, J., Louviere, J.J., Satchell, S. & Thorp, S.J. 2010, 'Economic Rationality, Risk Presentation, and Retirement Portfolio Choice'.
Geweke, J., Horowitz, J. & Pesaran, M.H. 2006, 'Econometrics: A Birds Eye View'.
As a unified discipline, econometrics is still relatively young and has been transforming and expanding very rapidly over the past few decades. Major advances have taken place in the analysis of cross sectional data by means of semi-parametric and non-parametric techniques. Heterogeneity of economic relations across individuals, firms and industries is increasingly acknowledged and attempts have been made to take them into account either by integrating out their effects or by modeling the sources of heterogeneity when suitable panel data exists. The counterfactual considerations that underlie policy analysis and treatment evaluation have been given a more satisfactory foundation. New time series econometric techniques have been developed and employed extensively in the areas of macroeconometrics and finance. Non-linear econometric techniques are used increasingly in the analysis of cross section and time series observations. Applications of Bayesian techniques to econometric problems have been given new impetus largely thanks to advances in computer power and computational techniques. The use of Bayesian techniques have in turn provided the investigators with a unifying framework where the tasks of forecasting, decision making, model evaluation and learning can be considered as parts of the same interactive and iterative process; thus paving the way for establishing the foundation of real time econometrics. This paper attempts to provide an overview of some of these developments.
Geweke, J., Gowrisankaran, G. & Jul, R.J.T.X.-.I. 2002, 'Bayesian inference for hospital quality in a selection model'.
This paper develops new econometric methods to infer hospital quality in a model with discrete dependent variables and non-random selection. Mortality rates in patient discharge records are widely used to infer hospital quality. However, hospital admission is not random and some hospitals may attract patients with greater unobserved severity of illness than others. In this situation the assumption of random admission leads to spurious inference about hospital quality. This study controls for hospital selection using a model in which distance between the patient's residence and alternative hospitals are key exogenous variables. Bayesian inference in this model is feasible using a Markov chain Monte Carlo posterior simulator, and attaches posterior probabilities to quality comparisons between individual hospitals and groups of hospitals. The study uses data on 74,848 Medicare patients admitted to 114 hospitals in Los Angeles County from 1989 through 1992 with a diagnosis of pneumonia. It finds the smallest and largest hospitals to be of the highest quality. There is strong evidence of dependence between the unobserved severity of illness and the assignment of patients to hospitals, whereby patients with a high unobserved severity of illness are disproportionately admitted to high quality hospitals. Consequently a conventional probit model leads to inferences about quality markedly different than those in this study's selection model.
Geweke, J.F. 1998, 'Using simulation methods for Bayesian econometric models: inference, development, and communication'.
This paper surveys the fundamental principles of subjective Bayesian inference in econometrics and the implementation of those principles using posterior simulation methods. The emphasis is on the combination of models and the development of predictive distributions. Moving beyond conditioning on a fixed number of completely specified models, the paper introduces subjective Bayesian tools for formal comparison of these models with as yet incompletely specified models. The paper then shows how posterior simulators can facilitate communication between investigators (for example, econometricians) on the one hand and remote clients (for example, decision makers) on the other, enabling clients to vary the prior distributions and functions of interest employed by investigators. A theme of the paper is the practicality of subjective Bayesian methods. To this end, the paper describes publicly available software for Bayesian inference, model development, and communication and provides illustrations using two simple econometric models.
Geweke, J.F., Keane, M.P. & Runkle, D.E. 1994, 'Alternative computational approaches to inference in the multinomial probit model'.
This research compares several approaches to inference in the multinomial probit model, based on Monte-Carlo results for a seven choice model. The experiment compares the simulated maximum likelihood estimator using the GHK recursive probability simulator, the method of simulated moments estimator using the GHK recursive simulator and kernel-smoothed frequency simulators, and posterior means using a Gibbs sampling-data augmentation algorithm. Each estimator is applied in nine different models, which have from 1 to 40 free parameters. The performance of all estimators is found to be satisfactory. However, the results indicate that the method of simulated moments estimator with the kernel-smoothed frequency simulator does not perform quite as well as the other three methods. Among those three, the Gibbs sampling-data augmentation algorithm appears to have a slight overall edge, with the relative performance of MSM and SML based on the GHK simulator difficult to determine.
Geweke, J., 'Using Simulation Methods for Bayesian Econometric Models'.
This paper surveys the fundamental principles of subjective Bayesian inference in econometrics and their implementation using posterior simulation methods. The emphasis is on the combination of models and the development of predictive distributions. The paper shows how posterior simulators can facilitate communication between investigators (for example, econometricians) on the one hand and remote clients (for example, decision makers) on the other, enabling clients to vary the prior distributions and functions of interest employed by investigators.
Geweke, J. & Keane, M., 'An Empirical Analysis of Income Dynamics among Men in the PSID: 1968–1989'.
This study uses data from the Panel Survey of Income Dynamics (PSID) to address a number of questions about life-cycle earnings mobility. It develops a dynamic reduced-form model of earnings and marital status that is nonstationary over the life-cycle. A Gibbs sampling-data augmentation algorithm facilitates use of the entire sample and provides numerical approximations to the exact posterior distribution of properties of earnings paths. This algorithm copes with the complex distribution of endogenous variables that are observed for short segments of an individual's work history, not including the initial period. <p>The study reaches several firm conclusions about life cycle earnings mobility. Incorporating non-Gaussian shocks makes it possible to account for transitions between low and higher earnings states, a heretofore unresolved problem. The non-Gaussian distribution substantially increases the lifetime return to postsecondary education, and substantially reduces differences in lifetime wages attributable to race. In a given year, the majority of variance in earnings not accounted for by race, education, and age is due to transitory shocks, but over a lifetime the majority is due to unobserved individual heterogeneity. Consequently, low earnings at early ages are strong predictors of low earnings later in life, even conditioning on observed individual characteristics.
Geweke, J., 'Computational Experiments and Reality'.
A common practice in macroeconomics is to assess the validity of general equilibrium models by first deriving their implications for population moments and then comparing population moments with observed sample moments. Generally the population moments are not explicit functions of model parameters, and so computational experiments are used to establish the link between parameters and moments. In most cases the general equilibrium models are intended to describe certain population moments (for example, means) but not others (for example, variances). The comparison of population moments with observed sample moments is informal, a process that has been termed calibration by some economists and ocular econometrics by others. This paper provides a formal probability framework within which this approach to inference can be studied. There are two principle results. First, if general equilibrium models are taken as predictive for sample moments, then the formal econometrics of model evaluation and comparison are straightforward. The fact that the models describe only a subset of moments presents no obstacles, and the formal econometrics yield as a byproduct substantial insights into the workings of models. Second, if general equilibrium models are taken to establish implications for population moments but not sample moments, then there is no link to reality because population moments are unobserved. Under this assumption, atheoretical macroeconomic models that link population and sample moments can be introduced coherently into the formal econometrics of model evaluation and comparison. The result is a framework that unifies general equilibrium models (theory without measurement) and atheoretical econometrics (measurement without theory). The paper illustrates these using some models of the equity premium.
Geweke, J. & Amisano, G., 'Hierarchical Markov normal mixture models with applications to financial asset returns'.
With the aim of constructing predictive distributions for daily returns, we introduce a new Markov normal mixture model in which the components are themselves normal mixtures. We derive the restrictions on the autocovariances and linear representation of integer powers of the time series in terms of the number of components in the mixture and the roots of the Markov process. We use the model prior predictive distribution to study its implications for some interesting functions of returns. We apply the model to construct predictive distributions of daily S&P500 returns, dollarpound returns, and one- and ten-year bonds. We compare the performance of the model with ARCH and stochastic volatility models using predictive likelihoods. The model's performance is about the same as its competitors for the bond returns, better than its competitors for the S&P 500 returns, and much better for the dollar-pound returns. Validation exercises identify some potential improvements. JEL Classification: C53, G12, C11, C14
Geweke, J., Gowrisankaran, G. & Town, R.J., 'Bayesian Inference for Hospital Quality in a Selection Model'.
This paper develops new econometric methods to infer hospital quality in a model with discrete dependent variables and non-random selection. Mortality rates in patient discharge records are widely used to infer hospital quality. However, hospital admission is not random and some hospitals may attract patients with greater unobserved severity of illness than others. In this situation the assumption of random admission leads to spurious inference about hospital quality. This study controls for hospital selection using a model in which distance between the patient's residence and alternative hospitals are key exogenous variables. Bayesian inference in this model is feasible using a Markov chain Monte Carlo posterior simulator, and attaches posterior probabilities to quality comparisons between individual hospitals and groups of hospitals. The study uses data on 74,848 Medicare patients admitted to 114 hospitals in Los Angeles County from 1989 through 1992 with a diagnosis of pneumonia. It finds the smallest and largest hospitals to be of high quality and public hospitals to be of low quality. There is strong evidence of dependence between the unobserved severity of illness and the assignment of patients to hospitals. Consequently a conventional probit model leads to inferences about quality markedly different than those in this study's selection model.
Geweke, J., Horowitz, J. & Pesaran, M.H., 'Econometrics: A Bird's Eye View'.
As a unified discipline, econometrics is still relatively young and has been transforming and expanding very rapidly over the past few decades. Major advances have taken place in the analysis of cross sectional data by means of semi-parametric and non-parametric techniques. Heterogeneity of economic relations across individuals, firms and industries is increasingly acknowledged and attempts have been made to take them into account either by integrating out their effects or by modeling the sources of heterogeneity when suitable panel data exists. The counterfactual considerations that underlie policy analysis and treatment evaluation have been given a more satisfactory foundation. New time series econometric techniques have been developed and employed extensively in the areas of macroeconometrics and finance. Non-linear econometric techniques are used increasingly in the analysis of cross section and time series observations. Applications of Bayesian techniques to econometric problems have been given new impetus largely thanks to advances in computer power and computational techniques. The use of Bayesian techniques have in turn provided the investigators with a unifying framework where the tasks of forecasting, decision making, model evaluation and learning can be considered as parts of the same interactive and iterative process; thus paving the way for establishing the foundation of 'real time econometrics. This paper attempts to provide an overview of some of these developments.