Skip to main content Accessibility help
×
Hostname: page-component-76fb5796d-zzh7m Total loading time: 0 Render date: 2024-04-26T19:35:12.841Z Has data issue: false hasContentIssue false

Eleven - Choosing the Optimal Divergence under Quadratic Loss

from Part V - Optimal Convex Divergence

Published online by Cambridge University Press:  05 June 2012

George G. Judge
Affiliation:
University of California, Berkeley
Ron C. Mittelhammer
Affiliation:
Washington State University
Get access

Summary

Introduction

In econometric practice, the underlying data sampling process is seldom known. This lack of information has a potential impact on estimator performance and thus the precision of information recovery. A basic limitation of traditional likelihood-divergence approaches is that it is impossible to describe estimators-distributions of an arbitrary form. In this chapter, we recognize this estimation and inference problem and propose a loss function approach to choosing an optimum choice rule from a family of possible likelihood estimators. The resulting optimal likelihood function-estimator combination involves, given a loss function, a convex combination of likelihood functions from the Cressie-Read (CR) family of power divergence measures (PDM) that were introduced and analyzed in Chapters 7 through 10.

The basis for choosing a loss-based choice rule comes from questions asked by Gorban (1984), Gorban and Karlin (2003), and Judge and Mittelhammer (2004). Gorban and Karlin asked which density distributions will emerge if we take a convex combination of two entropy functionals, which, in our context, refers to two members of the CR family. Judge and Mittelhammer asked which combination of econometric models will emerge if we take a convex combination of two design matrices-models, using a quadratic loss choice rule. In this chapter, we change the question and ask which estimation rule will emerge from the CR family of likelihood functions if we take a convex combination of two or more members of the CR family (two or more γs), subject to a squared error-quadratic loss measure. In the sections that follow, we develop a framework for seeking an answer to this question. A sampling experiment is used to illustrate the performance of the resulting estimation-choice rules.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2011

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Cressie, NRead, T 1984 Multinomial Goodness of Fit TestsJournal of the Royal Statistical Society, Series B 46 440Google Scholar
Gorban, AGorban, PJudge, G 2010 Entropy: The Markov Ordering ApproachEntropy 12 1145CrossRefGoogle Scholar
Gorban, A. N 1984 Equilibrium Encircling: Equations of Chemical Kinetics and Their Thermodynamic AnalysisNaukaNovosibirskGoogle Scholar
Gorban, A. NKarlin, I. V 2003 Family of Additive Entropy Functions out of Thermodynamic LimitPhysical Review E 67CrossRefGoogle ScholarPubMed
Grendar, MGrendar, M 2000
Hahn, JHausman, J 2002 A New Specification Test for the Validity of Instrumental VariablesEconometrica 70 163CrossRefGoogle Scholar
Haubold, HMathai, ASaxena, R 2004 Boltzmann-Gibbs Entropy versus Tsallis EntropyAstrophysics and Space Science 290 241CrossRefGoogle Scholar
James, WStein, C 1961 Estimation with Quadratic LossProceedings of Fourth Berkeley Symposium on Statistics and ProbabilityUniversity of California PressBerkeleyGoogle Scholar
Jeffrey, H 1948 Theory of ProbabilityOxfordOxford University PressGoogle Scholar
Judge, GBock, M. E 1978 The Statistical Implication of Pre-Test and Stein-Rule EstimatorsAmsterdamNorth HollandGoogle Scholar
Judge, GMittelhammer, R 2004 A Semiparametric Basis for Combining Estimation Problems under Quadratic LossJournal of the American Statistical Association 99 479CrossRefGoogle Scholar
Khalil, K 1996 Nonlinear SystemsEnglewood Cliffs, NJPrentice HallGoogle Scholar
Mittelhammer, RJudge, GMiller, D 2000 Econometric FoundationsNew YorkCambridge University PressGoogle Scholar
Pardo, L 2006 Statistical Inference Based on Divergence MeasuresBoca Raton, FLChapman and HallGoogle Scholar
Read, T. RCressie, N. A 1988 Goodness of Fit Statistics for Discrete Multivariate DataNew YorkSpringer VerlagCrossRefGoogle Scholar
Renyi, A 1961 On Measures of Entropy and InformationProceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability547University of California PressBerkeley–Los AngelesGoogle Scholar
Renyi, A 1970 Probability TheoryAmsterdamNorth-Holland.Google Scholar
Tsallis, C 1988 Possible Generalizations of the Boltzmann-Gibbs StatisticsJournal of Statistical Physics 52 479CrossRefGoogle Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×