Skip to main content Accessibility help
×
Hostname: page-component-7bb8b95d7b-495rp Total loading time: 0 Render date: 2024-10-05T07:12:21.807Z Has data issue: false hasContentIssue false

5 - Field Experiments in Public Management

from Part II - Methods

Published online by Cambridge University Press:  27 July 2017

Oliver James
Affiliation:
University of Exeter
Sebastian R. Jilke
Affiliation:
Rutgers University, New Jersey
Gregg G. Van Ryzin
Affiliation:
Rutgers University, New Jersey
Get access

Summary

Image of the first page of this content. For PDF version, please use the ‘Save PDF’ preceeding this image.'
Type
Chapter
Information
Experiments in Public Management Research
Challenges and Contributions
, pp. 89 - 116
Publisher: Cambridge University Press
Print publication year: 2017

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Allcott, H. 2015. ‘Site selection bias in program evaluation’, Quarterly Journal of Economics 130(3): 1117–65.CrossRefGoogle Scholar
Anderson, D. M. and Edwards, B. C. 2014. ‘Unfulfilled promise: laboratory experiments in public management research’, Public Management Review doi: 10.1080/14719037.2014.943272.CrossRefGoogle Scholar
Arceneaux, K. and Butler, D. M. 2015. ‘How not to increase participation in local government: the advantages of experiments when testing policy interventions’, Public Administration Review Early view.CrossRefGoogle Scholar
Avellaneda, C. N. 2013. ‘Mayoral decision-making: issue salience, decision context, and choice constraint? An experimental study with 120 Latin American mayors’, Journal of Public Administration Research and Theory 23: 631–61.CrossRefGoogle Scholar
Bækgaard, M. Baethge, C. Blom-Hansen, J., Dunlop, C., Esteve, M., Jakobsen, M., Kisida, B., Marvel, J., Moseley, A., Serritzlew, S., Stewart, P., Kjaergaard Thomsen, M. and Wolf, P. 2015. ‘Conducting experiments in public management research: a practical guide’, International Public Management Journal 18(2): 323–42.CrossRefGoogle Scholar
Banerjee, A. V. and Duflo, E. 2014. ‘The experimental approach to development economics’, in Field Experiments and Their Critics: Essays on the Uses and Abuses of Experimentation in the Social Sciences, edited by Teele, Dawn Langan, 78114. New Haven, CT: Yale University Press.Google Scholar
Barnow, B. S. 2010. ‘Setting up social experiments: the good, the bad, and the ugly’, Zeitschrift für Arbeitsmarkt Forschung 43: 91105.Google Scholar
Barrett, C. B. and Carter, M. R. 2014. ‘Retreat from radical skepticism: rebalancing theory, observational data, and randomization in development economics’, in Field Experiments and Their Critics: Essays on the Uses and Abuses of Experimentation in the Social Sciences, edited by Teele, Dawn Langan, 5877. New Haven, CT: Yale University Press.Google Scholar
Beath, A., Fontina, C. and Enikolopov, R. 2013. ‘Empowering women through development aid: evidence from a field experiment in Afghanistan’, American Political Science Review 107: 540–57.CrossRefGoogle Scholar
Belle, N. 2015. ‘Performance-related pay and the crowding out of motivation in the public sector: a randomized field experiment’, Public Administration Review 75 (2): 230–41.CrossRefGoogle Scholar
Bonetti, S. 1998. ‘Experimental economics and deception’, Journal of Economic Psychology 19: 377–95.CrossRefGoogle Scholar
Boyne, G. A., James, O., John, P. and Petrovsky, N. 2009. ‘Democracy and government performance: holding incumbents accountable in English local governments’, Journal of Politics 71(4): 1273–84.CrossRefGoogle Scholar
Bozeman, B and Scott, P. 1992. ‘Laboratory experiments in public management and management’, Journal of Public Administration Research and Theory 2(4): 293313.Google Scholar
Brewer, G. A. and Brewer, G. A. Jr. 2011. ‘Parsing public/private differences in work motivation and performance: an experimental study’, Journal of Public Administration Research and Theory 21(suppl 3): i347–62.CrossRefGoogle Scholar
Butler, D. M. 2010. ‘Monitoring bureaucratic compliance: using field experiments to improve governance’, Public Sector Digest 2010 (winter): 41–4.Google Scholar
Butler, D. M. and Brockman, D. E. 2011. ‘Do politicians racially discriminate against constituents? A field experiment on state legislators’, American Journal of Political Science 55: 463–77.CrossRefGoogle Scholar
Card, D., DellaVigna, S. and Malmendier, U. 2011. ‘The role of theory in field experiments’, Journal of Economic Perspectives 25(3): 3962.CrossRefGoogle Scholar
Colson, G. Corrigan, J. R., Grebitus, G., Loureiro, M. L. and Rousu, M. C. 2015. ‘Which deceptive practices, if any, should be allowed in experimental economics research? Results from surveys of applied experimental economists and students’, American Journal of Agricultural Economics, Early View 112. doi: 10.1093/ajae/aav067.CrossRefGoogle Scholar
Coppock, A. and Green, D. P. 2015. ‘Assessing the correspondence between experimental results obtained in the lab and field: a review of recent social science research’, Political Science Research and Methods 3: 113–31.CrossRefGoogle Scholar
Cotterill, S. and Richardson, R. 2010. ‘Expanding the use of experiments on civic behavior: experiments with local governments as research partners’, The Annals of the American Academy of Political and Social Science 628: 148–64.CrossRefGoogle Scholar
Druckman, J. N. and Kam, C. D. 2011. ‘Students as experimental participants: a defense of the “narrow database”’, in Handbook of Experimental Political Science, edited by Druckman, James, Green, Donald P., Kuklinski, James H. and Lupia, Arthur, 4157. New York: Cambridge University Press.CrossRefGoogle Scholar
Duflo, E., Glennerster, R. and Kremer, M. 2006. Using Randomization in Development Economics Research: A Toolkit. Cambridge, National Bureau of Economic Research.CrossRefGoogle Scholar
Dunning, T. 2012. Natural Experiments in the Social Sciences. New York: Cambridge University Press.CrossRefGoogle Scholar
Falk, A. and Heckman, J. J. 2009. ‘Lab experiments are a major source of knowledge in the social sciences’, Science 326(5952): 535–8.CrossRefGoogle Scholar
Fisher, R. A. 1926. ‘The arrangement of field experiments’, Journal of the Ministry of Agriculture of Great Britain 33: 503–13.Google Scholar
Fisher, R. A. 1935. Design of Experiments. Edinburgh: Oliver and Boyd.Google Scholar
Gerber, A. S. and Green, D. P. 2012. Field Experiments: Design, Analysis and Interpretation. New York: Norton.Google Scholar
Green, D. P. and Thorley, D. 2014. ‘Field experimentation and the study of law and policy’, Annual Review of Law and Social Science 10: 5372.CrossRefGoogle Scholar
Guala, F. 2005. The Methodology of Experimental Economics. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
Harrison, G., Lau, M. and Williams, M. 2002. ‘Estimating individual discount rates in Denmark: a field experiment’, American Economic Review 92(5): 1606–17.CrossRefGoogle Scholar
Harrison, G. W. and List, J. A. 2004. ‘Field experiments’, Journal of Economic Literature 42(4): 1009–55.CrossRefGoogle Scholar
Henrich, J., Heine, S. J. and Norenzayan, A. 2010. ‘Beyond WEIRD: towards a broadbased behavioural science’, Behavioral and Brain Sciences 33(2–3): 111–35.CrossRefGoogle Scholar
Hertwig, R. and Ortmann, R. 2008. ‘Deception in experiments: revisiting the arguments in its defense’, Ethics and Behavior 18 (1): 5982.CrossRefGoogle Scholar
Hess, D. R., Hanmer, M. J. and Nickerson, D. M. 2015. ‘Encouraging local bureaucratic compliance with federal civil rights laws: field experiments with agencies implementing the national voter registration act’, unpublished paper. www.douglasrhess.com/uploads/4/3/7/8/43789009/hess_hanmer_nickerson_may_2015_nvra_compliance.pdf. Accessed 8 August 2015.Google Scholar
Jakobsen, M. 2013. ‘Can government initiatives increase citizen coproduction? Results of a randomized field experiment’, Journal of Public Administration Research and Theory 23(1): 2754.CrossRefGoogle Scholar
Jakobsen, M. and Andersen, S. 2013. ‘Coproduction and equity in public service delivery’, Public Administration Review 73(5): 704–13.CrossRefGoogle Scholar
James, O. 2011. ‘Performance measures and democracy: information effects on citizens in field and laboratory experiments’, Journal of Public Administration Research and Theory 21: 399418.CrossRefGoogle Scholar
James, O. and Moseley, A. 2014. ‘Does performance information about public services affect citizens’ perceptions, satisfaction, and voice behaviour? Field experiments with absolute and relative performance information’. Public Administration 92(2): 493511.CrossRefGoogle Scholar
John, P. (2016) Experimentation in Political Science and Public Policy The Challenge and Promise of Field Trials. London: Routledge.Google Scholar
John, P., Cotterill, S., Moseley, A., Richardson, L., Smith, G., Stoker, G. and Wales, C. 2011. Nudge, Nudge, Think, Think: Experimenting with Ways to Change Civic Behaviour. London: Bloomsbury Academic.CrossRefGoogle Scholar
Levitt, S. D. and List, J. A. 2009. ‘Field experiments in economics: the past, the present, and the future’, European Economic Review 53: 118.CrossRefGoogle Scholar
List, J. 2011. ’Why economists should conduct field experiments and 14 tips for pulling one off’, Journal of Economic Perspectives 25(3): 316.CrossRefGoogle Scholar
List, J. and Metcalfe, R. 2014. ‘Field experiments in the developed world: an introduction’, Oxford Review of Economic Policy 30(4): 585–96.CrossRefGoogle Scholar
Manzi, J. 2012. Uncontrolled: The Surprising Payoff of Trial-and-Error for Business, Politics, and Society. New York: Basic Books.Google Scholar
Margetts, H. Z. 2011. ‘Experiments for public management research’, Public Management Review 13(2): 189208.CrossRefGoogle Scholar
McClendon, G. H. 2012. ‘Ethics of using public officials as field experiment subjects’, Newsletter of the APSA Experimental Section 3: 1320.Google Scholar
Medical Research Council 1948. ‘Streptomycin treatment of pulmonary tuberculosis’, British Medical Journal 2(4582): 769–82.Google Scholar
Moher, D., Hopewell, S., Schulz, K. F., Montori, V., Gøtzsche, P. C., Devereaux, P. J., Elbourne, D., Egger, M., Altman, D. G. 2010. ‘CONSORT 2010 explanation and elaboration: updated guidelines for reporting parallel group randomized trials’, British Medical Journal 340: c869.CrossRefGoogle Scholar
Moseley, A., James, O., John, P. Richardson, L. Ryan, M. and Stoker, G. 2015. ‘Can approaches shown to increase monetary donations to charity also increase voluntary donations of time? Evidence from field experiments using social information’, Unpublished working paper.Google Scholar
Moseley, A. and Stoker, G. 2015. ‘Putting public policy defaults to the test: the case of organ donor registration’, International Public Management Journal 18(2): 246–64.CrossRefGoogle Scholar
Mutz, D. C. 2011. Population-Based Survey Experiments. Princeton, NJ: Princeton University Press.Google Scholar
Olken, B. 2007. ‘Monitoring corruption: evidence from a field experiment in Indonesia’, Journal of Political Economy 115: 200–49.CrossRefGoogle Scholar
Orr, L. L. 1999. Social Experiments: Evaluating Public Programs with Experimental Methods. Thousand Oaks, CA: Sage.Google Scholar
Pawson, R. and Tilley, N. 1997. Realistic Evaluation. London: Sage.Google Scholar
Riccio, J. and Bloom, H. S. 2002. ‘Extending the reach of randomized social experiments: new directions in evaluations of American welfare-to-work and employment initiatives’, Journal of the Royal Statistical Society: Series A (Statistics in Society) 165: 1330.CrossRefGoogle Scholar
Rothwell, P. M. 2015. ‘External validity of randomized controlled trials: to whom do the results of this trial apply?The Lancet 365: 8293.CrossRefGoogle Scholar
Rousu, M. C., Colson, G., Corrigan, J. R., Grebitus, C. and Loureiro, M. L. 2015. ‘Deception in experiments: towards guidelines on use in applied economics research’, Applied Economic Perspectives and Psychology 37(3): 524–36.Google Scholar
Salovey, P. and Williams-Piehota, P. 2004. ‘Field experiments in social psychology message framing and the promotion of health protective behaviors’, American Behavioral Scientist 47(5): 488505.CrossRefGoogle Scholar
Salsburg, D. 2001. The Lady Tasting Tea: How Statistics Revolutionized Science in the Twentieth Century. New York WH Freeman.Google Scholar
Shadish, William R., Cook, T. D. Thomas D., and Campbell, Donald T. 2002. Experimental and Quasi-Experimental Designs for Generalized Causal Inference. Boston: Houghton Mifflin Company.Google Scholar
Shafir, E. (ed.) 2013. The Behavioral Foundations of Public Policy. Princeton, NJ: Princeton University Press.CrossRefGoogle Scholar
Sherman, L. W. and Rogan, D. P. 1995. ‘Deterrent effects of police raids on crack houses: a randomized, controlled experiment’, Justice Quarterly 12: 755–81.Google Scholar
Sherman, L., Rogan, D., Edwards, T., Whipple, R., Shreve, D., Witcher, D., Trimble, W., The Street Narcotics Unit, Velke, R., Blumberg, M., Beatty, A. and Bridgeforth, C. 1995. ‘Deterrent effects of police raids on crack houses: a randomized, controlled experiment’, Justice Quarterly 12: 755–81.Google Scholar
Sherman, L. W. and Weisburd, D. 1995. ‘General deterrent effects of police patrol in crime hot spots: a randomized, controlled trial’, Justice Quarterly 12(4): 635–48.Google Scholar
Soloman, P., Cavanaugh, M. M. and Draine, J. 2009. Randomized Controlled Trials: Design and Implementation for Community Based Psychosocial Interventions. Oxford: Oxford University Press.CrossRefGoogle Scholar
Teele, D. L. 2015. ‘Reflections on the ethics of field experiments’, in Field Experiments and Their Critics: Essays on the Uses and Abuses of Experimentation in the Social Sciences, edited by Teele, Dawn Langan, 115–40. New Haven, CT: Yale University Press.Google Scholar
Torgerson, D. J. and Torgerson, C. J. 2008. Designing Randomized Trials in Health, Education and the Social Sciences. Basingstoke: Palgrave.CrossRefGoogle Scholar
Vivalt, E. 2015. ‘Heterogeneous treatment effects in impact evaluation’, American Economic Review: Papers & Proceedings 105: 467–70.CrossRefGoogle Scholar
Worthy, B., John, P. and Vannoni, M. 2015. ‘Information requests and local accountability: an experimental analysis of local parish responses to an informational campaign’, Unpublished paper.Google Scholar
Yates, F. 1964. ‘Sir Ronald Fisher and the design of experiments’, Biometrics 20: 307–21.CrossRefGoogle Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×