Hostname: page-component-76fb5796d-vvkck Total loading time: 0 Render date: 2024-04-25T12:24:43.193Z Has data issue: false hasContentIssue false

How can experiments play a greater role in public policy? Three notions from behavioral psychology

Published online by Cambridge University Press:  09 July 2020

SCOTT McCONNELL*
Affiliation:
Department of Educational Psychology, University of Minnesota, Minneapolis, MN, USA
*
*Correspondence to: Department of Educational Psychology, University of Minnesota, 250 Education Sciences Bldg, 56 E River Rd, Minneapolis, MN55455, USA. E-mail: smcconne@umn.edu

Abstract

Al-Ubaydli et al. provide a far-reaching, insightful and directly actionable analysis of how social-behavioral research may exert more influence over the development and implementation of public policy. Their paper offers a sophisticated understanding of the ‘scale-up effect’, or factors that influence the extent to which positive experimental effects replicate as an intervention is implemented more broadly. Using economic principles, models and analyses, they offer 12 proposals for improving the process of scaling up effective and policy-relevant interventions. The current paper outlines how their proposals share a number of complementary features with behavioral psychology and applied behavior analysis. This response considers three possible points of intersection: (1) perspectives on the importance and challenges of studying and controlling our own behavior; (2) approaches to determining the social value of intervention outcomes and the procedures for achieving them; and (3) recommendations for deploying meaningful, common measures across phases of research.

Type
Articles
Copyright
Copyright © The Author(s) 2020. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Allinder, R. M. and Oats, R. G. (1997), ‘Effects of Acceptability on Teachers' Implementation of Curriculum-Based Measurement and Student Achievement in Mathematics Computation’, Remedial and Special Education, 18, 113120. https://doi.org/10.1177/074193259701800205CrossRefGoogle Scholar
Al-Ubaydli, O., List, J. A. and Suskind, D. L. (2017), ‘What Can We Learn from Experiments?Understanding the Threats to the Scalability of Experimental Results. American Economic Review, 107, 282286. https://doi.org/10.1257/aer.p20171115Google Scholar
Al-Ubaydli, O., List, J. A. and Suskind, D. L. (2019), ‘The science of using science: Towards an understanding of the threats to scaling experiments’, Artefactual Field Experiments.Google Scholar
Baer, D. M., Wolf, M. M. and Risley, T. R. (1968), ‘Some current dimensions of applied behavior analysis’, Journal of Applied Behavior Analysis, 1, 9197.CrossRefGoogle ScholarPubMed
Biglan, A. and Embry, D. D. (2013), ‘A framework for intentional cultural change’, Journal of Contextual Behavioral Science, 2, 95104. https://doi.org/10.1016/j.jcbs.2013.06.001CrossRefGoogle ScholarPubMed
Bijou, S. W., Peterson, R. F. and Ault, M. H. (1968), ‘A method to integrate descriptive and experimental field studies at the level of data and empirical concepts’, Journal of Applied Behavior Analysis, 1, 175191.CrossRefGoogle ScholarPubMed
Both, T. (2010), ‘Human-Centered, Systems-Minded Design’, Stanford Social Innovation Review, 19.Google Scholar
Boy, G. A. and Narkevicius, J. M. (2014), ‘Unifying Human Centered Design and Systems Engineering for Human Systems Integration’, in Aiguier, M., Boulanger, F., Krob, D. and Marchal, C. (eds), Complex Systems Design & Management, Cham: Springer International Publishing, 151162. https://doi.org/10.1007/978-3-319-02812-5_12CrossRefGoogle Scholar
Cook, B. G. and Odom, S. L. (2013), ‘Evidence-Based Practices and Implementation Science in Special Education’, Exceptional Children, 79, 135144.CrossRefGoogle Scholar
Darcy Mahoney, A., McConnell, S. R., Larson, A. L., Becklenberg, A. and Stapel-Wax, J. L. (2019), ‘Where do we go from here? Examining pediatric and population-level interventions to improve child outcomes’, Early Childhood Research Quarterly. https://doi.org/10.1016/j.ecresq.2019.01.009Google Scholar
Deno, S. L., Mirkin, P. K. and Chiang, B. (1982), ‘Identifying vaild measures of reading’, Exceptional Children, 49, 3645.CrossRefGoogle Scholar
Duda, M. A., Fixsen, D. L. and Blase, K. A. (2013), ‘Setting the stage for sustainability: Building the infrastructure for implementation capacity’, in Peisner-Feinberg, E. and Buysse, V. (eds), Handbook of Response to Intervention in Early Childhood, Baltitmore, MD: Paul H. Brookes, 397420.Google Scholar
Duflo, E. and Banerjee, A. (2017), Handbook of Field Experiments, Elsevier.Google Scholar
Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M. and Wallace, F. (2005), Implementation research: A synthesis of the literature, Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Insitute, The National Implementation Network (FMHI Publication #231).Google Scholar
Fuchs, L. S. and Deno, S. L. (1991), ‘Paradigmatic distinctions between instructionally relevant measurement models’, Exceptional Children, 57, 488500.CrossRefGoogle Scholar
Harte, R., Glynn, L., Rodríguez-Molinero, A., Baker, P. M., Scharf, T., Quinlan, L. R. and ÓLaighin, G. (2017), ‘A human-centered design methodology to enhance the usability, human factors, and user experience of connected health systems: a three-phase methodology’, JMIR Human Factors, 4, e8.CrossRefGoogle ScholarPubMed
Lee, Y.-S. and Lembke, E. (2016), ‘Developing and evaluating a kindergarten to third grade CBM mathematics assessment’, ZDM, 48, 10191030.CrossRefGoogle Scholar
Levitt, S. D. and List, J. A. (2009), ‘Field experiments in economics: The past, the present, and the future’, European Economic Review, 53, 118.CrossRefGoogle Scholar
McMaster, K. L., Du, X., Yeo, S., Deno, S. L., Parker, D. and Ellis, T. (2011), ‘Curriculum-Based Measures of Beginning Writing: Technical Features of the Slope’, Exceptional Children, 77, 185206. https://doi.org/10.1177/001440291107700203CrossRefGoogle Scholar
Mellard, D. F., Frey, B. B. and Woods, K. L. (2012), ‘School-wide Student Outcomes of Response to Intervention Frameworks’, Learning Disabilities--A Contemporary Journal, 10.Google Scholar
Metz, A. and Miller, R. (2018), ‘Using Implementation Science to create a strategic plan for using an MTSS/RtI service delivery framework’, in Carta, J. J. and Miller, R. (eds), Multi-Tiered Systems of Support for Young Children: A Guide to Response to Intervention in Early Education, Baltimore, MD: Paul H. Brookes.Google Scholar
Peterson, C. and McConnell, S. R. (1996), ‘Factors Related to Intervention Integrity and Child Outcome in Social Skills Interventions’, Journal of Early Intervention, 20, 146164. https://doi.org/10.1177/105381519602000206CrossRefGoogle Scholar
Schwartz, I. S. (1991), ‘The study of consumer behavior and social validity: An essential partnership for applied behavior analysis’, Journal of Applied Behavior Analysis, 24, 241244.CrossRefGoogle Scholar
Sidman, M. (1960), Tactics of scientific research, New York: Basic Books.Google Scholar
Sidman, M. (1993), ‘Reflections on Behavior Analysis and Coercion’, Behavior and Social Issues, 3. https://doi.org/10.5210/bsi.v3i1.200CrossRefGoogle Scholar
Skinner, B. F. (1950), ‘Are theories of learning necessary?Psychological Record, 57, 193216.Google ScholarPubMed
Skinner, B. F. (1953), Science and human behavior, New York: Macmillan.Google Scholar
Skinner, B. F. (1975), ‘The steep and thorny way to a science of behavior’, American Psychologist, 30(1): 4249.CrossRefGoogle ScholarPubMed
State, T. M., Harrison, J. R., Kern, L. and Lewis, T. J. (2017), ‘Feasibility and Acceptability of Classroom-Based Interventions for Students With Emotional/Behavioral Challenges at the High School Level’, Journal of Positive Behavior Interventions, 19, 2636. https://doi.org/10.1177/1098300716648459CrossRefGoogle Scholar
Thaler, R. H. (2016), ‘Behavioral Economics: Past, Present, and Future’, American Economic Review, 106, 15771600. https://doi.org/10.1257/aer.106.7.1577CrossRefGoogle Scholar
Tusing, M. E. and Breikjern, N. A. (2017), ‘Using Curriculum-Based Measurements for Program Evaluation: Expanding Roles for School Psychologists’, Journal of Applied School Psychology, 33, 4366.CrossRefGoogle Scholar
Whitley, S. (2019), ‘Oral Reading Fluency and Maze Selection for Predicting 5ThAnd 6Th Grade Students Reading and Math Achievement on a High Stakes Summative Assessment’, Reading Improvement, 56, 2435.Google Scholar
Wilson, D. S., Hayes, S. C., Biglan, A. and Embry, D. D. (2014), ‘Evolving the future: Toward a science of intentional change’, Behavioral and Brain Sciences, 37, 395416. https://doi.org/10.1017/S0140525X13001593CrossRefGoogle Scholar
Winett, R. A. and Winkler, R. C. (1972), ‘Current behavior modification in the classroom: Be still, be quiet, be docile’, Journal of Applied Behavior Analysis, 5, 499504.CrossRefGoogle ScholarPubMed
Wolf, M. M. (1978), ‘Social validity: The case for subjective assessment or how applied behavior analysis is finding its heart’, Journal of Applied Behavior Analysis, 11, 203214.CrossRefGoogle ScholarPubMed