Skip to main content Accessibility help
×
Hostname: page-component-84b7d79bbc-fnpn6 Total loading time: 0 Render date: 2024-07-31T16:50:11.432Z Has data issue: false hasContentIssue false

References

Published online by Cambridge University Press:  16 October 2018

Steven Higgins
Affiliation:
University of Durham
Get access
Type
Chapter
Information
Improving Learning
Meta-analysis of Intervention Research in Education
, pp. 232 - 251
Publisher: Cambridge University Press
Print publication year: 2018

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Abrami*, P. C., Bernard, R. M., Borokhovski, E., Wade, A., Surkes, M. A., Tamim, R., & Zhang, D. (2008). Instructional interventions affecting critical thinking skills and dispositions: A stage 1 meta-analysis. Review of Educational Research 78 (4), 11021134. http://dx.doi.org/10.3102/0034654308326084Google Scholar
Aguinis, H., Pierce, C. A., Bosco, F. A., Dalton, D. R., & Dalton, C. M. (2011). Debunking myths and urban legends about meta-analysis. Organizational Research Methods, 14 (2), 306331. https://doi.org/10.1177/1094428110375720CrossRefGoogle Scholar
Ahn, S., Ames, A. J., & Myers, N. D. (2012). A review of meta-analyses in education: Methodological strengths and weaknesses. Review of Educational Research, 82 (4), 436476. http://dx.doi.org/10.3102/0034654312458162Google Scholar
Anwar, E., Goldberg, E., Fraser, A., Acosta, C. J., Paul, M., & Leibovici, L. (2014). Vaccines for preventing typhoid fever. The Cochrane Library. http://dx.doi.org/10.1002/14651858.CD001261.pub3Google Scholar
Arnold, R. D. (1968). Four methods of teaching word recognition to disabled readers. The Elementary School Journal, 68 (5), 269274. http://dx.doi.org/10.1086/460445Google Scholar
Athappilly, K., Smidchens, U., & Kofel, J. W. (1983). A computer-based meta-analysis of the effects of modern mathematics in comparison with traditional mathematics. Educational Evaluation and Policy Analysis, 5 (4), 485493. http://www.jstor.org/stable/1164053Google Scholar
Baigent, C., Blackwell, L., Emberson, J., Holland, L. E., Reith, C., Bhala, N., & Collins, R. (2010). Efficacy and safety of more intensive lowering of LDL cholesterol: A meta-analysis of data from 170,000 participants in 26 randomised trials. The Lancet, 376 (9753), 16701681.Google Scholar
Bangert-Drowns*, R. L., Hurley, M. M., & Wilkinson, B. (2004). The effects of school-based writing-to-learn interventions on academic achievement: A meta-analysis. Review of Educational Research, 74 (1), 2958. https://doi.org/10.3102/00346543074001029CrossRefGoogle Scholar
Bangert-Drowns*, R. L., Kulik, C. L. C., Kulik, J. A., & Morgan, M. (1991). The instructional effect of feedback in test-like events. Review of Educational Research, 61 (2), 213238. http://dx.doi.org/10.3102/00346543061002213Google Scholar
Bernstein, B. (1971) Class, codes and control: Theoretical studies towards a sociology of language. London: Routledge & Kegan Paul.Google Scholar
Bayraktar*, S. (2000). A meta-analysis of the effectiveness of computer assisted instruction in science education. Journal of Research on Technology in Education, 42 (2), 173188. http://www.dx.doi.org/10.1080/15391523.2001.10782344Google Scholar
Bennett, R. E. (2011). Formative assessment: A critical review. Assessment in Education: Principles, Policy & Practice, 18 (1), 525. http://dx.doi.org/10.1080/0969594X.2010.513678Google Scholar
Berkeley, S., Scruggs, T. E., & Mastropieri, M. A. (2010). Reading comprehension instruction for students with learning disabilities, 1995–2006: A meta-analysis. Remedial and Special Education, 31 (6), 423436. http://dx.doi.org/10.1177/0741932509355988Google Scholar
Berninger, V. W., Vaughan, K., Abbott, R. D., Begay, K., Coleman, K. B., Curtin, G., & Graham, S. (2002). Teaching spelling and composition alone and together: Implications for the simple view of writing. Journal of Educational Psychology, 94 (2), 291304. http://dx.doi.org/10.1037/0022–0663.94.2.291Google Scholar
Biesta, G. (2007). Why ‘what works’ won’t work: Evidence‐based practice and the democratic deficit in educational research. Educational Theory, 57 (1), 122. http://dx.doi.org/10.1111/j.1741–5446.2006.00241.xGoogle Scholar
Black, P., & Wiliam, D. (1998). Assessment and classroom learning, Assessment in Education, 5, 773. http://dx.doi.org/10.1080/0969595980050102Google Scholar
Black, P., & Wiliam, D. (2005). Lessons from around the world: How policies, politics and cultures constrain and afford assessment practices. Curriculum Journal, 16, 249261. http://dx.doi.org/10.1080/09585170500136218Google Scholar
Black, P., & Wiliam, D. (2009). Developing the theory of formative assessment. Educational Assessment, Evaluation and Accountability, 21 (1), 531. http://dx.doi.org/10.1007/s11092-008–9068-5CrossRefGoogle Scholar
Blok*, H., Oostdam, R., Otter, M. E., & Overmaat, M. (2002). Computer-assisted instruction in support of beginning reading instruction: A review. Review of Educational Research, 72 (1), 101130. http://dx.doi.org/10.3102/00346543072001101Google Scholar
Bloom, B. S. (1984). The 2 sigma problem: The search for methods of group instruction as effective as one-to-one tutoring. Educational Researcher, 13 (6), 416. http://dx.doi.org/10.3102/0013189X013006004Google Scholar
Bloom, B. S., Hastings, J. T., & Madaus, G. F. (Eds.) (1971). Handbook on the formative and summative evaluation of student learning. New York: McGraw-Hill.Google Scholar
Bloom, B. S. (1964). Stability and change in human characteristics. New York: Wiley.Google Scholar
Borenstein, M., Hedges, L. V., Higgins, J., & Rothstein, H. R. (2009). Introduction to meta-analysis. London: John Wiley & Sons, Ltd.Google Scholar
Box, G. E., Hunter, W. G., & Hunter, J. S. (1978). Statistics for experimenters: An introduction to design, data analysis, and model building. New York: Wiley.Google Scholar
Bracht, G. H., & Glass, G. V. (1968). The external validity of experiments. American Educational Research Journal, 5 (4), 437474. https://doi.org/10.3102/00028312005004437Google Scholar
Briggs, D. C. (2016). Can Campbell’s law be mitigated? In Braun, H. (Ed.), Meeting the challenges to measurement in an era of accountability (pp. 168179). New York: Routledge.Google Scholar
Bus*, A. G., Van Ijzendoorn, M. H., & Pellegrini, A. D. (1995). Joint book reading makes for success in learning to read: A meta-analysis on intergenerational transmission of literacy. Review of Educational Research, 65 (1), 121. http://dx.doi.org/10.3102/00346543065001001Google Scholar
Camilli, G., Vargas, S., & Yurecko, M. (2003). Teaching children to read: The fragile link between science and federal education policy. Education Policy Analysis Archives, 11 (15), n15. http://dx.doi.org/10.14507/epaa.v11n15.2003Google Scholar
Camilli*, G., Vargas, S., Ryan, S., & Barnett, W. S. (2008). Meta-Analysis of the effects of early education interventions on cognitive and social development. Teachers College Record, 112 (3), 579620. http://dx.doi.org/10.14507/epaa.v11n15.2003CrossRefGoogle Scholar
Campbell, D. T. (1976). Assessing the Impact of Planned Social Change. Occasional Paper Series, # 8. Kalamazoo, MI: Western Michigan University Evaluation Centre.Google Scholar
Campbell, M. K., Piaggio, G., Elbourne, D. R., & Altman, D. G. (2012). Consort 2010 statement: Extension to cluster randomised trials. British Medical Journal, 345, e5661. http://dx.doi.org/10.1136/bmj.e5661Google Scholar
Chall, J. S. (1983). Learning to read: The great debate. New York: McGraw-Hill.Google Scholar
Chalmers, I., & Altman, D. G. (1995) Systematic Reviews. London: BMJ Publications.Google Scholar
Chalmers, I., Hedges, L. V., & Cooper, H. (2002). A brief history of research synthesis. Evaluation and the Health Professions 25, 1237. http://dx.doi.org/10.1037/0033–2909.87.3.442Google Scholar
Chan, M. E., & Arvey, R. D. (2012). Meta-analysis and the development of knowledge. Perspectives on Psychological Science, 7 (1), 7992. http://dx.doi.org/10.1177/1745691611429355CrossRefGoogle ScholarPubMed
Chauhan*, S. (2017). A meta-analysis of the impact of technology on learning effectiveness of elementary students. Computers & Education, 105, 1430. http://dx.doi.org/10.1016/j.compedu.2016.11.005Google Scholar
Cheung*, A. C., & Slavin, R. E. (2012). How features of educational technology applications affect student reading outcomes: A meta-analysis. Educational Research Review, 7 (3), 198215. http://doi.org/10.1016/j.edurev.2012.05.002Google Scholar
Cheung*, A. C., & Slavin, R. E. (2013). The effectiveness of educational technology applications for enhancing mathematics achievement in K-12 classrooms: A meta-analysis. Educational Research Review, 9, 88113. http://doi.org/10.1016/j.edurev.2013.01.001Google Scholar
Cheung, A. C., & Slavin, R. E. (2015). How methodological features affect effect sizes in education. In Best Evidence Encyclopedia. Baltimore: Johns Hopkins University. http://www.bestevidence.org/word/methodological_Sept_21_2015.pdfGoogle Scholar
Childs, A., & Menter, I. (Eds.) Mobilising teacher researchers: Challenging educational inequality. London: Routledge.Google Scholar
Chiu*, C. W. T. (1998). Synthesizing metacognitive interventions: What training characteristics can improve reading performance? Paper presented at the Annual Meeting of the American Educational Research Association, San Diego, CA, April 13–17, 1998. http://files.eric.ed.gov/fulltext/ED420844.pdfGoogle Scholar
Churches, R. (2016). Closing the gap: Test and learn. Nottingham: National College for Teaching and Leadership.Google Scholar
Cipriani, A., Higgins, J. P., Geddes, J. R., & Salanti, G. (2013). Conceptual and technical challenges in network meta-analysis. Annals of Internal Medicine, 159 (2), 130137. http://dx.doi.org/10.7326/0003-4819-159-2-201307160-00008Google Scholar
Clark*, D. B., Tanner-Smith, E. E., & Killingsworth, S. S. (2016). Digital games, design, and learning: A systematic review and meta-analysis. Review of Educational Research, 86 (1), 79122. http://dx.doi.org/10.3102/0034654315582065Google Scholar
Coe, R. (2002). It’s the effect size stupid; what effect size is and why is it important. Paper presented at the Annual Conference of the British Educational Research Association, University of Exeter, England, September 12–14, 2002.Google Scholar
Coffield, F., Moseley, D., Hall, E., & Ecclestone, K. (2004). Learning styles and pedagogy in post-16 learning. A systematic and critical review. London: Learning and Skills Research Centre.Google Scholar
Cohen, J. (1988). Statistical power analysis for the behavioral sciences. (2nd ed.) Hilsdale, NJ: Lawrence Erlbaum Associates.Google Scholar
Coldwell, M., Greany, T., Higgins, S., Brown, C., Maxwell, B., Stiell, B., Stoll, L., Willis, B. &, Helen Burns, H. (2017) Evidence-informed teaching: An evaluation of progress in England (Research Report July 2017 (DFE-RR-696)). London: Department for Education. Retrieved from: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/625007/Evidence-informed_teaching_-_an_evaluation_of_progress_in_England.pdfGoogle Scholar
Comfort*, C. B. (2003). Evaluating the effectiveness of parent training to improve outcomes for young children: A meta-analytic review of the published research. (Ph.D. in Applied Psychology). University of Calgary. http://dspace.ucalgary.ca/handle/1880/42284Google Scholar
Cooper, H. M., & Rosenthal, R. (1980). Statistical versus traditional procedures for summarizing research findings. Psychological Bulletin, 87 (3), 442. http://dx.doi.org/10.1037/0033–2909.87.3.442Google Scholar
Corak, M. (2013). Income inequality, equality of opportunity, and intergenerational mobility. The Journal of Economic Perspectives, 27 (3), 79102. http://hdl.handle.net/10419/80702Google Scholar
Cordingley, P. (2008). Research and evidence‐informed practice: Focusing on practice and practitioners. Cambridge Journal of Education, 38 (1), 3752. http://dx.doi.org/10.1080/03057640801889964Google Scholar
Cronbach, L. J., Ambron, S. R., Dornbusch, S. M., Hess, R. O., Hornik, R. C., Phillips, D. C., Walker, D. F., & Weiner, S. S. (1980). Toward reform of program evaluation: Aims, methods, and institutional arrangements. San Francisco, CA: Jossey-Bass.Google Scholar
Cuevas, J. (2015). Is learning styles-based instruction effective? A comprehensive analysis of recent research on learning styles. Theory and Research in Education, 13 (3), 308333. http://dx.doi.org/10.1177/1477878515606621Google Scholar
Cummings, C., Laing, K., Law, J., McLaughlin, J., Papps, I., Todd, L., & Woolner, P. (2012). Can changing aspirations and attitudes impact on educational attainment? A review of interventions. York: Joseph Rowntree Foundation.Google Scholar
Dagenais, C., Lysenko, L., Abrami, P. C., Bernard, R. M., Ramde, J., & Janosz, M. (2012). Use of research-based information by school practitioners and determinants of use: A review of empirical research. Evidence & Policy: A Journal of Research, Debate and Practice, 8 (3), 285309. http://dx.doi.org/10.1332/174426412X654031Google Scholar
D’Angelo*, C., Rutstein, D., Harris, C., Bernard, R., Borokhovski, E., & Haertel, G. (2014). Simulations for STEM learning: Systematic review and meta-analysis. Menlo Park, CA: SRI International. www.sri.com/educationGoogle Scholar
Davies, F. W. J. (1973). Teaching reading in early England. London: Pitman and Sons, Ltd.Google Scholar
Davis*, D. S. (2010). A meta-analysis of comprehension strategy instruction for upper elementary and middle school students (Doctoral dissertation). Vanderbilt University, USA). http://etd.library.vanderbilt.edu/available/etd-06162010–100830/unrestricted/Davis_dissertation.pdfGoogle Scholar
Deaton, A., & Cartwright, N. (2016). Understanding and misunderstanding randomized controlled trials (Working Paper No. 22595). Cambridge MA: National Bureau of Economic Research.Google Scholar
De Boer, H., Donker, A. S., & van der Werf, M. P. (2014). Effects of the attributes of educational interventions on students’ academic performance: A meta-analysis. Review of Educational Research, 84 (4), 509545. http://dx.doi.org/10.3102/0034654314540006Google Scholar
Department for Education (2012). What is the research evidence on writing? (Research Report DFE-RR238). London: Department for Education. https://www.gov.uk/government/uploads/system/ … /DFE-RR238.pdfGoogle Scholar
DerSimonian, R., & Laird, N. (1986). Meta-analysis in clinical trials. Controlled Clinical Trials, 7 (3), 177188. https://doi.org/10.1016/0197–2456 (86)90046–2Google Scholar
Dignath*, C., Buettner, G., & Langfeldt, H. (2008). How can primary school students learn self-regulated learning strategies most effectively? A meta-analysis on self-regulation training programmes. Educational Research Review 3 (2), 101129. http://www.dx.doi.org/10.1016/j.edurev.2008.02.003Google Scholar
Dillon, J.T., (1982). Superanalysis. American Journal of Evaluation 3 (3), 3543.Google Scholar
Donker*, A. S., De Boer, H., Kostons, D., Dignath van Ewijk, C. C., & Van der Werf, M. P. C. (2014). Effectiveness of learning strategy instruction on academic performance: A meta-analysis. Educational Research Review, 11, 126. http://www.dx.doi.org/10.1016/j.edurev.2013.11.002Google Scholar
Duffin, J., & Simpson, A. (2000). Understanding their thinking: The tension between the cognitive and the affective. In Perspectives on adults learning mathematics (pp. 8399). Springer Netherlands.Google Scholar
Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41 (3–4), 327350. http://dx.doi.org/10.1007/s10464-008–9165-0Google Scholar
Edmonds*, M. S., Vaughn, S., Wexler, J., Reutebuch, C., Cable, A., Tackett, K. K., & Schnakenberg, J. W. (2009). A synthesis of reading interventions and effects on reading comprehension outcomes for older struggling readers. Review of Educational Research, 79(1), 262300. http://dx.doi.org/10.3102/0034654308325998Google Scholar
EEF (2017). Improving literacy in key stage two: Guidance report. London: Education Endowment Foundation.Google Scholar
Ehri*, C. L., Nunes, S.R., Stahl, S. A., & Willows, D. M. (2001). Systematic phonics instruction helps students learn to read: Evidence from the National Reading Panel’s meta-analysis. Review of Educational Research, 71 (3), 393447. http://dx.doi.org/10.3102/00346543071003393Google Scholar
Elleman*, A. M., Lindo, E. J., Morphy, P., & Compton, D. L. (2009). The impact of vocabulary instruction on passage-level comprehension of school-age children: A meta-analysis. Journal of Research on Educational Effectiveness, 2 (1), 144. https://doi.org/10.1080/19345740802539200Google Scholar
Elliott, J. H., Turner, T., Clavisi, O., Thomas, J., Higgins, J. P., Mavergames, C., & Gruen, R. L. (2014). Living systematic reviews: An emerging opportunity to narrow the evidence-practice gap. PLoS Medicine, 11 (2), e1001603. http://dx.doi.org/10.1371/journal.pmed.1001603Google Scholar
Elwood, P. C., Cochrane, A. L., Burr, M. L., Sweetnam, P. M., Williams, G., Welsby, E., Hughes, S .J., & Renton, R. (1974). A randomized controlled trial of acetyl salicylic acid in the secondary prevention of mortality from myocardial infarction. British Medical Journal, 1(5905), 436.Google Scholar
Epstein, J. L. (2009). School, family, and community partnerships: Your handbook for action (3rd ed.).Thousand Oaks, CA: Corwin Press.Google Scholar
Erlenmeyer-Kimling, L., & Jarvik, L. F. (1963). Genetics and intelligence: A review. Science, 142(3598), 14771479. http://dx.doi.org/10.1126/science.142.3598.1477Google Scholar
Eysenck, H. J. (1952). The effects of psychotherapy: An evaluation. Journal of Consulting Psychology, 1 (5), 319.Google Scholar
Eysenck, H. J. (1978). An exercise in mega-silliness. American Psychologist, 33 (5), 517. http://dx.doi.org/10.1037/0003-066X.33.5.517.aGoogle Scholar
Fan, X., & Chen, M. (2001). Parental involvement and students’ academic achievement: A meta-analysis. Educational Psychology Review, 13 (1), 122. http://dx.doi.org/10.1023/A:1009048817385Google Scholar
Fauzan*, N. (2003). The effects of metacognitive strategies on reading comprehension: A quantitative synthesis and the empirical investigation (Doctoral dissertation). University of Durham). http://etheses.dur.ac.uk/1086/Google Scholar
Fisher, R. A. (1935) The design of experiments. Edinburgh: Oliver and Boyd.Google Scholar
Fisher, R. A. (1956) Statistical methods and scientific inference. Edinburgh: Oliver and Boyd.Google Scholar
Fitzgerald, J., & Shanahan, T. (2000) Reading and writing relations and their development, Educational Psychologist, 35 (1), 3950. http://dx.doi.org/10.1207/S15326985EP3501_5Google Scholar
Fitz‐Gibbon, C. T. (1984). Meta‐analysis: An explication. British Educational Research Journal, 10 (2), 135144. http://dx.doi.org/10.1080/0141192840100202Google Scholar
Fitz‐Gibbon, C. T. (1985). The implications of meta‐analysis for educational research. British Educational Research Journal, 11 (1), 4549. http://dx.doi.org/10.1080/0141192850110105Google Scholar
Flesch, R. (1955). Why Johnnie can’t read: And what you can do about it. New York: Harper & Brothers.Google Scholar
Francis, G. (2012). Too good to be true: Publication bias in two prominent studies from experimental psychology. Psychonomic Bulletin & Review, 19 (2), 151156. http://dx.doi.org/10.3758/s13423-012–0227-9Google Scholar
Fraser, A., Paul, M., Goldberg, E., Acosta, C. J., & Leibovici, L. (2007). Typhoid fever vaccines: Systematic review and meta-analysis of randomised controlled trials. Vaccine, 25 (45), 78487857. http://dx.doi.org/10.1016/j.vaccine.2007.08.027Google Scholar
Fraser, B. J., Walberg, H. J., Welch, W. W., & Hattie, J. A. (1987). Syntheses of educational productivity research. International Journal of Educational Research, 11 (2), 147252. http://dx.doi.org/10.1016/0883–0355(87)90035–8Google Scholar
Fuchs*, L. S., & Fuchs, D. (1986). Effects of systematic formative evaluation: A meta-analysis. Exceptional Children, 53 (3), 199208. http://dx.doi.org/10.1177/001440298605300301Google Scholar
Fukkink*, R. G., & De Glopper, K. (1998). Effects of instruction in deriving word meaning from context: A meta-analysis. Review of Educational Research, 68 (4), 450469. http://www.dx.doi.org/10.3102/00346543068004450Google Scholar
Furukawa, T. A., & Leucht, S. (2011) How to obtain NNT from Cohen’s d: Comparison of two methods. PloS One, 6 (4), e19070. http://dx.doi.org/10.1371/journal.pone.0019070Google Scholar
Galloway, A. M. (2003). Improving reading comprehension through metacognitive strategy instruction: Evaluating the evidence for the effectiveness of the reciprocal teaching procedure (Doctoral dissertation ETD collection). Lincoln, NE: University of Nebraska–Lincoln. AAI3092542. http://digitalcommons.unl.edu/dissertations/AAI3092542Google Scholar
Galuschka*, K., Ise, E., Krick, K., & Schulte-Körne, G. (2014) Effectiveness of treatment approaches for children and adolescents with reading disabilities: A meta-analysis of randomized controlled trials. PLoS One, 9 (2), e89900. http://dx.doi.org/10.1371/journal.pone.0089900Google Scholar
Garlinger*, D. K., & Frank, B. M. (1986). Teacher-student cognitive style and academic achievement: A review and a mini-meta analysis. Journal of Classroom Interaction, 21 (2), 28. http://www.jstor.org/stable/23869505Google Scholar
Glass, G. V. (1976). Primary, secondary, and meta-analysis of research. Educational Researcher, 5 (10), 38. http://dx.doi.org/10.3102/0013189X005010003Google Scholar
Glass, G. V. (1977). Integrating findings: The meta-analysis of research. Review of Research in Education, 5 (1), 351379. http://dx.doi.org/10.3102/0091732X005001351Google Scholar
Glass, G. V. (2000). Meta-analysis at 25. Retrieved from: http://www.gvglass.info/papers/meta25.htmlGoogle Scholar
Glass, G. V, McGaw, B., & Smith, M. L. (1981). Meta-analysis in social research. Beverly Hills, CA: Sage Publications.Google Scholar
Goldacre, B. (2010). Bad science: Quacks, hacks, and big pharma flacks. Toronto: McClelland & Stewart.Google Scholar
Goldacre, B. (2014). Bad pharma: How drug companies mislead doctors and harm patients. London: Macmillan.Google Scholar
Goldberg, A., Russell, M., & Cook, A. (2003). The effect of computers on student writing: A meta-analysis of studies from 1992 to 2002. The Journal of Technology, Learning and Assessment, 2 (1). https://ejournals.bc.edu/ojs/index.php/jtla/article/view/1661Google Scholar
Goodwin, A. P., & Ahn, S. (2010). A meta-analysis of morphological interventions: Effects on literacy achievement of children with literacy difficulties. Annals of Dyslexia, 60 (2), 183208. http://dx.doi.org/10.1080/10888438.2012.689791Google Scholar
Gorard, S. (2014). The widespread abuse of statistics by researchers: What is the problem and what is the ethical way forward? Psychology of Education Review 38 (1), 310.Google Scholar
Gorard, S., & See, B. H. (2013). Do parental involvement interventions increase attainment? A review of the evidence. London: Nuffield Foundation. http://www.nuffieldfoundation.org/sites/default/files/files/Do_parental_involvement_interventions_increase_attainment1.pdfGoogle Scholar
Gorard, S., See, B. H., & Davies, P. (2012). The impact of attitudes and aspirations on educational attainment and participation. York: Joseph Rowntree Foundation.Google Scholar
Gorard, S., See, B. H., & Siddiqi, N. (2015) Philosophy for children (P4C) evaluation. (Report). London: EEFGoogle Scholar
Gough, P. B., & Tunmer, W. E. (1986). Decoding, reading, and reading disability. Remedial and Special Education, 7, 610. http://dx.doi.org/10.1177/074193258600700104Google Scholar
Graham, S., & Hebert, M. A. (2010). Writing to read: Evidence for how writing can improve reading (A Carnegie Corporation Time to Act Report). Washington, DC: Alliance for Excellent Education.Google Scholar
Graham, S., Hebert, M., & Harris, K. R. (2015). Formative assessment and writing. The Elementary School Journal, 115 (4), 523547. http://dx.doi.org/10.1086/681947Google Scholar
Graham*, S., McKeown, D., Kiuhara, S., & Harris, K. R. (2012) A meta-analysis of writing instruction for students in the elementary grades. Journal of Educational Psychology. 104 (4), 879896. http://dx.doi.org/10.1037/a0029185Google Scholar
Graham, S., & Perrin, D. (2007). A meta-analysis of writing instruction for adolescent students. Journal of Educational Psychology, 99 (3), 445. http://dx.doi.org/10.1037/0022–0663.99.3.445Google Scholar
Graham, S., Liu, X., Aitken, A., Ng, C., Bartlett, B., Harris, K. R., & Holzapfel, J. (2017). Effectiveness of literacy programs balancing reading and writing instruction: A meta‐analysis. reading research quarterly (online early). Retrieved from: http://www.dx.doi.org/10.1002/rrq.194Google Scholar
Grant, M. J., & Booth, A. (2009). A typology of reviews: An analysis of 14 review types and associated methodologies. Health Information & Libraries Journal, 26(2), 91108. http://dx.doi.org/ 10.1111/j.1471–1842.2009.00848.xGoogle Scholar
Griffiths, Y. M., & Snowling, M. J. (2002). Predictors of exception word and nonword reading in dyslexic children: The severity hypothesis. Journal of Educational Psychology, 94 (1), 3443. http://dx.doi.org/10.1037/0022–0663.94.1.34Google Scholar
Guthrie, J. T., McRae, A., & Klauda, S. L. (2007). Contributions of concept-oriented reading instruction to knowledge about interventions for motivations in reading. Educational Psychologist, 42 (4), 237250. http://www.dx.doi.org/10.1080/00461520701621087Google Scholar
Guzetti*, B. J., Snyder, T. E., Glass, G. V., & Gamas, W. S. (1993). Meta-analysis of instructional interventions from reading education and science education to promote conceptual change in science. Reading Research Quarterly, 28 (2), 116161.Google Scholar
Haller*, E. P., Child, D. A., & Walberg, H. J. (1988). Can comprehension be taught? A quantitative synthesis of ‘metacognitive studies’. Educational Researcher, 17 (9), 58. http://www.dx.doi.org/10.3102/0013189X017009005Google Scholar
Harris, K. R., Graham, S., & Mason, L. H. (2006) Improving the writing, knowledge, and motivation of struggling young writers: Effects of self-regulated strategy development with and without peer support. American Educational Research Journal, 43 (2), 295340. http://dx.doi.org/10.3102/00028312043002295CrossRefGoogle Scholar
Hattie, J. (1992). Measuring the effects of schooling. Australian Journal of Education, 36 (1), 513. http://dx.doi.org/10.1177/000494419203600102Google Scholar
Hattie, J. A. (2008). Visible Learning. London: Routledge.Google Scholar
Hattie, J.A. (2011). Visible Learning for teachers. London: Routledge.Google Scholar
Hattie, J. (2015). The applicability of Visible Learning to higher education. Scholarship of Teaching and Learning in Psychology, 1 (1), 79. http://dx.doi.org/10.1037/stl0000021Google Scholar
Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research 77 (1), 81112. http://dx.doi.org/10.3102/003465430298487Google Scholar
Hedges, L. V. (1983). A random effects model for effect sizes. Psychological Bulletin, 93(2), 388. http://dx.doi.org/10.1037/0033–2909.93.2.388Google Scholar
Hedges, L. V., & Olkin, I. (1980). Vote-counting methods in research synthesis. Psychological Bulletin, 88 (2), 359369. http://dx.doi.org/10.1037/0033–2909.88.2.359Google Scholar
Hemsley-Brown, J. V., & Sharp, C. (2004). The use of research to improve professional practice: A systematic review of the literature. Oxford Review of Education, 29 (4), 449470. http://dx.doi.org/10.1080/0305498032000153025Google Scholar
Higgins, S. (2003). Parlez-vous mathematics? In Thompson, I. (Ed.), Enhancing primary mathematics teaching and learning. Buckingham: Open University Press.Google Scholar
Higgins, S. E. (2013). Matching style of learning. In Hattie, J. & Anderman, E. M. (Eds.) International guide to student achievement (educational psychology handbook) (pp. 337438). London: Routledge.Google Scholar
Higgins, S. (2016). Meta-synthesis and comparative meta-analysis of education research findings: Some risks and benefits. Review of Education, 4 (1), 3153. http://dx.doi.org/10.1002/rev3.3067Google Scholar
Higgins, S. (2017). Room in the Toolbox? The place of randomised controlled trials in educational research. In Childs, A. & Menter, I. (Eds.) Mobilising teacher researchers: Challenging educational inequality. London: Routledge.Google Scholar
Higgins, S., & Hall, E. (2004). Picking the strawberries out of the jam: Thinking critically about narrative reviews, systematic reviews and meta-analysis. Presented at the British Education Research Association conference, at Manchester Metropolitan University, September 2004. Retrieved from: http://www.leeds.ac.uk/educol/documents/00003835.htmGoogle Scholar
Higgins, S., & Katsipataki, M. (2015). Evidence from meta-analysis about parental involvement in education which supports their children’s learning. Journal of Children’s Services, 10 (3), 111. http://dx.doi.org/10.1108/JCS-02–2015-0009Google Scholar
Higgins, S., & Katsipataki, M. (2016). Communicating comparative findings from meta-analysis in educational research: Some examples and suggestions. International Journal of Research & Method in Education, 39 (3), 237254 http://dx.doi.org/10.1080/1743727X.2016.1166486Google Scholar
Higgins, S., Katsipataki, M., Coleman, R., Henderson, P., Major, L. E., Coe, R., & Mason, D. (2016). The Sutton Trust-Education Endowment Foundation Teaching and Learning Toolkit: Oral language interventions. London: Education Endowment Foundation. https://educationendowmentfoundation.org.uk/toolkit/toolkit-a-z/oral-language-interventions/Google Scholar
Higgins, S., Katsipataki, M., Kokotsaki, D., Coleman, R., Major, L. E., & Coe, R. (2014). The Sutton Trust-Education Endowment Foundation Teaching and Learning Toolkit. London: Education Endowment Foundation. http://educationendowmentfoundation.org.uk/toolkit/Google Scholar
Higgins, S., Katsipataki, M., Kokotsaki, D., Coleman, R., Major, L. E., & Coe, R. (2013). The Sutton Trust-Education Endowment Foundation Teaching and Learning Toolkit: Technical Appendices London: Education Endowment Foundation. Retrieved from: http://educationendowmentfoundation.org.uk/uploads/pdf/Technical Appendices (June_2013).pdfGoogle Scholar
Higgins, S., Kokotsaki, D., & Coe, R. (2011) Toolkit of strategies to improve learning: Summary for schools spending the Pupil Premium. London: Sutton Trust.Google Scholar
Higgins, S., Wall, K., Baumfield, V., Hall, E., Leat, D., Moseley, D., & Woolner, P. (2007). Learning to learn in schools phase 3 evaluation. Newcastle: Newcastle University.Google Scholar
Higgins*, S., Hall, E., Baumfield, V., & Moseley, D. (2005). A meta-analysis of the impact of the implementation of thinking skills approaches on pupils. In Research Evidence in Education Library. London: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London. http://eppi.ioe.ac.uk/cms/Default.aspx?tabid=339Google Scholar
Higgins, S., & Simpson, A. (2011). Visible Learning: A synthesis of over 800 meta-analyses relating to achievement. By John AC Hattie: Book review. British Journal of Educational Studies, 59 (2), 197201. http://dx.doi.org/10.1080/00071005.2011.584660Google Scholar
Hill, C. J., Bloom, H. S., Black, A. R., & Lipsey, M. W. (2008). Empirical benchmarks for interpreting effect sizes in research. Child Development Perspectives, 2(3), 172177. http://dx.doi.org/10.1111/j.1750–8606.2008.00061.xGoogle Scholar
Hill*, N. E., & Tyson, D. F. (2009). Parental involvement in middle school: A meta-analytic assessment of the strategies that promote achievement. Developmental Psychology, 45 (3), 740. http://dx.doi.org/10.1037/a0015362Google Scholar
Hunt, M. (1997). How science takes stock: The story of meta-analysis. New York: Russell Sage Foundation.Google Scholar
Ioannidis, J. P. (2009). Integration of evidence from multiple meta-analyses: A primer on umbrella reviews, treatment networks and multiple treatments meta-analyses. Canadian Medical Association Journal, 181 (8), 488493. http://dx.doi.org/10.1503/cmaj.081086Google Scholar
Ioannidis, J. P. A., & Trikalinos, T. A. (2007). The appropriateness of asymmetry tests for publication bias in meta-analyses: A large survey. Canadian Medical Association Journal, 176, 8. http://dx.doi.org/10.1503/cmaj.060410Google Scholar
Jacob, R., & Parkinson, J. (2015). The potential for school-based interventions that target executive function to improve academic achievement A review. Review of Educational Research. Retrieved from: http://dx.doi.org/10.3102/0034654314561338Google Scholar
Jeynes*, W. (2012). A meta-analysis of the efficacy of different types of parental involvement programs for urban students. Urban Education, 47 (4), 706742. http://dx.doi.org/10.1177/0042085912445643Google Scholar
Jeynes*, W. H. (2005). A meta-analysis of the relation of parental involvement to urban elementary school student academic achievement. Urban Education, 40 (3), 237269. http://dx.doi.org/10.1177/0042085905274540Google Scholar
Jeynes*, W. H. (2007). The relationship between parental involvement and urban secondary school student academic achievement a meta-analysis. Urban Education, 42 (1), 82110. http://dx.doi.org/10.1177/0042085906293818Google Scholar
Jeynes*, W. H. (2008). A meta-analysis of the relationship between phonics instruction and minority elementary school student academic achievement. Education and Urban Society, 40 (2), 151166. http://dx.doi.org/10.1177/0013124507304128Google Scholar
Kanadli*, S. (2016). A meta-analysis on the effect of instructional designs based on the learning styles models on academic achievement, attitude and retention. Educational Sciences: Theory and Practice, 16 (6), 20572086. http://dx.doi.org/10.12738/estp.2016.6.0084Google Scholar
Kao, G., & Thompson, J. S. (2003). Racial and ethnic stratification in educational achievement and attainment. Annual Review of Sociology, 29, 417442. http://dx.doi.org/10.1146/annurev.soc.29.010202.100019Google Scholar
Katsipataki, M., & Higgins, S. (2016). What works or what’s worked? Evidence from education in the United Kingdom. Procedia-Social and Behavioral Sciences, 217, 903909. http://dx.doi.org/10.1016/j.sbspro.2016.02.030Google Scholar
Kavale, K. A., & LeFever, G. B. (2007). Dunn and Dunn model of learning-style preferences: Critique of Lovelace meta-analysis. The Journal of Educational Research, 101 (2), 9497. http://dx.doi.org/10.3200/JOER.101.2.94–98CrossRefGoogle Scholar
Kavale, K., Hirshoren, A., & Forness, S. (1998). Meta-analytic validation of the Dunn-and-Dunn model of learning-style preferences: A critique of what was Dunn. Learning Disabilities Research and Practice, 13, 7580.Google Scholar
Kavale*, K. A., & Forness, S. R. (1987). Substance over style: Assessing the efficacy of modality testing and teaching. Exceptional Children, 54 (3), 228–39. http://journals.sagepub.com/doi/pdf/10.1177/001440298705400305Google Scholar
Kazrin, A., Durac, J., & Agteros, T. (1979). Meta-meta analysis: A new method for evaluating therapy outcome. Behaviour Research and Therapy, 17 (4), 397399. http://dx.doi.org/10.1016/0005–7967(79)90011–1Google Scholar
Kim*, J. S., & Quinn, D. M. (2013). The effects of summer reading on low-income children’s literacy achievement from kindergarten to grade 8 a meta-analysis of classroom and home interventions. Review of Educational Research, 83 (3), 386431. http://dx.doi.org/10.3102/0034654313483906Google Scholar
Kingston*, N., & Nash, B. (2011). Formative assessment: A meta-analysis and call for research. Educational Measurement: Issues and Practice, 30 (4), 2837. http://dx.doi.org/10.1111/j.1745–3992.2011.00220.xGoogle Scholar
Klauer*, K. J., & Phye, G. D. (2008). Inductive reasoning: A training approach. Review of Educational Research, 78 (1), 85123. http://www.dx.doi.org/10.3102/0034654307313402Google Scholar
Klein, P. D. (2003) Rethinking the multiplicity of cognitive resources and curricular representations: Alternatives to ‘learning styles’ and ‘multiple intelligences. Journal of Curriculum Studies, 35 (1), 4581. http://dx.doi.org/10.1080/00220270210141891Google Scholar
Kluger*, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological Bulletin, 119 (2), 254. http://dx.doi.org/10.1037/0033–2909.119.2.254Google Scholar
Komarraju, M., Karau, S. J., Schmeck, R. R., & Avdic, A. (2011). The Big Five personality traits, learning styles, and academic achievement. Personality and Individual Differences, 51(4), 472477. https://doi.org/10.1016/j.paid.2011.04.019Google Scholar
Kulik*, J. A., & Fletcher, J. D. (2016). Effectiveness of intelligent tutoring systems: A meta-analytic review. Review of Educational Research, 86 (1), 4278. https://doi.org/10.3102/0034654315581420Google Scholar
Kulik, C., Kulik, J., & Bangert-Drowns, R. (1990). Effectiveness of mastery learning programs: A meta-analysis. Review of Educational Research, 60 (2), 265306. http://dx.doi.org/10.3102/00346543060002265Google Scholar
Kulik, J. A., & Kulik, C. L. C. (1989). The concept of meta-analysis. International Journal of Educational Research, 13 (3), 227340. http://dx.doi.org/10.1016/0883–0355(89)90052–9Google Scholar
Kulik*, J. A., & Fletcher, J. D. (2016). Effectiveness of intelligent tutoring systems: A meta-analytic review. Review of Educational Research, 86 (1), 4278. http://dx.doi.org/10.3102/0034654315581420Google Scholar
Kunkel*, A. K. (2015). The effects of computer-assisted instruction in reading: A meta-analysis (Doctoral dissertation). University of Minnesota. http://conservancy.umn.edu/handle/11299/175221Google Scholar
Langer, L., Tripney, J., & Gough, D. (2016). The science of using science: Researching the use of research evidence in decision-making. London: EPPI-Centre, Social Science Research Unit, UCL Institute of Education, University College London.Google Scholar
Lather, P. (1986). Research as praxis. Harvard Educational Review, 56 (3), 257278.CrossRefGoogle Scholar
Layzer*, J. I., Goodson, B. D., Bernstein, L., & Price, C. (2001). National evaluation of family support programs. Final report volume A: The meta-analysis (ED462186). Washington, DC: Administration for Children, Youth, and Families (DHHS). http://files.eric.ed.gov/fulltext/ED462186.pdfGoogle Scholar
Leak, J., Duncan, G. J., Li, W., Magnuson, K., Schindler, H., & Yoshikawa, H. (2013). Is timing everything? How early childhood education program impacts vary by starting age, program duration and time since the end of the program (Working Paper). National Forum on Early Childhood Policy and Programs, Meta-analytic Database Project. Center on the Developing Child, Harvard University.Google Scholar
Lemons, C. J., Fuchs, D., Gilbert, J. K., & Fuchs, L. S. (2014). Evidence-based practices in a changing world: Reconsidering the counterfactual in education research. Educational Researcher, 43 (5), 242252. http://dx.doi.org/10.3102/0013189X14539189Google Scholar
Levin, B. (2011).Mobilising research knowledge in education. London Review of Education, 9(1), 1526. http://dx.doi.org/10.1080/14748460.2011.550431Google Scholar
Lewis*, R. J., & Vosburgh, W. T. (1988). Effectiveness of kindergarten intervention programs: A meta-analysis. School Psychology International, 9 (4), 265275. http://dx.doi.org/10.1177/0143034388094004Google Scholar
Li*, Q., & Ma, X. (2010). A meta-analysis of the effects of computer technology on school students’ mathematics learning. Educational Psychology Review, 22 (3), 215243. http://dx.doi.org/10.1007/s10648-010–9125-8Google Scholar
Lipsey, M. W., & Wilson, D. (2000). Practical meta-analysis (applied social research methods). London: Sage Publications.Google Scholar
Losinski*, M., Cuenca-Carlino, Y., Zablocki, M., & Teagarden, J. (2014). Examining the efficacy of self-regulated strategy development for students with emotional or behavioral disorders: A meta-analysis. Behavioral Disorders, 40 (1), 5267. http://dx.doi.org/10.17988/0198–7429-40.1.52CrossRefGoogle Scholar
Lou*, Y., Abrami, P. C., & d’Apollonia, S. (2001). Small group and individual learning with technology: A meta-analysis. Review of Educational Research, 71 (3), 449521. http://dx.doi.org/10.3102/00346543071003449Google Scholar
Lovelace, M. K. (2005). Meta-analysis of experimental research based on the Dunn and Dunn model. The Journal of Educational Research, 98 (3), 176183. http://dx.doi.org/10.3200/JOER.98.3.176–183Google Scholar
Lovelace*, M. K. (2002). A meta-analysis of experimental research studies based on the Dunn and Dunn learning-style model (ProQuest Dissertations and Theses, p. 177). New York: School of Education and Human Services, St. John’s University. Retrieved from: http://search.proquest.com/docview/275698679Google Scholar
Luyten, H. (2006). An empirical assessment of the absolute effect of schooling: Regression/discontinuity applied to TIMSS-95. Oxford Review of Education, 32 (3), 397429. http://dx.doi.org/10.1080/03054980600776589Google Scholar
Lysakowski*, R. S., & Walberg, H. J. (1982). Instructional effects of cues, participation, and corrective feedback: A quantitative synthesis. American Educational Research Journal, 19 (4), 559578. http://dx.doi.org/10.3102/00028312019004559Google Scholar
Makel, M. C., & Plucker, J. A. (2014). Facts are more important than novelty: Replication in the education sciences. Educational Researcher, 43 (6), 304316. http://dx.doi.org/10.3102/0013189X14545513Google Scholar
Manning*, M., Homel, R., & Smith, C. (2010). A meta-analysis of the effects of early developmental prevention programs in at-risk populations on non-health outcomes in adolescence. Children and Youth Services Review, 32 (4), 506519. http://dx.doi.org/10.1016/j.childyouth.2009.11.003CrossRefGoogle Scholar
Manz*, P. H., Hughes, C., Barnabas, E., Bracaliello, C., & Ginsburg-Block, M. (2010). A descriptive review and meta-analysis of family-based emergent literacy interventions: To what extent is the research applicable to low-income, ethnic-minority or linguistically-diverse young children? Early Childhood Research Quarterly, 25 (4), 409431. http://dx.doi.org/10.1016/j.ecresq.2010.03.002Google Scholar
Marulis*, L. M., & Neuman, S. B. (2010). The effects of vocabulary intervention on young children’s word learning: A meta-analysis. Review of Educational Research, 80 (3), 300335. http://www.dx.doi.org/10.3102/0034654310377087Google Scholar
Marzano, R. J. (1998). A theory-based meta-analysis of research on instruction. Aurora, CO: Mid-Continent Regional Educational Lab (McREL).Google Scholar
Mayer, R. E. (2011). Does styles research have useful implications for educational practice? Learning and Individual Differences, 21, 319320. http://dx.doi.org/10.1016/j.lindif.2010.11.016Google Scholar
McArthur*, G., Eve, P. M., Jones, K., Banales, E., Kohnen, S., Anandakumar, T., Larsen, L., Marinus, E., Wang, H. C., & Castles, A. (2012). Phonics training for English-speaking poor readers. Cochrane Database of Systematic Reviews, 12, CD009115. http://dx.doi.org/10.1002/14651858.CD009115.pub2Google Scholar
McKenna, M. C., & Stahl, K. A. D. (2015). Assessment for reading instruction (3rd ed.). New York: Guilford Publications.Google Scholar
Means*, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2009). Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. US Department of Education. http://files.eric.ed.gov/fulltext/ED505824.pdfGoogle Scholar
Melby-Lervåg, M., Lyster, S. A. H., & Hulme, C. (2012). Phonological skills and their role in learning to read: A meta-analytic review. Psychological Bulletin, 138 (2), 322. http://dx.doi.org/10.1037/a0026744Google Scholar
Mills, E. J., Thorlund, K., & Ioannidis, J. P. (2013). Demystifying trial networks and network meta-analysis. British Medical Journal, 346, f2914. http://dx.doi.org/10.1136/bmj.f2914Google Scholar
Moher, D., Liberati, A., Tetzlaff, J., & Altman, D. G. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Annals of Internal Medicine, 151 (4), 264269. http://dx.doi.org/10.7326/0003–4819-151–4-200908180–00135Google Scholar
Mol, S. E., Bus, A. G., & de Jong, M. T. (2009). Interactive book reading in early education: A tool to stimulate print knowledge as well as oral language. Review of Educational Research, 79 (2), 9791007. https://doi.org/10.3102/0034654309332561Google Scholar
Moran*, J., Ferdig, R. E., Pearson, P. D., Wardrop, J., & Blomeyer, R. L. (2008). Technology and reading performance in the middle-school grades: A meta-analysis with recommendations for policy and practice. Journal of Literacy Research, 40 (1), 658. http://www.dx.doi.org/10.1080/10862960802070483Google Scholar
Morphy*, P., & Graham, S. (2012). Word processing programs and weaker writers/readers: A meta-analysis of research findings, Reading and Writing, 25, 641678. http://dx.doi.org/10.1007/s11145-010–9292-5Google Scholar
Moseley, D., Baumfield, V., Elliott, J., Higgins, S., Miller, J., & Newton, D. P. (2005). Frameworks for thinking: A handbook for teaching and learning. Cambridge: Cambridge University Press.Google Scholar
Nickerson, R. S. (2000). Null hypothesis significance testing: A review of an old and continuing controversy. Psychological Methods, 5 (2), 241301. http://dx.doi.org/10.1037/1082-989X.5.2.241Google Scholar
Nye*, C., Schwartz, J., & Turner, H. (2006). Approaches to parent involvement for improving the academic performance of elementary school age children: A systematic review. Campbell Systematic Reviews, 2 (4). http://campbellcollaboration.org/lib/download/63/Google Scholar
O’Rourke, K. (2007). An historical perspective on meta-analysis: Dealing quantitatively with varying study results. Journal of the Royal Society of Medicine, 100 (12), 579582. http://dx.doi.org/10.1177/0141076807100012020Google Scholar
Olejnik, S., & Algina, J. (2000). Measures of effect size for comparative studies: Applications, interpretations, and limitations. Contemporary Educational Psychology, 25 (3), 241286. http://dx.doi.org/10.1006/ceps.2000.1040Google Scholar
Onuoha*, C. O. (2007). Meta-analysis of the effectiveness of computer-based laboratory versus traditional hands-on laboratory in college and pre-college science instructions (Order No. 3251334). ProQuest Dissertations & Theses Global. (304699656). Retrieved from: http://search.proquest.com/docview/304699656Google Scholar
Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349 (6251), aac4716. http://dx.doi.org/10.1126/science.aac4716CrossRefGoogle Scholar
Pashler, H., McDaniel, M., Rohrer, D., & Bjork, R. (2008). Learning styles: Concepts and evidence. Psychological Science in the Public Interest, 9 (3), 106119. http://dx.doi.org/10.1111/j.1539–6053.2009.01038.xGoogle Scholar
Pearson, K. (1904) Report on certain enteric fever inoculation statistics. The British Medical Journal, 2 (2288), 12431246. http://www.jstor.org/stable/20282622Google Scholar
Pearson, P. D., & Dole, J. A. (1987). Explicit comprehension instruction: A review of research and a new conceptualization of instruction. The Elementary School Journal, 88 (2), 151165. http://dx.doi.org/10.1086/461530Google Scholar
Pearson*, P. D., Ferdig, R. E., BlomeyerJr., R. L., & Moran, J. (2005). The effects of technology on reading performance in the middle-school grades: A meta-analysis with recommendations for policy. Oak Brook, IL: Learning Point Associates/North Central Regional Educational Laboratory (NCREL).Google Scholar
Perry, V., Albeg, L., & Tung, C. (2012). Meta-analysis of single-case design research on self-regulatory interventions for academic performance. Journal of Behavioral Education, 21 (3), 217229. http://www.dx.doi.org/10.1007/s10864-012–9156-yGoogle Scholar
Peirce, C. S., & Jastrow, J. (1885) On small differences in sensation. Memoirs of the National Academy of Sciences, 3, 7383. https://philpapers.org/archive/PEIOSD.pdfGoogle Scholar
Pratt, J. G., Smith, B. M., Rhine, J. B., Stuart, C. E., & Greenwood, J. A. (1940). Extra-sensory perception after sixty years: A critical appraisal of the research in extra-sensory perception. New York: Henry Holt and Company. http://dx.doi.org/10.1037/13598–000Google Scholar
Protzko, J. (2015). The environment in raising early intelligence: A meta-analysis of the fadeout effect. Intelligence, 53, 202210. http://dx.doi.org/10.1016/j.intell.2015.10.006Google Scholar
Raudenbush, S. W. (1997). Statistical analysis and optimal design for cluster randomized trials. Psychological Methods, 2 (2), 173. http://dx.doi.org/10.1037/1082-989X.2.2.173Google Scholar
Roberts, R. W., & Coleman, J. C. (1958). An investigation of the role of visual and kinesthetic factors in reading failure, The Journal of Educational Research, 51 (6), 445451. http://dx.doi.org/10.1080/00220671.1958.10882487Google Scholar
Rosen*, Y., & Salomon, G. (2007). The differential learning achievements of constructivist technology-intensive learning environments as compared with traditional ones: A meta-analysis. Journal of Educational Computing Research, 36 (1), 114. https://doi.org/10.2190/R8M4-7762-282U–554JGoogle Scholar
Rosenthal, R. (1966) Experimenter effects in behavioral research. New York: Appleton-Century-Crofts.Google Scholar
Rosenzweig, C. (2001). A meta-analysis of parenting and school success: The role of parents in promoting students’ academic performance. Paper presented at the Annual Meeting of the American Educational Research Association (Seattle, WA, April 10–14, 2001). ED452232 http://files.eric.ed.gov/fulltext/ED452232.pdfGoogle Scholar
Savage, R., Burgos, G., Wood, E., & Piquette, N. (2015). The simple view of reading as a framework for national literacy initiatives: A hierarchical model of pupil‐level and classroom‐level factors. British Educational Research Journal, 41 (5), 820844. http://dx.doi.org/10.1002/berj.3177Google Scholar
Schunk, D. H. (2008). Metacognition, self-regulation, and self-regulated learning: Research recommendations. Educational Psychology Review, 20 (4), 463467. http://www.dx.doi.org/10.1007/s10648-008–9086-3Google Scholar
Seidel, T., & Shavelson, R. J. (2007). Teaching effectiveness research in the past decade: The role of theory and research design in disentangling meta-analysis results. Review of Educational Research, 77 (4), 454499. http://dx.doi.org/10.3102/0034654307310317Google Scholar
Sellke, T., Bayarri, M. J., & Berger, J. O. (2001). Calibration of ρ values for testing precise null hypotheses. The American Statistician, 55 (1), 6271. http://dx.doi.org/10.1198/000313001300339950Google Scholar
Sénéchal*, M., & Young, L. (2008). The effect of family literacy interventions on children’s acquisition of reading from kindergarten to grade 3: A meta-analytic review. Review of Educational Research, 78 (4), 880907. http://dx.doi.org/10.3102/0034654308320319Google Scholar
Seo*, Y. J., & Bryant, D. P. (2009). Analysis of studies of the effects of computer-assisted instruction on the mathematics performance of students with learning disabilities. Computers & Education, 53 (3), 913928. http://dx.doi.org/10.1016/j.compedu.2009.05.002Google Scholar
Scammacca*, N. K., Roberts, G., Vaughn, S., & Stuebing, K. K. (2015). A meta-analysis of interventions for struggling readers in Grades 4–12: 1980–2011. Journal of Learning Disabilities, 48 (4), 369390. http://www.dx.doi.org/10.1177/0022219413504995Google Scholar
Schagen, I., & Hodgen, E. (2009). How much difference does it make? Notes on understanding, using, and calculating effect sizes for schools. Wellington: NCZER www.educationcounts.govt.nz/publications/schooling/36097/36098Google Scholar
Sheldon, S. B. (2009), Improving student outcomes with school, family, and community partnerships: A research review. In Epstein, J. L. et al. (Eds.) School, family, and community partnerships: Your handbook for action, 3rd ed. (pp. 4056).Thousand Oaks, CA: Corwin Press.Google Scholar
Sherman*, K. H. (2007). A meta-analysis of interventions for phonemic awareness and phonics instruction for delayed older readers (Doctoral thesis UMI No. 3285626). University of Oregon. ProQuest Dissertations and Theses. Retrieved from: http://search.proquest.com/docview/304825094Google Scholar
Simpson, A. (2017). The misdirection of public policy: Comparing and combining standardized effect sizes. Journal of Education Policy, 32 (4), 450466. https://doi.org/10.1080/02680939.2017.1280183Google Scholar
Sipe, T. A., & Curlette, W. L. (1996) .A meta-synthesis of factors related to educational achievement: A methodological approach to summarizing and synthesizing meta-analyses. International Journal of Educational Research, 25 (7), 583698. https://doi.org/10.1016/S0883–0355(96)80001–2Google Scholar
Slavin, R. E. (1986). Best-evidence synthesis: An alternative to meta-analytic and traditional reviews. Educational Researcher, 15 (9), 511. http://dx.doi.org/10.3102/0013189X015009005Google Scholar
Slavin, R. E. (2002). Evidence-based education policies: Transforming educational practice and research. Educational Researcher, 31 (7), 1521. http://dx.doi.org/10.3102/0013189X031007015Google Scholar
Slavin, R. E., Lake, C., Davis, S., & Madden, N. A. (2011). Effective programs for struggling readers: A best-evidence synthesis. Educational Research Review, 6 (1), 126. https://doi.org/10.1016/j.edurev.2010.07.002Google Scholar
Slavin, R., & Madden, N. A. (2011). Measures inherent to treatments in program effectiveness reviews. Journal of Research on Educational Effectiveness, 4 (4), 370380. http://dx.doi.org/10.1080/19345747.2011.558986Google Scholar
Slavin*, R. E., Lake, C., Davis, S., & Madden, N. A. (2011). Effective programs for struggling readers: A best-evidence synthesis. Educational Research Review, 6 (1), 126. http://dx.doi.org/10.1016/j.edurev.2010.07.002Google Scholar
Slavin, R., & Smith, D. (2009). The relationship between sample sizes and effect sizes in systematic reviews in education. Educational Evaluation and Policy Analysis, 31 (4), 500506. http://dx.doi.org/10.3102/0162373709352369Google Scholar
Slemmer*, D. L. (2002). The effect of learning styles on student achievement in various hypertext, hypermedia and technology enhanced learning environments: A meta-analysis (Ph.D. dissertation, unpublished). Boise, ID: Boise State University (ProQuest Dissertations and Theses). https://www.editlib.org/p/122865/Google Scholar
Smith, E., & Gorard, S. (2005). They don’t give us our marks: The role of formative feedback in student progress. Assessment in Education 12 (1), 2138. http://dx.doi.org/10.1080/0969594042000333896Google Scholar
Smith, N. L. (1982). Evaluative applications of meta-and mega-analysis.American Journal of Evaluation, 34, 4347. http://dx.doi.org/10.1177/109821408200300412Google Scholar
Smith, M. L., & Glass, G. V. (1977). Meta-analysis of psychotherapy outcome studies. American Psychologist, 32 (9), 752.Google Scholar
Snook, I., O’Neill, J., Clark, J., O’Neill, A. M., & Openshaw, R. (2009). Invisible learnings? A commentary on John Hattie’s book: Visible Learning: A synthesis of over 800 meta-analyses relating to achievement. New Zealand Journal of Educational Studies, 44 (1), 93.Google Scholar
Speight, S., Callanan, M., Griggs, J., & Farias, J. (2016). Rochdale research into practice: Evaluation report and executive summary. London: EEF.Google Scholar
Stahl, S. A., & Miller, P. D. (1989). Whole language and language experience approaches for beginning reading: A quantitative research synthesis. Review of Educational Research, 59 (1), 87116. http://dx.doi.org/10.3102/00346543059001087Google Scholar
Steenbergen-Hu*, S., & Cooper, H. (2013). A meta-analysis of the effectiveness of intelligent tutoring systems on K–12 students’ mathematical learning. Journal of Educational Psychology, 105 (4), 970987. http://dx.doi.org/10.1037/a0032447CrossRefGoogle Scholar
Stone, C. A. (1998). The metaphor of scaffolding: Its utility for the field of learning disabilities. Journal of Learning Disabilities, 31 (4), 344364. http://dx.doi.org/10.1177/002221949803100404Google Scholar
Strong*, G. K., Torgerson, C. J., Torgerson, D., & Hulme, C. (2011). A systematic meta‐ analytic review of evidence for the effectiveness of the ‘Fast ForWord’ language intervention program. Journal of Child Psychology and Psychiatry, 52 (3), 224235. http://www.dx.doi.org/10.1111/j.1469–7610.2010.02329.xGoogle Scholar
Susser, M. (1977). Judgment and causal inference: Criteria in epidemiologic studies. American Journal of Epidemiology, 105 (1), 115. http://dx.doi.org/10.1093/oxfordjournals.aje.a112349Google Scholar
Swanson, E., Vaughn, S., Wanzek, J., Petscher, Y., Heckert, J., Cavanaugh, C., & Tackett, K. (2011). A synthesis of read-aloud interventions on early reading outcomes among preschool through third graders at risk for reading difficulties. Journal of Learning Disabilities, 44 (3), 258275. http://www.dx.doi.org/10.1177/0022219410378444Google Scholar
Sweet, M. A., & Appelbaum, M. I. (2004). Is home visiting an effective strategy? A meta‐analytic review of home visiting programs for families with young children. Child development, 75 (5), 14351456. http://dx.doi.org/10.1111/j.1467–8624.2004.00750.xGoogle Scholar
Tamir*, P. (1985). Meta‐analysis of cognitive preferences and learning. Journal of Research in Science Teaching, 22 (1), 117. http://dx.doi.org/10.1002/tea.3660220101Google Scholar
Tenenbaum*, G., & Goldring, E. (1989). A meta-analysis of the effect of enhanced instruction: Cues, participation, reinforcement and feedback, and correctives on motor skill learning. Journal of Research and Development in Education, 22 (3), 5364.Google Scholar
Terhart, E. (2011). Has John Hattie really found the Holy Grail of research on teaching? An extended review of Visible Learning. Journal of Curriculum Studies, 43 (3), 425438.Google Scholar
Tingir*, S., Cavlazoglu, B., Caliskan, O., Koklu, O., & Intepe‐Tingir, S. (2017). Effects of mobile devices on K–12 students’ achievement: A meta‐analysis. Journal of Computer Assisted Learning (early view). Retrieved from: http://dx.doi.org/10.1111/jcal.12184Google Scholar
Todd, E. S., & Higgins, S. (1998). Powerlessness in professional and parent partnerships. British Journal of Sociology of Education, 19 (2), 227236. http://dx.doi.org/10.1080/0142569980190205Google Scholar
Tokpah*, C. L. (2008). The effects of computer algebra systems on students’ achievement in mathematics (Order No. 3321336). ProQuest Dissertations & Theses Global. (304549974). Retrieved from: http://search.proquest.com/docview/304549974Google Scholar
Torgerson, D., Torgerson, C., Ainsworth, H., Buckley, H. M., Heaps, C. K., Hewitt, C., & Mitchell, N. (2014). Improving writing quality: Evaluation report and executive summary May 2014. London: EEF. http://educationendowmentfoundation.org.uk/uploads/pdf/EEF_Evaluation_Report_-_Improving_Writing_Quality_-_May_2014_v2.pdfGoogle Scholar
Torgerson*, C., & Zhu, D. (2003). A systematic review and meta-analysis of the effectiveness of ICT on literacy learning in English, 5–16. In Research Evidence in Education Library. London: EPPI-Centre, Social Science Research Unit, Institute of Education.Google Scholar
Torgerson*, C., Brooks, G., & Hall, J. (2006). A systematic review of the research literature on the use of phonics in the teaching of reading and spelling (DfES Research Report RR711). London: DfES Publications.Google Scholar
Torgerson*, C. J., & Elbourne, D. (2002). A systematic review and meta-analysis of the effectiveness of information and communication technology (ICT) on the teaching of spelling. Journal of Research in Reading, 25, 129143. http://www.dx.doi.org/10.1111/1467–9817.00164Google Scholar
Tracy, B., Reid, R., & Graham, S. (2009). Teaching young students strategies for planning and drafting stories: The impact of self-regulated strategy development. The Journal of Educational Research, 102 (5), 323332. http://dx.doi.org/10.3200/JOER.102.5.323–332Google Scholar
Tunnell, M. O., & Jacobs, J. S. (1989). Using ‘real’ books: Research findings on literature based reading instruction. The Reading Teacher, 42 (7), 470477. http://www.jstor.org/stable/20200193Google Scholar
Tymms, P. (2004). Are standards rising in English primary schools? British Educational Research Journal, 30 (4), 477494. http://dx.doi.org/10.1080/0141192042000237194Google Scholar
Umbach, B., Darch, C., & Halpin, G. (1989). Teaching reading to low performing first graders in rural schools: A comparison of two instructional approaches. Journal of Instructional Psychology, 16 (3), 112.Google Scholar
Underwood, B. J. (1957). Interference and forgetting. Psychological Review, 64 (1), 49. http://dx.doi.org/10.1037/h0044616Google Scholar
Valentine, J. C., Pigott, T. D., & Rothstein, H. R. (2010). How many studies do you need? A primer on statistical power for meta-analysis. Journal of Educational and Behavioral Statistics, 35 (2), 215247. http://dx.doi.org/10.3102/1076998609346961CrossRefGoogle Scholar
van de Ven, M., Voeten, M., Steenbeek-Planting, E. G., & Verhoeven, L. (2017). Post-primary reading fluency development: A latent change approach. Learning and Individual Differences, 55, 112. http://dx.doi.org/10.1016/j.lindif.2017.02.001Google Scholar
Van Steensel*, R., McElvany, N., Kurvers, J., & Herppich, S. (2011). How effective are family literacy programs? Results of a meta-analysis. Review of Educational Research, 81 (1), 6996. http://dx.doi.org/10.3102/0034654310388819Google Scholar
Van Voorhis, F. L., Maier, M. F., Epstein, J. L., Lloyd, C. M., & Leuong, T. (2013). The impact of family involvement on the education of children ages 3 to 8: A focus on literacy and math achievement outcomes and social-emotional skills. New York: Center on School, Family and Community Partnerships, MDRC.Google Scholar
Walberg, H. J. (1982). Educational productivity: Theory, evidence, and prospects. Australian Journal of Education, 26 (2), 115122. http://dx.doi.org/10.1177/000494418202600202Google Scholar
Wang, M. C., Haertel, G. D., & Walberg, H. J. (1993). Toward a knowledge base for school learning. Review of Educational Research, 63 (3), 249294. https://doi.org/10.3102/00346543063003249Google Scholar
Waxman, H. C., Lin, M-F., &Michko, G. M. (2002). A meta-analysis of the effectiveness of teaching and learning with technology on student outcomes. Napier, IL: Learning Point Associates.Google Scholar
Wigelsworth, M., Lendrum, A., Oldfield, J., Scott, A., ten Bokkel, I., Tate, K., & Emery, C. (2016). The impact of trial stage, developer involvement and international transferability on universal social and emotional learning programme outcomes: A meta-analysis. Cambridge Journal of Education, 46 (3), 347376. http://dx.doi.org/10.1080/0305764X.2016.1195791Google Scholar
Wiliam, D. (2010). Standardized testing and school accountability. Educational Psychologist, 45 (2), 107122. http://dx.doi.org/10.1080/00461521003703060Google Scholar
Wiliam, D., Lee, C., Harrison, C., & Black, P. (2004). Teachers developing assessment for learning: Impact on student achievement. Assessment in Education: Principles, Policy & Practice, 11 (1), 4965. http://dx.doi.org/10.1080/0969594042000208994Google Scholar
Wiseman, S. (Ed.). (1961). Examinations and English education. Manchester, UK: Manchester University Press.Google Scholar
Wouters*, P., Van Nimwegen, C., Van Oostendorp, H., & Van Der Spek, E. D. (2013). A meta-analysis of the cognitive and motivational effects of serious games. Journal of Educational Psychology, 105 (2), 249265. http://dx.doi.org/10.1037/a0031311Google Scholar
Wyse, D., & Styles, M. (2007). Synthetic phonics and the teaching of reading: The debate surrounding England’s ‘Rose Report’. Literacy, 41 (1), 3542. http://dx.doi.org/10.1111/j.1467–9345.2007.00455.xGoogle Scholar
Xiao, Z., Kasim, A., & Higgins, S. E. (2016) Same difference? Understanding variation in the estimation of effect sizes from educational trials. International Journal of Educational Research 77, 114 http://dx.doi.org/10.1016/j.ijer.2016.02.001Google Scholar
Yusuf, S., Peto, R., Lewis, J., Collins, R., & Sleight, P. (1985). Beta blockade during and after myocardial infarction: An overview of the randomized trials. Progress in Cardiovascular Diseases, 27 (5), 335371. http://dx.doi.org/10.1016/S0033-0620(85)80003–7Google Scholar
Zheng*, L. (2016). The effectiveness of self-regulated learning scaffolds on academic performance in computer-based learning environments: A meta-analysis. Asia Pacific Education Review, 17 (2), 187202. http://dx.doi.org/10.1007/s12564-016–9426-9Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×