Our systems are now restored following recent technical disruption, and we’re working hard to catch up on publishing. We apologise for the inconvenience caused. Find out more: https://www.cambridge.org/universitypress/about-us/news-and-blogs/cambridge-university-press-publishing-update-following-technical-disruption
We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
This journal utilises an Online Peer Review Service (OPRS) for submissions. By clicking "Continue" you will be taken to our partner site
https://ajie.atsis.uq.edu.au/.
Please be aware that your Cambridge account is not valid for this OPRS and registration is required. We strongly advise you to read all "Author instructions" in the "Journal information" area prior to submitting.
To save this undefined to your undefined account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you used this feature, you will be asked to authorise Cambridge Core to connect with your undefined account.
Find out more about saving content to .
To save this article to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Over the 10 years of ‘Closing the Gap’, several interventions designed to improve outcomes for Aboriginal and Torres Strait Islander students have been trialled. In 2014 the Australian Government announced the ‘Flexible Literacy for Remote Primary Schools Programme’ (FLFRPSP) which was designed primarily to improve the literacy outcomes of students in remote schools with mostly Aboriginal and Torres Strait Islander students. The programme, using Direct Instruction (DI) or Explicit Direct Instruction, was extended to 2019 with more than $30 million invested. By 2017, 34 remote schools were participating in the Northern Territory, Queensland and Western Australia. This paper analyses My School data for 25 ‘very remote’ FLFRPSP schools with more than 80% Aboriginal or Torres Strait Islander students. It considers Year 3 and 5 NAPLAN reading results and attendance rates for participating and non-participating primary schools in the 3 years before the programme's implementation and compares them with results since. Findings show that, compared to very remote schools without FLFRPSP, the programme has not improved students' literacy abilities and results. Attendance rates for intervention schools have declined faster than for non-intervention schools. The paper questions the ethics of policy implementation and the role of evidence as a tool for accountability.
In an article published in this journal, Guenther and Osborne (2020) use data from the reading test of the National Assessment Program for Literacy and Numeracy (NAPLAN) to evaluate the effectiveness of the Flexible Literacy for Remote Primary Schools program in its first 3 years of implementation. However, their analysis has some serious flaws, including that the ‘post-intervention’ data were actually collected from the start of the implementation period. This calls their conclusions that the program was ineffective into question.
Guenther and Osborne's (2020) article ‘Did DI do it?’ raises concerns about the outcomes of a programme designed to improve literacy for First Nations students in remote schools. A critique of the article challenges the methods and findings. In this response, the authors respond to the criticism.
In the journal article Did DI do it? The impact of a programme designed to improve literacy for Aboriginal and Torres Strait Islander students in remote schools, Guenther and Osborne (2020) compare schoolwide NAPLAN reading scale scores for 25 Very Remote Indigenous schools implementing Direct Instruction through the Flexible Literacy for Remote Primary Schools Program (‘Flexible Literacy’ or ‘the program’) with those for 118 Very Remote Indigenous schools not involved with the program, to assert the program has not improved literacy outcomes. Good to Great Schools Australia (GGSA) undertook an analysis of the same school data for Reading, Writing, Spelling and Grammar and Punctuation scores. Our findings contradict theirs. In all areas, schools participating in the program show significant growth compared with all Australian and all Very Remote Indigenous schools. In Reading, schools involved in the program from 2015 to 2017 averaged 124% growth, while the average growth for comparable ages was 19 and 34% for Australian and Very Remote Indigenous schools, respectively. In Grammar and Punctuation schools involved in the program in the same period grew 180%, whilst growth for Australian schools was 15%, and for Very Remote Indigenous schools, 28%. These contrasting results illustrate the importance of evaluating growth to assess the impact of educational programs, rather than achievement alone, particularly in the case of Very Remote Indigenous schools where achievement levels are far below Australian grade levels. Guenther and Osborne's comparison of achievement across schools rather than measuring growth within schools obscures real gains and is misleading.
The authors of the article ‘Did DI do it? The impact of a programme designed to improve literacy for Aboriginal and Torres Strait Islander students in remote schools’ respond to a critique of their analysis of work.