Hostname: page-component-8448b6f56d-gtxcr Total loading time: 0 Render date: 2024-04-19T16:29:12.064Z Has data issue: false hasContentIssue false

Programmes, policies and implementation

Published online by Cambridge University Press:  08 May 2015

Rights & Permissions [Opens in a new window]

Abstract

Type
Editorial
Copyright
Copyright © The Authors 2015 

This issue of Public Health Nutrition highlights two related topics: implementation and nutrition policy. In an invited commentary, van Nassau et al.( Reference van Nassau, Singh and van Mechelen 1 ) provide an excellent introduction to and overview of implementation research. While the foremost question in programme evaluation most often is, ‘did it work?’, in implementation research the question is different: ‘why did my programme (not) work?’ Van Nassau et al. provide a strong argument in favour of research not only on the effect of obesity prevention programmes (‘did it work?’), but also on the process of implementing them (‘why (not)?’).

Did it work? Why (not)?

Three studies in this issue ask the question, ‘did it work?’ Knowing the answer, of course, is important and of interest. In the field study by Wansink and Just( Reference Wansink and Just 2 ), cafeteria diners without trays did not take, eat and waste less food than diners with trays, as had been expected. The two intervention studies, one among kindergarteners in Israel( Reference Lerner-Geva, Bar-Zvi and Levitan 3 ) and the other among secondary-school students in the Netherlands( Reference Kocken, van Kesteren and Buijs 4 ), showed only limited effectiveness of an education and/or environmental intervention in changing behaviours and knowledge.

But of even greater interest is where to go from here: should the programmes be discarded or modified, and if modified, how? That is why the question, ‘why did my programme (not) work?’ is so critical, whether the programme is an unqualified success or not. Information collected by Kocken et al. suggested that the education component of their programme had only limited reach – only 20% of students reported completing an online lesson, for example – and not all schools complied with the environmental (vending machine) change component of the intervention. The authors suggest that the ‘limited effects’ observed in their study might have been due to poor implementation( Reference Kocken, van Kesteren and Buijs 4 ). The findings also suggest that next steps might involve a plan to promote better implementation of the programme. In this context, evaluating the effectiveness of a programme is only a starting point. The bigger goal is to know what to do next. Informed decisions are possible only if investigators incorporate into their studies some ‘adequate and systematic measurement of the implementation process’( Reference van Nassau, Singh and van Mechelen 1 ).

The implementation plan

Besides the importance of process indicators, a second theme in this collection of studies is the importance of an implementation plan. In their randomized study, Johnston Molloy et al. compared delivery of a pre-school nutrition and health intervention by training managers and staff v. training managers only( Reference Johnston Molloy, Kearney and Hayes 5 ). Their findings offer information useful towards planning implementation of a programme, and also towards how to assess the implementation – by self-report v. direct observation. The qualitative study by Blondin et al. addresses a problem identified in an already implemented programme – in this case, food waste in a universal free School Breakfast Program in the USA( Reference Blondin, Djang and Metayer 6 ). Finally, a second invited commentary in this issue describes a programme that was developed and successfully implemented in Waikato, New Zealand and that has now been extended to all primary and intermediate schools in the Waikato region( Reference Rush, McLennan and Obolonkin 7 ). Rush et al. provide a picture of a work in progress – the work being to demonstrate effectiveness of the scaled-up programme. They also raise questions and issues that have emerged during the process of implementation; for example, how to address ethical concerns while collecting data on effect and process, and what are alternative, meaningful and feasible outcome measures to assess effectiveness. Together, these articles illustrate the fact that programme implementation is an ongoing process and that an implementation plan must remain adaptable as it meets different settings within the ‘real world’.

Adoption of policies and guidelines

Besides the implementation of programmes targeted at changing individual behaviours, the implementation of policies and guidelines is also potentially important for their influence on the food environment. Whether adoption of policies/guidelines has a measurable effect on individual behaviours and outcomes is addressed in a second editorial in this issue of Public Health Nutrition ( Reference Olstad and Ball 8 ). But as Gregorič et al.( Reference Gregorič, Pograjc and Pavlovec 9 ) note, ‘guidelines alone do not result in the required changes in practice’ – a clear prerequisite is the uptake or adoption of those guidelines. Thus, a third theme among these studies is uptake: which institutions (schools, centres, facilities) are more likely to adopt policies/guidelines and how can uptake be improved?

Four studies address this issue, in a variety of ways. Farmer et al.( Reference Farmer, Nikolopoulos and McCargar 10 ) conducted a qualitative study to gain an understanding of the organizational characteristics and processes of two ‘exemplary’ child-care centres in Edmonton, Alberta, Canada that were early adopters of the Alberta Nutrition Guidelines for Children and Youth. Gregorič et al.( Reference Gregorič, Pograjc and Pavlovec 9 ) present a well-considered evaluation of the implementation of Slovenian national dietary guidelines, describing the proportion of primary schools complying with the guidelines, which aspects of the guidelines were implemented, and characteristics of schools that were more or less successful in their implementation. Miller et al. report on the extent of implementation of a state-wide policy to improve food and drink supply in health facilities in Queensland, Australia( Reference Miller, Lee and Obersky 11 ). The fourth study, by Bell et al., is an intervention to encourage adoption of healthy eating policies and practices in child-care services in a region of New South Wales, Australia( Reference Bell, Hendrie and Hartley 12 ). The studies are useful examples of how to identify and support institutions that are slow to adopt policies/guidelines; how to identify potential barriers to implementation; and how to determine aspects of policies/guidelines that are more challenging to implement. Such information is critical towards efforts to ‘scale up effective public health nutrition initiatives to a population level’( Reference Bell, Hendrie and Hartley 12 ), especially because, as Gregorič et al. found, implementation of guidelines tends to be ‘achieved differently at distinct levels’( Reference Gregorič, Pograjc and Pavlovec 9 ).

Much of the emphasis on programmes and policies has been on whether they ‘work’. But whether they work depends heavily on how well they are adopted and implemented. This is what distinguishes ‘efficacy’ from ‘effectiveness’( Reference Glasgow, Lichtenstein and Marcus 13 ) and what transforms a programme from a subject of academic interest to a strategy with public health impact. Over 10 years ago, Glasgow et al. asserted that, ‘It is not enough to produce a highly efficacious intervention.’ Rather, ‘in terms of overall public health effect, adoption and implementation are as important as reach and efficacy’( Reference Glasgow, Lichtenstein and Marcus 13 ). Implementation research provides the necessary perspective and tools to evaluate and improve strategies, and hence to facilitate action. As such, it should be considered an essential aspect in the development of any programme or policy. If we want to see more progress from our many programmes and policies, or if we wonder why we don’t see a bigger impact, then it is time to shift our focus and invest greater resources in understanding the process of implementation.

References

1. van Nassau, F, Singh, AS & van Mechelen, W (2015) Implementation evaluation of school-based obesity prevention programmes in youth; how, what and why? (Invited Commentary). Public Health Nutr 18, 15311534.Google Scholar
2. Wansink, B & Just, DR (2015) Trayless cafeterias lead diners to take less salad and relatively more dessert. Public Health Nutr 18, 15351536.CrossRefGoogle ScholarPubMed
3. Lerner-Geva, L, Bar-Zvi, E, Levitan, G et al. (2015) An intervention for improving the lifestyle habits of kindergarten children in Israel: a cluster-randomised controlled trial investigation. Public Health Nutr 18, 15371544.Google Scholar
4. Kocken, PL, van Kesteren, NMC, Buijs, G et al. (2015) Students’ beliefs and behaviour regarding low-calorie beverages, sweets or snacks: are they affected by lessons on healthy food and by changes to school vending machines? Public Health Nutr 18, 15451553.CrossRefGoogle ScholarPubMed
5. Johnston Molloy, C, Kearney, J, Hayes, N et al. (2015) Pre-school manager training: a cost-effective tool to promote nutrition- and health-related practice improvements in the Irish full-day-care pre-school setting. Public Health Nutr 18, 15541564.Google Scholar
6. Blondin, SA, Djang, HC, Metayer, N et al. (2015) ‘It’s just so much waste.’ A qualitative investigation of food waste in a universal free School Breakfast Program. Public Health Nutr 18, 15651577.Google Scholar
7. Rush, E, McLennan, S, Obolonkin, V et al. (2015) Beyond the randomised controlled trial and BMI – evaluation of effectiveness of through-school nutrition and physical activity programmes (Invited Commentary). Public Health Nutr 18, 15781581.Google Scholar
8. Olstad, DL & Ball, K (2015) Optimizing child-focused nutrition policies: considerations and controversies (Editorial). Public Health Nutr 18, 15281530.Google Scholar
9. Gregorič, M, Pograjc, L, Pavlovec, A et al. (2015) School nutrition guidelines: overview of the implementation and evaluation. Public Health Nutr 18, 15821592.Google Scholar
10. Farmer, AP, Nikolopoulos, H, McCargar, L et al. (2015) Organizational characteristics and processes are important in the adoption of the Alberta Nutrition Guidelines for Children and Youth in child-care centres. Public Health Nutr 18, 15931601.CrossRefGoogle ScholarPubMed
11. Miller, J, Lee, A, Obersky, N et al. (2015) Implementation of A Better Choice Healthy Food and Drink Supply Strategy for staff and visitors in government-owned health facilities in Queensland, Australia. Public Health Nutr 18, 16021609.CrossRefGoogle ScholarPubMed
12. Bell, LK, Hendrie, GA, Hartley, J et al. (2015) Impact of a nutrition award scheme on the food and nutrient intakes of 2- to 4-year-olds attending long day care. Public Health Nutr 18, 16101619.CrossRefGoogle ScholarPubMed
13. Glasgow, RE, Lichtenstein, E & Marcus, AC (2003) Why don’t we see more translation of health promotion research to practice? Rethinking the efficacy-to-effectiveness transition. Am J Public Health 93, 12611267.CrossRefGoogle ScholarPubMed