We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Rapid tranquillisation – the parenteral administration of a sedating psychotropic – is frequently utilised to manage acute behavioural disturbances. Each mental health trust in England utilises independent guidelines for rapid tranquillisation, which vary geographically in both recommendations for therapeutic agents, as well as the format in which this information presented. Audits have identified that there is currently poor adherence to rapid tranquillisation protocol guidelines; this may be due to a lack of guideline clarity allowing for personal interpretation. This service evaluation aims to determine the clarity and uniformity of protocols outlined in mental health trust guidelines, in addition to analysing the outcomes of guideline testing to identify if there is consistency between policies, or whether outcomes varied depending on the trust guidelines used.
Methods
Five reviewers (of differing positions throughout clinical training) utilised 52 guidelines from each mental health trust in England, as well as Maudsley and NICE. These were assessed using the same fictional scenario, which simulated a common presentation in which the use of rapid tranquillisation is required. Reviewers deduced the most appropriate therapeutic agent according to the guideline, rated the clarity of each guideline and were invited to leave comments highlighting the guideline's useability.
Results
Seven different management plans were generated by the majority of respondents from the 52 guidelines. Lorazepam was the most frequently selected therapeutic agent. Guidelines with better subjective ratings of clarity had more agreement between reviewers, but full agreement between reviewers was only present for 10 out of 52 guidelines. For 11 guidelines, consensual agreement between reviewers was not reached. Qualitative analysis of comments identified the inclusion of past medical history, drug history and flow charts as positive sub-themes. Redundant language, contradictions and the suggestion to seek senior intervention before trialling a second agent were viewed negatively. Many guidelines did not sufficiently emphasise the need for performing an ECG before administering therapeutic agents, such as haloperidol, which may lead to potentially fatal arrhythmias.
Conclusion
There is no national consensus on the most appropriate rapid tranquillisation agents, with the available evidence being interpreted variously by different trusts and organisations. Poor guideline comprehensibility impacts clinician adherence and allows for personal preference to influence choice of drug. Clear guidelines utilising flow charts to succinctly outline relevant doses and absolute contraindications were viewed favourably by reviewers. The findings of this project highlights to relevant stakeholders the attributes that should be implemented when improving guidelines for the future.
“All or none” approaches to the use of contact precautions for methicillin-resistant Staphylococcus aureus (MRSA) both fail to recognize that transmission risk varies. This qualitative study assessed healthcare personnel perspectives regarding the feasibility of a risk-tailored approach to use contact precautions for MRSA more strategically in the acute care setting.
When The Viking-Age Gold and Silver of Scotland (AD 850–1100) was published in 1995, the catalogue detailed the contents of thirty-four hoards containing coins and/or ornaments and bullion (in the form of ingots and hack-silver) which could be linked to Viking activity and Norse settlement in Scotland during the 9th to 11th centuries, together with all of the known single finds. A subsequent paper (Graham-Campbell 2008) brought to attention more recent discoveries and some further (lost) antiquarian material (summarised below). In this update, it was pointed out (ibid.: 194) that the total of thirty-four hoards was in no way definitive, given the likelihood of some duplication, with [7] being recommended for deletion, as subsequently [16] and now also [21], but with the addition of a (lost) coin hoard reportedly discovered in St Kilda in the 18th century [2a], and since then a small gold hoard from Jura, found in 1869 [25a], and the Galloway (2014) Hoard [35]. Finally, renewed consideration is given to the inclusion of two further hoards, both from south-west Scotland [6a] and [30a], originally rejected as probably ‘non-Viking’. Their addition to the overall total means that the revised corpus currently consists of thirtysix hoards. For a recent survey of the use of silver in Scotland 75–1000 ad, see Blackwell, Goldberg and Hunter, Scotland’s Early Silver (2017), and for Scandinavian Scotland, see Horne (2021).
Hoards: Addenda and corrigenda
[1] Port Glasgow (near), Renfrewshire (1699)
Hugh Pagan (2014) has recovered important information pertaining to this mixed hoard, augmenting our knowledge of both its provenance and contents, from letters written by (or addressed to) the Rev. Robert Wodrow (1679–1734), who at the time of its discovery was the newly appointed Librarian of the University of Glasgow. Pagan confirms that ‘a date of deposit of c. 970’ is ‘appropriate’ (ibid.: 419) and concludes that: … it can now be seen that the hoard was found ‘within a mile of Port Glasgow’ and on a ‘brae’ [‘by the falling doun of some earth’], i.e.
Already by the time our volume Vikings in Scotland: An Archaeological Survey came to fruition in 1998, it was becoming clear that even two authors could not fully encompass the range of developing evidence. Scientific endeavours, new methodologies and the explosion of environmental data, with burgeoning analysis, were beginning to dominate research agendas beyond our specialisms. During the succeeding decades, these aspects have developed into commonly applied approaches, complementing the study of antiquarian sources, place-names and historical documentation. Taken together, all these aspects provide a unique suite of interdisciplinary tools. The contents of this current – and most timely – volume highlight both the richness of the evidence and the results of a collegiate approach within our discipline as a whole.
Several different approaches are already enabling a much fuller – and potentially more accurate – understanding of the Scandinavians in Scotland. In combination with more commonly applied methods, new approaches and new scientific methodologies are already integrated, and all are providing a much wider platform for discussion. Commencing with a reassessment of accepted narratives, a number of issues can be addressed. Making use of new refinements in C14 determinations, artefactual studies (for example, Ashby on combs, this volume) and isotopic/ aDNA studies, it is becoming more likely that we will be able to establish more clearly the dating of the arrival of the Vikings on our shores, as well as the nature of that arrival and interaction between native and incoming populations. The thorny issue of whether this was peaceful or violent is less commonly dictating the agenda now, being replaced with a more nuanced understanding of regional variations and continuing regimes of landscape exploitation (see, for example, Dockrill and Bond, and Macniven, this volume). The nature of this potential population replacement is informed through isotopic examination where the origin of individuals can be interrogated. The consideration of ethnic identities and their expression in newly settled areas has fascinating potential.
Major datasets ripe for reinterpretation and amplification include little- understood early settlement excavations, markedly Jarlshof in Shetland. As an oft-cited archaeological sequence of developing farmsteads, the issues with the stratigraphy and associated (or otherwise) artefact groups have cast a long shadow over the interpretation of many artefact assemblages from broadly contemporary sites. Items are uncritically considered to be securely dated in the Jarlshof sequence and are cited as datable parallels, when in fact the stratigraphical sequence is problematic.
The ‘Pagan Norse Graves of Scotland’ Research Project (PNGS) was initiated in 1995 with the award of a research grant to James Graham- Campbell (UCL) by the Leverhulme Trust. This, together with funding from the National Museums of Scotland (NMS), enabled the creation of a temporary post at NMS to catalogue and research Viking-Age gravefinds, to which Caroline Paterson (then Richardson) was duly appointed. In 1997, NMS with Historic Scotland funded a short extension for her to accession the finds from the Norwegian excavations of the Viking cemetery at Westness, Rousay, Orkney (Kaland 1993; Sellevold 1999, 2010), then newly returned from Bergen, in part intended for display in the new Museum of Scotland.
After a fallow period, PNGS was revived in 2017 thanks again to the Leverhulme Trust, with the award to Graham-Campbell of an Emeritus Fellowship, but in the meantime there were several lesser, but no less welcome, grants in support of PNGS (to be acknowledged in the final publication). The present outline, by Graham-Campbell and Paterson, includes a review of relevant recent literature on the subject by Stephen Harrison (University of Glasgow).
The point of departure for PNGS has been the catalogue of ‘Viking antiquities in Scotland’, published in 1940 by the Norwegian archaeologist Sigurd Grieg, based on a study-tour undertaken during a couple of months in 1925 (1940: 9–10). It formed the basis for general studies of the material by both Brøgger (1929, 1930) and Shetelig (1945, 1954) and is an indispensable work, but ‘the book teems with blunders’ (Thorsteinsson 1968: 164). For anyone wishing to map the distribution of pagan Norse graves in Scotland (for example, Crawford 1987, fig. 31), it is all that there has been to go on, with the result that Grieg’s inaccuracies and errors (including duplications) have inevitably been reproduced in terms of both overall numbers and individual examples of doubtful date/provenance. In the case of numbers, for example, PNGS has been able to increase Shetland’s three accepted graves to maybe thirteen (Graham-Campbell 2016, with additions), and for Orkney, including the two burial places at Pierowall, Westray and Westness, Rousay (see below), the total has reached a possible ninety-seven.
Clozapine is licensed for treatment-resistant psychosis and remains underutilised. This may berelated to the stringent haematological monitoring requirements that are mandatory in most countries. We aimed to compare guidelines internationally and develop a novel Stringency Index. We hypothesised that the most stringent countries would have increased healthcare costs and reduced prescription rates.
Method
We conducted a literature review and survey of guidelines internationally. Guideline identification involved a literature review and consultation with clinical academics. We focused on the haematological monitoring parameters, frequency and thresholds for discontinuation and rechallenge after suspected clozapine-induced neutropenia. In addition, indicators reflecting monitoring guideline stringency were scored and visualised using a choropleth map. We developed a Stringency Index with an international panel of clozapine experts, through a modified-Delphi-survey. The Stringency Index was compared to health expenditure per-capita and clozapine prescription per 100 000 persons.
Results
One hundred twocountries were included, from Europe (n = 35), Asia (n = 24), Africa (n = 20), South America (n = 11), North America (n = 7) and Oceania and Australia (n = 5). Guidelines differed in frequency of haematological monitoring and discontinuation thresholds. Overall, 5% of included countries had explicit guidelines for clozapine-rechallenge and 40% explicitly prohibited clozapine-rechallenge. Furthermore, 7% of included countries had modified discontinuation thresholds for benign ethnic neutropenia. None of the guidelines specified how long haematological monitoring should continue. The most stringent guidelines were in Europe, and the least stringent were in Africa and South America. There was a positive association (r = 0.43, p < 0.001) between a country's Stringency Index and healthcare expenditure per capita.
Conclusions
Recommendations on how haematological function should be monitored in patients treated with clozapine vary considerably between countries. It would be useful to standardise guidelines on haematological monitoring worldwide.
Whole-genome sequencing (WGS) has traditionally been used in infection prevention to confirm or refute the presence of an outbreak after it has occurred. Due to decreasing costs of WGS, an increasing number of institutions have been utilizing WGS-based surveillance. Additionally, machine learning or statistical modeling to supplement infection prevention practice have also been used. We systematically reviewed the use of WGS surveillance and machine learning to detect and investigate outbreaks in healthcare settings.
Methods:
We performed a PubMed search using separate terms for WGS surveillance and/or machine-learning technologies for infection prevention through March 15, 2021.
Results:
Of 767 studies returned using the WGS search terms, 42 articles were included for review. Only 2 studies (4.8%) were performed in real time, and 39 (92.9%) studied only 1 pathogen. Nearly all studies (n = 41, 97.6%) found genetic relatedness between some isolates collected. Across all studies, 525 outbreaks were detected among 2,837 related isolates (average, 5.4 isolates per outbreak). Also, 35 studies (83.3%) only utilized geotemporal clustering to identify outbreak transmission routes. Of 21 studies identified using the machine-learning search terms, 4 were included for review. In each study, machine learning aided outbreak investigations by complementing methods to gather epidemiologic data and automating identification of transmission pathways.
Conclusions:
WGS surveillance is an emerging method that can enhance outbreak detection. Machine learning has the potential to identify novel routes of pathogen transmission. Broader incorporation of WGS surveillance into infection prevention practice has the potential to transform the detection and control of healthcare outbreaks.
The 2020 update of the Canadian Stroke Best Practice Recommendations (CSBPR) for the Secondary Prevention of Stroke includes current evidence-based recommendations and expert opinions intended for use by clinicians across a broad range of settings. They provide guidance for the prevention of ischemic stroke recurrence through the identification and management of modifiable vascular risk factors. Recommendations address triage, diagnostic testing, lifestyle behaviors, vaping, hypertension, hyperlipidemia, diabetes, atrial fibrillation, other cardiac conditions, antiplatelet and anticoagulant therapies, and carotid and vertebral artery disease. This update of the previous 2017 guideline contains several new or revised recommendations. Recommendations regarding triage and initial assessment of acute transient ischemic attack (TIA) and minor stroke have been simplified, and selected aspects of the etiological stroke workup are revised. Updated treatment recommendations based on new evidence have been made for dual antiplatelet therapy for TIA and minor stroke; anticoagulant therapy for atrial fibrillation; embolic strokes of undetermined source; low-density lipoprotein lowering; hypertriglyceridemia; diabetes treatment; and patent foramen ovale management. A new section has been added to provide practical guidance regarding temporary interruption of antithrombotic therapy for surgical procedures. Cancer-associated ischemic stroke is addressed. A section on virtual care delivery of secondary stroke prevention services in included to highlight a shifting paradigm of care delivery made more urgent by the global pandemic. In addition, where appropriate, sex differences as they pertain to treatments have been addressed. The CSBPR include supporting materials such as implementation resources to facilitate the adoption of evidence into practice and performance measures to enable monitoring of uptake and effectiveness of recommendations.
Energy deficit is common during prolonged periods of strenuous physical activity and limited sleep, but the extent to which appetite suppression contributes is unclear. The aim of this randomised crossover study was to determine the effects of energy balance on appetite and physiological mediators of appetite during a 72-h period of high physical activity energy expenditure (about 9·6 MJ/d (2300 kcal/d)) and limited sleep designed to simulate military operations (SUSOPS). Ten men consumed an energy-balanced diet while sedentary for 1 d (REST) followed by energy-balanced (BAL) and energy-deficient (DEF) controlled diets during SUSOPS. Appetite ratings, gastric emptying time (GET) and appetite-mediating hormone concentrations were measured. Energy balance was positive during BAL (18 (sd 20) %) and negative during DEF (–43 (sd 9) %). Relative to REST, hunger, desire to eat and prospective consumption ratings were all higher during DEF (26 (sd 40) %, 56 (sd 71) %, 28 (sd 34) %, respectively) and lower during BAL (–55 (sd 25) %, −52 (sd 27) %, −54 (sd 21) %, respectively; Pcondition < 0·05). Fullness ratings did not differ from REST during DEF, but were 65 (sd 61) % higher during BAL (Pcondition < 0·05). Regression analyses predicted hunger and prospective consumption would be reduced and fullness increased if energy balance was maintained during SUSOPS, and energy deficits of ≥25 % would be required to elicit increases in appetite. Between-condition differences in GET and appetite-mediating hormones identified slowed gastric emptying, increased anorexigenic hormone concentrations and decreased fasting acylated ghrelin concentrations as potential mechanisms of appetite suppression. Findings suggest that physiological responses that suppress appetite may deter energy balance from being achieved during prolonged periods of strenuous activity and limited sleep.
Errors inherent in self-reported measures of energy intake (EI) are substantial and well documented, but correlates of misreporting remain unclear. Therefore, potential predictors of misreporting were examined. In Study One, fifty-nine individuals (BMI = 26·1 (sd 3·8) kg/m2, age = 42·7 (sd 13·6) years, females = 29) completed a 14-d stay in a residential feeding behaviour suite where eating behaviour was continuously monitored. In Study Two, 182 individuals (BMI = 25·7 (sd 3·9) kg/m2, age = 42·4 (sd 12·2) years, females = 96) completed two consecutive days in a residential feeding suite and five consecutive days at home. Misreporting was directly quantified by comparing covertly measured laboratory weighed intakes (LWI) with self-reported EI (weighed dietary record (WDR), 24-h recall, 7-d diet history, FFQ). Personal (age, sex and %body fat) and psychological traits (personality, social desirability, body image, intelligence quotient and eating behaviour) were used as predictors of misreporting. In Study One, those with lower psychoticism (P = 0·009), openness to experience (P = 0·006) and higher agreeableness (P = 0·038) reduced EI on days participants knew EI was being measured to a greater extent than on covert days. Isolated associations existed between personality traits (psychoticism and openness to experience), eating behaviour (emotional eating) and differences between the LWI and self-reported EI, but these were inconsistent between dietary assessment techniques and typically became non-significant after accounting for multiplicity of comparisons. In Study Two, sex was associated with differences between LWI and the WDR (P = 0·009), 24-h recall (P = 0·002) and diet history (P = 0·050) in the laboratory, but not home environment. Personal and psychological correlates of misreporting identified displayed no clear pattern across studies or dietary assessment techniques and had little utility in predicting misreporting.
The ‘jumping to conclusions’ (JTC) bias is associated with both psychosis and general cognition but their relationship is unclear. In this study, we set out to clarify the relationship between the JTC bias, IQ, psychosis and polygenic liability to schizophrenia and IQ.
Methods
A total of 817 first episode psychosis patients and 1294 population-based controls completed assessments of general intelligence (IQ), and JTC, and provided blood or saliva samples from which we extracted DNA and computed polygenic risk scores for IQ and schizophrenia.
Results
The estimated proportion of the total effect of case/control differences on JTC mediated by IQ was 79%. Schizophrenia polygenic risk score was non-significantly associated with a higher number of beads drawn (B = 0.47, 95% CI −0.21 to 1.16, p = 0.17); whereas IQ PRS (B = 0.51, 95% CI 0.25–0.76, p < 0.001) significantly predicted the number of beads drawn, and was thus associated with reduced JTC bias. The JTC was more strongly associated with the higher level of psychotic-like experiences (PLEs) in controls, including after controlling for IQ (B = −1.7, 95% CI −2.8 to −0.5, p = 0.006), but did not relate to delusions in patients.
Conclusions
Our findings suggest that the JTC reasoning bias in psychosis might not be a specific cognitive deficit but rather a manifestation or consequence, of general cognitive impairment. Whereas, in the general population, the JTC bias is related to PLEs, independent of IQ. The work has the potential to inform interventions targeting cognitive biases in early psychosis.
New δ13Ccarb and microfacies data from Hereford–Worcestershire and the West Midlands allow for a detailed examination of variations in the Homerian carbon isotope excursion (Silurian) and depositional environment within the Much Wenlock Limestone Formation of the Midland Platform (Avalonia), UK. These comparisons have been aided by a detailed sequence-stratigraphic and bentonite correlation framework. Microfacies analysis has identified regional differences in relative sea-level change and indicates an overall shallowing of the carbonate platform interior from Hereford–Worcestershire to the West Midlands. Based upon the maximum δ13Ccarb values for the lower and upper peaks of the Homerian carbon isotope excursion (CIE), the shallower depositional setting of the West Midlands is associated with values that are 0.7 ‰ and 0.8 ‰ higher than in Hereford–Worcestershire. At the scale of parasequences the effect of depositional environment upon δ13Ccarb values can also be observed, with a conspicuous offset in the position of the trough in δ13Ccarb values between the peaks of the Homerian CIE. This offset can be accounted for by differences in relative sea-level change and carbonate production rates. While such differences complicate the use of CIEs as a means of high-resolution correlation, and caution against correlations based purely upon the isotopic signature, it is clear that a careful analysis of the depositional environment can account for such differences and thereby improve the use of carbon isotopic curves as a means of correlation.
Successful scale-up of integrated primary mental healthcare requires routine monitoring of key programme performance indicators. A consensus set of mental health indicators has been proposed but evidence on their use in routine settings is lacking.
Aims
To assess the acceptability, feasibility, perceived costs and sustainability of implementing indicators relating to integrated mental health service coverage in six South Asian (India, Nepal) and sub-Saharan African countries (Ethiopia, Nigeria, South Africa, Uganda).
Method
A qualitative study using semi-structured key informant interviews (n = 128) was conducted. The ‘Performance of Routine Information Systems’ framework served as the basis for a coding framework covering three main categories related to the performance of new tools introduced to collect data on mental health indicators: (1) technical; (2) organisation; and (3) behavioural determinants.
Results
Most mental health indicators were deemed relevant and potentially useful for improving care, and therefore acceptable to end users. Exceptions were indicators on functionality, cost and severity. The simplicity of the data-capturing formats contributed to the feasibility of using forms to generate data on mental health indicators. Health workers reported increasing confidence in their capacity to record the mental health data and minimal additional cost to initiate mental health reporting. However, overstretched primary care staff and the time-consuming reporting process affected perceived sustainability.
Conclusions
Use of the newly developed, contextually appropriate mental health indicators in health facilities providing primary care services was seen largely to be feasible in the six Emerald countries, mainly because of the simplicity of the forms and continued support in the design and implementation stage. However, approaches to implementation of new forms generating data on mental health indicators need to be customised to the specific health system context of different countries. Further work is needed to identify ways to utilise mental health data to monitor and improve the quality of mental health services.
Current coverage of mental healthcare in low- and middle-income countries is very limited, not only in terms of access to services but also in terms of financial protection of individuals in need of care and treatment.
Aims
To identify the challenges, opportunities and strategies for more equitable and sustainable mental health financing in six sub-Saharan African and South Asian countries, namely Ethiopia, India, Nepal, Nigeria, South Africa and Uganda.
Method
In the context of a mental health systems research project (Emerald), a multi-methods approach was implemented consisting of three steps: a quantitative and narrative assessment of each country's disease burden profile, health system and macro-fiscal situation; in-depth interviews with expert stakeholders; and a policy analysis of sustainable financing options.
Results
Key challenges identified for sustainable mental health financing include the low level of funding accorded to mental health services, widespread inequalities in access and poverty, although opportunities exist in the form of new political interest in mental health and ongoing reforms to national insurance schemes. Inclusion of mental health within planned or nascent national health insurance schemes was identified as a key strategy for moving towards more equitable and sustainable mental health financing in all six countries.
Conclusions
Including mental health in ongoing national health insurance reforms represent the most important strategic opportunity in the six participating countries to secure enhanced service provision and financial protection for individuals and households affected by mental disorders and psychosocial disabilities.
Declaration of interest
D.C. is a staff member of the World Health Organization.
In most low- and middle-income countries (LMIC), routine mental health information is unavailable or unreliable, making monitoring of mental healthcare coverage difficult. This study aims to evaluate a new set of mental health indicators introduced in primary healthcare settings in five LMIC.
Method
A survey was conducted among primary healthcare workers (n = 272) to assess the acceptability and feasibility of eight new indicators monitoring mental healthcare needs, utilisation, quality and payments. Also, primary health facility case records (n = 583) were reviewed by trained research assistants to assess the level of completion (yes/no) for each of the indicators and subsequently the level of correctness of completion (correct/incorrect – with incorrect defined as illogical, missing or illegible information) of the indicators used by health workers. Assessments were conducted within 1 month of the introduction of the indicators, as well as 6–9 months afterwards.
Results
Across both time points and across all indicators, 78% of the measurements of indicators were complete. Among the best performing indicators (diagnosis, severity and treatment), this was significantly higher. With regards to correctness, 87% of all completed indicators were correctly completed. There was a trend towards improvement over time. Health workers' perceptions on feasibility and utility, across sites and over time, indicated a positive attitude in 81% of all measurements.
Conclusion
This study demonstrates high levels of performance and perceived utility for a set of indicators that could ultimately be used to monitor coverage of mental healthcare in primary healthcare settings in LMIC. We recommend that these indicators are incorporated into existing health information systems and adopted within the World Health Organization Mental Health Gap Action Programme implementation strategy.
There is a global drive to improve access to mental healthcare by scaling up integrated mental health into primary healthcare (PHC) systems in low- and middle-income countries (LMICs).
Aims
To investigate systems-level implications of efforts to scale-up integrated mental healthcare into PHC in districts in six LMICs.
Method
Semi-structured interviews were conducted with 121 managers and service providers. Transcribed interviews were analysed using framework analysis guided by the Consolidated Framework for Implementation Research and World Health Organization basic building blocks.
Results
Ensuring that interventions are synergistic with existing health system features and strengthening of the healthcare system building blocks to support integrated chronic care and task-sharing were identified as aiding integration efforts. The latter includes (a) strengthening governance to include technical support for integration efforts as well as multisectoral collaborations; (b) ring-fencing mental health budgets at district level; (c) a critical mass of mental health specialists to support task-sharing; (d) including key mental health indicators in the health information system; (e) psychotropic medication included on free essential drug lists and (f) enabling collaborative and community- oriented PHC-service delivery platforms and continuous quality improvement to aid service delivery challenges in implementation.
Conclusions
Scaling up integrated mental healthcare in PHC in LMICs is more complex than training general healthcare providers. Leveraging existing health system processes that are synergistic with chronic care services and strengthening healthcare system building blocks to provide a more enabling context for integration are important.