We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: There are uncertainties regarding the optimal management of acutely symptomatic carotid stenosis (“hot carotids”). We sought to explore the approaches of stroke physicians to anti-thrombotic management, imaging, and revascularization in patients with “hot carotids”. Methods: We used a qualitative descriptive methodology to examine decision-making approaches of physicians regarding the management of hot carotids. We conducted semi-structured interviews with 22 stroke physicians from various specialties in 16 centers across 4 continents. Results: Important themes regarding anti-thrombotic included limitations of existing clinical trial evidence, competing physician preferences, antiplatelet therapy while awaiting revascularization and various regional differences. Timely imaging availability, breadth of information gained, and surgeon/interventionalist preferences were important themes influencing the choice of imaging modality. The choice of revascularization intervention was influenced by healthcare system factors such as use of multidisciplinary review and operating room/angiography suite availability, and patient factors like age and infarct size. Many themes related to uncertainties in the management of hot carotids were also discussed. Conclusions: Our study revealed themes that are important to international stroke experts. We highlight common and divergent practices while underscoring important areas of clinical equipoise and uncertainty. Teams designing international carotid trials may wish to accommodate identified variations in practice patterns and areas of uncertainty.
Background: Chordomas are rare malignant skull-base/spine cancers with devastating neurological morbidities and mortality. Unfortunately, no reliable prognostic factors exist to guide treatment decisions. This work identifies DNA methylation-based prognostic chordoma subtypes that are detectable non-invasively in plasma. Methods: Sixty-eight tissue samples underwent DNA methylation profiling and plasma methylomes were obtained for available paired samples. Immunohistochemical staining and publicly available methylation and gene expression data were utilized for validation. Results: Unsupervised clustering identified two prognostic tissue clusters (log-rank p=0.0062) predicting disease-specific survival independent of clinical factors (Multivariable Cox: HR=16.5, 95%CI: 2.8-96, p=0.0018). The poorer-performing cluster showed immune-related pathway promoter hypermethylation and higher immune cell abundance within tumours, which was validated with external RNA-seq data and immunohistochemical staining. The better-performing cluster showed higher tumour cellularity. Similar clusters were seen in external DNA methylation data. Plasma methylome-based models distinguished chordomas from differential diagnoses in independent testing sets (AUROC=0.84, 95%CI: 0.52-1.00). Plasma methylomes were highly correlated with tissue-based signals for both clusters (r=0.69 & 0.67) and leave-one-out models identified the correct cluster in all plasma cases. Conclusions: Prognostic molecular chordoma subgroups are for the first time identified, characterized, and validated. Plasma methylomes can detect and subtype chordomas which may transform chordoma treatment with personalized approaches tailored to prognosis.
The fossil record of treeshrews, hedgehogs, and other micromammals from the Lower Siwaliks of India is sparse. Here, we report on a new genus and species of fossil treeshrew, specimens of the hedgehog Galerix, and other micromammals from the middle Miocene (Lower Siwalik) deposits surrounding Ramnagar (Udhampur District, Jammu and Kashmir), at a fossil locality known as Dehari. The treeshrew from Dehari (Sivatupaia ramnagarensis n. gen. n. sp.) currently represents the oldest record of fossil tupaiids in the Siwaliks, extending their time range by ca. 2.5–4.0 Myr in the region. Dietary analyses suggest that the new tupaiid was likely adapted for a less mechanically challenging or more frugivorous diet compared to other extant and fossil tupaiids. The occurrence of Galerix has only been recently documented from the Indian Siwaliks and the Dehari specimens help establish the likely presence of a relatively large Siwalik Galerix species in the Ramnagar region. In addition to the new treeshrew and hedgehogs, new specimens of the rodents Kanisamys indicus, Sayimys sivalensis, and Murinae indet. from Dehari help confirm that age estimates for the Ramnagar region are equivalent to the Chinji Formation in Pakistan, most likely corresponding to the middle to upper part of the Chinji Formation.
This study aimed: to evaluate the association between coronavirus disease 2019 infection and olfactory and taste dysfunction in patients presenting to the out-patient department with influenza-like illness, who underwent reverse transcription polymerase chain reaction testing for coronavirus; and to determine the sensitivity, specificity, and positive and negative predictive values of olfactory and taste dysfunction and other symptoms in these patients.
Methods
Patients presenting with influenza-like illness to the study centre in September 2020 were included in the study. The symptoms of patients who tested positive for coronavirus on reverse transcription polymerase chain reaction testing were compared to those with negative test results.
Results
During the study period, 909 patients, aged 12–70 years, presented with influenza-like illness; of these, 316 (34.8 per cent) tested positive for coronavirus. Only the symptoms of olfactory and taste dysfunction were statistically more significant in patients testing positive for coronavirus than those testing negative.
Conclusion
During the pandemic, patients presenting to the out-patient department with sudden loss of sense of smell or taste may be considered as positive for coronavirus disease 2019, until proven otherwise.
Developmental adversities early in life are associated with later psychopathology. Clustering may be a useful approach to group multiple diverse risks together and study their relation with psychopathology. To generate risk clusters of children, adolescents, and young adults, based on adverse environmental exposure and developmental characteristics, and to examine the association of risk clusters with manifest psychopathology. Participants (n = 8300) between 6 and 23 years were recruited from seven sites in India. We administered questionnaires to elicit history of previous exposure to adverse childhood environments, family history of psychiatric disorders in first-degree relatives, and a range of antenatal and postnatal adversities. We used these variables to generate risk clusters. Mini-International Neuropsychiatric Interview-5 was administered to evaluate manifest psychopathology. Two-step cluster analysis revealed two clusters designated as high-risk cluster (HRC) and low-risk cluster (LRC), comprising 4197 (50.5%) and 4103 (49.5%) participants, respectively. HRC had higher frequencies of family history of mental illness, antenatal and neonatal risk factors, developmental delays, history of migration, and exposure to adverse childhood experiences than LRC. There were significantly higher risks of any psychiatric disorder [Relative Risk (RR) = 2.0, 95% CI 1.8–2.3], externalizing (RR = 4.8, 95% CI 3.6–6.4) and internalizing disorders (RR = 2.6, 95% CI 2.2–2.9), and suicidality (2.3, 95% CI 1.8–2.8) in HRC. Social-environmental and developmental factors could classify Indian children, adolescents and young adults into homogeneous clusters at high or low risk of psychopathology. These biopsychosocial determinants of mental health may have practice, policy and research implications for people in low- and middle-income countries.
To compare the nutritional composition of bovine milk and several plant-based drinks with a focus on protein and essential amino acid content and to determine the ratio of essential amino acids to greenhouse gas emission.
Design:
Nutritional information on the label was extracted for semi-skimmed milk, soy, oat, almond, coconut and rice drink from the Innova database between January 2017 and March 2020 for the Netherlands, Belgium, Germany, Spain, Italy and Sweden. Protein and amino acids were measured and carbon footprint was calculated for a selection of Dutch products. Protein quality was determined by calculating the contribution to the WHO essential amino acids requirements.
Setting:
The bovine milk and plant-based drinks market in Netherlands, Belgium, Germany, Spain, Italy and Sweden.
Participants:
Semi-skimmed bovine milk and soy, oat, almond, coconut and rice drink.
Results:
Nutritional label information was collected for 399 products. Milk naturally contains many micronutrients, e.g. vitamin B2, B12 and Ca. Approximately 50 % of the regular plant-based drinks was fortified with Ca, whereas the organic plant-based drinks were mostly unfortified. Protein quantity and quality were highest in milk. Soy drink had the best protein quality to carbon footprint ratio and milk came second.
Conclusions:
The nutrition – climate change balance presented in this study, is in line with previous literature, which shows that semi-skimmed bovine milk and fortified soy drink deserve a place in a sustainable diet.
This meta-analysis provides a quantitative measure of the otorhinolaryngological manifestations of coronavirus disease 2019 in children.
Methods
A structured literature review was carried out using PubMed, Embase and Cochrane Central, employing pertinent search terms. The statistical analysis was performed using Stata version 14.2 software, and the analysed data were expressed as the pooled prevalence of the symptoms with 95 per cent confidence intervals.
Results
The commonest symptoms noted were cough (38 per cent (95 per cent confidence interval = 33–42; I2 = 97.5 per cent)), sore throat (12 per cent (95 per cent confidence interval =10–14; I2 = 93.7 per cent)), and nasal discharge (15 per cent (95 per cent confidence interval = 12–19; I2 = 96.9 per cent)). Anosmia and taste disturbances showed a pooled prevalence of 8 per cent each. Hearing loss, vertigo and hoarseness were rarely reported.
Conclusion
Cough, sore throat and nasal discharge were the commonest otorhinolaryngological symptoms in paediatric patients with coronavirus disease 2019. Compared with adults, anosmia and taste disturbances were infrequently reported in children.
Asteroid and cometary impacts have been considered one of the possible routes for exogenous delivery of organics to the early Earth. It is well established that amino acids can be synthesized due to impact-driven shock processesing of simple molecules and that amino acids can survive the extreme conditions of impact events. In the present study, we simulate impact-induced shock conditions utilizing a shock tube that can maintain a reflected shock temperature of about 5,500 K for 2 ms time scale. We have performed shock processing of various combinations of amino acids with subsequent morphological analysis carried out using Scanning Electron Microscope (SEM), revealing that the shock processed amino acids demonstrate an extensive range of complex structures. These results provide evidence for the further evolution of amino acids in impact-induced shock environments leading to the formation of complex structures and thus providing a pathway for the origin of life.
Maize is the primary staple crop cultivated during the monsoon season in eastern India. However, yield gaps are large because of multiple factors, including low adoption rates of good agronomic management practices. This study aimed to narrow the maize yield gap using diverse agronomic and varietal interventions through field experiments over 2 years (2013–2014) in the rainfed plateau region of Odisha. As a result, maize yield increased by 0.9, 0.74, and 0.17 Mg ha−1 under optimum plant population, fertilizer management, and herbicide-based weed management, respectively, over farmers’ current practices (Check). Moreover, when all three interventions were combined (‘best’ management practice), grain yields increased by 1.7 Mg ha−1 in conservation tillage and 2.2 Mg ha−1 in conventional tillage. We also observed that the combination of long-duration hybrids and best management practices (BMPs) increased grain yield by 4.0 Mg ha−1 and profitability by $888 ha−1 over farmers’ current practices. In addition, Nutrient Expert decision support tool-based fertilizer management along with BMPs increased grain yield by 1.7 Mg ha−1 and profitability by $314 ha−1 over farmers’ fertilizer practices (Check). These results suggest that the combination of maize hybrids and BMPs can improve the productivity and profitability of rainfed maize in the plateau region of Odisha. However, these entry points for intensification need to be placed in the context of varying investment requirements, input and output market conditions, and matched with farmer preferences and risk.
Yarkoni's analysis clearly articulates a number of concerns limiting the generalizability and explanatory power of psychological findings, many of which are compounded in infancy research. ManyBabies addresses these concerns via a radically collaborative, large-scale and open approach to research that is grounded in theory-building, committed to diversification, and focused on understanding sources of variation.
This study aimed to explore effects of adjunctive minocycline treatment on inflammatory and neurogenesis markers in major depressive disorder (MDD). Serum samples were collected from a randomised, placebo-controlled 12-week clinical trial of minocycline (200 mg/day, added to treatment as usual) for adults (n = 71) experiencing MDD to determine changes in interleukin-6 (IL-6), lipopolysaccharide binding protein (LBP) and brain derived neurotrophic factor (BDNF). General Estimate Equation modelling explored moderation effects of baseline markers and exploratory analyses investigated associations between markers and clinical outcomes. There was no difference between adjunctive minocycline or placebo groups at baseline or week 12 in the levels of IL-6 (week 12; placebo 2.06 ± 1.35 pg/ml; minocycline 1.77 ± 0.79 pg/ml; p = 0.317), LBP (week 12; placebo 3.74 ± 0.95 µg/ml; minocycline 3.93 ± 1.33 µg/ml; p = 0.525) or BDNF (week 12; placebo 24.28 ± 6.69 ng/ml; minocycline 26.56 ± 5.45 ng/ml; p = 0.161). Higher IL-6 levels at baseline were a predictor of greater clinical improvement. Exploratory analyses suggested that the change in IL-6 levels were significantly associated with anxiety symptoms (HAMA; p = 0.021) and quality of life (Q-LES-Q-SF; p = 0.023) scale scores. No other clinical outcomes were shown to have this mediation effect, nor did the other markers (LBP or BDNF) moderate clinical outcomes. There were no overall changes in IL-6, LBP or BDNF following adjunctive minocycline treatment. Exploratory analyses suggest a potential role of IL-6 on mediating anxiety symptoms with MDD. Future trials may consider enrichment of recruitment by identifying several markers or a panel of factors to better represent an inflammatory phenotype in MDD with larger sample size.
Catatonic features can appear in autism spectrum disorders (ASDs). There can be overlap in symptoms across catatonia and ASD. The overall aim of this review is to provide evidence for the presence of catatonic features in subjects with ASD.
Methods
A systematic literature search using the Web of Science database from inception to July 10, 2021 was conducted following PRISMA, MOOSE guidelines and the PROSPERO protocol. (CRD42021248615). Twelve studies with information about catatonia and ASD were reviewed. Data from a subset was used to conduct meta-analyses of the presence of catatonia in ASD.
Results
The systematic review included 12 studies, seven of which were used for the meta-analysis, comprising 969 individuals. The mean age was 21.25 (7.5) years. Two studies (16.6%) included only children and adolescents. A total of 70–100% were males. Our meta-analysis showed that 10.4% (5.8–18.0 95%CI) of individuals with ASD have catatonia. Motor disturbances were common in ASD subjects with catatonia. No differences were found in comorbidity. Several treatments have been used in ASD with catatonic features, including benzodiazepines, antipsychotics, and electroconvulsive therapy (ECT). The findings of the systematic review showed that ECT might help manage catatonic symptoms.
Conclusions
Different features of catatonia can exist in individuals with ASD and core symptoms of catatonia are reported in ASD. Longitudinal and longer-term studies are required to understand the relationship between catatonia and ASD, and the response of catatonic symptoms to treatment.
The ai/m of this study was to compare the self-reported confidence of novices in using a smartphone-enabled video otoscope, a microscope and loupes for ear examination and external ear canal procedures.
Method
Medical students (n = 29) undertook a pre-study questionnaire to ascertain their knowledge of techniques for otoscopy and aural microsuction. Participants underwent teaching on ear anatomy, examination and procedural techniques using a microscope, loupes and smartphone-enabled video otoscopes. Confidence and preference using each modality was rated using a Likert-like questionnaire.
Results
After teaching, all modalities demonstrated a significant increase in confidence in ear examination (p < 0.0001). Confidence in using the smartphone-enabled otoscope post-teaching was highest (p = 0.015). Overall, the smartphone-enabled video otoscope was the preferred method in all other parameters assessed including learning anatomy or pathology (51.72 per cent) and learning microsuction (65.51 per cent).
Conclusion
Smartphone-enabled video otoscopes provide an alternative approach to ear examination and aural microsuction that can be undertaken outside of a traditional clinical setting and can be used by novices.
Optimal preoperative therapy regimen in the treatment of resectable retroperitoneal sarcoma (RPS) remains unclear. This study compares the impact of preoperative radiation, chemoradiation and chemotherapy on overall survival (OS) in RPS patients.
Materials and Methods:
The National Cancer Database (NCDB) was queried for patients with non-metastatic, resectable RPS (2006–15). The primary endpoint was OS, evaluated by Kaplan–Meier method, log-rank test, Cox multivariable analysis and propensity score matching.
Results:
A total of 1,253 patients met the inclusion criteria, with 210 patients (17%) receiving chemoradiation, 850 patients (68%) receiving radiation and 193 patients (15%) receiving chemotherapy. On Cox multivariable analysis, when compared to preoperative chemoradiation, preoperative radiation was not associated with improved OS (hazards ratio [HR] 0·98, 95% CI 0·76–1·25, p = 0·84), while preoperative chemotherapy was associated with worse OS (HR 1·64, 95% CI 1·24–2·18, p < 0·001). Similar findings were observed in 199 and 128 matched pairs for preoperative radiation and chemotherapy, respectively, when compared to preoperative chemoradiation.
Findings:
Our study suggested an OS benefit in using preoperative chemoradiation compared to chemotherapy alone, but OS outcomes were comparable between preoperative chemoradiation and radiation alone.
Meeting the complex demands of conservation requires a multi-skilled workforce operating in a sector that is respected and supported. Although professionalization of conservation is widely seen as desirable, there is no consistent understanding of what that entails. Here, we review whether and how eight elements of professionalization observed in other sectors are applicable to conservation: (1) a defined and respected occupation; (2) official recognition; (3) knowledge, learning, competences and standards; (4) paid employment; (5) codes of conduct and ethics; (6) individual commitment; (7) organizational capacity; and (8) professional associations. Despite significant achievements in many of these areas, overall progress is patchy, and conventional concepts of professionalization are not always a good fit for conservation. Reasons for this include the multidisciplinary nature of conservation work, the disproportionate influence of elite groups on the development and direction of the profession, and under-representation of field practitioners and of Indigenous peoples and local communities with professional-equivalent skills. We propose a more inclusive approach to professionalization that reflects the full range of practitioners in the sector and the need for increased recognition in countries and regions of high biodiversity. We offer a new definition that characterizes conservation professionals as practitioners who act as essential links between conservation action and conservation knowledge and policy, and provide seven recommendations for building a more effective, inclusive and representative profession.
To determine: whether young adults (aged 18–24) not in education, employment or training (NEET) have different psychological treatment outcomes to other young adults; any socio-demographic or treatment-related moderators of differential outcomes; and whether service-level changes are associated with better outcomes for those who are NEET.
Methods
A cohort was formed of 20 293 young adults treated with psychological therapies in eight Improving Access to Psychological Therapies services. Pre-treatment characteristics, outcomes, and moderators of differential outcomes were compared for those who were and were not NEET. Associations between outcomes and the following were assessed for those that were NEET: missing fewer sessions, attending more sessions, having a recorded diagnosis, and waiting fewer days between referral and starting treatment.
Results
Those who were NEET had worse outcomes: odds ratio (OR) [95% confidence interval (CI)] for reliable recovery = 0.68 (0.63–0.74), for deterioration = 1.41 (1.25–1.60), and for attrition = 1.31 (1.19–1.43). Ethnic minority participants that were NEET had better outcomes than those that were White and NEET. Living in deprived areas was associated with worse outcomes. The intensity of treatment (high or low) did not moderate outcomes, but having more sessions was associated with improved outcomes for those that were NEET: odds (per one-session increase) of reliable recovery = 1.10 (1.08–1.12), deterioration = 0.94 (0.91–0.98), and attrition = 0.68 (0.66–0.71).
Conclusions
Earlier treatment, supporting those that are NEET to attend sessions, and in particular, offering them more sessions before ending treatment might be effective in improving clinical outcomes. Additional support when working with White young adults that are NEET and those in more deprived areas may also be important.
Using a combination of simulated data and pyrite isotopic reference materials, we have refined a methodology to obtain quantitative δ34S measurements from atom probe tomography (APT) datasets. This study builds on previous attempts to characterize relative 34S/32S ratios in gold-containing pyrite using APT. We have also improved our understanding of the artifacts inherent in laser-pulsed APT of insulators. Specifically, we find the probability of multi-hit detection events increases during the APT experiment, which can have a detrimental effect on the accuracy of the analysis. We demonstrate the use of standardized corrected time-of-flight single-hit data for our isotopic analysis. Additionally, we identify issues with the standard methods of extracting background-corrected counts from APT mass spectra. These lead to inaccurate and inconsistent isotopic analyses due to human variability in peak ranging and issues with background correction algorithms. In this study, we use the corrected time-of-flight single-hit data, an adaptive peak fitting algorithm, and an improved deconvolution algorithm to extract 34S/32S ratios from the S2+ peaks. By analyzing against a standard material, acquired under similar conditions, we have extracted δ34S values to within ±5‰ (1‰ = 1 part per thousand) of the published values of our standards.
Population-based surveys commonly use point-of-care (POC) methods with capillary blood samples for estimating Hb concentrations; these estimates need to be validated by comparison with reference methods using venous blood. In a cross-sectional study in 748 participants (17–86 years, 708 women, Hb: 5·1 to 18·2 g/dl) from Hyderabad, India, we validated Hb measured from a pooled capillary blood sample by a POC autoanalyser (Horiba ABX Micros 60OT, Hb-C-AA) by comparison with venous blood Hb measured by two reference methods: POC autoanalyser (Hb-V-AA) and cyanmethemoglobin method (Hb-V-CM). These comparisons also allowed estimation of blood sample-related and equipment-related differences in the Hb estimates. We also conducted a longitudinal study in 426 participants (17–21 years) to measure differences in the Hb response to iron folate (IFA) treatment by the capillary blood POC method compared with the reference methods. In the cross-sectional study, Bland–Altman analyses showed trivial differences between source of blood (Hb-C-AA and Hb-V-AA; mean difference, limits of agreement: 0·1, −0·8 to 1·0 g/dl) and between analytical methods (Hb-V-AA and Hb-V-CM; mean difference, limits of agreement: < 0·1, −1·8 to 1·8 g/dl). Cross-sectional anaemia prevalence estimated using Hb-C-AA did not differ significantly from Hb-V-CM or Hb-V-AA. In the longitudinal study, the Hb increment in response to IFA intervention was not different when using Hb-C-AA (1·6 ± 1·7 g/dl) compared with Hb-V-AA (1·7 ± 1·7 g/dl) and Hb-V-CM (1·7 ± 1·7 g/dl). The pooled capillary blood–autoanalyzer method (Hb-C-AA) offers a practical and accurate way forward for POC screening of anaemia.
Background: There are no recommendations regarding endovascular treatment (EVT) for patients with acute ischemic stroke (AIS) due to primary medium vessel occlusion (MeVO). The aim of this study was to examine the willingness to perform EVT among stroke physicians in patients with mild, yet personally-disabling deficits due to MeVO. Methods: In an international survey consisting of 4 cases of primary MeVOs, participants were asked whether the presence of personally-disabling deficits would influence their decision-making for EVT despite the patients having low NIHSS scores. Decision rates were calculated based on physician characteristics. Clustered univariable logistic regression was performed. Results: 366 participants from 44 countries provided 2562 answers. 56.9% opted to perform EVT in scenarios in which the deficit was relevant to the patient’s profession versus 41.0% in which no information regarding patient profession was provided (RR1.39, p<0.001). The largest effect sizes were seen for female participants (RR1.68, 95%CI:1.35-2.09), participants >60 years (RR1.61, 95%CI:1.23-2.10), with more neurointervention experience (RR1.60, 95%CI:1.24-2.06), and who personally performed >100 EVTs per year (RR1.63, 95%CI:1.22-2.17). Conclusions: The presence of a patient-relevant deficit in low NIHSS AIS due to MeVO is an important factor for EVT decision-making. This may have relevance for the conduct and interpretation of low NIHSS EVT randomized trials.
In May 2021, the Scientific Advisory Committee on Nutrition (SACN) published a risk assessment on lower carbohydrate diets for adults with type 2 diabetes (T2D)(1). The purpose of the report was to review the evidence on ‘low’-carbohydrate diets compared with the current UK government advice on carbohydrate intake for adults with T2D. However, since there is no agreed and widely utilised definition of a ‘low’-carbohydrate diet, comparisons in the report were between lower and higher carbohydrate diets. SACN’s remit is to assess the risks and benefits of nutrients, dietary patterns, food or food components for health by evaluating scientific evidence and to make dietary recommendations for the UK based on its assessment(2). SACN has a public health focus and only considers evidence in healthy populations unless specifically requested to do otherwise. Since the Committee does not usually make recommendations relating to clinical conditions, a joint working group (WG) was established in 2017 to consider this issue. The WG comprised members of SACN and members nominated by Diabetes UK, the British Dietetic Association, Royal College of Physicians and Royal College of General Practitioners. Representatives from NHS England and NHS Health Improvement, the National Institute for Health and Care Excellence and devolved health departments were also invited to observe the WG. The WG was jointly chaired by SACN and Diabetes UK.