Book chapters will be unavailable on Saturday 24th August between 8am-12pm BST. This is for essential maintenance which will provide improved performance going forwards. Please accept our apologies for any inconvenience caused.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The sternocleidomastoid can be used as a pedicled flap in head and neck reconstruction. It has previously been associated with high complication rates, likely due in part to the variable nature of its blood supply.
To provide clinicians with an up-to-date review of clinical outcomes of sternocleidomastoid flap surgery in head and neck reconstruction, integrated with a review of vascular anatomical studies of the sternocleidomastoid.
A literature search of the Medline and Web of Science databases was conducted. Complications were analysed for each study. The trend in success rates was analysed by date of the study.
Reported complication rates have improved over time. The preservation of two vascular pedicles rather than one may have contributed to improved outcomes.
The sternocleidomastoid flap is a versatile option for patients where prolonged free flap surgery is inappropriate. Modern vascular imaging techniques could optimise pre-operative planning.
Space Infrared Telescope for Cosmology and Astrophysics (SPICA), the cryogenic infrared space telescope recently pre-selected for a ‘Phase A’ concept study as one of the three remaining candidates for European Space Agency (ESA's) fifth medium class (M5) mission, is foreseen to include a far-infrared polarimetric imager [SPICA-POL, now called B-fields with BOlometers and Polarizers (B-BOP)], which would offer a unique opportunity to resolve major issues in our understanding of the nearby, cold magnetised Universe. This paper presents an overview of the main science drivers for B-BOP, including high dynamic range polarimetric imaging of the cold interstellar medium (ISM) in both our Milky Way and nearby galaxies. Thanks to a cooled telescope, B-BOP will deliver wide-field 100–350
m images of linearly polarised dust emission in Stokes Q and U with a resolution, signal-to-noise ratio, and both intensity and spatial dynamic ranges comparable to those achieved by Herschel images of the cold ISM in total intensity (Stokes I). The B-BOP 200
m images will also have a factor
30 higher resolution than Planck polarisation data. This will make B-BOP a unique tool for characterising the statistical properties of the magnetised ISM and probing the role of magnetic fields in the formation and evolution of the interstellar web of dusty molecular filaments giving birth to most stars in our Galaxy. B-BOP will also be a powerful instrument for studying the magnetism of nearby galaxies and testing Galactic dynamo models, constraining the physics of dust grain alignment, informing the problem of the interaction of cosmic rays with molecular clouds, tracing magnetic fields in the inner layers of protoplanetary disks, and monitoring accretion bursts in embedded protostars.
To determine the baseline individual characteristics that predicted symptom recovery and functional recovery at 10-years following the first episode of psychosis.
AESOP-10 is a 10-year follow up of an epidemiological, naturalistic population-based cohort of individuals recruited at the time of their first episode of psychosis in two areas in the UK (South East London and Nottingham). Detailed information on demographic, clinical, and social factors was examined to identify which factors predicted symptom and functional remission and recovery over 10-year follow-up. The study included 557 individuals with a first episode psychosis. The main study outcomes were symptom recovery and functional recovery at 10-year follow-up.
At 10 years, 46.2% (n = 140 of 303) of patients achieved symptom recovery and 40.9% (n = 117) achieved functional recovery. The strongest predictor of symptom recovery at 10 years was symptom remission at 12 weeks (adj OR 4.47; CI 2.60–7.67); followed by a diagnosis of depression with psychotic symptoms (adj OR 2.68; CI 1.02–7.05). Symptom remission at 12 weeks was also a strong predictor of functional recovery at 10 years (adj OR 2.75; CI 1.23–6.11), together with being from Nottingham study centre (adj OR 3.23; CI 1.25–8.30) and having a diagnosis of mania (adj OR 8.17; CI 1.61–41.42).
Symptom remission at 12 weeks is an important predictor of both symptom and functional recovery at 10 years, with implications for illness management. The concepts of clinical and functional recovery overlap but should be considered separately.
One of the long-standing puzzles in the political behavior literature is about immigrants' low level of political participation: after achieving comparable and sometimes even higher levels of socioeconomic status relative to the native-born citizens, why do immigrants still participate less in politics? We argue that the different formative years experiences associated with immigrants who moved to the United States at an older age is the key that explains the participation gap between immigrants and the native-born population. Using the 1994–2016 Current Population Survey and their Voting and Civic Engagement Supplements as data sources, we develop a hierarchical model that simultaneously accounts for region-, country-, and individual-level variables. The results are striking. We show that immigrants who move to the United States at a young age participate in politics at a rate that is indistinguishable from the native-born population; those who migrated at an older age participate less. The fact that over 60% of the immigrant population moved to the United States as adults is a main factor that contributes to the political participation gap between immigrants and the native-born population.
To describe and compare caffeinated energy drink adverse event (AE) report/exposure call data from the US Food and Drug Administration Center for Food Safety and Applied Nutrition’s Adverse Event Reporting System (CAERS) and the American Association of Poison Control Centers’ National Poison Data System (NPDS).
Data were evaluated from US-based CAERS reports and NPDS exposure calls, including report/exposure call year, age, sex, location, single v. multiple product consumption, outcome, symptom, intentionality (NPDS only), report type, product name (CAERS only).
The analysis defined participants (cases) by the number of caffeinated energy drink products indicated in each AE report or exposure call. Single product cases included 357 from CAERS and 12 822 from NPDS; multiple product cases included 153 from CAERS and 931 from NPDS.
CAERS v. NPDS single product cases were older and more frequently indicated serious symptoms. Multiple v. single product consumers were older in both. In CAERS, unlike NPDS, most multiple product consumers were female. CAERS single v. multiple product reports cited higher proportions of life-threatening events, but less often indicated hospitalization and serious events. NPDS multiple v. single product cases involved fewer ≤5-year-olds and were more often intentional.
Despite limitations, both data sources contribute to post-market surveillance and improve understanding of public health concerns.
Clinical Enterobacteriacae isolates with a colistin minimum inhibitory concentration (MIC) ≥4 mg/L from a United States hospital were screened for the mcr-1 gene using real-time polymerase chain reaction (RT-PCR) and confirmed by whole-genome sequencing. Four colistin-resistant Escherichia coli isolates contained mcr-1. Two isolates belonged to the same sequence type (ST-632). All subjects had prior international travel and antimicrobial exposure.
Peripheral low-grade inflammation in depression is increasingly seen as a therapeutic target. We aimed to establish the prevalence of low-grade inflammation in depression, using different C-reactive protein (CRP) levels, through a systematic literature review and meta-analysis.
We searched the PubMed database from its inception to July 2018, and selected studies that assessed depression using a validated tool/scale, and allowed the calculation of the proportion of patients with low-grade inflammation (CRP >3 mg/L) or elevated CRP (>1 mg/L).
After quality assessment, 37 studies comprising 13 541 depressed patients and 155 728 controls were included. Based on the meta-analysis of 30 studies, the prevalence of low-grade inflammation (CRP >3 mg/L) in depression was 27% (95% CI 21–34%); this prevalence was not associated with sample source (inpatient, outpatient or population-based), antidepressant treatment, participant age, BMI or ethnicity. Based on the meta-analysis of 17 studies of depression and matched healthy controls, the odds ratio for low-grade inflammation in depression was 1.46 (95% CI 1.22–1.75). The prevalence of elevated CRP (>1 mg/L) in depression was 58% (95% CI 47–69%), and the meta-analytic odds ratio for elevated CRP in depression compared with controls was 1.47 (95% CI 1.18–1.82).
About a quarter of patients with depression show evidence of low-grade inflammation, and over half of patients show mildly elevated CRP levels. There are significant differences in the prevalence of low-grade inflammation between patients and matched healthy controls. These findings suggest that inflammation could be relevant to a large number of patients with depression.
India has the second largest number of people with type 2 diabetes (T2D) globally. Epidemiological evidence indicates that consumption of white rice is positively associated with T2D risk, while intake of brown rice is inversely associated. Thus, we explored the effect of substituting brown rice for white rice on T2D risk factors among adults in urban South India. A total of 166 overweight (BMI ≥ 23 kg/m2) adults aged 25–65 years were enrolled in a randomised cross-over trial in Chennai, India. Interventions were a parboiled brown rice or white rice regimen providing two ad libitum meals/d, 6 d/week for 3 months with a 2-week washout period. Primary outcomes were blood glucose, insulin, glycosylated Hb (HbA1c), insulin resistance (homeostasis model assessment of insulin resistance) and lipids. High-sensitivity C-reactive protein (hs-CRP) was a secondary outcome. We did not observe significant between-group differences for primary outcomes among all participants. However, a significant reduction in HbA1c was observed in the brown rice group among participants with the metabolic syndrome (−0·18 (se 0·08) %) relative to those without the metabolic syndrome (0·05 (se 0·05) %) (P-for-heterogeneity = 0·02). Improvements in HbA1c, total and LDL-cholesterol were observed in the brown rice group among participants with a BMI ≥ 25 kg/m2 compared with those with a BMI < 25 kg/m2 (P-for-heterogeneity < 0·05). We observed a smaller increase in hs-CRP in the brown (0·03 (sd 2·12) mg/l) compared with white rice group (0·63 (sd 2·35) mg/l) (P = 0·04). In conclusion, substituting brown rice for white rice showed a potential benefit on HbA1c among participants with the metabolic syndrome and an elevated BMI. A small benefit on inflammation was also observed.
Starting in 2016, we initiated a pilot tele-antibiotic stewardship program at 2 rural Veterans Affairs medical centers (VAMCs). Antibiotic days of therapy decreased significantly (P < .05) in the acute and long-term care units at both intervention sites, suggesting that tele-stewardship can effectively support antibiotic stewardship practices in rural VAMCs.
There is a clear need to educate and train the clinical research workforce to conduct scientifically sound clinical research. Meeting this need requires the creation of tools to assess both an individual’s preparedness to function efficiently in the clinical research enterprise and tools to evaluate the quality and effectiveness of programs that are designed to educate and train clinical research professionals. Here we report the development and validation of a competency self-assessment entitled the Competency Index for Clinical Research Professionals, version II (CICRP-II).
CICRP-II was developed using data collected from clinical research coordinators (CRCs) participating in the “Development, Implementation and Assessment of Novel Training In Domain-Based Competencies” (DIAMOND) project at four clinical and translational science award (CTSA) hubs and partnering institutions.
An exploratory factor analysis (EFA) identified a two-factor structure: the first factor measures self-reported competence to perform Routine clinical research functions (e.g., good clinical practice regulations (GCPs)), while the second factor measures competence to perform Advanced clinical functions (e.g., global regulatory affairs). We demonstrate the between groups validity by comparing CRCs working in different research settings.
The excellent psychometric properties of CICRP-II and its ability to distinguish between experienced CRCs at research-intensive CTSA hubs and CRCs working in less-intensive community-based sites coupled with the simplicity of alternative methods for scoring respondents make it a valuable tool for gauging an individual’s perceived preparedness to function in the role of CRC as well as an equally valuable tool to evaluate the value and effectiveness of clinical research education and training programs.
Young people with 22q11.2 deletion syndrome (22q11.2DS) are at high risk for neurodevelopmental disorders. Sleep problems may play a role in this risk but their prevalence, nature and links to psychopathology and cognitive function remain undescribed in this population.
Sleep problems, psychopathology, developmental coordination and cognitive function were assessed in 140 young people with 22q11.2DS (mean age = 10.1, s.d. = 2.46) and 65 unaffected sibling controls (mean age = 10.8, s.d.SD = 2.26). Primary carers completed questionnaires screening for the children's developmental coordination and autism spectrum disorder.
Sleep problems were identified in 60% of young people with 22q11.2DS compared to 23% of sibling controls (OR 5.00, p < 0.001). Two patterns best-described sleep problems in 22q11.2DS: restless sleep and insomnia. Restless sleep was linked to increased ADHD symptoms (OR 1.16, p < 0.001) and impaired executive function (OR 0.975, p = 0.013). Both patterns were associated with elevated symptoms of anxiety disorder (restless sleep: OR 1.10, p = 0.006 and insomnia: OR 1.07, p = 0.045) and developmental coordination disorder (OR 0.968, p = 0.0023, and OR 0.955, p = 0.009). The insomnia pattern was also linked to elevated conduct disorder symptoms (OR 1.53, p = 0.020).
Clinicians and carers should be aware that sleep problems are common in 22q11.2DS and index psychiatric risk, cognitive deficits and motor coordination problems. Future studies should explore the physiology of sleep and the links with the neurodevelopment in these young people.
Introduction: Elder abuse is infrequently detected in the emergency department (ED) and less than 2% are reported to proper law authorities by ED physicians. This study aims to examine the characteristics of community-dwelling older adults who screened positive for elder abuse during home care assessments and the epidemiology of ED visits by these patients relative to other home care patients. Methods: This study utilized a population-based retrospective cohort study of home care patients in Canada between April 1, 2007 and March 31, 2015. Standardized, comprehensive home care assessments were extracted from the Home Care Reporting System. A positive screen for elder abuse was defined as at least one these criteria: fearful of a caregiver; unusually poor hygiene; unexplained injuries; or neglected, abused, or mistreated. Home care assessments were linked to the National Ambulatory Care Reporting System in the regions and time periods in which population-based estimates could be obtained to identify all ED visits within 6 months of the home care assessment. Results: A total of 30,413 from the 2,401,492 patients (1.3%) screened positive for elder abuse during a home care assessment. They were more likely to be male (40.5% versus 35.3%, p < 0.001), to have a cognitive impairment (82.9% versus 65.3%, p < 0.001), a higher frailty index (0.27 versus 0.22, p < 0.001) and to exhibit more depressive symptoms (depression rating scale 1 or more: 68.7% versus 42.7%, p < 0.001). Patient who screened positive for elder abuse were less likely to be independent in activities of daily living (41.9% versus 52.7%, p < 0.001) and reported having fallen more frequently (44.2% versus 35.5%, p < 0.001). Caregiver expressing distress was associated with elder abuse (35.3% versus 18.3%, p < 0.001) but not a higher number of hours caring for the patient. Victims of elder abuse were more likely to attend the ED for low acuity conditions (Canadian triage and acuity scale (CTAS) 4 or 5). Diagnosis at discharge from ED were similar with the exception of acute intoxication that was more frequent in patients who are victims of abuse. Conclusion: Elder abuse is infrequently detected during home care assessments in community-dwelling older adults. Higher frailty index, cognitive impairment, depressive symptoms were associated with elder abuse during homecare assessments. Patients who are victims of elder abuse are attending EDs more frequently for low acuity conditions but ED diagnosis at discharge, except for acute intoxication, are similar.
Ultrasound applications are widespread, and their utility in resource-limited environments are numerous. In disasters, the use of ultrasound can help reallocate resources by guiding decisions on management and transportation priorities. These interventions can occur on-scene, at triage collection points, during transport, and at the receiving medical facility. Literature related to this specific topic is limited. However, literature regarding prehospital use of ultrasound, ultrasound in combat situations, and some articles specific to disaster medicine allude to the potential growth of ultrasound utilization in disaster response.
To evaluate the utility of point-of-care ultrasound in a disaster response based on studies involving ultrasonography in resource-limited environments.
A narrative review of MEDLINE, MEDLINE InProcess, EPub, and Embase found 20 articles for inclusion.
Experiences from past disasters, prehospital care, and combat experiences have demonstrated the value of ultrasound both as a diagnostic and interventional modality.
Current literature supports the use of ultrasound in disaster response as a real-time, portable, safe, reliable, repeatable, easy-to-use, and accurate tool. While both false positives and false negatives were reported in prehospital studies, these values correlate to accepted false positive and negative rates of standard in-hospital point-of-care ultrasound exams. Studies involving austere environments demonstrate the ability to apply ultrasound in extreme conditions and to obtain high-quality images with only modest training and real-time remote guidance. The potential for point-of-care ultrasound in triage and management of mass casualty incidents is there. However, as these studies are heterogeneous and observational in nature, further research is needed as to how to integrate ultrasound into the response and recovery phases.
Children with congenital heart disease are at high risk for malnutrition. Standardisation of feeding protocols has shown promise in decreasing some of this risk. With little standardisation between institutions’ feeding protocols and no understanding of protocol adherence, it is important to analyse the efficacy of individual aspects of the protocols.
Adherence to and deviation from a feeding protocol in high-risk congenital heart disease patients between December 2015 and March 2017 were analysed. Associations between adherence to and deviation from the protocol and clinical outcomes were also assessed. The primary outcome was change in weight-for-age z score between time intervals.
Increased adherence to and decreased deviation from individual instructions of a feeding protocol improves patients change in weight-for-age z score between birth and hospital discharge (p = 0.031). Secondary outcomes such as markers of clinical severity and nutritional delivery were not statistically different between groups with high or low adherence or deviation rates.
High-risk feeding protocol adherence and fewer deviations are associated with weight gain independent of their influence on nutritional delivery and caloric intake. Future studies assessing the efficacy of feeding protocols should include the measures of adherence and deviations that are not merely limited to caloric delivery and illness severity.
In the USA, western Washington (WWA) and the Alaska (AK) Interior are two regions where maritime and continental climates, high latitude and cropping systems necessitate early maturing spring wheat (Triticum aestivum L.). Both regions aim to increase the production of hard spring bread wheat for human consumption to support regional agriculture and food systems. The Nordic region of Europe has a history of breeding for early maturing spring wheat and also experiences long daylengths with mixed maritime and continental climates. Nordic wheat also carries wildtype (wt) NAM-B1, an allele associated with accelerated senescence and increased grain protein and micronutrient content, at a higher frequency than global germplasm. Time to senescence, yield, protein and mineral content were evaluated on 42 accessions of Nordic hard red spring wheat containing wt NAM-B1 over 2 years on experimental stations in WWA and the AK Interior. Significant variation was found by location and accession for time to senescence, suggesting potential parental lines for breeding programmes targeting early maturity. Additionally, multiple regression analysis showed that decreased time to senescence correlated negatively with grain yield and positively with grain protein, iron and zinc content. Breeding for early maturity in these regions will need to account for this potential trade-off in yield. Nordic wt NAM-B1 accessions with early senescence yet with yields similar to regional checks are reported. Collaboration among alternative wheat regions can aid in germplasm exchange and varietal development as shown here for the early maturing trait.
Laser–solid interactions are highly suited as a potential source of high energy X-rays for nondestructive imaging. A bright, energetic X-ray pulse can be driven from a small source, making it ideal for high resolution X-ray radiography. By limiting the lateral dimensions of the target we are able to confine the region over which X-rays are produced, enabling imaging with enhanced resolution and contrast. Using constrained targets we demonstrate experimentally a
X-ray source, improving the image quality compared to unconstrained foil targets. Modelling demonstrates that a larger sheath field envelope around the perimeter of the constrained targets increases the proportion of electron current that recirculates through the target, driving a brighter source of X-rays.
Antibiotics are overprescribed for acute respiratory tract infections (ARIs). Guidelines provide criteria to determine which patients should receive antibiotics. We assessed congruence between documentation of ARI diagnostic and treatment practices with guideline recommendations, treatment appropriateness, and outcomes.
A multicenter quality improvement evaluation was conducted in 28 Veterans Affairs facilities. We included visits for pharyngitis, rhinosinusitis, bronchitis, and upper respiratory tract infections (URI-NOS) that occurred during the 2015–2016 winter season. A manual record review identified complicated cases, which were excluded. Data were extracted for visits meeting criteria, followed by analysis of practice patterns, guideline congruence, and outcomes.
Of 5,740 visits, 4,305 met our inclusion criteria: pharyngitis (n = 558), rhinosinusitis (n = 715), bronchitis (n = 1,155), URI-NOS (n = 1,475), or mixed diagnoses (>1 ARI diagnosis) (n = 402). Antibiotics were prescribed in 68% of visits: pharyngitis (69%), rhinosinusitis (89%), bronchitis (86%), URI-NOS (37%), and mixed diagnosis (86%). Streptococcal diagnostic testing was performed in 33% of pharyngitis visits; group A Streptococcus was identified in 3% of visits. Streptococcal tests were ordered less frequently for patients who received antibiotics (28%) than those who did not receive antibiotics 44%; P < .01). Although 68% of visits for rhinosinusitis had documentation of symptoms, only 32% met diagnostic criteria for antibiotics. Overall, 39% of patients with uncomplicated ARIs received appropriate antibiotic management. The proportion of 30-day return visits for ARI care was similar for appropriate (11%) or inappropriate (10%) antibiotic management (P = .22).
Antibiotics were prescribed in most uncomplicated ARI visits, indicating substantial overuse. Practice was frequently discordant with guideline diagnostic and treatment recommendations.
Although there are extensive data on clinical psychopathology in youth with suicidal ideation, data are lacking regarding their neurocognitive function.
To characterise the cognitive profile of youth with suicidal ideation in a community sample and evaluate gender differences and pubertal status effects.
Participants (N = 6151, age 11–21 years, 54.9% females) from the Philadelphia Neurodevelopmental Cohort, a non-help-seeking community sample, underwent detailed clinical evaluation. Cognitive phenotyping included executive functioning, episodic memory, complex reasoning and social cognitive functioning. We compared participants with suicidal ideation (N = 672) and without suicidal ideation (N = 5479). Regression models were employed to evaluate differences in cognitive performance and functional level, with gender and pubertal status as independent variables. Models controlled for lifetime depression or general psychopathology, and for covariates including age and socioeconomic status.
Youth with suicidal ideation showed greater psychopathology, poorer level of function but better overall neurocognitive performance. Greater functional impairment was observed in females with suicidal ideation (suicidal ideation × gender interaction, t = 3.091, P = 0.002). Greater neurocognition was associated with suicidal ideation post-puberty (suicidal ideation × puberty interaction, t = 3.057, P = 0.002). Exploratory analyses of specific neurocognitive domains showed that suicidal ideation-associated cognitive superiority was more prominent in post-pubertal males compared with females (Cohen's d = 0.32 and d = 0.11, respectively) across all cognitive domains.
Suicidal ideation was associated with poorer functioning yet better cognitive performance, especially in post-pubertal males, as measured by a comprehensive cognitive battery. Findings point to gender and pubertal-status specificity in the relationship between suicidal ideation, cognition and function in youth.
Declaration of interest
R.B. serves on the scientific board and reports stock ownership in ‘Taliaz Health’, with no conflict of interest relevant to this work. M.A.O. receives royalties for the commercial use of the Columbia-Suicide Severity Rating Scale from the Research Foundation for Mental Hygiene. Her family owns stock in Bristol-Myers Squibb. All other authors declare no potential conflict of interest.