We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The 1891–92, final issue of Leaves of Grass (commonly called the “deathbed” edition) has declined in the opinion of critics who see it as a negative benchmark within Whitman’s production for its heightened conservative poetics, both thematically and stylistically. The first three editions of Leaves of Grass have instead been championed, beginning in the 1950s and continuing to today. Our chapter investigates the question of what is still radical, experimental, and inspiring in the final edition, especially considering that it remains the most widely read version across the world, and that it was the one with which American and international modernist writers were most familiar. Particular attention is given to the edition’s juxtaposition of traditional and innovative poems, shifting styles, obsession with sound, and with what remains unutterable, especially as seen in the annexes to the edition.
Optimal transition from child to adult services involves continuity, joint care, planning meetings and information transfer; commissioners and service providers therefore need data on how many people require that service. Although attention-deficit hyperactivity disorder (ADHD) frequently persists into adulthood, evidence is limited on these transitions.
Aims
To estimate the national incidence of young people taking medication for ADHD that require and complete transition, and to describe the proportion that experienced optimal transition.
Method
Surveillance over 12 months using the British Paediatric Surveillance Unit and Child and Adolescent Psychiatry Surveillance System, including baseline notification and follow-up questionnaires.
Results
Questionnaire response was 79% at baseline and 82% at follow-up. For those aged 17–19, incident rate (range adjusted for non-response) of transition need was 202–511 per 100 000 people aged 17–19 per year, with successful transition of 38–96 per 100 000 people aged 17–19 per year. Eligible young people with ADHD were mostly male (77%) with a comorbid condition (62%). Half were referred to specialist adult ADHD and 25% to general adult mental health services; 64% had referral accepted but only 22% attended a first appointment. Only 6% met optimal transition criteria.
Conclusions
As inclusion criteria required participants to be on medication, these estimates represent the lower limit of the transition need. Two critical points were apparent: referral acceptance and first appointment attendance. The low rate of successful transition and limited guideline adherence indicates significant need for commissioners and service providers to improve service transition experiences.
The archaeological site of Saruq al-Hadid, Dubai, United Arab Emirates, presents a long sequence of persistent temporary human occupation on the northern edge of the Rub’ al-Khali desert. The site is located in active dune fields, and evidence for human activity is stratified within a deep sequence of natural dune deposits that reflect complex taphonomic processes of deposition, erosion and reworking. This study presents the results of a program of radiocarbon (14C) and thermoluminescence dating on deposits from Saruq al-Hadid, allied with studies of material remains, which are amalgamated with the results of earlier absolute dating studies provide a robust chronology for the use of the site from the Bronze Age to the Islamic period. The results of the dating program allow the various expressions of human activity at the site—ranging from subsistence activities such as hunting and herding, to multi-community ritual activities and large scale metallurgical extraction—to be better situated chronologically, and thus in relation to current debates regarding the development of late prehistoric and early historic societies in southeastern Arabia.
Introduction: Depending on the time and day of initial Emergency Department (ED) presentation, some patients may require a return to the ED the following day for ultrasound examination. Return visits for ultrasound may be time and resource intensive for both patients and the ED. Qualitative experience suggests that a percentage of return ultrasounds could be performed at a non-ED facility. Our objective was to undertake a retrospective audit of return for ultrasound usage, patterns and outcomes at 2 academic EDs. Methods: A retrospective review of all adult patients returning to the ED for ultrasound at both LHSC ED sites in 2016 was undertaken. Each chart was independently reviewed by two emergency medicine consultants. Charts were assessed for day and time of initial presentation and return, type of ultrasound ordered, and length of ED stay on initial presentation and return visit. Opinion based questions were considered by reviewers, including urgency of diagnosis clarification required, if symptoms were still present on return, and if any medical or surgical treatment or follow up was arranged based on ultrasound results. Agreement between reviewers was assessed. Results: After eliminating charts for which the return visit was not for a scheduled ultrasound examination, 328 patient charts were reviewed. 63% of patients were female and median [IQR] age was 40 years [27-56]. Abdomen/pelvis represented 50% of the ultrasounds; renal 24%; venous Doppler 15.9%. Symptoms were still present and documented in 79% of cases. 22% of cases required a medical intervention and 9% an immediate surgical intervention. 11% of patients were admitted to hospital on their return visit. Outpatient follow-up based on US results was initiated in 29% of cases. Median [IQR] combined LOS was 479.5 minutes [358.5-621.75]. Agreement between reviewers for opinion based questions was poor (63%-96%). Conclusion: Ideally, formal ultrasound should be available on a 24 hour basis for ED patients in order to avoid return visits. A percentage of return for ultrasound examinations do not result in any significant change in treatment. Emergency departments should consider the development of pathways to avoid return visits for follow up ultrasound when possible. The low incidence of surgical treatment in those returning for US suggests that this population could be served in a non-hospital setting. Further research is required to support this conclusion.
Introduction: Trauma and injury play a significant role in the population's burden of disease. Limited research exists evaluating the role of trauma bypass protocols. The objective of this study was to assess the impact and effectiveness of a newly introduced prehospital field trauma triage (FTT) standard, allowing paramedics to bypass a closer hospital and directly transport to a trauma centre (TC) provided transport times were within 30 minutes. Methods: We conducted a 12-month multi-centred health record review of paramedic call reports and emergency department health records following the implementation of the 4 step FTT standard (step 1: vital signs and level of consciousness, step 2: anatomical injury, step 3: mechanism and step 4: special considerations) in nine paramedic services across Eastern Ontario. We included adult trauma patients transported as an urgent transport to hospital, that met one of the 4 steps of the FTT standard and would allow for a bypass consideration. We developed and piloted a standardized data collection tool and obtained consensus on all data definitions. The primary outcome was the rate of appropriate triage to a TC, defined as any of the following: injury severity score ≥12, admitted to an intensive care unit, underwent non-orthopedic operation, or death. We report descriptive and univariate analysis where appropriate. Results: 570 adult patients were included with the following characteristics: mean age 48.8, male 68.9%, attended by Advanced Care Paramedic 71.8%, mechanisms of injury: MVC 20.2%, falls 29.6%, stab wounds 10.5%, median initial GCS 14, mean initial BP 132, prehospital fluid administered 26.8%, prehospital intubation 3.5%, transported to a TC 74.6%. Of those transported to a TC, 308 (72.5%) had bypassed a closer hospital prior to TC arrival. Of those that bypassed a closer hospital, 136 (44.2%) were determined to be “appropriate triage to TC”. Bypassed patients more often met the step 1 or step 2 of the standard (186, 66.9%) compared to the step 3 or step 4 (122, 39.6%). An appropriate triage to TC occurred in 104 (55.9%) patients who had met step 1 or 2 and 32 (26.2%) patients meeting step 3 or 4 of the FTT standard. Conclusion: The FTT standard can identify patients who should be bypassed and transported to a TC. However, this is at a cost of potentially burdening the system with poor sensitivity. More work is needed to develop a FTT standard that will assist paramedics in appropriately identifying patients who require a trauma centre.
Age at sexual debut is known to have implications for future sexual behaviours and health outcomes, including HIV infection, early pregnancy and maternal mortality, but may also influence educational outcomes. Longitudinal data on schooling and sexual behaviour from a demographic surveillance site in Karonga district, northern Malawi, were analysed for 3153 respondents between the ages of 12 and 25 years to examine the association between sexual debut and primary school dropout, and the role of prior school performance. Time to dropout was modelled using the Fine and Gray survival model to account for the competing event of primary school completion. To deal with the time-varying nature of age at sexual debut and school performance, models were fitted using landmark analyses. Sexual debut was found to be associated with a five-fold increase in rate of subsequent dropout for girls and a two-fold increase in dropout rate for boys (adjusted hazard ratio [aHR] of 5.27, CI 4.22–6.57, and 2.19, CI 1.77–2.7, respectively). For girls who were sexually active by age 16, only 16% ultimately completed primary schooling, compared with 70% aged 18 or older at sexual debut. Prior to sexual debut, girls had primary completion levels similar to those of boys. The association between sexual debut and school dropout could not be explained by prior poor school performance: the effect of sexual debut on dropout was as strong among those who were not behind in school as among those who were overage for their school grade. Girls who were sexually active were more likely to repeat a grade, with no effect being seen for boys. Pathways to dropout are complex and may differ for boys and girls. Interventions are needed to improve school progression so children complete primary school before sexual debut, and to improve sex education and contraception provision.
Recommending nitrofurantoin to treat uncomplicated cystitis was associated with increased nitrofurantoin use from 3.53 to 4.01 prescriptions per 1,000 outpatient visits, but nitrofurantoin resistance in E. coli isolates remained stable at 2%. Concomitant levofloxacin resistance was a significant risk for nitrofurantoin resistance in E. coli isolates (odds ratio [OR], 2.72; 95% confidence interval [CI], 1.04–7.17).
Rare copy number variants (CNVs) are associated with risk of neurodevelopmental disorders characterised by varying degrees of cognitive impairment, including schizophrenia, autism spectrum disorder and intellectual disability. However, the effects of many individual CNVs in carriers without neurodevelopmental disorders are not yet fully understood, and little is known about the effects of reciprocal copy number changes of known pathogenic loci.
Aims
We aimed to analyse the effect of CNV carrier status on cognitive performance and measures of occupational and social outcomes in unaffected individuals from the UK Biobank.
Method
We called CNVs in the full UK Biobank sample and analysed data from 420 247 individuals who passed CNV quality control, reported White British or Irish ancestry and were not diagnosed with neurodevelopmental disorders. We analysed 33 pathogenic CNVs, including their reciprocal deletions/duplications, for association with seven cognitive tests and four general measures of functioning: academic qualifications, occupation, household income and Townsend Deprivation Index.
Results
Most CNVs (24 out of 33) were associated with reduced performance on at least one cognitive test or measure of functioning. The changes on the cognitive tests were modest (average reduction of 0.13 s.d.) but varied markedly between CNVs. All 12 schizophrenia-associated CNVs were associated with significant impairments on measures of functioning.
Conclusions
CNVs implicated in neurodevelopmental disorders, including schizophrenia, are associated with cognitive deficits, even among unaffected individuals. These deficits may be subtle but CNV carriers have significant disadvantages in educational attainment and ability to earn income in adult life.
Detractors have long criticized the use of courts to achieve social change because judicial victories tend to provoke counterproductive political backlashes. Backlash arguments typically assert or imply that if movement litigators had relied on democratic rather than judicial politics, their policy victories would have been better insulated from opposition. We argue that these accounts wrongly assume that the unilateral decision by a group of movement advocates to eschew litigation will lead to a reduced role for courts in resolving the relevant policy and political conflicts. To the contrary, such decisions will often result in a policy field with judges every bit as active, but with the legal challenges initiated and framed by the advocates' opponents. We document this claim and explore its implications for constitutional politics via a counterfactual thought experiment rooted in historical case studies of litigation involving abortion and the right to die.
OBJECTIVES/SPECIFIC AIMS: The aim of this study is to examine if stable health insurance coverage is associated with improved type 2 diabetes (DM) control and with reduced racial/ethnic health disparities. METHODS/STUDY POPULATION: We utilized EMR data (2005–2013) from 2 large, urban academic health centers with a racially/ethnically diverse patient population to longitudinally examine insurance coverage, and diabetes outcomes (A1C, LDL cholesterol, BP) and management measures (e.g., A1C and BP monitoring). We categorized insurance stability status during each 6-month interval as 6 separate categories based upon type (private, public, uninsured) and continuity of insurance (continuous, switches, or gaps in coverage). We will examine the association between insurance stability status and DM outcomes adjusting for time, age, sex, comorbidities, site of care, education, and income. Additional analysis will examine if insurance stability moderates the impact of race/ethnicity on DM outcomes. RESULTS/ANTICIPATED RESULTS: Overall, we anticipate that stable health insurance coverage will improve measures for DM care, particularly for racially/ethnically diverse patients. DISCUSSION/SIGNIFICANCE OF IMPACT: The finding of an interaction between insurance stability status and race/ethnicity in improved diabetes management and control would inform the national health care policy debate on the impact of stable health insurance.
Introduction: Early recognition of sepsis can improve patient outcomes yet recognition by paramedics is poor and research evaluating the use of prehospital screening tools is limited. Our objective was to evaluate the predictive validity of the Regional Paramedic Program for Eastern Ontario (RPPEO) prehospital sepsis notification tool to identify patients with sepsis and to describe and compare the characteristics of patients with an emergency department (ED) diagnosis of sepsis that are transported by paramedics. The RPPEO prehospital sepsis notification tool is comprised of 3 criteria: current infection, fever &/or history of fever and 2 or more signs of hypoperfusion (eg. SBP<90, HR 100, RR24, altered LOA). Methods: We performed a review of ambulance call records and in-hospital records over two 5-month periods between November 2014 February 2016. We enrolled a convenience sample of patients, assessed by primary and advanced care paramedics (ACPs), with a documented history of fever &/or documented fever of 38.3°C (101°F) that were transported to hospital. In-hospital management and outcomes were obtained and descriptive, t-tests, and chi-square analyses performed where appropriate. The RPPEO prehospital sepsis notification tool was compared to an ED diagnosis of sepsis. The predictive validity of the RPPEO tool was calculated (sensitivity, specificity, NPV, PPV). Results: 236 adult patients met the inclusion criteria with the following characteristics: mean age 65.2 yrs [range 18-101], male 48.7%, history of sepsis 2.1%, on antibiotics 23.3%, lowest mean systolic BP 125.9, treated by ACP 58.9%, prehospital temperature documented 32.6%. 34 (14.4%) had an ED diagnosis of sepsis. Patients with an ED diagnosis of sepsis, compared to those that did not, had a lower prehospital systolic BP (114.9 vs 127.8, p=0.003) and were more likely to have a prehospital shock index >1 (50.0% vs 21.4%, p=0.001). 44 (18.6%) patients met the RPPEO sepsis notification tool and of these, 27.3% (12/44) had an ED diagnosis of sepsis. We calculated the following predictive values of the RPPEO tool: sensitivity 35.3%, specificity 84.2%, NPV 88.5%, PPV 27.3%. Conclusion: The RPPEO prehospital sepsis notification tool demonstrated modest diagnostic accuracy. Further research is needed to improve accuracy and evaluate the impact on patient outcomes.
To determine the effect of mandatory and nonmandatory influenza vaccination policies on vaccination rates and symptomatic absenteeism among healthcare personnel (HCP).
DESIGN
Retrospective observational cohort study.
SETTING
This study took place at 3 university medical centers with mandatory influenza vaccination policies and 4 Veterans Affairs (VA) healthcare systems with nonmandatory influenza vaccination policies.
PARTICIPANTS
The study included 2,304 outpatient HCP at mandatory vaccination sites and 1,759 outpatient HCP at nonmandatory vaccination sites.
METHODS
To determine the incidence and duration of absenteeism in outpatient settings, HCP participating in the Respiratory Protection Effectiveness Clinical Trial at both mandatory and nonmandatory vaccination sites over 3 viral respiratory illness (VRI) seasons (2012–2015) reported their influenza vaccination status and symptomatic days absent from work weekly throughout a 12-week period during the peak VRI season each year. The adjusted effects of vaccination and other modulating factors on absenteeism rates were estimated using multivariable regression models.
RESULTS
The proportion of participants who received influenza vaccination was lower each year at nonmandatory than at mandatory vaccination sites (odds ratio [OR], 0.09; 95% confidence interval [CI], 0.07–0.11). Among HCP who reported at least 1 sick day, vaccinated HCP had lower symptomatic days absent compared to unvaccinated HCP (OR for 2012–2013 and 2013–2014, 0.82; 95% CI, 0.72–0.93; OR for 2014–2015, 0.81; 95% CI, 0.69–0.95).
CONCLUSIONS
These data suggest that mandatory HCP influenza vaccination policies increase influenza vaccination rates and that HCP symptomatic absenteeism diminishes as rates of influenza vaccination increase. These findings should be considered in formulating HCP influenza vaccination policies.
Cauterisation techniques are commonly used and widely accepted for the management of epistaxis. This review assesses which methods of intranasal cautery should be endorsed as optimum treatment on the basis of benefits, risks, patient tolerance and economic assessment.
Method:
A systematic review of the literature was performed using a standardised methodology and search strategy.
Results:
Eight studies were identified: seven prospective controlled trials and one randomised controlled trial. Pooling of data was possible from 3 studies, yielding a total of 830 patients. Significantly lower re-bleed rates were identified (p < 0.01) using electrocautery (14.5 per cent) when compared to chemical cautery (35.1 per cent). No evidence suggested that electrocautery was associated with more adverse events or discomfort. Limited evidence supported the use of a vasoconstrictor agent and operating microscope during the procedure. The included studies had considerable heterogeneity in terms of design and outcome measures.
Conclusion:
Consistent evidence suggests that electrocautery has higher success rates than chemical cautery, and is not associated with increased complications or patient discomfort. Lower quality evidence suggests that electrocautery reduces costs and duration of hospital stay.
Despite the complexity of the retirement process, most research treats it as an abrupt and one-way transition. Our study takes a different approach by examining retirement reversals (unretirement) and their predictors. Using the British Household Panel Survey (1991–2008), and following participants into Understanding Society (2010–2015), we undertake a survival analysis to investigate retirement reversals among Britons aged 50–69 years who were born in 1920–1959 (N = 2,046). Unretirement was defined as: (a) reporting being retired and subsequently recommencing paid employment, or (b) beginning full-time work following partial retirement (the latter defined here as reporting being retired and working fewer than 30 hours per week). A cumulative proportion of around 25 per cent of participants experienced a retirement reversal after reporting being retired; about half of these reversals occurred within the first five years of retirement. Unretirement was more common for participants who were male, more educated, in better health, owned a house with a mortgage (compared to owning it outright) and whose partner was in paid work. However, unretirement rates were not higher for participants in greater financial need, whether measured as subjective assessment of finances or household income quintiles. These results suggest that unretirement is a strategy more often used by those who are already advantaged and that it has the potential to exacerbate income inequalities in later life.
The nutritional value of meat is an increasingly important factor influencing consumer preferences for poultry, red meat and processed meat products. Intramuscular fat content and composition, in addition to high quality protein, trace minerals and vitamins are important determinants of nutritional value. Fat content of meat at retail has decreased substantially over the past 40 years through advances in animal genetics, nutrition and management and changes in processing techniques. Evidence of the association between diet and the incidence of human non-communicable diseases has driven an interest in developing production systems for lowering total SFA and trans fatty acid (TFA) content and enrichment of n-3 PUFA concentrations in meat and meat products. Typically, poultry and pork has a lower fat content, containing higher PUFA and lower TFA concentrations than lamb or beef. Animal genetics, nutrition and maturity, coupled with their rumen microbiome, are the main factors influencing tissue lipid content and relative proportions of SFA, MUFA and PUFA. Altering the fatty acid (FA) profile of lamb and beef is determined to a large extent by extensive plant and microbial lipolysis and subsequent microbial biohydrogenation of dietary lipid in the rumen, and one of the major reasons explaining the differences in lipid composition of meat from monogastrics and ruminants. Nutritional strategies can be used to align the fat content and FA composition of poultry, pork, lamb and beef with Public Health Guidelines for lowering the social and economic burden of chronic disease.
A numerical model for an interacting ice shelf and ocean is presented in which the ice- shelf base exhibits a channelized morphology similar to that observed beneath Petermann Gletscher’s (Greenland) floating ice shelf. Channels are initiated by irregularities in the ice along the grounding line and then enlarged by ocean melting. To a first approximation, spatially variable basal melting seaward of the grounding line acts as a steel-rule die or a stencil, imparting a channelized form to the ice base as it passes by. Ocean circulation in the region of high melt is inertial in the along-channel direction and geostrophically balanced in the transverse direction. Melt rates depend on the wavelength of imposed variations in ice thickness where it enters the shelf, with shorter wavelengths reducing overall melting. Petermann Gletscher’s narrow basal channels may therefore act to preserve the ice shelf against excessive melting. Overall melting in the model increases for a warming of the subsurface water. The same sensitivity holds for very slight cooling, but for cooling of a few tenths of a degree a reorganization of the spatial pattern of melting leads, surprisingly, to catastrophic thinning of the ice shelf 12 km from the grounding line. Subglacial discharge of fresh water along the grounding line increases overall melting. The eventual steady state depends on when discharge is initiated in the transient history of the ice, showing that multiple steady states of the coupled system exist in general.
Sequential satellite imagery and modeling are used to investigate crevasse patterns at the head of Ice Stream B tributary B1b. The crevasses, informally called the “chromosomes”, form at the upstream limit to B1b’s northern shear margin and chaotic crevasse zone. We find that the onset to crevasse formation, and by inference the onset to streaming flow, has migrated upstream over time at a mean rate of 230(16) m a−1. A possible cause for that migration is changes in net basal friction due to changes in basal water production rate and storage.