To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Pediatric Heart Network Normal Echocardiogram Database Study had unanticipated challenges. We sought to describe these challenges and lessons learned to improve the design of future studies.
Challenges were divided into three categories: enrolment, echocardiographic imaging, and protocol violations. Memoranda, Core Lab reports, and adjudication logs were reviewed. A centre-level questionnaire provided information regarding local processes for data collection. Descriptive statistics were used, and chi-square tests determined differences in imaging quality.
For the 19 participating centres, challenges with enrolment included variations in Institutional Review Board definitions of “retrospective” eligibility, overestimation of non-White participants, centre categorisation of Hispanic participants that differed from National Institutes of Health definitions, and exclusion of potential participants due to missing demographic data. Institutional Review Board amendments resolved many of these challenges. There was an unanticipated burden imposed on centres due to high numbers of echocardiograms that were reviewed but failed to meet submission criteria. Additionally, image transfer software malfunctions delayed Core Lab image review and feedback. Between the early and late study periods, the proportion of unacceptable echocardiograms submitted to the Core Lab decreased (14 versus 7%, p < 0.01). Most protocol violations were from eligibility violations and inadvertent protected health information disclosure (overall 2.5%). Adjudication committee reviews led to protocol changes.
Numerous challenges encountered during the Normal Echocardiogram Database Study prolonged study enrolment. The retrospective design and flaws in image transfer software were key impediments to study completion and should be considered when designing future studies collecting echocardiographic images as a primary outcome.
Surgical site infections (SSIs) are among the most common healthcare-associated infections in low- and middle-income countries. To encourage establishment of actionable and standardized SSI surveillance in these countries, we propose simplified surveillance case definitions. Here, we use NHSN reports to explore concordance of these simplified definitions to NHSN as ‘reference standard.’
There is strong research evidence to support the pharmacological treatment of post-traumatic stress disorder (PTSD) as a second line to trauma-focused psychological interventions. Fluoxetine, paroxetine, sertraline and venlafaxine are the best-evidenced drugs, with lower-level evidence for other medications. It is important that prescribing for PTSD is evidence-based.
To evaluate the National Health Safety Network (NHSN) hospital-onset Clostridioides difficile infection (HO-CDI) standardized infection ratio (SIR) risk adjustment for general acute-care hospitals with large numbers of intensive care unit (ICU), oncology unit, and hematopoietic cell transplant (HCT) patients.
Retrospective cohort study.
Eight tertiary-care referral general hospitals in California.
We used FY 2016 data and the published 2015 rebaseline NHSN HO-CDI SIR. We compared facility-wide inpatient HO-CDI events and SIRs, with and without ICU data, oncology and/or HCT unit data, and ICU bed adjustment.
For these hospitals, the median unmodified HO-CDI SIR was 1.24 (interquartile range [IQR], 1.15–1.34); 7 hospitals qualified for the highest ICU bed adjustment; 1 hospital received the second highest ICU bed adjustment; and all had oncology-HCT units with no additional adjustment per the NHSN. Removal of ICU data and the ICU bed adjustment decreased HO-CDI events (median, −25%; IQR, −20% to −29%) but increased the SIR at all hospitals (median, 104%; IQR, 90%–105%). Removal of oncology-HCT unit data decreased HO-CDI events (median, −15%; IQR, −14% to −21%) and decreased the SIR at all hospitals (median, −8%; IQR, −4% to −11%).
For tertiary-care referral hospitals with specialized ICUs and a large number of ICU beds, the ICU bed adjustor functions as a global adjustment in the SIR calculation, accounting for the increased complexity of patients in ICUs and non-ICUs at these facilities. However, the SIR decrease with removal of oncology and HCT unit data, even with the ICU bed adjustment, suggests that an additional adjustment should be considered for oncology and HCT units within general hospitals, perhaps similar to what is done for ICU beds in the current SIR.
The urgency of symbolic reparation was brought to public attention in a very dramatic manner at the University of Cape Town (UCT) in 2015. Student protestors demanded, and shortly thereafter achieved, the removal of the bronze statue of British imperialist Cecil John Rhodes from its prominent perch on the main campus. Activists would, however, make the point repeatedly that the attack on the Rhodes statue was about something much bigger – a general sense of disaffection with, and alienation from, white institutions, in this case UCT (Ngcaweni and Ngcaweni 2018).
In many ways, the focus on Rhodes was a strategic masterstroke. The statue was a visible and tangible representation of black student discontent with the institutional cultures and practices of former white universities. The statue could therefore be lifted and removed in full sight, offering a political spectacle that demonstrated student power even as it symbolised the uprooting of a concrete symbol of whiteness. It should be no surprise, therefore, that the attack on the statue spread quickly to other former white campuses around the country, from the statue of King George V on the campus of the University of KwaZulu-Natal (UKZN) to the monument of President Steyn at the University of the Free State (UFS).
In real time, images of statues defaced, toppled or removed were carried through social media platforms across the country and around the world, thereby boosting the student cause as well as mobilising further activism around these visible representations of South Africa's colonial and apartheid past (Jansen 2017). In the minds of activists, these physical attacks on statues were a starting point for deeper changes to white institutions, from addressing the paucity of black professors to eliminating the Eurocentrism of the resident curriculum.
It would be a mistake, however, to think that student discontent started with the UCT student Chumani Maxwele pouring human excrement on the Rhodes statue; in fact, activism around race, culture and the curriculum at UCT began much earlier, when postgraduate students protested the closure of African Studies as an independent department, and when Senate decided to stop using race as the only measure of disadvantage in decisions on student admissions (UCT 2014). The word ‘decolonisation’ was used during these protests, long before the discontent boiled over in 2015.
The detection of fireballs streaks in astronomical imagery can be carried out by a variety of methods. The Desert Fireball Network uses a network of cameras to track and triangulate incoming fireballs to recover meteorites with orbits and to build a fireball orbital dataset. Fireball detection is done on-board camera, but due to the design constraints imposed by remote deployment, the cameras are limited in processing power and time. We describe the processing software used for fireball detection under these constrained circumstances. Two different approaches were compared: (1) A single-layer neural network with 10 hidden units that were trained using manually selected fireballs and (2) a more traditional computational approach based on cascading steps of increasing complexity, whereby computationally simple filters are used to discard uninteresting portions of the images, allowing for more computationally expensive analysis of the remainder. Both approaches allowed a full night’s worth of data (over a thousand 36-megapixel images) to be processed each day using a low-power single-board computer. We distinguish between large (likely meteorite-dropping) fireballs and smaller fainter ones (typical ‘shooting stars’). Traditional processing and neural network algorithms both performed well on large fireballs within an approximately 30 000-image dataset, with a true positive detection rate of 96% and 100%, respectively, but the neural network was significantly more successful at smaller fireballs, with rates of 67% and 82%, respectively. However, this improved success came at a cost of significantly more false positives for the neural network results, and additionally the neural network does not produce precise fireball coordinates within an image (as it classifies). Simple consideration of the network geometry indicates that overall detection rate for triangulated large fireballs is calculated to be better than 99.7% and 99.9%, by ensuring that there are multiple double-station opportunities to detect any one fireball. As such, both algorithms are considered sufficient for meteor-dropping fireball event detection, with some consideration of the acceptable number of false positives compared to sensitivity.
A higher intake of food rich in flavonoids such as quercetin can reduce the risk of CVD. Enzymatically modified isoquercitrin (EMIQ®) has a bioavailability 17-fold higher than quercetin aglycone and has shown potential CVD moderating effects in animal studies. The present study aimed to determine whether acute ingestion of EMIQ® improves endothelial function, blood pressure (BP) and cognitive function in human volunteers at risk of CVD. Twenty-five participants (twelve males and thirteen females) with at least one CVD risk factor completed this randomised, controlled, crossover study. In a random order, participants were given EMIQ® (2 mg aglycone equivalent)/kg body weight or placebo alongside a standard breakfast meal. Endothelial function, assessed by flow-mediated dilatation (FMD) of the brachial artery was measured before and 1·5 h after intervention. BP, arterial stiffness, cognitive function, BP during cognitive stress and measures of quercetin metabolites, oxidative stress and markers of nitric oxide (NO) production were assessed post-intervention. After adjustment for pre-treatment measurements and treatment order, EMIQ® treatment resulted in a significantly higher FMD response compared with the placebo (1·80 (95 % CI 0·23, 3·37) %; P = 0·025). Plasma concentrations of quercetin metabolites were significantly higher (P < 0·001) after EMIQ® treatment compared with the placebo. No changes in BP, arterial stiffness, cognitive function or biochemical parameters were observed. In this human intervention study, the acute administration of EMIQ® significantly increased circulating quercetin metabolites and improved endothelial function. Further clinical trials are required to assess whether health benefits are associated with long-term EMIQ® consumption.
Distinguishing between hypertrophic cardiomyopathy and other causes ofleft ventricular hypertrophy can be difficult in children. We hypothesised that cardiac MRI T1 mapping could improve diagnosis of paediatric hypertrophic cardiomyopathy and that measures of myocardial function would correlate with T1 times and extracellular volume fraction.
Thirty patients with hypertrophic cardiomyopathy completed MRI with tissue tagging, T1-mapping, and late gadolinium enhancement. Left ventricular circumferential strain was calculated from tagged images. T1, partition coefficient, and synthetic extracellular volume were measured at base, mid, apex, and thickest area of myocardial hypertrophy. MRI measures compared to cohort of 19 healthy children and young adults. Mann–Whitney U, Spearman’s rho, and multivariable logistic regression were used for statistical analysis.
Hypertrophic cardiomyopathy patients had increased left ventricular ejection fraction and indexed mass. Hypertrophic cardiomyopathy patients had decreased global strain and increased native T1 (−14.3% interquartile range [−16.0, −12.1] versus −17.3% [−19.0, −15.7], p < 0.001 and 1015 ms [991, 1026] versus 990 ms [972, 1001], p = 0.019). Partition coefficient and synthetic extracellular volume were not increased in hypertrophic cardiomyopathy. Global native T1 correlated inversely with ejection fraction (ρ = −0.63, p = 0.002) and directly with global strain (ρ = 0.51, p = 0.019). A logistic regression model using ejection fraction and native T1 distinguished between hypertrophic cardiomyopathy and control with an area under the receiver operating characteristic curve of 0.91.
In this cohort of paediatric hypertrophic cardiomyopathy, strain was decreased and native T1 was increased compared with controls. Native T1 correlated with both ejection fraction and strain, and a model using native T1 and ejection fraction differentiated patients with and without hypertrophic cardiomyopathy.
Soybean [Glycine max (L.) Merr.] has recently become a popular rotational crop in the Canadian Northern Great Plains where herbicide-resistant (HR) soybean cultivars have been widely adopted. Intense reliance on herbicides has contributed to the development of HR weeds in soybean and other crops. Cultural weed management practices reduce the need for herbicides and lower the selection pressure for HR weed biotypes by improving the competitiveness of the crop. The effects of two row spacings, three target densities, and three cultivars on the critical weed-free period (CWFP) in soybean were evaluated as three separate experiments in southern Manitoba. In the row-spacing experiment, soybean grown in narrow rows shortened the CWFP by up to three soybean developmental stages at site-years with increased weed pressure. In the target density experiment, low-density soybean stands lengthened the CWFP by one soybean developmental stage compared with higher-density soybean stands. The effect of soybean cultivar varied among locations, yet tended to be consistent within location over the 2-yr study, suggesting that competitive ability in these soybean cultivars was linked to edaphic and/or environmental factors. Generally, the cultivar with the shortest days to maturity, which also had the shortest stature, consistently had a longer CWFP. Each of these cultural practices were effective at reducing the need for in-crop herbicide applications.
We undertook a quality improvement project to address challenges with pulmonary artery catheter (PAC) line maintenance in a setting of low-baseline central-line infection rates. We observed a subsequent reduction in Staphylococcal PAC line infections and a trend toward a reduction in overall PAC infection rates over 1 year.
In patients with β-lactam allergies, administration of non–β-lactam surgical prophylaxis is associated with increased risk of infection. Although many patients self-report β-lactam allergies, most are unconfirmed or mislabeled. A quality improvement process, utilizing a structured β-lactam allergy tool, was implemented to improve the utilization of preferred β-lactam surgical prophylaxis.
Seasonal influenza virus epidemics have a major impact on healthcare systems. Data on population susceptibility to emerging influenza virus strains during the interepidemic period can guide planning for resource allocation of an upcoming influenza season. This study sought to assess the population susceptibility to representative emerging influenza virus strains collected during the interepidemic period. The microneutralisation antibody titers (MN titers) of a human serum panel against representative emerging influenza strains collected during the interepidemic period before the 2018/2019 winter influenza season (H1N1-inter and H3N2-inter) were compared with those against influenza strains representative of previous epidemics (H1N1-pre and H3N2-pre). A multifaceted approach, incorporating both genetic and antigenic data, was used in selecting these representative influenza virus strains for the MN assay. A significantly higher proportion of individuals had a ⩾four-fold reduction in MN titers between H1N1-inter and H1N1-pre than that between H3N2-inter and H3N2-pre (28.5% (127/445) vs. 4.9% (22/445), P < 0.001). The geometric mean titer (GMT) of H1N1-inter was significantly lower than that of H1N1-pre (381 (95% CI 339–428) vs. 713 (95% CI 641–792), P < 0.001), while there was no significant difference in the GMT between H3N2-inter and H3N2-pre. Since A(H1N1) predominated the 2018–2019 winter influenza epidemic, our results corroborated the epidemic subtype.
From the 1950s through the 1970s, American policymakers engaged in an extensive campaign against illegal gambling in an effort to turn the tide in the government’s crusade against organized crime. At the grassroots, however, voters endorsed a different form of state expansion to beat back the mob menace. Between 1963 and 1977, fourteen northeastern and Rust Belt states enacted the first government-run lotteries in the twentieth-century United States on the belief that legalized gambling would undercut the mob’s gambling profits. While gambling opponents pointed to Las Vegas as proof that organized crime would flourish following legalization, supporters argued that illegal gambling was already pervasive, so the state may as well profit from this irrepressible activity. The history of gambling legalization challenges narratives on the popularity of law-and-order politics and offers a new perspective on crime policy in the post–World War II period.
Consumption of sugar-sweetened beverages (SSB) by infants and young children are less explored in Asian populations. The Growing Up in Singapore Towards healthy Outcomes cohort study examined associations between SSB intake at 18 months and 5 years of age, with adiposity measures at 6 years of age. We studied Singaporean infants/children with SSB intake assessed by FFQ at 18 months of age (n 555) and 5 years of age (n 767). The median for SSB intakes is 28 (interquartile range 5·5–98) ml at 18 months of age and 111 (interquartile range 57–198) ml at 5 years of age. Association between SSB intake (100 ml/d increments and tertile categories) and adiposity measures (BMI standard deviation scores (sd units), sum of skinfolds (SSF)) and overweight/obesity status were examined using multivariable linear and Poisson regression models, respectively. After adjusting for confounders and additionally for energy intake, SSB intake at age 18 months were not significantly associated with later adiposity measures and overweight/obesity outcomes. In contrast, at age 5 years, SSB intake when modelled as 100 ml/d increments were associated with higher BMI by 0·09 (95 % CI 0·02, 0·16) sd units, higher SSF thickness by 0·68 (95 % CI 0·06, 1·44) mm and increased risk of overweight/obesity by 1·2 (95 % CI 1·07, 1·23) times at age 6 years. Trends were consistent with SSB intake modelled as categorical tertiles. In summary, SSB intake in young childhood is associated with higher risks of adiposity and overweight/obesity. Public health policies working to reduce SSB consumption need to focus on prevention programmes targeted at young children.
In the United States, both tort and contract law are generally matters of state rather than federal law. While there is a common core to both tort and contract doctrines in the US, one can find variation – sometimes narrow but sometimes broad – across the 51 independent state jurisdictions. In this United States country report, we attempt to provide a synthesis of state law while also trying to point out states whose laws deviate significantly from the synthesis.
TRACING THE BORDERLINES
The bases for liability in tort and contract are thought of as distinct in the US. The phrase ‘law of obligations’ is largely unknown in the United States. Contractual obligations are derived from the agreement of the parties to a contract that sets forth the bilateral obligations of the parties. By contrast, tort duties are imposed on parties by operation of law. This clean conceptual separation can break down in a number of contexts in which a tort occurs in a relationship where there is also a contract or potential contract between the parties, and where a contract is meant to benefit a third party. Medical and other professional malpractice, products liability, landlords’ obligations for defects in rented premises, implied warranties and other terms in contracts, and contractual waivers of tort liability are among the instances in which this occurs.
A court's characterization of an action as tort or contract matters for several reasons. Statutes of limitations typically limit tort actions to two years from the date of discovery of injury; contract actions typically are given four years from the date of material breach. Insurance contracts and principles of governmental immunity sometimes turn on the nature of the claim. The standard for liability is also quite different – to prevail in the typical tort case, a plaintiff must prove negligence; breach of contract requires no such proof, but merely that the defendant breached the agreed-upon terms. In addition, the applicable damages rules are different – rules regarding mitigation, scope of liability (proximate cause), non-pecuniary losses, and punitive damages are much more permissive in the tort context than in contract. Indeed, even some rules that tort and contract actions share – eg causation – are applied differently in sometimes outcome-determinative ways.
Horseweed, also known as marestail, is a problematic weed for no-till soybean producers that can emerge from late summer through the following spring. Overwintering cover crops can reduce both the density and size of fall-emerged weeds such as horseweed and reduce further spring emergence, although typically cover crops do not provide complete control. Cover crops may be integrated with additional spring herbicide applications to control emerged horseweed, and selective herbicides such as 2,4-D may be used to target horseweed while maintaining small grain cover crop growth. However, cover crops may affect herbicide deposition, which could reduce their efficacy to control weeds. The objective of this study was to determine how the amount and variability of 2,4-D ester spray solution deposition, measured with water-sensitive paper, was affected by a cereal rye cover crop and fall-applied saflufenacil. We also examined deposition at the soil surface relative to the cereal rye row position. In a year with greater cereal rye biomass accumulation, there was 44% less coverage and average deposit size was 45% smaller immediately adjacent to cereal rye rows compared with between rows and areas without cereal rye. Greater variability in these measurements was also noted in this position. Percent spray solution coverage was also 22% greater in plots that received saflufenacil in the fall, and deposits were 28% larger. In a year with less cover crop and winter weed biomass, no differences in spray deposition were observed. This suggests that small horseweed plants and other weeds immediately adjacent to cereal rye cover crop rows may be more likely to survive early spring herbicide applications, though the suppressive effects of cover crops may mitigate this concern.
The first case of evolved protoporphyrinogen oxidase (PPO)-inhibitor resistance was observed in 2001 in common waterhemp [Amaranthus tuberculatus (Moq.) Sauer var. rudis (Sauer) Costea and Tardif]. This resistance in A. tuberculatus is most commonly conferred by deletion of the amino acid glycine at the 210th position (ΔGly-210) of the PPO enzyme (PPO2) encoded by PPX2. In a field in Kentucky in 2015, inadequate control of Amaranthus plants was observed following application of a PPO inhibitor. Morphological observations indicated that survivors included both A. tuberculatus and Palmer amaranth (Amaranthus palmeri S. Watson). Research was conducted to confirm species identities and resistance and then to determine whether resistance evolved independently in the two species or via hybridization. Results from a quantitative PCR assay based on the ribosomal internal transcribed spacer confirmed that both A. tuberculatus and A. palmeri coexisted in the field. The mutation conferring ΔGly-210 in PPO2 was identified in both species; phylogenetic analysis of a region of PPX2, however, indicated that the mutation evolved independently in the two species. Genotyping of greenhouse-grown plants that survived lactofen indicated that all A. tuberculatus survivors, but only a third of A. palmeri survivors, contained the ΔGly-210 mutation. Consequently, A. palmeri plants were evaluated for the presence of an arginine to glycine or methionine substitution at position 128 of PPO2 (Arg-128-Gly and Arg-128-Met). The Arg-128-Gly substitution was found to account for resistance that was not accounted for by the ΔGly-210 mutation in plants from the A. palmeri population. Results from this study provide a modern-day example of both parallel and convergent evolution occurring within a single field.