To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Electroanatomic mapping systems are increasingly used during ablations to decrease the need for fluoroscopy and therefore radiation exposure. For left-sided arrhythmias, transseptal puncture is a common procedure performed to gain access to the left side of the heart. We aimed to demonstrate the radiation exposure associated with transseptal puncture.
Data were retrospectively collected from the Catheter Ablation with Reduction or Elimination of Fluoroscopy registry. Patients with left-sided accessory pathway-mediated tachycardia, with a structurally normal heart, who had a transseptal puncture, and were under 22 years of age were included. Those with previous ablations, concurrent diagnostic or interventional catheterisation, and missing data for fluoroscopy use or procedural outcomes were excluded. Patients with a patent foramen ovale who did not have a transseptal puncture were selected as the control group using the same criteria. Procedural outcomes were compared between the two groups.
There were 284 patients in the transseptal puncture group and 70 in the patent foramen ovale group. The transseptal puncture group had a significantly higher mean procedure time (158.8 versus 131.4 minutes, p = 0.002), rate of fluoroscopy use (38% versus 7%, p < 0.001), and mean fluoroscopy time (2.4 versus 0.6 minutes, p < 0.001). The acute success and complication rates were similar.
Performing transseptal puncture remains a common reason to utilise fluoroscopy in the era of non-fluoroscopic ablation. Better tools are needed to make non-fluoroscopic transseptal puncture more feasible.
After implementing a coronavirus disease 2019 (COVID-19) infection prevention bundle, the incidence rate ratio (IRR) of non–severe acute respiratory coronavirus virus 2 (non–SARS-CoV-2) hospital-acquired respiratory viral infection (HA-RVI) was significantly lower than the IRR from the pre–COVID-19 period (IRR, 0.322; 95% CI, 0.266–0.393; P < .01). However, HA-RVIs incidence rates mirrored community RVI trends, suggesting that hospital interventions alone did not significantly affect HA-RVI incidence.
It is a traditional hope of comparative psychology that animal minds might be unitary, parsimonious, associative. In contrast, cognitive researchers acknowledge multiple learning systems, including humans’ capacity for explicit hypothesis testing and rule learning. The authors describe new paradigms that may dissociate the explicit from the associative and demonstrate animals’ explicit capabilities. These paradigms include matched tasks that foster explicit or associative category learning, and paradigms that disable crucial components of associative learning. Given this disabling, animals may adopt instead an alternative, more explicit learning system. The authors review this area, including research on humans, monkeys, rats, and pigeons. They also consider the evolutionary and fitness factors that might favor the development of complementary associative and explicit learning systems.
Background: SARS-CoV-2 N95 mask contamination in healthcare providers (HCPs) treating patients with COVID-19 is poorly understood. Method: We performed a prospective observational study of HCP N95 respirator SARS-CoV-2 contamination during aerosol-generating procedures (AGPs) on SARS-CoV-2–positive patients housed in a COVID-19–specific unit at an academic medical center. Medical masks were used as surrogates for N95 respirators to avoid waste and were worn on top of HCP N95 respirators during study AGPs. Study masks were provided to HCPs while donning PPE and were retrieved during doffing. Additionally, during doffing, face shields were swabbed with Floq swabs premoistened with viral transport media (VTM) prior to disinfection. Medical masks were cut into 9 position-based pieces, placed in VTM, vortexed, and centrifuged (Fig. 1). RNA extraction and RT-PCR were completed on all samples. RT-PCR–positive samples underwent cell culture infection to detect cytopathic effects (CPE). Contamination was characterized by mask location and front and back of face shields. Patient COVID-19 symptoms were collected from routine clinical documentation. Study HCPs completed HCP-role–specific routine care (eg, assessing, administering medications, and maintaining oxygen supplementation) while in patient rooms and were observed by study team members. Results: We enrolled 31 HCPs between September and December 2021. HCP and patient characteristics are presented in Table 1. In total, 330 individual samples were obtained from 31 masks and 26 face shields among 12 patient rooms. Of the 330 samples, 0 samples were positive for SARS-CoV-2 via RT-PCR. Positive controls were successfully performed in the laboratory setting to confirm that the virus was recoverable using these methods. Notably, all samples were collected from HCPs caring for COVID-19 patients on high-flow, high-humidity Optiflow (AGP), with an average of 960 seconds (IQR, 525–1,680) spent in each room. In addition to Optiflow and routine care, study speech pathologists completed an additional AGP of fiberoptic endoscopic evaluation of swallowing. Notably, 29 (94%) of 31 study HCP had physical contact with their patient. Conclusions: Overall, mask contamination in HCPs treating patients with COVID-19 undergoing AGPs was not detectable while wearing face shields, despite patient contact and performing AGP.
Field studies were conducted in North Carolina in 2018 and 2019 to determine sweetpotato tolerance to indaziflam and its effectiveness in controlling Palmer amaranth in sweetpotato. Treatments included indaziflam pre-transplant; 7 d after transplanting (DATr) or 14 DATr at 29, 44, 58, or 73 g ai ha−1; and checks (weedy and weed-free). Indaziflam applied postemergence caused transient foliar injury to sweetpotato. Indaziflam pretransplant caused less injury to sweetpotato than other application timings regardless of rate. Palmer amaranth control was greatest when indaziflam was applied pretransplant or 7 DATr. In a weed-free environment, sweetpotato marketable yield decreased as indaziflam application was delayed. No differences in storage root length to width ratio were observed.
To analyze the spread of a novel sequence type (ST1478) of vancomycin-resistant Enterococcus faecium across Canadian hospitals.
Retrospective chart review of patients identified as having ST1478 VRE bloodstream infection.
Canadian hospitals that participate in the Canadian Nosocomial Infection Surveillance Program (CNISP).
From 2013 to 2018, VRE bloodstream isolates collected from participating CNISP hospitals were sent to the National Microbiology Laboratory (NML). ST1478 isolates were identified using multilocus sequence typing, and whole-genome sequencing was performed. Patient characteristics and location data were collected for patients with ST1478 bloodstream infection (BSI). The sequence and patient location information were used to generate clusters of infections and assess for intrahospital and interhospital spread.
ST1478 VRE BSI occurred predominantly in a small number of hospitals in central and western Canada. Within these hospitals, infections were clustered on certain wards, and isolates often had <20 single-nucleotide variants (SNV) differences from one another, suggesting a large component of intrahospital spread. Furthermore, some patients with bloodstream infections were identified as moving from one hospital to another, potentially having led to interhospital spread. Genomic analysis of all isolates revealed close relatedness between isolates at multiple different hospitals (<20 SNV) not predicted from our epidemiologic data.
Both intrahospital and regional interhospital spread have contributed to the emergence of VRE ST1478 infections across Canada. Whole-genome sequencing provides evidence of spread that might be missed with epidemiologic investigation alone.
Copy number variants (CNVs) have been associated with the risk of schizophrenia, autism and intellectual disability. However, little is known about their spectrum of psychopathology in adulthood.
We investigated the psychiatric phenotypes of adult CNV carriers and compared probands, who were ascertained through clinical genetics services, with carriers who were not. One hundred twenty-four adult participants (age 18–76), each bearing one of 15 rare CNVs, were recruited through a variety of sources including clinical genetics services, charities for carriers of genetic variants, and online advertising. A battery of psychiatric assessments was used to determine psychopathology.
The frequencies of psychopathology were consistently higher for the CNV group compared to general population rates. We found particularly high rates of neurodevelopmental disorders (NDDs) (48%), mood disorders (42%), anxiety disorders (47%) and personality disorders (73%) as well as high rates of psychiatric multimorbidity (median number of diagnoses: 2 in non-probands, 3 in probands). NDDs [odds ratio (OR) = 4.67, 95% confidence interval (CI) 1.32–16.51; p = 0.017) and psychotic disorders (OR = 6.8, 95% CI 1.3–36.3; p = 0.025) occurred significantly more frequently in probands (N = 45; NDD: 39[87%]; psychosis: 8[18%]) than non-probands (N = 79; NDD: 20 [25%]; psychosis: 3[4%]). Participants also had somatic diagnoses pertaining to all organ systems, particularly conotruncal cardiac malformations (in individuals with 22q11.2 deletion syndrome specifically), musculoskeletal, immunological, and endocrine diseases.
Adult CNV carriers had a markedly increased rate of anxiety and personality disorders not previously reported and high rates of psychiatric multimorbidity. Our findings support in-depth psychiatric and medical assessments of carriers of CNVs and the establishment of multidisciplinary clinical services.
The fossil record is notoriously imperfect and biased in representation, hindering our ability to place fossil specimens into an evolutionary context. For groups with fossil records mostly consisting of disarticulated parts (e.g., vertebrates, echinoderms, plants), the limited morphological information preserved sparks concerns about whether fossils retain reliable evidence of phylogenetic relationships and lends uncertainty to analyses of diversification, paleobiogeography, and biostratigraphy in Earth's history. To address whether a fragmentary past can be trusted, we need to assess whether incompleteness affects the quality of phylogenetic information contained in fossil data. Herein, we characterize skeletal incompleteness bias in a large dataset (6585 specimens; 14,417 skeletal elements) of fossil squamates (lizards, snakes, amphisbaenians, and mosasaurs). We show that jaws + palatal bones, vertebrae, and ribs appear more frequently in the fossil record than other parts of the skeleton. This incomplete anatomical representation in the fossil record is biased against regions of the skeleton that contain the majority of morphological phylogenetic characters used to assess squamate evolutionary relationships. Despite this bias, parsimony- and model-based comparative analyses indicate that the most frequently occurring parts of the skeleton in the fossil record retain similar levels of phylogenetic signal as parts of the skeleton that are rarer. These results demonstrate that the biased squamate fossil record contains reliable phylogenetic information and support our ability to place incomplete fossils in the tree of life.
Obesity increases the risk of post-operative arrhythmias in adults undergoing cardiac surgery, but little is known regarding the impact of obesity on post-operative arrhythmias after CHD surgery.
Patients undergoing CHD surgery from 2007 to 2019 were prospectively enrolled in the parent study. Telemetry was assessed daily, with documentation of all arrhythmias. Patients aged 2–20 years were categorised by body mass index percentile for age and sex (underweight <5, normal 5–85, overweight 85–95, and obese >95). Patients aged >20 years were categorised using absolute body mass index. We investigated the impact of body mass index category on arrhythmias using univariate and multivariate analysis.
There were 1250 operative cases: 12% underweight, 65% normal weight, 12% overweight, and 11% obese. Post-operative arrhythmias were observed in 38%. Body mass index was significantly higher in those with arrhythmias (18.8 versus 17.8, p = 0.003). There was a linear relationship between body mass index category and incidence of arrhythmias: underweight 33%, normal 38%, overweight 42%, and obese 45% (p = 0.017 for trend). In multivariate analysis, body mass index category was independently associated with post-operative arrhythmias (p = 0.021), with odds ratio 1.64 in obese patients as compared to normal-weight patients (p = 0.036). In addition, aortic cross-clamp time (OR 1.007, p = 0.002) and maximal vasoactive–inotropic score in the first 48 hours (OR 1.03, p = 0.04) were associated with post-operative arrhythmias.
Body mass index is independently associated with incidence of post-operative arrhythmias in children after CHD surgery.
Despite the numerous advantages of central venous catheters (CVCs), they have been associated with a variety of complications. Surveillance for mechanical complications of CVCs is not routine, so the true incidence and impact of this adverse patient outcome remains unclear.
Setting and methods:
Prospectively collected CVC data on mechanical complications were reviewed from a centralized database for all in-hospital patient days at our tertiary-care hospital from January 2001 to June 2016 in patients aged <19 years. Patient demographics, CVC characteristics, and rates of mechanical complications per 1,000 days of catheter use were described.
In total, 8,747 CVCs were placed in 5,743 patients during the study period, which captured 780,448 catheter days. The overall mechanical complication rate was 6.1 per 1,000 catheter days (95% confidence interval [CI], 5.9–6.3). The highest complication rates were in nontunneled lines; this was consistent throughout the 15-year study period. Also, 521 CVCs (∼6%) were removed due to mechanical complications before therapy termination. Catheters with tip location in the superior vena cava or right atrium had the fewest complications.
Mechanical complications of CVCs are a common and significant event in the pediatric population. We propose that CVC-associated mechanical complications become a routinely reported patient safety outcome.
Bacillus pumilus SAFR-032, an endospore-forming bacterial strain, was investigated to determine its methylation pattern (methylome) change, compared to ground control, after direct exposure to space conditions onboard the International Space Station (ISS) for 1.5 years. The resulting ISS-flown and non-flown strains were sequenced using the Nanopore MinION and an in-house method and pipeline to identify methylated positions in the genome. Our analysis indicated genomic variants and m6A methylation increased in the ISS-flown SAFR-032. To complement the broader omics investigation and explore phenotypic changes, ISS-flown and non-flown strains were compared in a series of laboratory-based chamber experiments using an X-ray irradiation source (doses applied at 250, 500, 750, 1000 and 1250 Gy); results show a potentially higher survival fraction of ISS-flown DS2 at the two highest exposures. Taken together, results from this study document lasting changes to the genome by methylation, potentially triggered by conditions in spaceflight, with functional consequences for the resistance of bacteria to stressors expected on long-duration missions beyond low Earth orbit.
Documenting phylogenetic diversity for conservation practice allows elucidation of ecosystem functioning and processes by highlighting the commonality and divergence of species’ functional traits within their evolutionary context. Conserving distinct evolutionary histories has intrinsic value, and the conservation of phylogenetically diverse communities is more likely to preserve distinct or relic evolutionary lineages. We explored the potential for anthropogenic forest fragmentation to act as a selective filter of avian phylogenetic diversity within the community of forest-dependent birds of the critically endangered Indian Ocean Coastal Belt Forest (IOCBF), South Africa. We conducted avian point count surveys during the austral breeding season, and calculated fragmentation metrics of forest structural complexity, patch size and isolation. We constructed a maximum likelihood phylogeny using the combined analysis of two mitochondrial genes and three nuclear markers and measured the influence of the fragmentation metrics on six measures of phylogenetic diversity. Our results indicated that the avian community was variously affected by anthropogenic forest fragmentation, with the different metrics of phylogenetic diversity responding with no definitive overall pattern. However, forest structural complexity emerged as an important metric explaining phylogenetic structuring. While the avian community’s phylogenetic diversity displayed resilience to anthropogenic fragmentation, previous research showed a reduction in functional diversity along the fragmentation gradient. Therefore, we recommend studies that especially aim to guide conservation management, incorporate both phylogenetic and functional diversity measures to sufficiently interrogate communities’ resilience to the threats under investigation.
Herbicides with soil-residual activity have the potential for carryover into subsequent crops, resulting in injury to sensitive crops and limiting productivity if severe. The increased use of soil-residual herbicides in the United States for management of troublesome weeds in corn- and soybean-cropping systems has potential to result in more cases of carryover. Soil management practices have different effects on the soil environment, potentially influencing herbicide degradation and likelihood of carryover. Field experiments were conducted at three sites in 2019 and 2020 to determine the effects of corn (clopyralid and mesotrione) and soybean (fomesafen and imazethapyr) herbicides applied in the fall at reduced rates (25% and 50% of labeled rates) and three soil management practices (tillage, no-tillage, and a fall-established cereal rye cover crop) on subsequent growth and productivity of the cereal rye cover crop and the soybean and corn crops, respectively. Most response variables (cereal rye biomass and crop canopy cover at cover crop termination in the spring, early-season crop stand and herbicide injury ratings, and crop yield) were not affected by herbicide carryover. Corn yield was lower when soil was managed with a cereal rye cover crop compared with tillage at all three sites, while yield was lower for no-till compared with tillage at two sites. Soybean yield was lower when managed with a cereal rye cover crop compared with tillage and no-till at one site. Findings from this research indicate a low carryover risk for these herbicides across site-years when label rotational restrictions are followed and environmental conditions favorable for herbicide degradation exist, regardless of soil management practice on silt loam or silty clay loam soil types in the U.S. Midwest region.
We conducted a retrospective review of a hybrid antimicrobial restriction process demonstrating adherence to appropriate use criteria in 72% of provisional-only orders, in 100% of provisional orders followed by ID orders, and in 97% of ID-initiated orders. Therapy interruptions occurred in 24% of provisional orders followed by ID orders.
As the understanding of health care worker lived experience during coronavirus disease 2019 (COVID-19) grows, the experiences of those utilizing emergency health care services (EHS) during the pandemic are yet to be fully appreciated.
The objective of this research was to explore lived experience of EHS utilization in Victoria, Australia during the COVID-19 pandemic from March 2020 through March 2021.
An explorative qualitative design underpinned by a phenomenological approach was applied. Data were collected through semi-structured, in-depth interviews, which were transcribed verbatim and analyzed using Colaizzi’s approach.
Qualitative data were collected from 67 participants aged from 32 to 78-years-of-age (average age of 52). Just over one-half of the research participants were male (54%) and three-quarters lived in metropolitan regions (75%). Four key themes emerged from data analysis: (1) Concerns regarding exposure and infection delayed EHS utilization among participants with chronic health conditions; (2) Participants with acute health conditions expressed concern regarding the impact of COVID-19 on their care, but continued to access services as required; (3) Participants caring for people with sensory and developmental disabilities identified unique communication needs during interactions with EHS during the COVID-19 pandemic; communicating with emergency health care workers wearing personal protective equipment (PPE) was identified as a key challenge, with face masks reported as especially problematic for people who are deaf or hard-of-hearing; and (4) Children and older people also experienced communication challenges associated with PPE, and the need for connection with emergency health care workers was important for positive lived experience during interactions with EHS throughout the pandemic.
This research provides an important insight into the lived experience of EHS utilization during the COVID-19 pandemic, a perspective currently lacking in the published peer-reviewed literature.
Food manufacturers are under increasing pressure to limit the amount of free sugars in their products. Many have reformulated products to replace sucrose, glucose and fructose with alternative sweeteners, but some of these have been associated with additional health concerns. Rare sugars are ‘monosaccharides and their derivatives that hardly exist in nature’, and there is increasing evidence that they could have health benefits. This review aimed to scope the existing literature in order to identify the most commonly researched rare sugars, to ascertain their proposed health benefits, mechanisms of action and potential uses and to highlight knowledge gaps. A process of iterative database searching identified fifty-five relevant articles. The reported effects of rare sugars were noted, along with details of the research methodologies conducted. Our results indicated that the most common rare sugars investigated are d-psicose and d-tagatose, with the potential health benefits divided into three topics: glycaemic control, body composition and CVD. All the rare sugars investigated have the potential to suppress postprandial elevation of blood glucose and improve glycaemic control in both human and animal models. Some animal studies have suggested that certain rare sugars may also improve lipid profiles, alter the gut microbiome and reduce pro-inflammatory cytokine expression. The present review demonstrates that rare sugars could play a role in reducing the development of obesity, type 2 diabetes and/or CVD. However, understanding of the mechanisms by which rare sugars may exert their effects is limited, and their effectiveness when used in reformulated products is unknown.
Vast disparities between and within American states’ responses to the COVID-19 pandemic have evoked renewed attention to whether greater centralization might enhance investments in subnational capacity and remedy subnational inequalities or instead erode subnational organizational capacity. Developments in American public education (1997–2015) offer perspective on this puzzle, which we examine by applying interrupted time series analysis to a novel dataset to assess the implications of centralization on subnational investments in administrative and technical capacity, two dimensions of organizational capacity. We find simultaneous subnational erosion in administrative capacity and growth in technical capacity following centralization, both of which appear concentrated in low-poverty areas despite centralization’s explicit antipoverty purposes. Public education reforms highlight both the challenge of dismantling subnational inequality through centralization and the need for future research on policy designs that enable centralization to yield subnational capacity that is able to remedy inequality.