We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
From Ayurvedic texts to botanical medicines to genomics, ideas and expertise about veterinary healing have circulated between cultures through travel, trade, and conflict. In this broad-ranging and accessible study spanning 400 years of history, Susan D. Jones and Peter A. Koolmees present the first global history of veterinary medicine and animal healing. Drawing on inter-disciplinary and multi-disciplinary perspectives, this book addresses how attitudes toward animals, disease causation theories, wars, problems of food insecurity and the professionalization and spread of European veterinary education have shaped new domains for animal healing, such as preventive medicine in intensive animal agriculture and the need for veterinarians specializing in zoo animals, wildlife, and pets. It concludes by considering the politicization of animal protection, changes in the global veterinary workforce, and concerns about disease and climate change. As mediators between humans and animals, veterinarians and other animal healers have both shaped, and been shaped by, the social, cultural, and economic roles of animals over time.
To describe national trends in testing and detection of carbapenemases
produced by carbapenem-resistant Enterobacterales (CRE) and associate
testing with culture and facility characteristics.
Design:
Retrospective cohort study.
Setting:
Department of Veterans’ Affairs medical centers (VAMCs).
Participants:
Patients seen at VAMCs between 2013 and 2018 with cultures positive for CRE,
defined by national VA guidelines.
Interventions:
Microbiology and clinical data were extracted from national VA data sets.
Carbapenemase testing was summarized using descriptive statistics.
Characteristics associated with carbapenemase testing were assessed with
bivariate analyses.
Results:
Of 5,778 standard cultures that grew CRE, 1,905 (33.0%) had evidence of
molecular or phenotypic carbapenemase testing and 1,603 (84.1%) of these had
carbapenemases detected. Among these cultures confirmed as
carbapenemase-producing CRE, 1,053 (65.7%) had molecular testing for
≥1 gene. Almost all testing included KPC (n = 1,047, 99.4%), with KPC
detected in 914 of 1,047 (87.3%) cultures. Testing and detection of other
enzymes was less frequent. Carbapenemase testing increased over the study
period from 23.5% of CRE cultures in 2013 to 58.9% in 2018. The South US
Census region (38.6%) and the Northeast (37.2%) region had the highest
proportion of CRE cultures with carbapenemase testing. High complexity (vs
low) and urban (vs rural) facilities were significantly associated with
carbapenemase testing (P < .0001).
Conclusions:
Between 2013 and 2018, carbapenemase testing and detection increased in the
VA, largely reflecting increased testing and detection of KPC. Surveillance
of other carbapenemases is important due to global spread and increasing
antibiotic resistance. Efforts supporting the expansion of carbapenemase
testing to low-complexity, rural healthcare facilities and standardization
of reporting of carbapenemase testing are needed.
Background: The advent of real-time quaking-induced conversion (RT-QuIC) assays has transformed the diagnostic approach to sporadic Creutzfeldt-Jakob disease (CJD) facilitating earlier recognition of affected patients. Recognizing this, we evaluated the performance of clinical features and diagnostic tests for CJD in the modern era. Methods: Clinical data were extracted from the electronic medical records of 115 patients with probable or definite CJD assessed at Mayo Clinic from 2014-2021. Clinical features and diagnostic tests were evaluated at presentation, and associations with diagnosis and prognosis determined. Results: Mean age-at-symptom onset was 64.8±9.4 years; 68 patients were female (59%). The sensitivity of clinical markers (myoclonus) and tests historically considered in patients with suspected CJD was poor (stereotyped EEG abnormalities, 16%; CSF 14-3-3, 60%). Conversely, RT-QuIC (93%), t-tau >1149 pg/mL (88%), and characteristic signal abnormalities on MRI (77%) identified most patients. Multivariable linear regression confirmed shorter days-to-death in patients with myoclonus (125.9, CI95% 23.3-15.5, p=0.026), visual/cerebellar signs (180.19, CI95% 282.2-78.2, p<0.001), positive 14-3-3 (193, CI95% 304.9-82.9; p<0.001), and elevated t-tau (9.0, CI95% 1.0-18.0, for every 1000 pg/ml elevation; p=0.041). Conclusions: CSF RT-QuIC and elevated t-tau, and stereotyped MRI abnormalities were consistently detected in CJD patients. Myoclonus, EEG findings, and CSF protein 14-3-3 were less useful in the modern era.
A spacetime formulation is presented to solve unsteady aerodynamic problems involving large deformation or topological change such as store separation, slat and flap deployment or spoiler deflection. This technique avoids complex CFD meshing methods, such as Chimera, by the use of a finite-volume approach both in space and time, and permits a locally varying real timestep. The use of a central-difference scheme in the time direction can yield non-physical transient solutions as a consequence of information travelling backwards in time. Therefore, an upwind formulation is provided and validated against one-dimensional and two-dimensional test cases. A hybrid formulation (central in space, upwind in time) is also given and unsteady cases are computed for a spoiler and spoiler/flap deployment, with all three formulations compared, demonstrating that the use of an upwind time stencil yields more representative physical solutions and improves the rate of convergence.
The goal for many PhD students in archaeology is tenure-track employment. Students primarily receive their training by tenure-track or tenured professors, and they are often tacitly expected—or explicitly encouraged—to follow in the footsteps of their advisor. However, the career trajectories that current and recent PhD students follow may hold little resemblance to the ones experienced by their advisors. To understand these different paths and to provide information for current PhD students considering pursuing a career in academia, we surveyed 438 archaeologists holding tenured or tenure-track positions in the United States. The survey, recorded in 2019, posed a variety of questions regarding the personal experiences of individual professors. The results are binned by the decade in which the respondent graduated. Evident patterns are discussed in terms of change over time. The resulting portraits of academic pathways through the past five decades indicate that although broad commonalities exist in the qualifications of early career academics, there is no singular pathway to obtaining tenure-track employment. We highlight the commonalities revealed in our survey to provide a set of general qualifications that might provide a baseline set of skills and experiences for an archaeologist seeking a tenure-track job in the United States.
To assess the impact of the coronavirus disease 2019 (COVID-19) pandemic on healthcare-associated infections (HAIs) reported from 128 acute-care and 132 long-term care Veterans Affairs (VA) facilities.
Methods:
We compared central-line–associated bloodstream infections (CLABSIs), ventilator-associated events (VAEs), catheter-associated urinary tract infections (CAUTIs), methicillin-resistant Staphylococcus aureus (MRSA), and Clostridioides difficile infections and rates reported from each facility monthly to a centralized database before the pandemic (February 2019 through January 2020) and during the pandemic (July 2020 through June 2021).
Results:
Nationwide VA COVID-19 admissions peaked in January 2021. Significant increases in the rates of CLABSIs, VAEs, and MRSA all-site HAIs (but not MRSA CLABSIs) were observed during the pandemic in acute-care facilities. There was no significant change in CAUTI rates, and C. difficile rates significantly decreased. There were no significant increases in HAIs in long-term care facilities.
Conclusions:
The COVID-19 pandemic had a differential impact on HAIs of various types in VA acute care, with many rates increasing. The decrease in CDI HAIs may be due, in part, to evolving diagnostic testing. The minimal impact of COVID-19 in VA long-term facilities may reflect differences in patient numbers and acuity and early recognition of the impact of the pandemic on nursing home residents leading to increased vigilance and optimization of infection prevention and control practices in that setting. These data support the need for building and sustaining conventional infection prevention and control strategies before and during a pandemic.
A comparison of computer-extracted and facility-reported counts of hospitalized coronavirus disease 2019 (COVID-19) patients for public health reporting at 36 hospitals revealed 42% of days with matching counts between the data sources. Miscategorization of suspect cases was a primary driver of discordance. Clear reporting definitions and data validation facilitate emerging disease surveillance.
To investigate a cluster of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infections in employees working on 1 floor of a hospital administration building.
Methods:
Contact tracing was performed to identify potential exposures and all employees were tested for SARS-CoV-2. Whole-genome sequencing was performed to determine the relatedness of SARS-CoV-2 samples from infected personnel and from control cases in the healthcare system with coronavirus disease 2019 (COVID-19) during the same period. Carbon dioxide levels were measured during a workday to assess adequacy of ventilation; readings >800 parts per million (ppm) were considered an indication of suboptimal ventilation. To assess the potential for airborne transmission, DNA-barcoded aerosols were released, and real-time polymerase chain reaction was used to quantify particles recovered from air samples in multiple locations.
Results:
Between December 22, 2020, and January 8, 2021, 17 coworkers tested positive for SARS-CoV-2, including 13 symptomatic and 4 asymptomatic individuals. Of the 5 cluster SARS-CoV-2 samples sequenced, 3 were genetically related, but these employees denied higher-risk contacts with one another. None of the sequences from the cluster were genetically related to the 17 control sequences of SARS-CoV-2. Carbon dioxide levels increased during a workday but never exceeded 800 ppm. DNA-barcoded aerosol particles were dispersed from the sites of release to locations throughout the floor; 20% of air samples had >1 log10 particles.
Conclusions:
In a hospital administration building outbreak, sequencing of SARS-CoV-2 confirmed transmission among coworkers. Transmission occurred despite the absence of higher-risk exposures and in a setting with adequate ventilation based on monitoring of carbon dioxide levels.
Many short gamma-ray bursts (GRBs) originate from binary neutron star mergers, and there are several theories that predict the production of coherent, prompt radio signals either prior, during, or shortly following the merger, as well as persistent pulsar-like emission from the spin-down of a magnetar remnant. Here we present a low frequency (170–200 MHz) search for coherent radio emission associated with nine short GRBs detected by the Swift and/or Fermi satellites using the Murchison Widefield Array (MWA) rapid-response observing mode. The MWA began observing these events within 30–60 s of their high-energy detection, enabling us to capture any dispersion delayed signals emitted by short GRBs for a typical range of redshifts. We conducted transient searches at the GRB positions on timescales of 5 s, 30 s, and 2 min, resulting in the most constraining flux density limits on any associated transient of 0.42, 0.29, and 0.084 Jy, respectively. We also searched for dispersed signals at a temporal and spectral resolution of 0.5 s and 1.28 MHz, but none were detected. However, the fluence limit of 80–100 Jy ms derived for GRB 190627A is the most stringent to date for a short GRB. Assuming the formation of a stable magnetar for this GRB, we compared the fluence and persistent emission limits to short GRB coherent emission models, placing constraints on key parameters including the radio emission efficiency of the nearly merged neutron stars (
$\epsilon_r\lesssim10^{-4}$
), the fraction of magnetic energy in the GRB jet (
$\epsilon_B\lesssim2\times10^{-4}$
), and the radio emission efficiency of the magnetar remnant (
$\epsilon_r\lesssim10^{-3}$
). Comparing the limits derived for our full GRB sample (along with those in the literature) to the same emission models, we demonstrate that our fluence limits only place weak constraints on the prompt emission predicted from the interaction between the relativistic GRB jet and the interstellar medium for a subset of magnetar parameters. However, the 30-min flux density limits were sensitive enough to theoretically detect the persistent radio emission from magnetar remnants up to a redshift of
$z\sim0.6$
. Our non-detection of this emission could imply that some GRBs in the sample were not genuinely short or did not result from a binary neutron star merger, the GRBs were at high redshifts, these mergers formed atypical magnetars, the radiation beams of the magnetar remnants were pointing away from Earth, or the majority did not form magnetars but rather collapse directly into black holes.
We present a novel approach to developing a unified radiocarbon-based chronology for multiple sediment cores from a location where radiocarbon dating is challenging. We used 36 radiocarbon ages from eight terminal Pleistocene and Holocene sediment cores with correlated stratigraphies. Stratigraphic correlation was accomplished using a combination of high-resolution photography, high-resolution X-ray fluorescence-based elemental composition data, and volcanic tephra identification. Results show that despite problems associated with potential contamination or radiocarbon reservoir effect, a useful age-depth model has been created for the correlated lacustrine sections of these eight sediment cores, providing chronological controls for future paleoenvironmental analyses of the cores.
Racial and ethnic groups in the USA differ in the prevalence of posttraumatic stress disorder (PTSD). Recent research however has not observed consistent racial/ethnic differences in posttraumatic stress in the early aftermath of trauma, suggesting that such differences in chronic PTSD rates may be related to differences in recovery over time.
Methods
As part of the multisite, longitudinal AURORA study, we investigated racial/ethnic differences in PTSD and related outcomes within 3 months after trauma. Participants (n = 930) were recruited from emergency departments across the USA and provided periodic (2 weeks, 8 weeks, and 3 months after trauma) self-report assessments of PTSD, depression, dissociation, anxiety, and resilience. Linear models were completed to investigate racial/ethnic differences in posttraumatic dysfunction with subsequent follow-up models assessing potential effects of prior life stressors.
Results
Racial/ethnic groups did not differ in symptoms over time; however, Black participants showed reduced posttraumatic depression and anxiety symptoms overall compared to Hispanic participants and White participants. Racial/ethnic differences were not attenuated after accounting for differences in sociodemographic factors. However, racial/ethnic differences in depression and anxiety were no longer significant after accounting for greater prior trauma exposure and childhood emotional abuse in White participants.
Conclusions
The present findings suggest prior differences in previous trauma exposure partially mediate the observed racial/ethnic differences in posttraumatic depression and anxiety symptoms following a recent trauma. Our findings further demonstrate that racial/ethnic groups show similar rates of symptom recovery over time. Future work utilizing longer time-scale data is needed to elucidate potential racial/ethnic differences in long-term symptom trajectories.
Necrotising otitis externa is a severe ear infection for which there are no established diagnostic or treatment guidelines.
Method
This study described clinical characteristics, management and outcomes for patients managed as necrotising otitis externa cases at a UK tertiary referral centre.
Results
A total of 58 (63 per cent) patients were classified as definite necrotising otitis externa cases, 31 (34 per cent) as probable cases and 3 (3 per cent) as possible cases. Median duration of intravenous and oral antimicrobial therapy was 6.0 weeks (0.49–44.9 weeks). Six per cent of patients relapsed a median of 16.4 weeks (interquartile range, 23–121) after stopping antimicrobials. Twenty-eight per cent of cases had complex disease. These patients were older (p = 0.042), had a longer duration of symptoms prior to imaging (p < 0.0001) and higher C-reactive protein at diagnosis (p = 0.005). Despite longer courses of intravenous antimicrobials (23 vs 14 days; p = 0.032), complex cases were more likely to relapse (p = 0.016).
Conclusion
A standardised case-definition of necrotising otitis externa is needed to optimise diagnosis, management and research.
Many conservation initiatives call for ‘transformative change’ to counter biodiversity loss, climate change, and injustice. The term connotes fundamental, broad, and durable changes to human relationships with nature. However, if oversimplified or overcomplicated, or not focused enough on power and the political action necessary for change, associated initiatives can perpetuate or exacerbate existing crises. This article aims to help practitioners deliberately catalyze and steer transformation processes. It provides a theoretically and practically grounded definition of ‘transformative conservation’, along with six strategic, interlocking recommendations. These cover systems pedagogy, political mobilization, inner transformation, as well as planning, action, and continual adjustment.
Technical summary
Calls for ‘transformative change’ point to the fundamental reorganization necessary for global conservation initiatives to stem ecological catastrophe. However, the concept risks being oversimplified or overcomplicated, and focusing too little on power and the political action necessary for change. Accordingly, its intersection with contemporary biodiversity and climate change mitigation initiatives needs explicit deliberation and clarification. This article advances the praxis of ‘transformative conservation’ as both (1) a desired process that rethinks the relationships between individuals, society, and nature, and restructures systems accordingly, and (2) a desired outcome that conserves biodiversity while justly transitioning to net zero emission economies and securing the sustainable and regenerative use of natural resources. It first reviews criticisms of area-based conservation targets, natural climate solutions, and nature-based solutions that are framed as transformative, including issues of ecological integrity, livelihoods, gender, equity, growth, power, participation, knowledge, and governance. It then substantiates six strategic recommendations designed to help practitioners deliberately steer transformation processes. These include taking a systems approach; partnering with political movements to achieve equitable and just transformation; linking societal with personal (‘inner’) transformation; updating how we plan; facilitating shifts from diagnosis and planning to action; and improving our ability to adjust to transformation as it occurs.
Social media summary
Curious about stemming the global biodiversity and climate crises? Browse this article on transformative conservation!
Dietary pattern analysis is typically based on dimension reduction and summarises the diet with a small number of scores. We assess ‘joint and individual variance explained’ (JIVE) as a method for extracting dietary patterns from longitudinal data that highlights elements of the diet that are associated over time. The Auckland Birthweight Collaborative Study, in which participants completed an FFQ at ages 3·5 (n 549), 7 (n 591) and 11 (n 617), is used as an example. Data from each time point are projected onto the directions of shared variability produced by JIVE to yield dietary patterns and scores. We assess the ability of the scores to predict future BMI and blood pressure measurements of the participants and make a comparison with principal component analysis (PCA) performed separately at each time point. The diet could be summarised with three JIVE patterns. The patterns were interpretable, with the same interpretation across age groups: a vegetable and whole grain pattern, a sweets and meats pattern and a cereal v. sweet drinks pattern. The first two PCA-derived patterns were similar across age groups and similar to the first two JIVE patterns. The interpretation of the third PCA pattern changed across age groups. Scores produced by the two techniques were similarly effective in predicting future BMI and blood pressure. We conclude that when data from the same participants at multiple ages are available, JIVE provides an advantage over PCA by extracting patterns with a common interpretation across age groups.
The objective was to use bibliometric analysis to create an infographic of motor unit number estimation methods over the past 50 years. The original method was published in 1971, but secondary and tertiary waves of research using alternative methods occurred in the early 2000s and a decade later. A metric of influence was used to determine if different methods had clear peaks of use over the past 50 years. While the original method continues to register influence, the MUNIX method introduced in 2004 stands out as the most influential method to estimate the innervation status of skeletal muscles.
The metabolic syndrome is common in older adults and may be modified by the diet. The aim of this study was to examine associations between a posteriori dietary patterns and the metabolic syndrome in an older New Zealand population. The REACH study (Researching Eating, Activity, and Cognitive Health) included 366 participants (aged 65–74 years, 36 % male) living independently in Auckland, New Zealand. Dietary data were collected using a 109-item FFQ with demonstrated validity and reproducibility for assessing dietary patterns using principal component analysis. The metabolic syndrome was defined by the National Cholesterol Education Program Adult Treatment Panel III. Associations between dietary patterns and the metabolic syndrome, adjusted for age, sex, index of multiple deprivation, physical activity, and energy intake were analysed using logistic regression analysis. Three dietary patterns explained 18 % of dietary intake variation – ‘Mediterranean style’ (salad/leafy cruciferous/other vegetables, avocados/olives, alliums, nuts/seeds, shellfish and white/oily fish, berries), ‘prudent’ (dried/fresh/frozen legumes, soya-based foods, whole grains and carrots) and ‘Western’ (processed meat/fish, sauces/condiments, cakes/biscuits/puddings and meat pies/hot chips). No associations were seen between ‘Mediterranean style’ (OR = 0·75 (95 % CI 0·53, 1·06), P = 0·11) or ‘prudent’ (OR = 1·17 (95 % CI 0·83, 1·59), P = 0·35) patterns and the metabolic syndrome after co-variate adjustment. The ‘Western’ pattern was positively associated with the metabolic syndrome (OR = 1·67 (95 % CI 1·08, 2·63), P = 0·02). There was also a small association between an index of multiple deprivation (OR = 1·04 (95 % CI 1·02, 1·06), P < 0·001) and the metabolic syndrome. This cross-sectional study provides further support for a Western dietary pattern being a risk factor for the metabolic syndrome in an older population.
The Variables and Slow Transients Survey (VAST) on the Australian Square Kilometre Array Pathfinder (ASKAP) is designed to detect highly variable and transient radio sources on timescales from 5 s to
$\sim\!5$
yr. In this paper, we present the survey description, observation strategy and initial results from the VAST Phase I Pilot Survey. This pilot survey consists of
$\sim\!162$
h of observations conducted at a central frequency of 888 MHz between 2019 August and 2020 August, with a typical rms sensitivity of
$0.24\ \mathrm{mJy\ beam}^{-1}$
and angular resolution of
$12-20$
arcseconds. There are 113 fields, each of which was observed for 12 min integration time, with between 5 and 13 repeats, with cadences between 1 day and 8 months. The total area of the pilot survey footprint is 5 131 square degrees, covering six distinct regions of the sky. An initial search of two of these regions, totalling 1 646 square degrees, revealed 28 highly variable and/or transient sources. Seven of these are known pulsars, including the millisecond pulsar J2039–5617. Another seven are stars, four of which have no previously reported radio detection (SCR J0533–4257, LEHPM 2-783, UCAC3 89–412162 and 2MASS J22414436–6119311). Of the remaining 14 sources, two are active galactic nuclei, six are associated with galaxies and the other six have no multi-wavelength counterparts and are yet to be identified.
Zirconium alloys are common fuel claddings in nuclear fission reactors and are susceptible to the effects of hydrogen embrittlement. There is a need to be able to detect and image hydrogen at the atomic scale to gain the experimental evidence necessary to fully understand hydrogen embrittlement. Through the use of deuterium tracers, atom probe tomography (APT) is able to detect and spatially locate hydrogen at the atomic scale. Previous works have highlighted issues with quantifying deuterium concentrations using APT due to complex peak overlaps in the mass-to-charge-state ratio spectrum between molecular hydrogen and deuterium (H2 and D). In this work, we use new methods to analyze historic and simulated atom probe data, by applying currently available data analysis tools, to optimize solving peak overlaps to improve the quantification of deuterium. This method has been applied to literature data to quantify the deuterium concentrations in a concentration line profile across an α-Zr/deuteride interface.