To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Debate on the use of lagged dependent variables has a long history in political science. The latest contribution to this discussion is Wilkins (2018, Political Science Research and Methods, 6, 393–411), which advocates the use of an ADL(2,1) model when there is serial dependence in the outcome and disturbance. While this specification does offer some insurance against serially correlated disturbances, this is never the best (linear unbiased estimator) approach and should not be pursued as a general strategy. First, this strategy is only appropriate when the data-generating process (DGP) actually implies a more parsimonious model. Second, when this is not the DGP—e.g., lags of the predictors have independent effects—this strategy mischaracterizes the dynamic process. We clarify this issue and detail a Wald test that can be used to evaluate the appropriateness of the Wilkins approach. In general, we argue that researchers need to always: (i) ensure models are dynamically complete and (ii) test whether more restrictive models are appropriate.
Colleges and universities around the world engaged diverse strategies during the COVID-19 pandemic. Baylor University, a community of ˜22,700 individuals, was one of the institutions which resumed and sustained operations. The key strategy was establishment of multidisciplinary teams to develop mitigation strategies and priority areas for action. This population-based team approach along with implementation of a “Swiss Cheese” risk mitigation model allowed small clusters to be rapidly addressed through testing, surveillance, tracing, isolation, and quarantine. These efforts were supported by health protocols including face coverings, social distancing, and compliance monitoring. As a result, activities were sustained from 1 August to 8 December 2020. There were 62,970 COVID-19 tests conducted with 1,435 people testing positive for a positivity rate of 2.28%. A total of 1,670 COVID-19 cases were identified with 235 self-reports. The mean number of tests per week was 3,500 with approximately 80 of these positive (11 per day). More than 60 student tracers were trained with over 120 personnel available to contact trace, at a ratio of one per 400 university members. The successes and lessons learned provide a framework and pathway for similar institutions to mitigate the ongoing impacts of COVID-19 and sustain operations during a global pandemic.
Retrospectively apply criteria from Center to Advance Palliative Care to a cohort of children treated in a cardiac ICU and compare children who received a palliative care consultation to those who were eligible for but did not receive one.
Medical records of children admitted to a cardiac ICU between January 2014 and June 2017 were reviewed. Selected criteria include cardiac ICU length of stay >14 days and/or ≥ 3 hospitalisations within a 6-month period.
Measurements and Results:
A consultation occurred in 17% (n = 48) of 288 eligible children. Children who received a consult had longer cardiac ICU (27 days versus 17 days; p < 0.001) and hospital (91 days versus 35 days; p < 0.001) lengths of stay, more complex chronic conditions at the end of first hospitalisation (3 versus1; p < 0.001) and the end of the study (4 vs.2; p < 0.001), and higher mortality (42% versus 7%; p < 0.001) when compared with the non-consulted group. Of the 142 pre-natally diagnosed children, only one received a pre-natal consult and 23 received it post-natally. Children who received a consultation (n = 48) were almost 2 months of age at the time of the consult.
Less than a quarter of eligible children received a consultation. The consultation usually occurred in the context of medical complexity, high risk of mortality, and at an older age, suggesting potential opportunities for more and earlier paediatric palliative care involvement in the cardiac ICU. Screening criteria to identify patients for a consultation may increase the use of palliative care services in the cardiac ICU.
The prespecification of the network is one of the biggest hurdles for applied researchers in undertaking spatial analysis. In this letter, we demonstrate two results. First, we derive bounds for the bias in nonspatial models with omitted spatially-lagged predictors or outcomes. These bias expressions can be obtained without prior knowledge of the network, and are more informative than familiar omitted variable bias formulas. Second, we derive bounds for the bias in spatial econometric models with nondifferential error in the specification of the weights matrix. Under these conditions, we demonstrate that an omitted spatial input is the limit condition of including a misspecificed spatial weights matrix. Simulated experiments further demonstrate that spatial models with a misspecified weights matrix weakly dominate nonspatial models. Our results imply that, where cross-sectional dependence is presumed, researchers should pursue spatial analysis even with limited information on network ties.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
Deaths due to opioid overdose have reached unprecedented levels in Canada; over 12,800 opioid-related deaths occurred between January 2016 and March 2019, and overdose death rates increased by approximately 50% from 2016 to 2018.1 In 2016, Health Canada declared the opioid epidemic a national public health crisis,2 and life expectancy increases have halted in Canada for the first time in decades.3 Children are not exempt from this crisis, and the Chief Public Health Officer of Canada has recently prioritized the prevention of problematic substance use among Canadian youth.4
The concentration of radiocarbon (14C) differs between ocean and atmosphere. Radiocarbon determinations from samples which obtained their 14C in the marine environment therefore need a marine-specific calibration curve and cannot be calibrated directly against the atmospheric-based IntCal20 curve. This paper presents Marine20, an update to the internationally agreed marine radiocarbon age calibration curve that provides a non-polar global-average marine record of radiocarbon from 0–55 cal kBP and serves as a baseline for regional oceanic variation. Marine20 is intended for calibration of marine radiocarbon samples from non-polar regions; it is not suitable for calibration in polar regions where variability in sea ice extent, ocean upwelling and air-sea gas exchange may have caused larger changes to concentrations of marine radiocarbon. The Marine20 curve is based upon 500 simulations with an ocean/atmosphere/biosphere box-model of the global carbon cycle that has been forced by posterior realizations of our Northern Hemispheric atmospheric IntCal20 14C curve and reconstructed changes in CO2 obtained from ice core data. These forcings enable us to incorporate carbon cycle dynamics and temporal changes in the atmospheric 14C level. The box-model simulations of the global-average marine radiocarbon reservoir age are similar to those of a more complex three-dimensional ocean general circulation model. However, simplicity and speed of the box model allow us to use a Monte Carlo approach to rigorously propagate the uncertainty in both the historic concentration of atmospheric 14C and other key parameters of the carbon cycle through to our final Marine20 calibration curve. This robust propagation of uncertainty is fundamental to providing reliable precision for the radiocarbon age calibration of marine based samples. We make a first step towards deconvolving the contributions of different processes to the total uncertainty; discuss the main differences of Marine20 from the previous age calibration curve Marine13; and identify the limitations of our approach together with key areas for further work. The updated values for ΔR, the regional marine radiocarbon reservoir age corrections required to calibrate against Marine20, can be found at the data base http://calib.org/marine/.
Preferential removal of W relative to other trace elements from zoned, W–Sn–U–Pb-bearing hematite coupled with disturbance of U–Pb isotope systematics is attributed to pseudomorphic replacement via coupled dissolution reprecipitation reaction (CDRR). This hematite has been studied down to the nanoscale to understand the mechanisms leading to compositional and U/Pb isotope heterogeneity at the grain scale. High-Angle Annular Dark Field Scanning Transmission Electron Microscopy (HAADF STEM) imaging of foils extracted in situ from three locations across the W-rich to W-depleted domains show lattice-scale defects and crystal structure modifications adjacent to twin planes. Secondary sets of twins and associated splays are common, but wider (up to ~100 nm) inclusion trails occur only at the boundary between the W-rich and W-depleted domains. STEM energy-dispersive X-ray mapping reveals W- and Pb-enrichment along 2–3 nm-wide features defining the twin planes; W-bearing nanoparticles occur along the splays. Tungsten and Pb are both present, albeit at low concentrations, within Na–K–Cl-bearing inclusions along the trails. HAADF STEM imaging of hematite reveals modifications relative to ideal crystal structure. A two-fold hematite superstructure (a = b = c = 10.85 Å; α = β = γ = 55.28°) involving oxygen vacancies was constructed and assessed by STEM simulations with a good match to data. This model can account for significant W release during interaction with fluids percolating through twin planes and secondary structures as CDRR progresses from the zoned domain, otherwise apparently undisturbed at the micrometre scale. Lead remobilisation is confirmed here at the nanoscale and is responsible for a disturbance of U/Pb ratios in hematite affected by CDRR. Twin planes can provide pathways for fluid percolation and metal entrapment during post-crystallisation overprinting. The presence of complex twinning can therefore predict potential disturbances of isotope systems in hematite that will affect its performance as a robust geochronometer.
Prognosis and disposition among older emergency department (ED) patients with suspected infection remains challenging. Frailty is increasingly recognized as a predictor of poor prognosis among critically ill patients; however, its association with clinical outcomes among older ED patients with suspected infection is unknown.
We conducted a multicenter prospective cohort study at two tertiary care EDs. We included older ED patients (≥75 years) with suspected infection. Frailty at baseline (before index illness) was explicitly measured for all patients by the treating physicians using the Clinical Frailty Scale (CFS). We defined frailty as a CFS 5–8. The primary outcome was 30-day mortality. We used multivariable logistic regression to adjust for known confounders. We also compared the prognostic accuracy of frailty with the Systemic Inflammatory Response Syndrome (SIRS) and Quick Sequential Organ Failure Assessment (qSOFA) criteria.
We enrolled 203 patients, of whom 117 (57.6%) were frail. Frail patients were more likely to develop septic shock (adjusted odds ratio [aOR], 1.83; 95% confidence interval [CI], 1.08–2.51) and more likely to die within 30 days of ED presentation (aOR 2.05; 95% CI, 1.02–5.24). Sensitivity for mortality was highest among the CFS (73.1%; 95% CI, 52.2–88.4), compared with SIRS ≥ 2 (65.4%; 95% CI, 44.3–82.8) or qSOFA ≥ 2 (38.4; 95% CI, 20.2–59.4).
Frailty is a highly prevalent prognostic factor that can be used to risk-stratify older ED patients with suspected infection. ED clinicians should consider screening for frailty to optimize disposition in this population.
Introduction: A critical component for successful implementation of any innovation is an organization's readiness for change. Competence by Design (CBD) is the Royal College's major change initiative to reform the training of medical specialists in Canada. The purpose of this study was to measure readiness to implement CBD among the 2019 launch disciplines. Methods: An online survey was distributed to program directors of the 2019 CBD launch disciplines one month prior to implementation. Questions were developed based on the R = MC2 framework for organizational readiness. They addressed program motivation to implement CBD, general capacity for change, and innovation-specific capacity. Questions related to motivation and general capacity were scored using a 5-point scale of agreement. Innovation-specific capacity was measured by asking participants whether they had completed 33 key pre-implementation tasks (yes/no) in preparation for CBD. Bivariate correlations were conducted to examine the relationship between motivation, general capacity and innovation specific capacity. Results: Survey response rate was 42% (n = 79). A positive correlation was found between all three domains of readiness (motivation and general capacity, r = 0.73, p < 0.01; motivation and innovation specific capacity, r = 0.52, p < 0.01; general capacity and innovation specific capacity, r = 0.47, p < 0.01). Most respondents agreed that successful launch of CBD was a priority (74%). Fewer felt that CBD was a move in the right direction (58%) and that implementation was a manageable change (53%). While most programs indicated that their leadership (94%) and faculty and residents (87%) were supportive of change, 42% did not have experience implementing large-scale innovation and 43% indicated concerns about adequate support staff. Programs had completed an average of 72% of pre-implementation tasks. No difference was found between disciplines (p = 0.11). Activities related to curriculum mapping, competence committees and programmatic assessment had been completed by >90% of programs, while <50% of programs had engaged off-service rotations. Conclusion: Measuring readiness for change aids in the identification of factors that promote or inhibit successful implementation. These results highlight several areas where programs struggle in preparation for CBD launch. Emergency medicine training programs can use this data to target additional implementation support and ongoing faculty development initiatives.
Introduction: Prognostication and disposition among older Emergency Department (ED) patients with suspected infection remains challenging. Frailty is increasingly recognized as a predictor of poor prognosis among critically ill patients, however its association with clinical outcomes among older ED patients with suspected infection is unknown. Methods: We conducted a multicentre prospective cohort study at two tertiary care EDs. We included older ED patients (≥ 75 years) presenting with suspected infection. Frailty at baseline (prior to index illness) was explicitly measured for all patients by the treating physicians using the Clinical Frailty Scale (CFS). We defined frailty as a CFS 5-8. The primary outcome was 30-day mortality. We used multivariable logistic regression to adjust for known confounders. We also compared the prognostic accuracy of frailty against the Systemic Inflammatory Response Syndrome (SIRS) and Quick Sequential Organ Failure Assessment (qSOFA) criteria. Results: We enrolled 203 patients, of whom 117 (57.6%) were frail. Frail patients were more likely to develop septic shock (adjusted odds ratio [aOR]: 1.83, 95% confidence interval [CI]: 1.08-2.51) and more likely to die within 30 days of ED presentation (aOR 2.05, 95% CI: 1.02-5.24). Sensitivity for mortality was highest among the CFS (73.1%, 95% CI: 52.2-88.4), as compared to SIRS ≥ 2 (65.4%, 95% CI: 44.3-82.8) or qSOFA ≥ 2 (38.4, 95% CI: 20.2-59.4). Conclusion: Frailty is a highly prevalent prognostic factor that can be used to risk-stratify older ED patients with suspected infection. ED clinicians should consider screening for frailty in order to optimize disposition in this population.
The volume of evidence from scientific research and wider observation is greater than ever before, but much is inconsistent and scattered in fragments over increasingly diverse sources, making it hard for decision-makers to find, access and interpret all the relevant information on a particular topic, resolve seemingly contradictory results or simply identify where there is a lack of evidence. Evidence synthesis is the process of searching for and summarising a body of research on a specific topic in order to inform decisions, but is often poorly conducted and susceptible to bias. In response to these problems, more rigorous methodologies have been developed and subsequently made available to the conservation and environmental management community by the Collaboration for Environmental Evidence. We explain when and why these methods are appropriate, and how evidence can be synthesised, shared, used as a public good and benefit wider society. We discuss new developments with potential to address barriers to evidence synthesis and communication and how these practices might be mainstreamed in the process of decision-making in conservation.
The front-line nature of mental health crisis services and the complex and acute presentations of their clients, require rapid decisions in response to medication requests. Prescribing is conducted by a combination of non-medical prescribers, advanced practitioners and medical professionals, with individual variation in prescribing habits and trends. This leads to an impact on the prescribing budget with spiraling costs evident.
We undertook an audit of the prescriptions issued for emergency medications over 6 months. We incorporated audit standards, from NICE guidelines for prescribing of psychotropic medications and local trust prescribing advice. We audited data to examine whether cost of medication had been considered, in the context of the efficacy and safety of medications.
Of 138 prescriptions issued, 72 (52%) were prescribed by advanced practitioners, 7 by non-medical prescribers (5%) and the remainder by doctors (43%). 213 items were prescribed costing £2828 during this period. We demonstrated, by introducing smarter prescribing methods, a reduction in the number of prescription items by 27.7% (59 items), resulting in a financial efficiency of 94.6% (£2677). We recommended implementing an acute care formulary acting as a guide to smarter prescribing.
The guide includes recommending generic versions of medications instead of trade brands, asking primary care to initiate medications, reducing quantities of drugs prescribed, increased accountability for prescribing decisions and stopping use of expensive psychotropics, where cheaper alternatives with similar efficacy and side effect profile are available. To assess impact, we would reaudit in 6 months, before consideration towards adopting the policy trustwide.
We investigate how early exposure to parental externalizing behaviors (EB) may contribute to development of alcohol use disorders (AUD) in young adulthood, testing a developmental cascade model focused on competencies in three domains (academic, conduct, and work) in adolescence and emerging adulthood, and examining whether high parental education can buffer negative effects of parental EB and other early risk factors. We use data from 451,054 Swedish-born men included in the national conscript register. Structural equation models showed parental EB was associated with academic and behavioral problems during adolescence, as well as with lower resilience, more criminal behavior, and reduced social integration during emerging adulthood. These pathways led to elevated rates of AUD in emerging and young adulthood. Multiple groups analysis showed most of the indirect pathways from parental EB to AUD were present but buffered by higher parental education, suggesting early life experiences and competencies matter more for young men from lower socioeconomic status (SES) families than from higher SES families. Developmental competencies in school, conduct, and work are important precursors to the development of AUD by young adulthood that are predicted by parental EB. Occupational success may be an overlooked source of resilience for young men from low-SES families.
Yukon Territory (YT) is a remote region in northern Canada with ongoing spread of tuberculosis (TB). To explore the utility of whole genome sequencing (WGS) for TB surveillance and monitoring in a setting with detailed contact tracing and interview data, we used a mixed-methods approach. Our analysis included all culture-confirmed cases in YT (2005–2014) and incorporated data from 24-locus Mycobacterial Interspersed Repetitive Units-Variable Number of Tandem Repeats (MIRU-VNTR) genotyping, WGS and contact tracing. We compared field-based (contact investigation (CI) data + MIRU-VNTR) and genomic-based (WGS + MIRU-VNTR + basic case data) investigations to identify the most likely source of each person's TB and assessed the knowledge, attitudes and practices of programme personnel around genotyping and genomics using online, multiple-choice surveys (n = 4) and an in-person group interview (n = 5). Field- and genomics-based approaches agreed for 26 of 32 (81%) cases on likely location of TB acquisition. There was less agreement in the identification of specific source cases (13/22 or 59% of cases). Single-locus MIRU-VNTR variants and limited genetic diversity complicated the analysis. Qualitative data indicated that participants viewed genomic epidemiology as a useful tool to streamline investigations, particularly in differentiating latent TB reactivation from the recent transmission. Based on this, genomic data could be used to enhance CIs, focus resources, target interventions and aid in TB programme evaluation.
Ground-penetrating radar data acquired in the 2016/17 austral summer on Sørsdal Glacier, East Antarctica, provide evidence for meltwater lenses within porous surface ice that are conceptually similar to firn aquifers observed on the Greenland Ice Sheet and the Arctic and Alpine glaciers. These englacial water bodies are associated with a dry relict surface basin and consistent with perennial drainage into an interconnected englacial drainage system, which may explain a large englacial outburst flood observed in satellite imagery in the early 2016/17 melt season. Our observations indicate the rarely-documented presence of an englacial hydrological system in Antarctica, with implications for the storage and routing of surface meltwater. Future work should ascertain the spatial prevalence of such systems around the Antarctic coastline, and identify the degree of surface runoff redistribution and storage in the near surface, to quantify their impact on surface mass balance.
This article, the first detailed scholarly assessment of northern responses to the death of former Confederate President Jefferson Davis in December 1889, contributes to ongoing academic debates over the troubled process of sectional reconciliation after the Civil War. Southern whites used their leader's funeral obsequies to assert not only their affection for the deceased but also their devotion to the Lost Cause that he had championed and embodied. Based on an analysis of northern newspapers and mass-circulation magazines in the two weeks after Davis's death, the essay demonstrates that many northerners, principally Republican politicians and editors, Union veterans, and African Americans, were outraged by southerners’ flagrant willingness to laud a man whom they regarded as the arch-traitor and that they remained opposed to reconciliation on southern terms. However, despite continuing concerns about public displays of affection for the Confederacy evident at the time of Davis's reinterment in Richmond in May 1893, northern opposition to the Lost Cause waned rapidly in the last decade of the nineteenth century. Full-blown sectional reconciliation occurred after the Republicans gave up on their efforts to enforce black voting rights in the South and President William McKinley's imperialist foreign policy necessitated, and to some degree garnered, support from southern whites. The death of Jefferson Davis, therefore, can be seen as an important event in the difficult transition from a heavily sectionalized postwar polity to a North-South rapprochement based heavily on political pragmatism, sentiment, nationalism, and white supremacism.