To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The field of sclerochronology has long been known to paleobiologists. Yet, despite the central role of growth rate, age, and body size in questions related to macroevolution and evolutionary ecology, these types of studies and the data they produce have received only episodic attention from paleobiologists since the field's inception in the 1960s. It is time to reconsider their potential. Not only can sclerochronological data help to address long-standing questions in paleobiology, but they can also bring to light new questions that would otherwise have been impossible to address. For example, growth rate and life-span data, the very data afforded by chronological growth increments, are essential to answer questions related not only to heterochrony and hence evolutionary mechanisms, but also to body size and organism energetics across the Phanerozoic. While numerous fossil organisms have accretionary skeletons, bivalves offer perhaps one of the most tangible and intriguing pathways forward, because they exhibit clear, typically annual, growth increments and they include some of the longest-lived, non-colonial animals on the planet. In addition to their longevity, modern bivalves also show a latitudinal gradient of increasing life span and decreasing growth rate with latitude that might be related to the latitudinal diversity gradient. Is this a recently developed phenomenon or has it characterized much of the group's history? When and how did extreme longevity evolve in the Bivalvia? What insights can the growth increments of fossil bivalves provide about hypotheses for energetics through time? In spite of the relative ease with which the tools of sclerochronology can be applied to these questions, paleobiologists have been slow to adopt sclerochronological approaches. Here, we lay out an argument and the methods for a path forward in paleobiology that uses sclerochronology to answer some of our most pressing questions.
Concerns persist regarding possible false negative results that may compromise COVID-19 containment. While obtaining a true false negative rate is infeasible, using real world observation these data suggest a possible false negative rate to be approximately 2.3%. Use of a sensitive, amplified RNA platform should reassure healthcare systems.
Cricothyrotomy and chest needle decompression (NDC) have a high failure and complication rate. This article sought to determine whether paramedics can correctly identify the anatomical landmarks for cricothyrotomy and chest NDC.
A prospective study using human models was performed. Paramedics were partnered and requested to identify the location for cricothyrotomy and chest NDC (both mid-clavicular and anterior axillary sites) on each other. A board-certified or board-eligible emergency medicine physician timed the process and confirmed location accuracy. All data were collected de-identified. Descriptive analysis was performed on continuous data; chi-square was used for categorical data.
A total of 69 participants were recruited, with one excluded for incomplete data. The paramedics had a range of six to 38 (median 14) years of experience. There were 28 medical training officers (MTOs) and 41 field paramedics. Cricothyroidotomy location was correctly identified in 56 of 68 participants with a time to identification range of 2.0 to 38.2 (median 8.6) seconds. Chest NDC (mid-clavicular) location was correctly identified in 54 of 68 participants with a time to identification range of 3.4 to 25.0 (median 9.5) seconds. Chest NDC (anterior axillary) location was correctly identified in 43 of 68 participants with a time to identification range of 1.9 to 37.9 (median 9.6) seconds. Chi-square (2-tail) showed no difference between MTO and field paramedic in cricothyroidotomy site (P = .62), mid-clavicular chest NDC site (P = .21), or anterior axillary chest NDC site (P = .11). There was no difference in time to identification for any procedure between MTO and field paramedic.
Both MTOs and field paramedics were quick in identifying correct placement of cricothyroidotomy and chest NDC location sites. While time to identification was clinically acceptable, there was also a significant proportion that did not identify the correct landmarks.
To compare the prevalence of select cardiovascular risk factors (CVRFs) in patients with mild cognitive impairment (MCI) versus lifetime history of major depression disorder (MDD) and a normal comparison group using baseline data from the Prevention of Alzheimer’s Dementia with Cognitive Remediation plus Transcranial Direct Current Stimulation (PACt-MD) study.
Baseline data from a multi-centered intervention study of older adults with MCI, history of MDD, or combined MCI and history of MDD (PACt-MD) were analyzed.
Community-based multi-centered study based in Toronto across 5 academic sites.
Older adults with MCI, history of MDD, or combined MCI and history of MDD and healthy controls.
We examined the baseline distribution of smoking, hypertension and diabetes in three groups of participants aged 60+ years in the PACt-MD cohort study: MCI (n = 278), MDD (n = 95), and healthy older controls (n = 81). Generalized linear models were fitted to study the effect of CVRFs on MCI and MDD as well as neuropsychological composite scores.
A higher odds of hypertension among the MCI cohort compared to healthy controls (p < .05) was noted in unadjusted analysis. Statistical significance level was lost on adjusting for age, sex and education (p > .05). A history of hypertension was associated with lower performance in composite executive function (p < .05) and overall composite neuropsychological test score (p < .05) among a pooled cohort with MCI or MDD.
This study reinforces the importance of treating modifiable CVRFs, specifically hypertension, as a means of mitigating cognitive decline in patients with at-risk cognitive conditions.
Parent-Child interaction therapy (PCIT) has been shown to improve positive, responsive parenting and lower risk for child maltreatment (CM), including among families who are already involved in the child welfare system. However, higher risk families show higher rates of treatment attrition, limiting effectiveness. In N = 120 child welfare families randomized to PCIT, we tested behavioral and physiological markers of parent self-regulation and socio-cognitive processes assessed at pre-intervention as predictors of retention in PCIT. Results of multinomial logistic regressions indicate that parents who declined treatment displayed more negative parenting, greater perceptions of child responsibility and control in adult–child transactions, respiratory sinus arrhythmia (RSA) increases to a positive dyadic interaction task, and RSA withdrawal to a challenging, dyadic toy clean-up task. Increased odds of dropout during PCIT's child-directed interaction phase were associated with greater parent attentional bias to angry facial cues on an emotional go/no-go task. Hostile attributions about one's child predicted risk for dropout during the parent-directed interaction phase, and readiness for change scores predicted higher odds of treatment completion. Implications for intervening with child welfare-involved families are discussed along with study limitations.
In March 2018, the US Department of Defense (DOD) added the smallpox vaccination, using ACAM2000, to its routine immunizations, increasing the number of persons receiving the vaccine. The following month, Fort Hood reported a cluster of 5 myopericarditis cases. The Centers for Disease Control and Prevention and the DOD launched an investigation.
The investigation consisted of a review of medical records, establishment of case definitions, causality assessment, patient interviews, and active surveillance. A 2-sided exact rate ratio test was used to compare myopericarditis incidence rates.
This investigation identified 4 cases of probable myopericarditis and 1 case of suspected myopericarditis. No alternative etiology was identified as a cause. No additional cases were identified. There was no statistically significant difference in incidence rates between the observed cluster (5.23 per 1000 vaccinated individuals, 95% CI: 1.7–12.2) and the ACAM2000 clinical trial outcomes for symptomatic persons, which was 2.29 per 1000 vaccinated individuals (95% CI: 0.3–8.3).
Vaccination with ACAM2000 is the presumptive cause of this cluster. Caution should be exercised before considering vaccination campaigns for smallpox given the clinical morbidity and costs incurred by a case of myopericarditis. Risk of myopericarditis should be carefully weighed with risk of exposure to smallpox.
Low birthweight has been related to an increased risk of adult cardiovascular disease (CVD). Transgenerational studies have been used to investigate likely mechanisms underlying this inverse association. However, previous studies mostly examined the association of offspring birthweight with CVD risk factors among parents. In this study, we investigated the association between offspring birthweight and individual CVD risk factors, an index of CVD risk factors, and education in their parents, aunts/uncles, and aunts’/uncles’ partners. Birth data (Medical Birth Registry, Norway (MBRN) (1967–2012)) was linked to CVD risk factor data (the County Study, Age 40 Program, and Cohort Norway (CONOR)) for the parents, aunts/uncles, and their partners. For body mass index (BMI), resting heart rate (RHR), systolic blood pressure (SBP), total cholesterol (TC), triglycerides (TG), and a risk factor index, the associations were examined by linear regression. For smoking and education, they were examined by logistic regression. Low birthweight was associated with an unfavorable risk factor profile in all familial relationships. For each kg increase in birthweight, the mean risk factor index decreased by −0.14 units (−0.15, −0.13) in mothers, −0.11 (−0.12, −0.10) in fathers, and −0.02 (−0.05, −0.00) to −0.07 (−0.09, −0.06) in aunts/uncles and their partners. The association in mothers was stronger than fathers, but it was also stronger in aunts/uncles than their partners. Profound associations between birthweight and CVD risk factors in extended family members were observed that go beyond the expected genetic similarities in pedigrees, suggesting that mechanisms like environmental factors, assortative mating, and genetic nurturing may explain these associations.
Critics have long maintained that the rights revolution and, by extension, the postwar turn to litigation as a regulatory tool, are the product of a cynical legislative choice. On this view, legislators choose rights and litigation over alternative regulatory approaches to shift costs from on-budget forms (for example, publicly funded social provisions, public enforcement actions by prosecutors or agencies) to off-budget forms (for example, rights-based statutory duties, enforced via private lawsuits). This “cost-shift” theory has never been subjected to sustained theoretical scrutiny or comprehensive empirical test. This article offers the first such analysis, examining a context where the cost-shift hypothesis is at its most plausible: disability discrimination laws, which shift costs away from social welfare programs by requiring that employers hire and “accommodate” workers with disabilities. Using a novel dataset of state-level disability discrimination laws enacted prior to the federal-level Americans with Disabilities Act (ADA) and a range of archival and other materials drawn from state-level legislative campaigns, we find only limited support for the view that cost shifting offered at least part of the motivation for these laws. Our findings offer a fresh perspective on long-standing debates about American disability law and politics, including judicial interpretation of the ADA and its state-level analogues and the relationship of disability rights activism to other rights-based political movements.
Focus in this chapter is on petroleum industry local content requirements for affected Canadian Indigenous peoples. Though Canada is a developed country, these indigenous nations are in a position analogous to developing countries affected by oil and gas development. Canada’s constitution, as judicially interpreted, recognizes and affirms indigenous peoples’ “aboriginal and treaty rights” that include the right to consultation and accommodation concerning proposed oil and gas development likely to affect them adversely. This duty to consult is typically discharged and implemented by means of impact benefit agreements (IBAs) between developers and indigenous nations, and enforceable conditions on government approvals of projects, including oil sands facilities and interjurisdictional pipelines, issued following public processes. The conclusion is that timing changes in government consultation would improve the effectiveness these IBA tools.
Rising antimicrobial resistance (AMR) in primary care is a growing concern and a threat to community health. The rise of AMR can be slowed down if general practitioners (GPs) and community pharmacists (CPs) could work as a team to implement antimicrobial stewardship (AMS) programs for optimal use of antimicrobial(s). However, the evidence supporting a GP pharmacist collaborative AMS implementation model (GPPAS) in primary care remains limited.
With an aim to design a GPPAS model in Australia, this paper outlines how this model will be developed.
This exploratory study undertakes a systematic review, a scoping review, nationwide surveys, and qualitative interviews to design the model. Medical Research Council (MRC) framework and Normalization Process Theory are utilized as guides. Reviews will identify the list of effective GPPAS interventions. Two AMS surveys and paired interviews of GPs and CPs across Australia will explore their convergent and divergent views about the GPPAS interventions, attitudes towards collaboration in AMS and the perceived challenges of implementing GPPAS interventions. Systems Engineering Initiative for Patient Safety (SEIPS 2.0) model and factor analyses will guide the structure of GPPAS model through identifying the determinants of GPPAS uptake. The implementable GPPAS strategies will be selected based on empirical feasibility assessment by AMS stakeholders using the APEASE (Affordability, Practicability, Effectiveness and cost-effectiveness, Acceptability, Side-effects and safety, Equity) criteria.
The GPPAS model might have potential implications to inform how to better involve GPs and CPs in AMS, and, to improve collaborative services to optimize antimicrobial use and reduce AMR in primary care.
This SHEA white paper identifies knowledge gaps and challenges in healthcare epidemiology research related to COVID-19 with a focus on core principles of healthcare epidemiology. These gaps, revealed during the worst phases of the COVID-19 pandemic, are described in 10 sections: epidemiology, outbreak investigation, surveillance, isolation precaution practices, personal protective equipment (PPE), environmental contamination and disinfection, drug and supply shortages, antimicrobial stewardship, healthcare personnel (HCP) occupational safety, and return to work policies. Each section highlights three critical healthcare epidemiology research questions with detailed description provided in supplemental materials. This research agenda calls for translational studies from laboratory-based basic science research to well-designed, large-scale studies and health outcomes research. Research gaps and challenges related to nursing homes and social disparities are included. Collaborations across various disciplines, expertise and across diverse geographic locations will be critical.
The Syriac term yaṣrā, “inclination,” “urge,” “wilfulness,” and its use in Syriac texts, has not until recently been the subject of any detailed study, and this is perhaps surprising not only because of its interest for an understanding of early Syriac Christian thought, but also for its potential contribution to discussions of the origins and development of Jewish concepts of the yeṣer.
The direct carbonate procedure for accelerator mass spectrometry radiocarbon (AMS 14C) dating of submilligram samples of biogenic carbonate without graphitization is becoming widely used in a variety of studies. We compare the results of 153 paired direct carbonate and standard graphite 14C determinations on single specimens of an assortment of biogenic carbonates. A reduced major axis regression shows a strong relationship between direct carbonate and graphite percent Modern Carbon (pMC) values (m = 0.996; 95% CI [0.991–1.001]). An analysis of differences and a 95% confidence interval on pMC values reveals that there is no significant difference between direct carbonate and graphite pMC values for 76% of analyzed specimens, although variation in direct carbonate pMC is underestimated. The difference between the two methods is typically within 2 pMC, with 61% of direct carbonate pMC measurements being higher than their paired graphite counterpart. Of the 36 specimens that did yield significant differences, all but three missed the 95% significance threshold by 1.2 pMC or less. These results show that direct carbonate 14C dating of biogenic carbonates is a cost-effective and efficient complement to standard graphite 14C dating.
Benzodiazepine (BZD) prescription rates have increased over the past decade in the United States. Available literature indicates that sociodemographic factors may influence diagnostic patterns and/or prescription behaviour. Herein, the aim of this study is to determine whether the gender of the prescriber and/or patient influences BZD prescription.
Cross-sectional study using data from the Florida Medicaid Managed Medical Assistance Program from January 1, 2018 to December 31, 2018. Eligible recipients ages 18 to 64, inclusive, enrolled in the Florida Medicaid plan for at least 1 day, and were dually eligible. Recipients either had a serious mental illness (SMI), or non-SMI and anxiety.
Total 125 463 cases were identified (i.e., received BZD or non-BZD prescription). Main effect of patient and prescriber gender was significant F(1, 125 459) = 0.105, P = 0 .745, partial η2 < 0.001. Relative risk (RR) of male prescribers prescribing a BZD compared to female prescribers was 1.540, 95% confidence intervals (CI) [1.513, 1.567], whereas the RR of male patients being prescribed a BZD compared to female patients was 1.16, 95% CI [1.14, 1.18]. Main effects of patient and prescriber gender were statistically significant F(1, 125 459) = 188.232, P < 0.001, partial η2 = 0.001 and F(1, 125 459) = 349.704, P < 0.001, partial η2 = 0.013, respectively.
Male prescribers are more likely to prescribe BZDs, and male patients are more likely to receive BZDs. Further studies are required to characterize factors that influence this gender-by-gender interaction.
Stereotactic radiosurgery (SRS) has proven itself as an effective tool in the treatment of intracranial lesions. Image-guided high dose single fraction treatments have the potential to deliver ablative doses to tumours; however, treatment times can be long. Flattening filter free (FFF) beams are available on most modern linacs and offer a higher dose rate compared to conventional flattened beams which should reduce treatment times. This study aimed to compare 6 MV FFF and 10 MV FFF to a 6 MV flattened beam for single fraction dynamic conformal arc SRS for a Varian Truebeam linac.
Materials and methods:
In total, 21 individual clinical treatment plans for 21 brain metastases treated with 6 MV were retrospectively replanned using both 6 MV FFF and 10 MV FFF. Plan quality and efficiency metrics were evaluated by analysing dose coverage, dose conformity, dose gradients, dose to normal brain, beam-on-time (BOT), treatment time and monitor units.
FFF resulted in a significant reduction in median BOT for both 6 MV FFF (57·9%; p < 0·001) and 10 MV FFF (76·3%; p < 0·001) which led to reductions in treatment times of 16·8 and 21·5% respectively. However, 6 MV FFF showed superior normal brain dose sparing (p < 0·001) and dose gradient (p < 0·001) compared to 10 MV FFF. No differences were observed for conformity.
6 MV FFF offers a significant reduction in average treatment time compared to 6 MV (3·7 minutes; p = 0·002) while maintaining plan quality.
This study compared the level of education and tests from multiple cognitive domains as proxies for cognitive reserve.
The participants were educationally, ethnically, and cognitively diverse older adults enrolled in a longitudinal aging study. We examined independent and interactive effects of education, baseline cognitive scores, and MRI measures of cortical gray matter change on longitudinal cognitive change.
Baseline episodic memory was related to cognitive decline independent of brain and demographic variables and moderated (weakened) the impact of gray matter change. Education moderated (strengthened) the gray matter change effect. Non-memory cognitive measures did not incrementally explain cognitive decline or moderate gray matter change effects.
Episodic memory showed strong construct validity as a measure of cognitive reserve. Education effects on cognitive decline were dependent upon the rate of atrophy, indicating education effectively measures cognitive reserve only when atrophy rate is low. Results indicate that episodic memory has clinical utility as a predictor of future cognitive decline and better represents the neural basis of cognitive reserve than other cognitive abilities or static proxies like education.
During the Randomized Assessment of Rapid Endovascular Treatment (EVT) of Ischemic Stroke (ESCAPE) trial, patient-level micro-costing data were collected. We report a cost-effectiveness analysis of EVT, using ESCAPE trial data and Markov simulation, from a universal, single-payer system using a societal perspective over a patient’s lifetime.
Primary data collection alongside the ESCAPE trial provided a 3-month trial-specific, non-model, based cost per quality-adjusted life year (QALY). A Markov model utilizing ongoing lifetime costs and life expectancy from the literature was built to simulate the cost per QALY adopting a lifetime horizon. Health states were defined using the modified Rankin Scale (mRS) scores. Uncertainty was explored using scenario analysis and probabilistic sensitivity analysis.
The 3-month trial-based analysis resulted in a cost per QALY of $201,243 of EVT compared to the best standard of care. In the model-based analysis, using a societal perspective and a lifetime horizon, EVT dominated the standard of care; EVT was both more effective and less costly than the standard of care (−$91). When the time horizon was shortened to 1 year, EVT remains cost savings compared to standard of care (∼$15,376 per QALY gained with EVT). However, if the estimate of clinical effectiveness is 4% less than that demonstrated in ESCAPE, EVT is no longer cost savings compared to standard of care.
Results support the adoption of EVT as a treatment option for acute ischemic stroke, as the increase in costs associated with caring for EVT patients was recouped within the first year of stroke, and continued to provide cost savings over a patient’s lifetime.
To describe epidemiologic and genomic characteristics of a severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) outbreak in a large skilled nursing facility (SNF), and the strategies that controlled transmission.
Design, Setting, and Participants:
Cohort study during March 22–May 4, 2020 of all staff and residents at a 780-bed SNF in San Francisco, California.
Contact tracing and symptom screening guided targeted testing of staff and residents; respiratory specimens were also collected through serial point prevalence surveys (PPS) in units with confirmed cases. Cases were confirmed by real-time reverse transcription–polymerase chain reaction testing for SARS-CoV-2; whole genome sequencing (WGS) characterized viral isolate lineages and relatedness. Infection prevention and control (IPC) interventions included restricting from work any staff who had close contact to a confirmed case; restricting movements between units; implementing surgical face masking facility-wide; and recommended PPE (isolation gown, gloves, N95 respirator and eye protection) for clinical interactions in units with confirmed cases.
Of 725 staff and residents tested through targeted testing and serial PPS, twenty-one (3%) were SARS-CoV-2-positive; sixteen (76%) staff and 5 (24%) residents. Fifteen (71%) were linked to a single unit. Targeted testing identified 17 (81%) cases; PPS identified 4 (19%). Most (71%) cases were identified prior to IPC intervention. WGS was performed on SARS-CoV-2 isolates from four staff and four residents; five were of Santa Clara County lineage and the three others were distinct lineages.
Early implementation of targeted testing, serial PPS, and multimodal IPC interventions limited SARS-CoV-2 transmission within the SNF.