We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Although the link between alcohol involvement and behavioral phenotypes (e.g. impulsivity, negative affect, executive function [EF]) is well-established, the directionality of these associations, specificity to stages of alcohol involvement, and extent of shared genetic liability remain unclear. We estimate longitudinal associations between transitions among alcohol milestones, behavioral phenotypes, and indices of genetic risk.
Methods
Data came from the Collaborative Study on the Genetics of Alcoholism (n = 3681; ages 11–36). Alcohol transitions (first: drink, intoxication, alcohol use disorder [AUD] symptom, AUD diagnosis), internalizing, and externalizing phenotypes came from the Semi-Structured Assessment for the Genetics of Alcoholism. EF was measured with the Tower of London and Visual Span Tasks. Polygenic scores (PGS) were computed for alcohol-related and behavioral phenotypes. Cox models estimated associations among PGS, behavior, and alcohol milestones.
Results
Externalizing phenotypes (e.g. conduct disorder symptoms) were associated with future initiation and drinking problems (hazard ratio (HR)⩾1.16). Internalizing (e.g. social anxiety) was associated with hazards for progression from first drink to severe AUD (HR⩾1.55). Initiation and AUD were associated with increased hazards for later depressive symptoms and suicidal ideation (HR⩾1.38), and initiation was associated with increased hazards for future conduct symptoms (HR = 1.60). EF was not associated with alcohol transitions. Drinks per week PGS was linked with increased hazards for alcohol transitions (HR⩾1.06). Problematic alcohol use PGS increased hazards for suicidal ideation (HR = 1.20).
Conclusions
Behavioral markers of addiction vulnerability precede and follow alcohol transitions, highlighting dynamic, bidirectional relationships between behavior and emerging addiction.
To assess cost-effectiveness of late time-window endovascular treatment (EVT) in a clinical trial setting and a “real-world” setting.
Methods:
Data are from the randomized ESCAPE trial and a prospective cohort study (ESCAPE-LATE). Anterior circulation large vessel occlusion patients presenting > 6 hours from last-known-well were included, whereby collateral status was an inclusion criterion for ESCAPE but not ESCAPE-LATE. A Markov state transition model was built to estimate lifetime costs and quality-adjusted life-years (QALYs) for EVT in addition to best medical care vs. best medical care only in a clinical trial setting (comparing ESCAPE-EVT to ESCAPE control arm patients) and a “real-world” setting (comparing ESCAPE-LATE to ESCAPE control arm patients). We performed an unadjusted analysis, using 90-day modified Rankin Scale(mRS) scores as model input and analysis adjusted for baseline factors. Acceptability of EVT was calculated using upper/lower willingness-to-pay thresholds of 100,000 USD/50,000 USD/QALY.
Results:
Two-hundred and forty-nine patients were included (ESCAPE-LATE:n = 200, ESCAPE EVT-arm:n = 29, ESCAPE control-arm:n = 20). Late EVT in addition to best medical care was cost effective in the unadjusted analysis both in the clinical trial and real-world setting, with acceptability 96.6%–99.0%. After adjusting for differences in baseline variables between the groups, late EVT was marginally cost effective in the clinical trial setting (acceptability:49.9%–61.6%), but not the “real-world” setting (acceptability:32.9%–42.6%).
Conclusion:
EVT for LVO-patients presenting beyond 6 hours was cost effective in the clinical trial setting and “real-world” setting, although this was largely related to baseline patient differences favoring the “real-world” EVT group. After adjusting for these, EVT benefit was reduced in the trial setting, and absent in the real-world setting.
We recently reported on the radio-frequency attenuation length of cold polar ice at Summit Station, Greenland, based on bi-static radar measurements of radio-frequency bedrock echo strengths taken during the summer of 2021. Those data also allow studies of (a) the relative contributions of coherent (such as discrete internal conducting layers with sub-centimeter transverse scale) vs incoherent (e.g. bulk volumetric) scattering, (b) the magnitude of internal layer reflection coefficients, (c) limits on signal propagation velocity asymmetries (‘birefringence’) and (d) limits on signal dispersion in-ice over a bandwidth of ~100 MHz. We find that (1) attenuation lengths approach 1 km in our band, (2) after averaging 10 000 echo triggers, reflected signals observable over the thermal floor (to depths of ~1500 m) are consistent with being entirely coherent, (3) internal layer reflectivities are ≈–60$\to$–70 dB, (4) birefringent effects for vertically propagating signals are smaller by an order of magnitude relative to South Pole and (5) within our experimental limits, glacial ice is non-dispersive over the frequency band relevant for neutrino detection experiments.
Over the past fifteen years, a narrative has developed that IR scholars have become a “cult of the irrelevant,” with declining influence on and engagement with policy debates. Despite these assertions, the evidence for limited policy engagement has been anecdotal. We investigate the extent of policy engagement—the ways in which IR scholars participate in policy-making processes and/or attempt to shape those processes—by surveying IR scholars directly about their engagement activities. We find policy engagement is pervasive among IR scholars. We draw on theories of credit-claiming to motivate expectations about how and when scholars are likely to engage with practitioners. Consistent with our expectations, much of this engagement comes in forms that involve small time commitments and provide opportunities for credit-claiming, such as media appearances and short-form, bylined op-eds and blog posts. However, sizable minorities report engaging in consulting activities not for attribution/publication and writing policy briefs, and a majority of respondents indicate they engaged in these activities several times a year or more. We find only small differences in engagement across gender and rank. Our results demonstrate that, for IR scholars, some form of policy engagement is the norm.
Long-chain omega-3 polyunsaturated fatty acid (LC n-3 PUFA) supplements, rich in eicosapentaenoic acid and/or docosahexaenoic acid, are increasingly being recommended within athletic institutions. However, the wide range of doses, durations and study designs implemented across trials makes it difficult to provide clear recommendations. The importance of study design characteristics in LC n-3 PUFA trials has been detailed in cardiovascular disease research, and these considerations may guide LC n-3 PUFA study design in healthy cohorts. This systematic review examined the quality of studies and study design considerations used in evaluating the evidence for LC n-3 PUFA improving performance in physically trained adults. SCOPUS, PubMed and Web of Science electronic databases were searched to identify studies that supplemented LC n-3 PUFA in physically trained participants. Forty-six (n = 46) studies met inclusion. Most studies used a randomised control design. Risk of bias, assessed using the design-appropriate Cochrane Collaboration tool, revealed that studies had a predominant judgment of ‘some concerns’, ‘high risk’ or ‘moderate risk’ in randomised controlled, randomised crossover or non-randomised studies, respectively. A custom five-point quality assessment scale demonstrated that no study satisfied all recommendations for LC n-3 PUFA study design. This review has highlighted that the disparate range of study designs is likely contributing to the inconclusive state of outcomes pertaining to LC n-3 PUFA as a potential ergogenic aid. Further research must adequately account for the specific LC n-3 PUFA study design considerations, underpinned by a clear hypothesis, to achieve evidence-based dose, duration and composition recommendations for physically trained individuals.
Plasma jets are widely investigated both in the laboratory and in nature. Astrophysical objects such as black holes, active galactic nuclei and young stellar objects commonly emit plasma jets in various forms. With the availability of data from plasma jet experiments resembling astrophysical plasma jets, classification of such data would potentially aid in not only investigating the underlying physics of the experiments but also the study of astrophysical jets. In this work we use deep learning to process all of the laboratory plasma images from the Caltech Spheromak Experiment spanning two decades. We found that cosine similarity can aid in feature selection, classify images through comparison of feature vector direction and be used as a loss function for the training of AlexNet for plasma image classification. We also develop a simple vector direction comparison algorithm for binary and multi-class classification. Using our algorithm we demonstrate 93 % accurate binary classification to distinguish unstable columns from stable columns and 92 % accurate five-way classification of a small, labelled data set which includes three classes corresponding to varying levels of kink instability.
This article examines the institutional development of the U.S. Court of Claims (USCC), in order to shed new light on the nature of constitutional and institutional change in the early Republic. From the founding period through the mid-nineteenth century, members of Congress believed that empowering other institutions to award claimants monies from the Treasury would violate two core doctrines: separation of powers and sovereign immunity. However, as claims against the government ballooned over the first half of the nineteenth century, Congress fundamentally changed its interpretation of the Constitution's requirements in order to create the USCC and thus to alleviate its workload. This story of institutional development is an example of constitutional construction and creative syncretism in that the institutional development of the USCC came from continuous interactions among political actors, working iteratively to refashion institutions capable of solving practical problems of governance. This close study of the court's creation shows something important about American constitutional development: Certain fundamental ideas of the early Republic, including sovereign immunity and separation of powers, were altered or jettisoned not out of some grand rethinking of the nature of the American state, but out of the need to solve a mundane problem.
This article is a clinical guide which discusses the “state-of-the-art” usage of the classic monoamine oxidase inhibitor (MAOI) antidepressants (phenelzine, tranylcypromine, and isocarboxazid) in modern psychiatric practice. The guide is for all clinicians, including those who may not be experienced MAOI prescribers. It discusses indications, drug-drug interactions, side-effect management, and the safety of various augmentation strategies. There is a clear and broad consensus (more than 70 international expert endorsers), based on 6 decades of experience, for the recommendations herein exposited. They are based on empirical evidence and expert opinion—this guide is presented as a new specialist-consensus standard. The guide provides practical clinical advice, and is the basis for the rational use of these drugs, particularly because it improves and updates knowledge, and corrects the various misconceptions that have hitherto been prominent in the literature, partly due to insufficient knowledge of pharmacology. The guide suggests that MAOIs should always be considered in cases of treatment-resistant depression (including those melancholic in nature), and prior to electroconvulsive therapy—while taking into account of patient preference. In selected cases, they may be considered earlier in the treatment algorithm than has previously been customary, and should not be regarded as drugs of last resort; they may prove decisively effective when many other treatments have failed. The guide clarifies key points on the concomitant use of incorrectly proscribed drugs such as methylphenidate and some tricyclic antidepressants. It also illustrates the straightforward “bridging” methods that may be used to transition simply and safely from other antidepressants to MAOIs.
Although associations among borderline personality disorder (BPD), social rejection, and frontal EEG alpha asymmetry scores (FAA, a neural correlate of emotion regulation and approach-withdrawal motivations) have been explored in different studies, relatively little work has examined these relations during adolescence in the same study. We examined whether FAA moderated the relation between BPD features and rejection sensitivity following a validated social exclusion paradigm, Cyberball. A mixed, clinical-community sample of 64 adolescents (females = 62.5%; Mage = 14.45 years; SD = 1.6; range = 11-17 years) completed psychodiagnostic interviews and a self-report measure of BPD (Time 1). Approximately two weeks later (Time 2), participants completed a resting EEG recording followed by Cyberball. FAA moderated the relation between BPD features and overall feelings of rejection following Cyberball: individuals with greater relative left FAA had the highest and lowest feelings of social rejection depending on whether they had high and low BPD feature scores, respectively. Results remained after controlling for age, sex, gender, depression, and BPD diagnosis. These results suggest that FAA may moderate the relation between BPD features and social rejection, and that left frontal brain activity at rest may be differentially associated with those feelings in BPD. Findings are discussed in terms of the link between left frontal brain activity in the regulation and dysregulation of social approach behaviors, characteristic of BPD.
While ethics has been identified as a core component of health technology assessment (HTA), there are few examples of practical, systematic inclusion of ethics analysis in HTA. Some attribute the scarcity of ethics analysis in HTA to debates about appropriate methodology and the need for ethics frameworks that are relevant to local social values. The “South African Values and Ethics for Universal Health Coverage” (SAVE-UHC) project models an approach that countries can use to develop HTA ethics frameworks that are specific to their national contexts.
Methods
The SAVE-UHC approach consisted of two phases. In Phase I, the research team convened and facilitated a national multistakeholder working group to develop a provisional ethics framework through a collaborative, engagement-driven process. In Phase II, the research team refined the model framework by piloting it through three simulated HTA appraisal committee meetings. Each simulated committee reviewed two case studies of sample health interventions: opioid substitution therapy and either a novel contraceptive implant or seasonal influenza immunization for children under five.
Results
The methodology was fit-for-purpose, resulting in a context-specified ethics framework and producing relevant findings to inform application of the framework for the given HTA context.
Conclusions
The SAVE-UHC approach provides a model for developing, piloting, and refining an ethics framework for health priority-setting that is responsive to national social values. This approach also helps identify key facilitators and challenges for integrating ethics analysis into HTA processes.
Capacity development is increasingly recognized as central to conservation goals. Efforts to develop individual, organizational and societal capacity underpin direct investments in biodiversity conservation and natural resource management, and sustain their impact over time. In the face of urgent needs and increasingly complex contexts for conservation the sector not only needs more capacity development, it needs new approaches to capacity development. The sector is embracing the dynamic relationships between the ecological, political, social and economic dimensions of conservation. Capacity development practitioners should ensure that individuals, organizations and communities are prepared to work effectively in these complex environments of constant change to transform the systems that drive biodiversity loss and unsustainable, unequitable resource use. Here we advocate for a systems view of capacity development. We propose a conceptual framework that aligns capacity development components with all stages of conservation efforts, fosters attention to context, and coordinates with parallel efforts to engage across practitioners and sectors for more systemic impact. Furthermore, we highlight a need for practitioners to target, measure and support vital elements of capacity that have traditionally received less attention, such as values and motivation, leadership and organizational culture, and governance and participation by using approaches from psychology, the social sciences and systems thinking. Drawing from conservation and other sectors, we highlight examples of approaches that can support reflective practice, so capacity development practitioners can better understand the factors that favour or hinder effectiveness of interventions and influence system-wide change.
Paramedics received training in point-of-care ultrasound (POCUS) to assess for cardiac contractility during management of medical out-of-hospital cardiac arrest (OHCA). The primary outcome was the percentage of adequate POCUS video acquisition and accurate video interpretation during OHCA resuscitations. Secondary outcomes included POCUS impact on patient management and resuscitation protocol adherence.
Methods:
A prospective, observational cohort study of paramedics was performed following a four-hour training session, which included a didactic lecture and hands-on POCUS instruction. The Prehospital Echocardiogram in Cardiac Arrest (PECA) protocol was developed and integrated into the resuscitation algorithm for medical non-shockable OHCA. The ultrasound (US) images were reviewed by a single POCUS expert investigator to determine the adequacy of the POCUS video acquisition and accuracy of the video interpretation. Change in patient management and resuscitation protocol adherence data, including end-tidal carbon dioxide (EtCO2) monitoring following advanced airway placement, adrenaline administration, and compression pauses under ten seconds, were queried from the prehospital electronic health record (EHR).
Results:
Captured images were deemed adequate in 42/49 (85.7%) scans and paramedic interpretation of sonography was accurate in 43/49 (87.7%) scans. The POCUS results altered patient management in 14/49 (28.6%) cases. Paramedics adhered to EtCO2 monitoring in 36/36 (100.0%) patients with an advanced airway, adrenaline administration for 38/38 (100.0%) patients, and compression pauses under ten seconds for 36/38 (94.7%) patients.
Conclusion:
Paramedics were able to accurately obtain and interpret cardiac POCUS videos during medical OHCA while adhering to a resuscitation protocol. These findings suggest that POCUS can be effectively integrated into paramedic protocols for medical OHCA.
Early administration of antibiotics in sepsis is associated with improved patient outcomes, but safe and generalizable approaches to de-escalate or discontinue antibiotics after suspected sepsis events are unknown.
Methods:
We used a modified Delphi approach to identify safety criteria for an opt-out protocol to guide de-escalation or discontinuation of antibiotic therapy after 72 hours in non-ICU patients with suspected sepsis. An expert panel with expertise in antimicrobial stewardship and hospital epidemiology rated 48 unique criteria across 3 electronic survey rating tools. Criteria were rated primarily based on their impact on patient safety and feasibility for extraction from electronic health record review. The 48 unique criteria were rated by anonymous electronic survey tools, and the results were fed back to the expert panel participants. Consensus was achieved to either retain or remove each criterion.
Results:
After 3 rounds, 22 unique criteria remained as part of the opt-out safety checklist. These criteria included high-risk comorbidities, signs of severe illness, lack of cultures during sepsis work-up or antibiotic use prior to blood cultures, or ongoing signs and symptoms of infection.
Conclusions:
The modified Delphi approach is a useful method to achieve expert-level consensus in the absence of evidence suifficient to provide validated guidance. The Delphi approach allowed for flexibility in development of an opt-out trial protocol for sepsis antibiotic de-escalation. The utility of this protocol should be evaluated in a randomized controlled trial.
Cyclosporiasis is an illness characterised by watery diarrhoea caused by the food-borne parasite Cyclospora cayetanensis. The increase in annual US cyclosporiasis cases led public health agencies to develop genotyping tools that aid outbreak investigations. A team at the Centers for Disease Control and Prevention (CDC) developed a system based on deep amplicon sequencing and machine learning, for detecting genetically-related clusters of cyclosporiasis to aid epidemiologic investigations. An evaluation of this system during 2018 supported its robustness, indicating that it possessed sufficient utility to warrant further evaluation. However, the earliest version of CDC's system had some limitations from a bioinformatics standpoint. Namely, reliance on proprietary software, the inability to detect novel haplotypes and absence of a strategy to select an appropriate number of discrete genetic clusters would limit the system's future deployment potential. We recently introduced several improvements that address these limitations and the aim of this study was to reassess the system's performance to ensure that the changes introduced had no observable negative impacts. Comparison of epidemiologically-defined cyclosporiasis clusters from 2019 to analogous genetic clusters detected using CDC's improved system reaffirmed its excellent sensitivity (90%) and specificity (99%), and confirmed its high discriminatory power. This C. cayetanensis genotyping system is robust and with ongoing improvement will form the basis of a US-wide C. cayetanensis genotyping network for clinical specimens.
This chapter presents an overview of the nature, assessment, and treatment of obsessive-compulsive and related disorders (OCRD), including obsessive-compulsive disorder (OCD), body dysmorphic disorder (BDD), hoarding disorder (HD), hair-pulling disorder (HPD), and skin-picking disorder (SPD). Specifically, we review the DSM-V diagnostic criteria, epidemiology and impact, clinical features and course, and etiological insights for each of these disorders in turn. Next, we discuss key points to consider when making a differential diagnosis with disorders outside the OCRD category. From there, we turn to a discussion of the assessment and treatment of these disorders using pharmacological, cognitive-behavioral, and neuromodulation interventions. Future directions in the research on OCRDs then follows.
Stem cells give rise to the entirety of cells within an organ. Maintaining stem cell identity and coordinately regulating stem cell divisions is crucial for proper development. In plants, mobile proteins, such as WUSCHEL-RELATED HOMEOBOX 5 (WOX5) and SHORTROOT (SHR), regulate divisions in the root stem cell niche. However, how these proteins coordinately function to establish systemic behaviour is not well understood. We propose a non-cell autonomous role for WOX5 in the cortex endodermis initial (CEI) and identify a regulator, ANGUSTIFOLIA (AN3)/GRF-INTERACTING FACTOR 1, that coordinates CEI divisions. Here, we show with a multi-scale hybrid model integrating ordinary differential equations (ODEs) and agent-based modeling that quiescent center (QC) and CEI divisions have different dynamics. Specifically, by combining continuous models to describe regulatory networks and agent-based rules, we model systemic behaviour, which led us to predict cell-type-specific expression dynamics of SHR, SCARECROW, WOX5, AN3 and CYCLIND6;1, and experimentally validate CEI cell divisions. Conclusively, our results show an interdependency between CEI and QC divisions.
Food security status is a continuum ranging from high to very low food security. While marginal food security falls next to high food security on the spectrum, new quantitative research indicates marginal food security status is associated with negative health outcomes and poor academic performance among college students. Qualitative research focusing on college students experiencing marginal food security has not been conducted. The current study aims to qualitatively explore experiences of college students with marginal food security and to identify themes to better understand and provide context regarding how marginal food security impacts students.
Design:
Students were recruited for semi-structured interviews with questions designed to study the challenges associated with students’ food situations. All interviews were recorded and transcribed with themes identified via an inductive approach.
Setting:
A large public university on the US west coast.
Participants:
Thirty college students.
Results:
Key themes that emerged: purchasing cheap unhealthy foods, insufficient time to prepare and eat meals on a regular basis, stress and anxiety around the inability to eat healthy food and future health issues, self-perception of health when eating poorly along with physical symptoms and low academic motivation by not fully participating in their courses due to few healthy food options or missing meals.
Conclusion:
Marginal food security can potentially diminish students’ health and their capacity to learn and succeed in their coursework. The results emphasise that students experiencing marginal food security should not be grouped with students experiencing high food security.
Fundamental knowledge about the processes that control the functioning of the biophysical workings of ecosystems has expanded exponentially since the late 1960s. Scientists, then, had only primitive knowledge about C, N, P, S, and H2O cycles; plant, animal, and soil microbialinteractions and dynamics; and land, atmosphere, and water interactions. With the advent of systems ecology paradigm (SEP) and the explosion of technologies supporting field and laboratory research, scientists throughout the world were able to assemble the knowledge base known today as ecosystem science. This chapter describes, through the eyes of scientists associated with the Natural Resource Ecology Laboratory (NREL) at Colorado State University (CSU), the evolution of the SEP in discovering how biophysical systems at small scales (ecological sites, landscapes) function as systems. The NREL and CSU are epicenters of the development of ecosystem science. Later, that knowledge, including humans as components of ecosystems, has been applied to small regions, regions, and the globe. Many research results that have formed the foundation for ecosystem science and management of natural resources, terrestrial environments, and its waters are described in this chapter. Throughout are direct and implicit references to the vital collaborations with the global network of ecosystem scientists.
Colleges and universities around the world engaged diverse strategies during the COVID-19 pandemic. Baylor University, a community of ˜22,700 individuals, was 1 of the institutions which resumed and sustained operations. The key strategy was establishment of multidisciplinary teams to develop mitigation strategies and priority areas for action. This population-based team approach along with implementation of a “Swiss Cheese” risk mitigation model allowed small clusters to be rapidly addressed through testing, surveillance, tracing, isolation, and quarantine. These efforts were supported by health protocols including face coverings, social distancing, and compliance monitoring. As a result, activities were sustained from August 1 to December 8, 2020. There were 62,970 COVID-19 tests conducted with 1435 people testing positive for a positivity rate of 2.28%. A total of 1670 COVID-19 cases were identified with 235 self-reports. The mean number of tests per week was 3500 with approximately 80 of these positive (11/d). More than 60 student tracers were trained with over 120 personnel available to contact trace, at a ratio of 1 per 400 university members. The successes and lessons learned provide a framework and pathway for similar institutions to mitigate the ongoing impacts of COVID-19 and sustain operations during a global pandemic.