To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The coronavirus disease 2019 (COVID-19) has greatly impacted health-care systems worldwide, leading to an unprecedented rise in demand for health-care resources. In anticipation of an acute strain on established medical facilities in Dallas, Texas, federal officials worked in conjunction with local medical personnel to convert a convention center into a Federal Medical Station capable of caring for patients affected by COVID-19. A 200,000 square foot event space was designated as a direct patient care area, with surrounding spaces repurposed to house ancillary services. Given the highly transmissible nature of the novel coronavirus, the donning and doffing of personal protective equipment (PPE) was of particular importance for personnel staffing the facility. Furthermore, nationwide shortages in the availability of PPE necessitated the reuse of certain protective materials. This article seeks to delineate the procedures implemented regarding PPE in the setting of a COVID-19 disaster response shelter, including workspace flow, donning and doffing procedures, PPE conservation, and exposure event protocols.
While negative affect reliably predicts binge eating, it is unknown how this association may decrease or ‘de-couple’ during treatment for binge eating disorder (BED), whether such change is greater in treatments targeting emotion regulation, or how such change predicts outcome. This study utilized multi-wave ecological momentary assessment (EMA) to assess changes in the momentary association between negative affect and subsequent binge-eating symptoms during Integrative Cognitive Affective Therapy (ICAT-BED) and Cognitive Behavior Therapy Guided Self-Help (CBTgsh). It was predicted that there would be stronger de-coupling effects in ICAT-BED compared to CBTgsh given the focus on emotion regulation skills in ICAT-BED and that greater de-coupling would predict outcomes.
Adults with BED were randomized to ICAT-BED or CBTgsh and completed 1-week EMA protocols and the Eating Disorder Examination (EDE) at pre-treatment, end-of-treatment, and 6-month follow-up (final N = 78). De-coupling was operationalized as a change in momentary associations between negative affect and binge-eating symptoms from pre-treatment to end-of-treatment.
There was a significant de-coupling effect at follow-up but not end-of-treatment, and de-coupling did not differ between ICAT-BED and CBTgsh. Less de-coupling was associated with higher end-of-treatment EDE global scores at end-of-treatment and higher binge frequency at follow-up.
Both ICAT-BED and CBTgsh were associated with de-coupling of momentary negative affect and binge-eating symptoms, which in turn relate to cognitive and behavioral treatment outcomes. Future research is warranted to identify differential mechanisms of change across ICAT-BED and CBTgsh. Results also highlight the importance of developing momentary interventions to more effectively de-couple negative affect and binge eating.
As the climate changes and ecosystems shift toward novel combinations of species, the methods and metrics of conservation science are becoming less species-centric. To meet this growing need, marine conservation paleobiologists stand to benefit from the addition of new, taxon-free benthic indices to the live–dead analysis tool kit. These indices, which were developed to provide actionable, policy-specific data, can be applied to the readily preservable component of benthic communities (e.g., mollusks) to assess the ecological quality status of the entire community. Because these indices are taxon-free, they remain applicable even as the climate changes and novel communities develop—making them a potentially valuable complement to traditionally applied approaches for live–dead analysis, which tend to focus on maintaining specific combinations of species under relatively stable environmental conditions. Integrating geohistorical data with these established indices has potential to increase the salience of the live–dead approach in the eyes of resource managers and other stakeholders.
Exposure to glucocorticoid levels higher than appropriate for current developmental stages induces offspring metabolic dysfunction. Overfed/obese (OB) ewes and their fetuses display elevated blood cortisol, while fetal Adrenocorticotropic hormone (ACTH) remains unchanged. We hypothesized that OB pregnancies would show increased placental 11β hydroxysteroid dehydrogenase 2 (11β-HSD2) that converts maternal cortisol to fetal cortisone as it crosses the placenta and increased 11β-HSD system components responsible for peripheral tissue cortisol production, providing a mechanism for ACTH-independent increase in circulating fetal cortisol. Control ewes ate 100% National Research Council recommendations (CON) and OB ewes ate 150% CON diet from 60 days before conception until necropsy at day 135 gestation. At necropsy, maternal jugular and umbilical venous blood, fetal liver, perirenal fat, and cotyledonary tissues were harvested. Maternal plasma cortisol and fetal cortisol and cortisone were measured. Fetal liver, perirenal fat, cotyledonary 11β-HSD1, hexose-6-phosphate dehydrogenase (H6PD), and 11β-HSD2 protein abundance were determined by Western blot. Maternal plasma cortisol, fetal plasma cortisol, and cortisone were higher in OB vs. CON (p < 0.01). 11β-HSD2 protein was greater (p < 0.05) in OB cotyledonary tissue than CON. 11β-HSD1 abundance increased (p < 0.05) in OB vs. CON fetal liver and perirenal fat. Fetal H6PD, an 11β-HSD1 cofactor, also increased (p < 0.05) in OB vs. CON perirenal fat and tended to be elevated in OB liver (p < 0.10). Our data provide evidence for increased 11β-HSD system components responsible for peripheral tissue cortisol production in fetal liver and adipose tissue, thereby providing a mechanism for an ACTH-independent increase in circulating fetal cortisol in OB fetuses.
The Apolipoprotein (APOE) ε4 allele increases the risk for mild cognitive impairment (MCI) and dementia, but not all carriers develop MCI/dementia. The purpose of this exploratory study was to determine if early and subtle preclinical signs of cognitive dysfunction and medial temporal lobe atrophy are observed in cognitively intact ε4 carriers who subsequently develop MCI.
Twenty-nine healthy, cognitively intact ε4 carriers (ε3/ε4 heterozygotes; ages 65–85) underwent neuropsychological testing and MRI-based measurements of medial temporal volumes over a 5-year follow-up interval; data were converted to z-scores based on a non-carrier group consisting of 17 ε3/ε3 homozygotes.
At follow-up, 11 ε4 carriers (38%) converted to a diagnosis of MCI. At study entry, the MCI converters had significantly lower scores on the Mini-Mental State Examination, Rey Auditory Verbal Learning Test (RAVLT) Trials 1–5, and RAVLT Immediate Recall compared to non-converters. MCI converters also had smaller MRI volumes in the left subiculum than non-converters. Follow-up logistic regressions revealed that left subiculum volumes and RAVLT Trials 1–5 scores were significant predictors of MCI conversion.
Results from this exploratory study suggest that ε4 carriers who convert to MCI exhibit subtle cognitive and volumetric differences years prior to diagnosis.
Claims of conscience are a substantial area of concern in relation to healthcare decisions but are often only considered in a limited context. Broadening our understanding of claims of conscience, however, might lead to claims that we are moving back towards a doctor-centred understanding of medical care. This article argues that we can allow claims of conscience without unduly penalising patients by focusing on the responsibilities that ought to attach to conscience claims. This article sets out three responsibilities – humility, universality and reciprocal respect – which ought to be part of any claim of conscience. The Charlie Gard case is then used as an example to explore the use of responsibilities. The article then moves to consider possible issues that arise from this view.
Cancer is the second leading cause of death worldwide. Lifestyle choices play an important role in the aetiology of cancer with up to 4 in 10 cases potentially preventable. Interventions delivered by healthcare professionals (HCPs) that incorporate risk information have the potential to promote behaviour change. Our aim was to develop a very brief intervention incorporating cancer risk, which could be implemented within primary care.
Guided by normalisation process theory (NPT), we developed a prototype intervention using literature reviews, consultation with patient and public representatives and pilot work with patients and HCPs. We conducted focus groups and interviews with 65 HCPs involved in delivering prevention activities. Findings were used to refine the intervention before 22 HCPs completed an online usability test and provided further feedback via a questionnaire incorporating a modified version of the NoMAD checklist.
The intervention included a website where individuals could provide information on lifestyle risk factors view their estimated 10-year risk of developing one or more of the five most common preventable cancers and access lifestyle advice incorporating behaviour change techniques. Changes incorporated from feedback from the focus groups and interviews included signposting to local services and websites, simplified wording and labelling of risk information. In the usability testing, all participants felt it would be easy to collect the risk information. Ninety-one percent felt the intervention would enable discussion about cancer risk and believed it had potential to be easily integrated into National Health Service (NHS) Health Checks. However, only 36% agreed it could be delivered within 5 min.
With the use of NPT, we developed a very brief intervention that is acceptable to HCPs in primary care and could be potentially integrated into NHS Health Checks. However, further work is needed to assess its feasibility and potential effectiveness.
Palmer amaranth is the most common and troublesome weed in North Carolina sweetpotato. Field studies were conducted in Clinton, NC, in 2016 and 2017 to determine the critical timing of Palmer amaranth removal in ‘Covington’ sweetpotato. Palmer amaranth was grown with sweetpotato from transplanting to 2, 3, 4, 5, 6, 7, 8, and 9 wk after transplanting (WAP) and maintained weed-free for the remainder of the season. Palmer amaranth height and shoot dry biomass increased as Palmer amaranth removal was delayed. Season-long competition by Palmer amaranth interference reduced marketable yields by 85% and 95% in 2016 and 2017, respectively. Sweetpotato yield loss displayed a strong inverse linear relationship with Palmer amaranth height. A 0.6% and 0.4% decrease in yield was observed for every centimeter of Palmer amaranth growth in 2016 and 2017, respectively. The critical timing for Palmer amaranth removal, based on 5% loss of marketable yield, was determined by fitting a log-logistic model to the relative yield data and was determined to be 2 WAP. These results show that Palmer amaranth is highly competitive with sweetpotato and should be managed as early as possible in the season. The requirement of an early critical timing of weed removal to prevent yield loss emphasizes the importance of early-season scouting and Palmer amaranth removal in sweetpotato fields. Any delay in removal can result in substantial yield reductions and fewer premium quality roots.
Field studies were conducted in 2016 and 2017 in Clinton, NC, to determine the interspecific and intraspecific interference of Palmer amaranth (Amaranthus palmeri S. Watson) or large crabgrass [Digitaria sanguinalis (L.) Scop.] in ‘Covington’ sweetpotato [Ipomoea batatas (L.) Lam.]. Amaranthus palmeri and D. sanguinalis were established 1 d after sweetpotato transplanting and maintained season-long at 0, 1, 2, 4, 8 and 0, 1, 2, 4, 16 plants m−1 of row in the presence and absence of sweetpotato, respectively. Predicted yield loss for sweetpotato was 35% to 76% for D. sanguinalis at 1 to 16 plants m−1 of row and 50% to 79% for A. palmeri at 1 to 8 plants m−1 of row. Weed dry biomass per meter of row increased linearly with increasing weed density. Individual dry biomass of A. palmeri and D. sanguinalis was not affected by weed density when grown in the presence of sweetpotato. When grown without sweetpotato, individual weed dry biomass decreased 71% and 62% from 1 to 4 plants m−1 row for A. palmeri and D. sanguinalis, respectively. Individual weed dry biomass was not affected above 4 plants m−1 row to the highest densities of 8 and 16 plants m−1 row for A. palmeri and D. sanguinalis, respectively.
Determining infectious cross-transmission events in healthcare settings involves manual surveillance of case clusters by infection control personnel, followed by strain typing of clinical/environmental isolates suspected in said clusters. Recent advances in genomic sequencing and cloud computing now allow for the rapid molecular typing of infecting isolates.
To facilitate rapid recognition of transmission clusters, we aimed to assess infection control surveillance using whole-genome sequencing (WGS) of microbial pathogens to identify cross-transmission events for epidemiologic review.
Clinical isolates of Staphylococcus aureus, Enterococcus faecium, Pseudomonas aeruginosa, and Klebsiella pneumoniae were obtained prospectively at an academic medical center, from September 1, 2016, to September 30, 2017. Isolate genomes were sequenced, followed by single-nucleotide variant analysis; a cloud-computing platform was used for whole-genome sequence analysis and cluster identification.
Most strains of the 4 studied pathogens were unrelated, and 34 potential transmission clusters were present. The characteristics of the potential clusters were complex and likely not identifiable by traditional surveillance alone. Notably, only 1 cluster had been suspected by routine manual surveillance.
Our work supports the assertion that integration of genomic and clinical epidemiologic data can augment infection control surveillance for both the identification of cross-transmission events and the inclusion of missed and exclusion of misidentified outbreaks (ie, false alarms). The integration of clinical data is essential to prioritize suspect clusters for investigation, and for existing infections, a timely review of both the clinical and WGS results can hold promise to reduce HAIs. A richer understanding of cross-transmission events within healthcare settings will require the expansion of current surveillance approaches.
Field and greenhouse studies were conducted in 2016 and 2017 to determine sweetpotato tolerance to herbicides applied to plant propagation beds. Herbicide treatments included PRE application of flumioxazin (107 g ai ha−1), S-metolachlor (800 g ai ha−1), fomesafen (280 g ai ha−1), flumioxazin plus S-metolachlor (107 g ai ha−1 + 800 g ai ha−1), fomesafen plus S-metolachlor (280 g ai ha−1 + 800 g ai ha−1), fluridone (1,120 or 2,240 g ai ha−1), fluridone plus S-metolachlor (1,120 g ai ha−1 + 800 g ai ha−1), napropamide (1,120 g ai ha−1), clomazone (420 g ai ha−1), linuron (560 g ai ha−1), linuron plus S-metolachlor (560 g ai ha−1 + 800 g ai ha−1), bicyclopyrone (38 or 49.7 g ai ha−1), pyroxasulfone (149 g ai ha−1), pre-mix of flumioxazin plus pyroxasulfone (81.8 g ai ha−1 + 104.2 g ai ha−1), or metribuzin (294 g ai ha−1). Paraquat plus non-ionic surfactant (280 g ai ha−1 + 0.25% v/v) POST was also included. After plants in the propagation bed were cut and sweetpotato slip number, length, and weight had been determined, the slips were then transplanted to containers and placed either in the greenhouse or on an outdoor pad to determine any effects from the herbicide treatments on initial sweetpotato growth. Sweetpotato slip number, length, and/or weight were affected by flumioxazin with or without S-metolachlor, S-metolachlor with or without fomesafen, clomazone, and all fluridone treatments. In the greenhouse studies, initial root growth of plants after transplanting was inhibited by fluridone (1,120 g ai ha−1) and fluridone plus S-metolachlor. However, by 5 wk after transplanting few differences were observed between treatments. Fomesafen, linuron with or without S-metolachlor, bicyclopyrone (38 or 49.7 g ai ha−1), pyroxasulfone with or without flumioxazin, metribuzin, and paraquat did not cause injury to sweetpotato slips in any of the studies conducted.
The role that vitamin D plays in pulmonary function remains uncertain. Epidemiological studies reported mixed findings for serum 25-hydroxyvitamin D (25(OH)D)–pulmonary function association. We conducted the largest cross-sectional meta-analysis of the 25(OH)D–pulmonary function association to date, based on nine European ancestry (EA) cohorts (n 22 838) and five African ancestry (AA) cohorts (n 4290) in the Cohorts for Heart and Aging Research in Genomic Epidemiology Consortium. Data were analysed using linear models by cohort and ancestry. Effect modification by smoking status (current/former/never) was tested. Results were combined using fixed-effects meta-analysis. Mean serum 25(OH)D was 68 (sd 29) nmol/l for EA and 49 (sd 21) nmol/l for AA. For each 1 nmol/l higher 25(OH)D, forced expiratory volume in the 1st second (FEV1) was higher by 1·1 ml in EA (95 % CI 0·9, 1·3; P<0·0001) and 1·8 ml (95 % CI 1·1, 2·5; P<0·0001) in AA (Prace difference=0·06), and forced vital capacity (FVC) was higher by 1·3 ml in EA (95 % CI 1·0, 1·6; P<0·0001) and 1·5 ml (95 % CI 0·8, 2·3; P=0·0001) in AA (Prace difference=0·56). Among EA, the 25(OH)D–FVC association was stronger in smokers: per 1 nmol/l higher 25(OH)D, FVC was higher by 1·7 ml (95 % CI 1·1, 2·3) for current smokers and 1·7 ml (95 % CI 1·2, 2·1) for former smokers, compared with 0·8 ml (95 % CI 0·4, 1·2) for never smokers. In summary, the 25(OH)D associations with FEV1 and FVC were positive in both ancestries. In EA, a stronger association was observed for smokers compared with never smokers, which supports the importance of vitamin D in vulnerable populations.
Network analysis is an emerging approach in the study of psychopathology, yet few applications have been seen in eating disorders (EDs). Furthermore, little research exists regarding changes in network strength after interventions. Therefore the present study examined the network structures of ED and co-occurring depression and anxiety symptoms before and after treatment for EDs.
Participants from residential or partial hospital ED treatment programs (N = 446) completed assessments upon admission and discharge. Networks were estimated using regularized Graphical Gaussian Models using 38 items from the Eating Disorders Examination-Questionnaire, Quick Inventory of Depressive Symptomatology, and State-Trait Anxiety Inventory.
ED symptoms with high centrality indices included a desire to lose weight, guilt about eating, shape overvaluation, and wanting an empty stomach, while restlessness, self-esteem, lack of energy, and feeling overwhelmed bridged ED to depression and anxiety symptoms. Comparisons between admission and discharge networks indicated the global network strength did not change significantly, though symptom severity decreased. Participants with denser networks at admission evidenced less change in ED symptomatology during treatment.
Findings suggest that symptoms related to shape and weight concerns and guilt are central ED symptoms, while physical symptoms, self-esteem, and feeling overwhelmed are links that may underlie comorbidities in EDs. Results provided some support for the validity of network approaches, in that admission networks conveyed prognostic information. However, the lack of correspondence between symptom reduction and change in network strength indicates that future research is needed to examine network dynamics in the context of intervention and relapse prevention.
The Neotoma Paleoecology Database is a community-curated data resource that supports interdisciplinary global change research by enabling broad-scale studies of taxon and community diversity, distributions, and dynamics during the large environmental changes of the past. By consolidating many kinds of data into a common repository, Neotoma lowers costs of paleodata management, makes paleoecological data openly available, and offers a high-quality, curated resource. Neotoma’s distributed scientific governance model is flexible and scalable, with many open pathways for participation by new members, data contributors, stewards, and research communities. The Neotoma data model supports, or can be extended to support, any kind of paleoecological or paleoenvironmental data from sedimentary archives. Data additions to Neotoma are growing and now include >3.8 million observations, >17,000 datasets, and >9200 sites. Dataset types currently include fossil pollen, vertebrates, diatoms, ostracodes, macroinvertebrates, plant macrofossils, insects, testate amoebae, geochronological data, and the recently added organic biomarkers, stable isotopes, and specimen-level data. Multiple avenues exist to obtain Neotoma data, including the Explorer map-based interface, an application programming interface, the neotoma R package, and digital object identifiers. As the volume and variety of scientific data grow, community-curated data resources such as Neotoma have become foundational infrastructure for big data science.
Use of ketamine in the prehospital setting may be advantageous due to its potent analgesic and sedative properties and favorable risk profile. Use in the military setting has demonstrated both efficacy and safety for pain relief. The purpose of this study was to assess ketamine training, use, and perceptions in the civilian setting among nationally certified paramedics (NRPs) in the United States.
A cross-sectional survey of NRPs was performed. The electronic questionnaire assessed paramedic training, authorization, use, and perceptions of ketamine. Included in the analysis were completed surveys of paramedics who held one or more state paramedic credentials, indicated “patient care provider” as their primary role, and worked in non-military settings. Descriptive statistics were calculated.
A total of 14,739 responses were obtained (response rate=23%), of which 10,737 (73%) met inclusion criteria and constituted the study cohort. Over one-half (53%) of paramedics reported learning about ketamine during their initial paramedic training. Meanwhile, 42% reported seeking ketamine-related education on their own. Of all respondents, only 33% (3,421/10,737) were authorized by protocol to use ketamine. Most commonly authorized uses included pain management (55%), rapid sequence intubation (RSI; 72%), and chemical restraint/sedation (72%). One-third of authorized providers (1,107/3,350) had never administered ketamine, with another 32% (1,070/3,350) having administered ketamine less than five times in their career. Ketamine was perceived to be safe and effective as the vast majority reported that they were comfortable with the use of ketamine (94%) and would, in similar situations (95%), use it again.
This was the first large, national survey to assess ketamine training, use, and perceptions among paramedics in the civilian prehospital setting. While training related to ketamine use was commonly reported among paramedics, few were authorized to administer the drug by their agency’s protocols. Of those authorized to use ketamine, most paramedics had limited experience administering the drug. Future research is needed to determine why the prevalence of ketamine use is low and to assess the safety and efficacy of ketamine use in the prehospital setting.
BucklandDM, CroweRP, CashRE, GondekS, MalusoP, SirajuddinS, SmithER, DangerfieldP, ShapiroG, WankaC, PanchalAR, SaraniB. Ketamine in the Prehospital Environment: A National Survey of Paramedics in the United States. Prehosp Disaster Med. 2018;33(1):23–28.
Investigation of Lake Quinault in western Washington, including a reflection seismic survey, analysis of piston cores, and preliminary mapping in the steep, landslide-prone Quinault River catchment upstream of the lake, reveals evidence for three episodes of earthquake disturbance in the past 3000 yr. These earthquakes triggered failures on the lake’s underwater slopes and delta front, as well as subaerial landsliding, partial channel blockage, and forced fluvial sediment aggradation. The ages of the three Lake Quinault disturbance events overlap with those of coseismically subsided, coastal marsh soils nearby in southwest Washington that are interpreted to record ruptures of the Cascadia megathrust. Absent from Lake Quinault, however, are signals of obvious disturbance from five additional subduction earthquakes inferred to have occurred during the period of record. The lack of evidence for these events may reflect the limitations of the data set derived from the detrital, river-dominated lake stratigraphy but may also have bearing on debates about segmentation and the distribution of slip along the Cascadia subduction zone during prior earthquakes.
In 1968 the FAA adopted a high density rule for the allocation of scarce landing and take-off slots at four major airports (La Guardia,Washington National, Kennedy International, and O'Hare International). This rule establishes slot quotas for the control of airspace congestion at these airports.
Airport runway slots, regulated by these quotas, have a distinguishing feature which any proposed allocation procedure must accommodate: an airline's demand for a takeoff slot at a flight originating airport is not independent of its demand for a landing slot at the flight destination airport. Indeed, a given flight may take off and land in a sequence of several connected demand interdependent legs. For economic efficiency it is desirable to develop an airport slot allocation procedure that allocates individual slots to those airline flights for which the demand (willingness to pay) is greatest.
Grether, Isaac, and Plott (hereafter, GIP) (1979, 1981) have proposed a practical market procedure for achieving this goal. Their procedure is based upon the growing body of experimental evidence on the performance of (1) the competitive (uniform price) sealed-bid auction and (2) the oral double auction such as is used on the organized stock and commodity exchanges. Under their proposal an independent primary market for slots at each airport would be organized as a sealed-bid competitive auction at timely intervals. Since the primary market allocation does not make provision for slot demand interdependence, a computerized form of the oral double auction (with block transaction capabilities) is proposed as an “after market” to allow airlines to purchase freely and sell primary market slots to each other. This continuous after market exchange would provide the institutional means by which individual airlines would acquire those slot packages which support their individual flight schedules. Thus, an airline that acquired slots at Washington National which did not flight-match the slots acquired at O'Hare could either buy additional O'Hare slots or sell its excess Washington slots in the after market.