To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Nutrition education programs for athletes aim to enhance nutrition knowledge and more importantly support positive dietary change to enhance performance, health and well-being. This systematic review assessed changes in the dietary intakes of athletes in response to nutrition education programs. A search was conducted which included studies providing quantitative dietary intake assessment of athletes of any calibre aged between 12-65y in response to a nutrition education program. Standardized differences (effect sizes) were calculated (when possible) for each dietary parameter. The search yielded 6285 papers with 22 studies (974 participants (71.9% female)) eligible for inclusion. Studies described athletes competing at high school (n=3) through to college level or higher (n=19). Study designs were either single-arm with an intervention only group (12 studies; n=241) or double-arm including an intervention and control group (10 studies; n=689). No control groups received an alternative or ‘sham’ intervention. Face-to-face lectures (9/22) and individual nutrition counselling (6/22) were the most common education interventions. Non-weighed, three-day diet records (10/22) were the most frequently utilised dietary assessment method. Although 14/22 studies (n=5 single and n=9 double) reported significant change in at least one nutrition parameter, dietary changes were inconsistent. Poor study quality and heterogeneity of methods prohibits firm conclusions regarding overall intervention success or superior types of educational modalities. Of note, carbohydrate intakes ‘post-intervention’ when assessed often failed to meet recommended guidelines (12/17 studies). Given the substantial investment made in nutrition education interventions with athletes, there is a need for well-designed and rigorous research to inform future best practice.
Self-regulated learning (SRL) involves a system of cyclically related, goal-directed skills and processes that students can use to overcome academic challenges and to optimize their success in school. Although there are many complex models of SRL, the purpose of this chapter is to distill key themes among prominent SRL theories and to provide practical guidelines for incorporating SRL principles into classroom instruction or direct service activities with students. In this chapter, we describe how students can be taught to engage in a cyclical process of SRL involving the use of metacognitive skills (i.e., setting goals, planning, and evaluating), strategic thinking and action, and adaptive motivational beliefs. The authors also illustrate how educators can support SRL skills by fostering a supportive learning environment encompassing five key principles (e.g., helping students set clear and relevant goals, talking in the language of strategies) and/or by implementing established school-based SRL intervention programs. The characteristics of a SRL intervention program, called the Self-Regulation Empowerment Program (SREP), concrete SRL case scenarios, and supplemental resources are also emphasized.
In 2009, the Institute of Medicine published guidelines for implementation of Crisis Standards of Care (CSC) at the state level in the United States (US). Based in part on the then concern for H1N1 pandemic, there was a recognized need for additional planning at the state level to maintain health system preparedness and conventional care standards when available resources become scarce. Despite the availability of this framework, in the years since and despite repeated large-scale domestic events, implementation remains mixed.
Coronavirus disease 2019 (COVID-19) rejuvenates concern for how health systems can maintain quality care when faced with unrelenting burden. This study seeks to outline which states in the US have developed CSC and which areas of care have thus far been addressed.
An online search was conducted for all 50 states in 2015 and again in 2020. For states without CSC plans online, state officials were contacted by email and phone. Public protocols were reviewed to assess for operational implementation capabilities, specifically highlighting guidance on ventilator use, burn management, sequential organ failure assessment (SOFA) score, pediatric standards, and reliance on influenza planning.
Thirty-six states in the US were actively developing (17) or had already developed (19) official CSC guidance. Fourteen states had no publicly acknowledged effort. Eleven of the 17 public plans had updated within five years, with a majority addressing ventilator usage (16/17), influenza planning (14/17), and pediatric care (15/17), but substantially fewer addressing care for burn patients (9/17).
Many states lacked publicly available guidance on maintaining standards of care during disasters, and many states with specific care guidelines had not sufficiently addressed the full spectrum of hazard to which their health care systems remain vulnerable.
Dinosaur body fossil material is rare in Scotland, previously known almost exclusively from the Great Estuarine Group on the Isle of Skye. We report the first unequivocal dinosaur fossil from the Isle of Eigg, belonging to a Bathonian (Middle Jurassic) taxon of uncertain affinity. The limb bone NMS G.2020.10.1 is incomplete, but through a combination of anatomical comparison and osteohistology, we determine it most likely represents a stegosaur fibula. The overall proportions and cross-sectional geometry are similar to the fibulae of thyreophorans. Examination of the bone microstructure reveals a high degree of remodelling and randomly distributed longitudinal canals in the remaining primary cortical bone. This contrasts with the histological signal expected of theropod or sauropod limb bones, but is consistent with previous studies of thyreophorans, specifically stegosaurs. Previous dinosaur material from Skye and broadly contemporaneous sites in England belongs to this group, including Loricatosaurus and Sarcolestes and a number of indeterminate stegosaur specimens. Theropods such as Megalosaurus and sauropods such as Cetiosaurus are also known from these localities. Although we find strong evidence for a stegosaur affinity, diagnostic features are not observed on NMS G.2020.10.1, preventing us from referring it to any known genera. The presence of this large-bodied stegosaur on Eigg adds a significant new datapoint for dinosaur distribution in the Middle Jurassic of Scotland.
The increase in mortality and total prehospital time (TPT) seen in Qatar appear to be realistic. However, existing reports on the influence of TPT on mortality in trauma patients are conflicting. This study aimed to explore the impact of prehospital time on the in-hospital outcomes.
A retrospective analysis of data on patients transferred alive by Emergency Medical Services (EMS) and admitted to Hamad Trauma Center (HTC) of Hamad General Hospital (HGH; Doha, Qatar) from June 2017 through May 2018 was conducted. This study was centered on the National Trauma Registry database. Patients were categorized based on the trauma triage activation and prehospital intervals, and comparative analysis was performed.
A total of 1,455 patients were included, of which nearly one-quarter of patients required urgent and life-saving care at a trauma center (T1 activations). The overall TPT was 70 minutes and the on-scene time (OST) was 24 minutes. When compared to T2 activations, T1 patients were more likely to have been involved in road traffic injuries (RTIs); experienced head and chest injuries; presented with higher Injury Severity Score (ISS: median = 22); and had prolonged OST (27 minutes) and reduced TPT (65 minutes; P = .001). Prolonged OST was found to be associated with higher mortality in T1 patients, whereas TPT was not associated.
In-hospital mortality was independent of TPT but associated with longer OST in severely injured patients. The survival benefit may extend beyond the golden hour and may depend on the injury characteristics, prehospital, and in-hospital settings.
To describe the pattern of transmission of SARS-CoV-2 during 2 nosocomial outbreaks of COVID-19 with regard to the possibility of airborne transmission.
Contact investigations with active case finding were used to assess the pattern of spread from 2 COVID-19 index patients.
A community hospital and university medical center in the United States, in February and March, 2020, early in the COVID-19 pandemic.
Two index patients and 421 exposed health care workers.
Exposed staff were identified by analyzing the EMR and conducting active case finding in combination with structured interviews. Staff were tested for COVID-19 by obtaining oropharyngeal/nasopharyngeal specimens, with RT-PCR testing to detect SARS-CoV-2.
Two separate index patients were admitted in February and March 2020, without initial suspicion for COVID-19 and without contact or droplet precautions in place; both patients underwent several aerosol generating procedures in this context. A total of 421 health care workers were exposed in total, and the results of the case contact investigations identified 8 secondary infections in health care workers. In all 8 cases, the staff had close contact with the index patients without sufficient personal protective equipment. Importantly, despite multiple aerosol generating procedures, there was no evidence of airborne transmission.
These observations suggest that, at least in a healthcare setting, a majority of SARS-CoV-2 transmission is likely to take place during close contact with infected patients through respiratory droplets, rather than by long-distance airborne transmission.
The 2019 coronavirus disease (COVID-19) pandemic has led to physical distancing measures in numerous countries in an attempt to control the spread. However, these measures are not without cost to the health and economies of the nations in which they are enacted. Nations are now looking for methods to remove physical distancing measures and return to full functioning. To prevent a massive second wave of infections, this must be done with a data-driven methodology. The purpose of this article is to propose an algorithm for COVID-19 testing that would allow for physical distancing to be scaled back in a stepwise manner, which limits ensuing infections and protects the capacity of the health care system.
Colombia is the fourth largest country in South America. It is an upper middle-income country with an estimated population of 49.2 million people, and road traffic collisions (RTCs) are the second most common cause of traumatic death. The United Nations (UN) proclaimed 2011 to 2020 as the “Decade of Action for Road Safety.” In this context, the government of Colombia established the National Road Safety Plan (PNSV) for the period 2011-2021, aiming to reduce RTC-related deaths by 26%. Some road safety laws (RSLs) were implemented before the PNSV, but their impact on deaths and injuries is still not known.
The aim of this study was to evaluate whether these RSLs have had a long-term effect on road safety in the country.
Data on RTC casualties, deaths, and injuries from January 1, 2001 through December 31, 2017 were collated from official Colombian governmental publications. Three different periods were considered for analysis: 2001-2010 to evaluate the Transit Code; 2011-2017 to evaluate the PNSV; and 2001-2017 to evaluate a composite of the full study period. Analyses of trends in deaths and injuries were related to dates of new RSLs.
A total of 102,723 deaths (12.7%) and 707,778 injuries (87.3%) were reported from 2001 through 2017. The Transit Code period (2001-2010) showed a 10.1% decline in deaths, 16.6% decline in injuries, and rates per 100,000 inhabitants and per 10,000 registered vehicles also declined. During the period of the PNSV (2011-2017), there was an increase in the number of deaths by 16.6%, injuries decreased by 1.7%, and death rates per 100,000 inhabitants also increased. During the total study period, a 12.4% reduction in the total number of casualties was achieved, and death and injury rates per 100,000 inhabitants decreased by 12.4% and 27.5%, respectively.
Despite the introduction of the PNSV, RTCs remain the second most common cause of preventable death in Colombia. Overall, while the absolute number of RTCs and deaths has been increasing, the rate of RTCs per 10,000 registered vehicles has been decreasing. This suggests that although the goals of the PNSV may not be realized, some of the laws emanating from it may be having a beneficial effect. Further study is required over a protracted period to determine the longer-term impact of these initiatives.
During the COVID-19 pandemic, the antimicrobial stewardship module in our electronic medical record was reconfigured for the management of COVID-19 patients. This change allowed our subspecialist providers to review charts quickly to optimize potential therapy and management during the patient surge.
As the climate changes and ecosystems shift toward novel combinations of species, the methods and metrics of conservation science are becoming less species-centric. To meet this growing need, marine conservation paleobiologists stand to benefit from the addition of new, taxon-free benthic indices to the live–dead analysis tool kit. These indices, which were developed to provide actionable, policy-specific data, can be applied to the readily preservable component of benthic communities (e.g., mollusks) to assess the ecological quality status of the entire community. Because these indices are taxon-free, they remain applicable even as the climate changes and novel communities develop—making them a potentially valuable complement to traditionally applied approaches for live–dead analysis, which tend to focus on maintaining specific combinations of species under relatively stable environmental conditions. Integrating geohistorical data with these established indices has potential to increase the salience of the live–dead approach in the eyes of resource managers and other stakeholders.
Emergency Medical Services (EMS) systems have developed protocols for prehospital activation of the cardiac catheterization laboratory for patients with suspected ST-elevation myocardial infarction (STEMI) to decrease first-medical-contact-to-balloon time (FMC2B). The rate of “false positive” prehospital activations is high. In order to decrease this rate and expedite care for patients with true STEMI, the American Heart Association (AHA; Dallas, Texas USA) developed the Mission Lifeline PreAct STEMI algorithm, which was implemented in Los Angeles County (LAC; California USA) in 2015. The hypothesis of this study was that implementation of the PreAct algorithm would increase the positive predictive value (PPV) of prehospital activation.
This is an observational pre-/post-study of the effect of the implementation of the PreAct algorithm for patients with suspected STEMI transported to one of five STEMI Receiving Centers (SRCs) within the LAC Regional System. The primary outcome was the PPV of cardiac catheterization laboratory activation for percutaneous coronary intervention (PCI) or coronary artery bypass graft (CABG). The secondary outcome was FMC2B.
A total of 1,877 patients were analyzed for the primary outcome in the pre-intervention period and 405 patients in the post-intervention period. There was an overall decrease in cardiac catheterization laboratory activations, from 67% in the pre-intervention period to 49% in the post-intervention period (95% CI for the difference, -14% to -22%). The overall rate of cardiac catheterization declined in post-intervention period as compared the pre-intervention period, from 34% to 30% (95% CI, for the difference -7.6% to 0.4%), but actually increased for subjects who had activation (48% versus 58%; 95% CI, 4.6%-15.0%). Implementation of the PreAct algorithm was associated with an increase in the PPV of activation for PCI or CABG from 37.9% to 48.6%. The overall odds ratio (OR) associated with the intervention was 1.4 (95% CI, 1.1-1.8). The effect of the intervention was to decrease variability between medical centers. There was no associated change in average FMC2B.
The implementation of the PreAct algorithm in the LAC EMS system was associated with an overall increase in the PPV of cardiac catheterization laboratory activation.
Clinically, OC-checkers often report staring compulsions and “lack of action completion” sensations, which have been linked to self-agency alterations. Belayachi and Van der Linden (2009) theoretically proposed that “abnormal” checkers self-agency could be due to an over-reliability on environmental cues and to a tendency to specify actions in a procedural and inflexible way, conceiving them as “low-level” agents. Currently, no studies have experimentally address this issue.
To investigate self-agency in OC-checkers subtype, measuring gaze agency (the ability to understand that we can cause events through our eye movements) and taking into account both agency beliefs and agency feelings.
13 OC-checkers and 13 healthy controls underwent two tasks. “Discovery” task, a completely novel task used to examine causal learning abilities. Subjects watched bouncing balls on a computer screen with the aim of discovering the cause of concurrently presented acoustical beeps. “Detection” task, a two-alternative forced choice task that required subjects to tell whether or not the beeps were generated by their own eye movements.
– lower performance scores and confidence ratings when they have to self-attribute the beep cause, but not eye behavioral differences, during discovery task;
– lower confidence ratings, but a level of accuracy similar to that of controls, during detection task.
Checkers do not show an altered self-agency per se, but what we have called a “doubtful” self-agency: indeed, we argue that agency beliefs alterations found during Discovery task can be due to pathological doubt, rather than to altered agency feelings.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Disasters are high-acuity, low-frequency events which require medical providers to respond in often chaotic settings. Due to this infrequency, skills can atrophy, so providers must train and drill to maintain them. Historically, drilling for disaster response has been costly, and thus infrequent. Virtual Reality Environments (VREs) have been demonstrated to be acceptable to trainees, and useful for training Disaster Medicine skills. The improved cost of virtual reality training can allow for increased frequency of simulation and training.
The problem addressed was to create a novel Disaster Medicine VRE for training and drilling.
A VRE was created using SecondLife (Linden Lab; San Francisco, California USA) and adapted for use in Disaster Medicine training and drilling. It is easily accessible for the end-users (trainees), and is adaptable for multiple scenario types due to the presence of varying architecture and objects. Victim models were created which can be role played by educators, or can be virtual dummies, and can be adapted for wide ranging scenarios. Finally, a unique physiologic simulator was created which allows for dummies to mimic disease processes, wounds, and treatment outcomes.
The VRE was created and has been used extensively in an academic setting to train medical students, as well as to train and drill disaster responders.
This manuscript presents a new VRE for the training and drilling of Disaster Medicine scenarios in an immersive, interactive experience for trainees.
Stressful experiences affect biological stress systems, such as the hypothalamic–pituitary–adrenal (HPA) axis. Life stress can potentially alter regulation of the HPA axis and has been associated with poorer physical and mental health. Little, however, is known about the relative influence of stressors that are encountered at different developmental periods on acute stress reactions in adulthood. In this study, we explored three models of the influence of stress exposure on cortisol reactivity to a modified version of the Trier Social Stress Test (TSST) by leveraging 37 years of longitudinal data in a high-risk birth cohort (N = 112). The cumulative stress model suggests that accumulated stress across the lifespan leads to dysregulated reactivity, whereas the biological embedding model implicates early childhood as a critical period. The sensitization model assumes that dysregulation should only occur when stress is high in both early childhood and concurrently. All of the models predicted altered reactivity, but do not anticipate its exact form. We found support for both cumulative and biological embedding effects. However, when pitted against each other, early life stress predicted more blunted cortisol responses at age 37 over and above cumulative life stress. Additional analyses revealed that stress exposure in middle childhood also predicted more blunted cortisol reactivity.
Sleep apnea is one of the most common sleep disorders and it is related to multiple negative health consequences. Previous studies have shown that sleep apnea is influenced by genetic factors. However, studies have not investigated the genetic and environmental influences of symptoms of sleep apnea in young adults. Furthermore, the underpinnings of the relationship between apnea symptoms and internalizing/externalizing problems are unknown. The objectives of this study were to estimate the magnitude of: (1) genetic and environmental influences on self-reported apnea symptoms; (2) the relationship between self-reported apnea symptoms and internalizing/externalizing traits; (3) genetic and environmental influences on the associations between self-reported apnea symptoms, internalizing behaviors and externalizing behaviors.
In a twin/sibling study, univariate and multivariate models were fitted to estimate both individual variance and sources of covariance between symptoms of sleep apnea and internalizing/externalizing behaviors.
Our results show that genetic influences account for 40% of the variance in sleep apnea symptoms. Moreover, there are modest associations between depression, anxiety and externalizing behaviors with apnea symptoms (ranging from r = 0.22–0.29). However, the origins of these associations differ. For example, whereas most of the covariation between symptoms of depression and sleep apnea can be explained by genes (95%), there was a larger role for the environment (53%) in the association between symptoms of anxiety and sleep apnea.
Genetic factors explain a significant proportion of variance in symptoms of apnea and most of the covariance with depression.
The gomphotheres were a diverse and widespread group of proboscideans occupying Eurasia, North America, and South America throughout the Neogene. Their decline was temporally and spatially heterogeneous, and the gomphotheres ultimately became extinct during the late Pleistocene; however, the genus Cuvieronius is rarely represented in late Pleistocene assemblages in North America. Two alternative hypotheses have been invoked to explain this phenomenon: (1) competitive exclusion by sympatric mammoths and mastodons or (2) ecologic displacement due to an environmental transition from closed forests to open grasslands. To test whether competition for resources contributed to the demise of North American Cuvieronius, we present herein a large collection of stable isotope and dental microwear data from populations occupying their Pleistocene refugium in the Atlantic Coastal Plain. Results suggest that Cuvieronius consumed a wide range of resources with variable textural and photosynthetic properties and was not specialized on either grasses or browse. Further, we document evidence for the consumption of similar foods between contemporaneous gomphotheres, mammoths, and mastodons. The generalist feeding strategy of the gomphotheres likely facilitated their high Miocene abundance and diversity. However, this “jack of all trades and master of none” feeding strategy may have proved challenging following the arrival of mammoths and likely contributed to the extirpation of Cuvieronius in North America.
This chapter reviews the systematics of partial melting of mantle lithologies – like peridotite and eclogite – in the presence of carbon dioxide. It discusses the composition of mantle-derived magmas generated in the presence of carbon dioxide and whether magmas erupted on Earth’s surface resemble carbonated magmas from the mantle. It reviews how the production of carbon dioxide-rich magma in the mantle varies as a function of tectonic settings – beneath continents and oceans and in subduction zones – and time.