To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The White River Badlands (WRB) of South Dakota record eolian activity spanning the late Pleistocene through the latest Holocene (21 ka to modern), reflecting the effects of the last glacial period and Holocene climate fluctuations (Holocene Thermal Maximum, Medieval Climate Anomaly, and Little Ice Age). The WRB dune fields are important paleoclimate indicators in an area of the Great Plains with few climate proxies. The goal of this study is to use 1 m/pixel-resolution digital elevation models from drone imagery to distinguish Early to Middle Holocene parabolic dunes from Late Holocene parabolic dunes. Results indicate that relative ages of dunes are distinguished by slope and roughness (terrain ruggedness index). Morphological differences are attributed to postdepositional wind erosion, soil formation, and mass wasting. Early to Middle Holocene and Late Holocene paleowind directions, 324°± 13.1° (N = 7) and 323° ± 3.0° (N = 19), respectively, are similar to the modern wind regime. Results suggest significant landscape resilience to wind erosion, which resulted in preservation of a mosaic of Early and Late Holocene parabolic dunes. Quantification of dune characteristics will help refine the chronology of eolian activity in the WRB, provide insight into drought-driven landscape evolution, and integrate WRB eolian activity in a regional paleoenvironmental context.
Environmental conditions during the early life stages of birds can have significant effects on the quality of sexual signals in adulthood, especially song, and these ultimately have consequences for breeding success and fitness. This has wide-ranging implications for the rehabilitation protocols undertaken in wildlife hospitals which aim to return captive-reared animals to their natural habitat. Here we review the current literature on bird song development and learning in order to determine the potential impact that the rearing of juvenile songbirds in captivity can have on rehabilitation success. We quantify the effects of reduced learning on song structure and relate this to the possible effects on an individual's ability to defend a territory or attract a mate. We show the importance of providing a conspecific auditory model for birds to learn from in the early stages post-fledging, either via live- or tape-tutoring and provide suggestions for tutoring regimes. We also highlight the historical focus on learning in a few model species that has left an information gap in our knowledge for most species reared at wildlife hospitals.
Of the wildlife casualties admitted to rehabilitation centres in England, less than half are subsequently released back into the wild. If the factors associated with survival within rehabilitation centres can be determined, they may be used to focus efforts on individuals with high chances of successful recovery, and thus improve welfare by devoting resources to those animals that are more likely to benefit. We analysed the medical record cards of eight species admitted to four wildlife rehabilitation centres run by the Royal Society for the Prevention of Cruelty to Animals between 2000-2004 to determine those factors that affected the chance of survival in care until release, and whether trends in predictive factors occurred across taxonomic groups. We found that the most important predictive factor, across all species, was the severity of the symptoms of injury or illness. Factors commonly used as important indicators of rehabilitation success in published practice guidelines, such as mass and age, were not found to affect survival significantly. Our results highlight the importance of triage based on clinical diagnosis as soon as a wildlife casualty is admitted, and indicate that although the ethos of many rehabilitation centres is to attempt the treatment of all wildlife casualties, the attempted treatment of those with severe injuries may be adversely affecting welfare by prolonging suffering.
People who possess greater mathematical skills (i.e., numeracy) are generally more accurate in interpreting numerical data than less numerate people. However, recent evidence has suggested that more numerate people may use their numerical skills to interpret data only if their initial interpretation conflicts with their worldview. That is, if an initial, intuitive (but incorrect) interpretation of data appears to disconfirm one’s beliefs, then numerical skills are used to further process the data and reach the correct interpretation, whereas numerical skills are not used in situations where an initial incorrect interpretation of the data appears to confirm one’s beliefs (i.e., motivated numeracy). In the present study, participants were presented with several data problems, some with correct answers confirming their political views and other disconfirming their views. The difficulty of these problems was manipulated to examine how numeracy would influence the rate of correct responses on easier vs. more difficult problems. Results indicated that participants were more likely to answer problems correctly if the correct answer confirmed rather than disconfirmed their political views, and this response pattern did not depend on problem difficulty or numerical skill. Although more numerate participants were more accurate overall, this was true both for problems in which the correct answer confirmed and disconfirmed participants’ political views.
Falls in older adulthood can have serious consequences. It is therefore important to identify ways to prevent falls, particularly from the voice of older adults. Bottom-up qualitative exploration of the perspectives of older adults can provide rich insights that can help inform the development of effective fall prevention programmes. However, currently there is a dearth of such empirical data, especially among urban-dwelling older adults in high-density cities where fall rates are high. The current study aimed to examine qualitatively perceptions of neighbourhood physical environment in relation to falls, perceived risks and fear of falling, and strategies and behaviours for fall prevention in a sample of urban-dwelling older adults in the high-density city of Hong Kong. Face-to-face semi-structured in-depth interviews were conducted with 50 community-dwelling older adults. Interviews were transcribed verbatim and analysed using thematic analysis techniques. Five general themes were revealed: risks and circumstances of falls, consequences of falls, fear of falling and its consequences, neighbourhood environment, and strategies and behaviours of fall prevention. While older adults discussed the risks of falling and held a fear of falling, these beliefs were mixed. In addition to fall prevention strategies (e.g. keep balance), current findings highlighted the importance of establishing protective factors (e.g. flat and even walking paths) and reducing risk factors (e.g. neighbourhood clutter) in neighbourhood environments. For urban-dwelling older adults in high-density cities, current findings highlight the importance of focusing efforts at the built environment level in addition to strategies and behaviours of fall prevention at the individual level.
To determine the proportion of hospitals that implemented 6 leading practices in their antimicrobial stewardship programs (ASPs). Design: Cross-sectional observational survey.
Advance letters and electronic questionnaires were initiated February 2020. Primary outcomes were percentage of hospitals that (1) implemented facility-specific treatment guidelines (FSTG); (2) performed interactive prospective audit and feedback (PAF) either face-to-face or by telephone; (3) optimized diagnostic testing; (4) measured antibiotic utilization; (5) measured C. difficile infection (CDI); and (6) measured adherence to FSTGs.
Of 948 hospitals invited, 288 (30.4%) completed the questionnaire. Among them, 82 (28.5%) had <99 beds, 162 (56.3%) had 100–399 beds, and 44 (15.2%) had ≥400+ beds. Also, 230 (79.9%) were healthcare system members. Moreover, 161 hospitals (54.8%) reported implementing FSTGs; 214 (72.4%) performed interactive PAF; 105 (34.9%) implemented procedures to optimize diagnostic testing; 235 (79.8%) measured antibiotic utilization; 258 (88.2%) measured CDI; and 110 (37.1%) measured FSTG adherence. Small hospitals performed less interactive PAF (61.0%; P = .0018). Small and nonsystem hospitals were less likely to optimize diagnostic testing: 25.2% (P = .030) and 21.0% (P = .0077), respectively. Small hospitals were less likely to measure antibiotic utilization (67.8%; P = .0010) and CDI (80.3%; P = .0038). Nonsystem hospitals were less likely to implement FSTGs (34.3%; P < .001).
Significant variation exists in the adoption of ASP leading practices. A minority of hospitals have taken action to optimize diagnostic testing and measure adherence to FSTGs. Additional efforts are needed to expand adoption of leading practices across all acute-care hospitals with the greatest need in smaller hospitals.
The coronavirus disease 2019 (COVID-19) pandemic rocked the world, spurring the collapse of national commerce, international trade, education, air travel, and tourism. The global economy has been brought to its knees by the rapid spread of infection, resulting in widespread illness and many deaths. The rise in nationalism and isolationism, ethnic strife, disingenuous governmental reporting, lockdowns, travel restrictions, and vaccination misinformation have caused further problems. This has brought into stark relief the need for improved disease surveillance and health protection measures. National and international agencies that should have provided earlier warning in fact failed to do so. A robust global health network that includes enhanced cooperation with Military Intelligence, Surveillance, and Reconnaissance (ISR) assets in conjunction with the existing international, governmental, and nongovernment medical intelligence networks and allies and partners would provide exceptional forward-looking and early-warning and is a proactive step toward making our future safe. This will be achieved both by surveilling populations for new biothreats, fusing and disseminating data, and then reaching out to target assistance to reduce disease spread in unprotected populations.
Adolescence is characterized by profound change, including increases in negative emotions. Approximately 84% of American adolescents own a smartphone, which can continuously and unobtrusively track variables potentially predictive of heightened negative emotions (e.g. activity levels, location, pattern of phone usage). The extent to which built-in smartphone sensors can reliably predict states of elevated negative affect in adolescents is an open question.
Adolescent participants (n = 22; ages 13–18) with low to high levels of depressive symptoms were followed for 15 weeks using a combination of ecological momentary assessments (EMAs) and continuously collected passive smartphone sensor data. EMAs probed negative emotional states (i.e. anger, sadness and anxiety) 2–3 times per day every other week throughout the study (total: 1145 EMA measurements). Smartphone accelerometer, location and device state data were collected to derive 14 discrete estimates of behavior, including activity level, percentage of time spent at home, sleep onset and duration, and phone usage.
A personalized ensemble machine learning model derived from smartphone sensor data outperformed other statistical approaches (e.g. linear mixed model) and predicted states of elevated anger and anxiety with acceptable discrimination ability (area under the curve (AUC) = 74% and 71%, respectively), but demonstrated more modest discrimination ability for predicting states of high sadness (AUC = 66%).
To the extent that smartphone data could provide reasonably accurate real-time predictions of states of high negative affect in teens, brief ‘just-in-time’ interventions could be immediately deployed via smartphone notifications or mental health apps to alleviate these states.
Sparse recent data are available on the epidemiology of surgical site infections (SSIs) in community hospitals. Our objective was to provide updated epidemiology data on complex SSIs in community hospitals and to characterize trends of SSI prevalence rates over time.
Retrospective cohort study.
SSI data were collected from patients undergoing 26 commonly performed surgical procedures at 32 community hospitals in the southeastern United States from 2013 to 2018. SSI prevalence rates were calculated for each year and were stratified by procedure and causative pathogen.
Over the 6-year study period, 3,561 complex (deep incisional or organ-space) SSIs occurred following 669,467 total surgeries (prevalence rate, 0.53 infections per 100 procedures). The overall complex SSI prevalence rate did not change significantly during the study period: 0.58 of 100 procedures in 2013 versus 0.53 of 100 procedures in 2018 (prevalence rate ratio [PRR], 0.84; 95% CI, 0.66–1.08; P = .16). Methicillin-sensitive Staphylococcus aureus (MSSA) complex SSIs (n = 480, 13.5%) were more common than complex SSIs caused by methicillin-resistant S. aureus (MRSA; n = 363, 10.2%).
The complex SSI rate did not decrease in our cohort of community hospitals from 2013 to 2018, which is a change from prior comparisons. The reason for this stagnation is unclear. Additional research is needed to determine the proportion of or remaining SSIs that are preventable and what measures would be effective to further reduce SSI rates.
This article is a clinical guide which discusses the “state-of-the-art” usage of the classic monoamine oxidase inhibitor (MAOI) antidepressants (phenelzine, tranylcypromine, and isocarboxazid) in modern psychiatric practice. The guide is for all clinicians, including those who may not be experienced MAOI prescribers. It discusses indications, drug-drug interactions, side-effect management, and the safety of various augmentation strategies. There is a clear and broad consensus (more than 70 international expert endorsers), based on 6 decades of experience, for the recommendations herein exposited. They are based on empirical evidence and expert opinion—this guide is presented as a new specialist-consensus standard. The guide provides practical clinical advice, and is the basis for the rational use of these drugs, particularly because it improves and updates knowledge, and corrects the various misconceptions that have hitherto been prominent in the literature, partly due to insufficient knowledge of pharmacology. The guide suggests that MAOIs should always be considered in cases of treatment-resistant depression (including those melancholic in nature), and prior to electroconvulsive therapy—while taking into account of patient preference. In selected cases, they may be considered earlier in the treatment algorithm than has previously been customary, and should not be regarded as drugs of last resort; they may prove decisively effective when many other treatments have failed. The guide clarifies key points on the concomitant use of incorrectly proscribed drugs such as methylphenidate and some tricyclic antidepressants. It also illustrates the straightforward “bridging” methods that may be used to transition simply and safely from other antidepressants to MAOIs.
Background: CIDP is an autoimmune polyneuropathy. Antibodies against the Node of Ranvier have been described, NF155,NF140/186 and contactin-1. Methods: A retrospective review of patients with CIDP who tested positive for antinodal/paranodal antibodies via Western blot were evaluated. We have included 20 sero-negative CIDP patients. All patients met definite or probable EFNS criteria. clinical, electrophysiological data and response to treatment were obtained. Results: Forty-five patients tested positive for the antibodies. Sixteen were positive for NF155,11 for NF140, 5 for CNTN1,11 were double positive for NF155 and NF140, and 3 were triple positive for NF155, NF140 and CNTN. Age of onset was similar in both seronegative (53.9 ± 3.1 yrs.) versus seropositive(52.3 ± 2.4 yrs.).Chronic presentation manifested in 85% of seronegative, 80% of seropositivepatients.Intrestingly,all triple-positive patients presented with a more acute presentation(i.e,<8 wks.)7/20 seronegative (35%),1/16 NF155, 6/11 NF140,1/5 contactin, 2/11 of doublepositive, 3/3 of triple-positive (28%,13/46) responded to IVIg. Conclusions: No major clinical or electrophysiological differences between groups. triple-positive patients showed 100% response to IVIg.These results cast doubt on the specificity of the Western blot as a clinico-electrophysiologic discriminator. Future testing with cellbased assays will likely provide a robust measure that will guide treatment decision.
In recent years, the use of analytics and data mining – methodologies that extract useful information from large datasets – has become commonplace in science and business. When these methods are used in education, they are referred to as learning analytics (LA) and educational data mining (EDM). For example, adaptive learning platforms – those that respond uniquely to each learner – require learning analytics to model the learner’s current state of knowledge. The researcher can conduct second-by-second analyses of phenomena that occur over long periods of time or in an individual learning session. Large datasets are required for these analyses. In most cases, the data are gathered automatically – such as keystrokes, eye movement, or assessments – and are analyzed using algorithms based in learning sciences research. This chapter reviews prediction methods, structure discovery, relationship mining, and discovery with models.
Background: CIDP is an autoimmune polyneuropathy. Antibodies against the Node of Ranvier have been described, NF155,NF140/186 and contactin-1. Methods: A retrospective review of patients with CIDP who tested positive for anti-nodal/paranodal antibodies via Western blot were evaluated. We have included 20 sero-negative CIDP patients. All patients met definite or probable EFNS criteria. clinical, electrophysiological data and response to treatment were obtained. Results: Forty-five patients tested positive for the antibodies. Sixteen were positive for NF155, 11 for NF140, 5 for CNTN1,11 were double positive for NF155 and NF140, and 3 were triple positive for NF155, NF140 and CNTN1.
Age of onset was similar in both seronegative (53.9 ± 3.1 yrs.) versus seropositive (52.3 ± 2.4 yrs.).
Chronic presentation manifested in 85% of seronegative, 80% of seropositive patients.Intrestingly,all triple-positive patients presented with a more acute presentation (i.e,<8 wks.)
7/20 seronegative (35%),1/16 NF155, 6/11 NF140,1/5 contactin, 2/11 of double-positive, 3/3 of triple-positive (28%,13/46) responded to IVIg. Conclusions: No major clinical or electrophysiological differences between groups. triple-positive patients showed 100% response to IVIg.These results cast doubt on the specificity of the Western blot as a clinico-electrophysiologic discriminator. Future testing with cell-based assays will likely provide a robust measure that will guide treatment decision.
Identifying the most effective ways to support career development of early stage investigators in clinical and translational science should yield benefits for the biomedical research community. Institutions with Clinical and Translational Science Awards (CTSA) offer KL2 programs to facilitate career development; however, the sustained impact has not been widely assessed.
A survey comprised of quantitative and qualitative questions was sent to 2144 individuals that had previously received support through CTSA KL2 mechanisms. The 547 responses were analyzed with identifying information redacted.
Respondents held MD (47%), PhD (36%), and MD/PhD (13%) degrees. After KL2 support was completed, physicians’ time was divided 50% to research and 30% to patient care, whereas PhD respondents devoted 70% time to research. Funded research effort averaged 60% for the cohort. Respondents were satisfied with their career progression. More than 95% thought their current job was meaningful. Two-thirds felt confident or very confident in their ability to sustain a career in clinical and translational research. Factors cited as contributing to career success included protected time, mentoring, and collaborations.
This first large systematic survey of KL2 alumni provides valuable insight into the group’s perceptions of the program and outcome information. Former scholars are largely satisfied with their career choice and direction, national recognition of their expertise, and impact of their work. Importantly, they identified training activities that contributed to success. Our results and future analysis of the survey data should inform the framework for developing platforms to launch sustaining careers of translational scientists.
In the U.S. approximately11.4 million misused prescription pain relievers; 2.1 million had an OUD in 2017. The Addictions Nursing Subspecialty was created to address this epidemic by expanding a workforce trained in OUD/SUD screening, treatment, and prevention. A curriculum was developed that included integrated/telehealth health care settings in medical and mental health provider shortage areas during their last nine months of training. Courses were developed and taught by aninterprofessional team of university faculty and informed by evidence-based guidelines/clinical competencies for effective OUD/SUD screening/prevention, assessment, treatment, and recovery. Courses were also offered as electives for nursing, clinical-counseling, social work, and other health science disciplines emphasizing an interdisciplinary approach to healthcare.
Expand the OUD/SUD trained workforce in areas with high OUD/SUD mortality rates and high mental health provider shortages emphasizing team-based integrated care and telehealth settings.
Program curriculum was informed by evidence-based guidelines/clinical competencies for effective OUD/SUD screening/prevention, assessment, treatment, and recovery using integrated care. Competencies included: Core Competencies for Integrated Behavioral Health and Primary Care that have been set forth by the Center for Integrated Health Solutions, telehealth competencies outlined in the recommended competencies by the National Organization of Nurse Practitioner Faculties (NONPF), and Core Competencies for Addictions Medicine by the American Board of Addictions Medicine.
Approximately 11 students enrolled in courses received additions integrated/telehealth health care settings. Students responded positively to evaluations regarding timely feedback, unique approach (i.e. intrative content, short videos and discussions).
The Addictions Nursing subspecialty will continue to be offered allowing enrollment for nurses twice a year.
Obesity remains a serious relevant public health concern throughout the world despite related countermeasures being well understood (i.e. mainly physical activity and an adjusted diet). Among different nutritional approaches, there is a growing interest in ketogenic diets (KD) to manipulate body mass (BM) and to enhance fat mass loss. KD reduce the daily amount of carbohydrate intake drastically. This results in increased fatty acid utilisation, leading to an increase in blood ketone bodies (acetoacetate, 3-β-hydroxybutyrate and acetone) and therefore metabolic ketosis. For many years, nutritional intervention studies have focused on reducing dietary fat with little or conflicting positive results over the long term. Moreover, current nutritional guidelines for athletes propose carbohydrate-based diets to augment muscular adaptations. This review discusses the physiological basis of KD and their effects on BM reduction and body composition improvements in sedentary individuals combined with different types of exercise (resistance training or endurance training) in individuals with obesity and athletes. Ultimately, we discuss the strengths and the weaknesses of these nutritional interventions together with precautionary measures that should be observed in both individuals with obesity and athletic populations. A literature search from 1921 to April 2021 using Medline, Google Scholar, PubMed, Web of Science, Scopus and Sportdiscus Databases was used to identify relevant studies. In summary, based on the current evidence, KD are an efficient method to reduce BM and body fat in both individuals with obesity and athletes. However, these positive impacts are mainly because of the appetite suppressive effects of KD, which can decrease daily energy intake. Therefore, KD do not have any superior benefits to non-KD in BM and body fat loss in individuals with obesity and athletic populations in an isoenergetic situation. In sedentary individuals with obesity, it seems that fat-free mass (FFM) changes appear to be as great, if not greater, than decreases following a low-fat diet. In terms of lean mass, it seems that following a KD can cause FFM loss in resistance-trained individuals. In contrast, the FFM-preserving effects of KD are more efficient in endurance-trained compared with resistance-trained individuals.