We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Despite the impact of inappropriate prescribing on antibiotic resistance, data on surgical antibiotic prophylaxis in sub-Saharan Africa are limited. In this study, we evaluated antibiotic use and consumption in surgical prophylaxis in 4 hospitals located in 2 geographic regions of Sierra Leone.
Methods:
We used a prospective cohort design to collect data from surgical patients aged 18 years or older between February and October 2021. Data were analyzed using Stata version 16 software.
Results:
Of the 753 surgical patients, 439 (58.3%) were females, and 723 (96%) had received at least 1 dose of antibiotics. Only 410 (54.4%) patients had indications for surgical antibiotic prophylaxis consistent with local guidelines. Factors associated with preoperative antibiotic prophylaxis were the type of surgery, wound class, and consistency of surgical antibiotic prophylaxis with local guidelines. Postoperatively, type of surgery, wound class, and consistency of antibiotic use with local guidelines were important factors associated with antibiotic use. Of the 2,482 doses administered, 1,410 (56.8%) were given postoperatively. Preoperative and intraoperative antibiotic use was reported in 645 (26%) and 427 (17.2%) cases, respectively. The most commonly used antibiotic was ceftriaxone 949 (38.2%) with a consumption of 41.6 defined daily doses (DDD) per 100 bed days. Overall, antibiotic consumption was 117.9 DDD per 100 bed days. The Access antibiotics had 72.7 DDD per 100 bed days (61.7%).
Conclusions:
We report a high rate of antibiotic consumption for surgical prophylaxis, most of which was not based on local guidelines. To address this growing threat, urgent action is needed to reduce irrational antibiotic prescribing for surgical prophylaxis.
Academic discovery in biomedicine is a growing enterprise with tens of billions of dollars in research funding available to universities and hospitals. Protecting and optimizing the resultant intellectual property is required in order for the discoveries to have an impact on society. To achieve that, institutions must create a multidisciplinary, collaborative system of review and support, and utilize connections to industry partners. In this study, we outline the efforts of Case Western Reserve University, coordinated through its Clinical and Translational Science Collaborative (CTSC), to promote entrepreneurial culture, and achieve goals of product development and startup formation for biomedical and population health discoveries arising from the academic ecosystem in Cleveland. The CTSC Office of Translation and Innovation, with the university’s Technology Transfer Office (TTO), helps identify and derisk promising IP while building interdisciplinary project teams to optimize the assets through key preclinical derisking steps. The benefits of coordinating funding across multiple programs, assuring dedicated project management to oversee optimizing the IP, and ensuring training to help improve proposals and encourage an entrepreneurial culture, are discussed in the context of a case study of therapeutic assets, the Council to Advance Human Health. This case study highlights best practices in academic innovation.
This chapter combines all available evidence to reassess the archaeological signature of Corinth’s destruction by the Romans in 146 B.C. and its refoundation as a Roman colony.
Stretcher transport isolators provide mobile, high-level biocontainment outside the hospital for patients with highly infectious diseases, such as Ebola virus disease. Air quality within this confined space may pose human health risks.
Methods:
Ambient air temperature, relative humidity, and CO2 concentration were monitored within an isolator during 2 operational exercises with healthy volunteers, including a ground transport exercise of approximately 257 miles. In addition, failure of the blower unit providing ambient air to the isolator was simulated. A simple compartmental model was developed to predict CO2 and H2O concentrations within the isolator.
Results:
In both exercises, CO2 and H2O concentrations were elevated inside the isolator, reaching steady-state values of 4434 ± 1013 ppm CO2 and 22 ± 2 mbar H2O in the first exercise and 3038 ± 269 ppm CO2 and 20 ± 1 mbar H2O in the second exercise. When blower failure was simulated, CO2 concentration exceeded 10 000 ppm within 8 minutes. A simple compartmental model predicted CO2 and H2O concentrations by accounting for human emissions and blower air exchange.
Conclusions:
Attention to air quality within stretcher transport isolators (including adequate ventilation to prevent accumulation of CO2 and other bioeffluents) is needed to optimize patient safety.
Little is known about the determinants of community integration (i.e. recovery) for individuals with a history of homelessness, yet such information is essential to develop targeted interventions.
Methods
We recruited homeless Veterans with a history of psychotic disorders and evaluated four domains of correlates of community integration: perception, non-social cognition, social cognition, and motivation. Baseline assessments occurred after participants were engaged in supported housing services but before they received housing, and again after 12 months. Ninety-five homeless Veterans with a history of psychosis were assessed at baseline and 53 returned after 12 months. We examined both cross-sectional and longitudinal relationships with 12-month community integration.
Results
The strongest longitudinal association was between a baseline motivational measure and social integration at 12 months. We also observed cross-sectional associations at baseline between motivational measures and community integration, including social, work, and independent living. Cross-lagged panel analyses did not suggest causal associations for the motivational measures. Correlations with perception and non-social cognition were weak. One social cognition measure showed a significant longitudinal correlation with independent living at 12 months that was significant for cross-lagged analysis, consistent with a causal relationship and potential treatment target.
Conclusions
The relatively selective associations for motivational measures differ from what is typically seen in psychosis, in which all domains are associated with community integration. These findings are presented along with a partner paper (Study 2) to compare findings from this study to an independent sample without a history of psychotic disorders to evaluate the consistency in findings regarding community integration across projects.
In an initial study (Study 1), we found that motivation predicted community integration (i.e. functional recovery) 12 months after receiving housing in formerly homeless Veterans with a psychotic disorder. The current study examined whether the same pattern would be found in a broader, more clinically diverse, homeless Veteran sample without psychosis.
Methods
We examined four categories of variables as potential predictors of community integration in non-psychotic Veterans: perception, non-social cognition, social cognition, and motivation at baseline (after participants were engaged in a permanent supported housing program but before receiving housing) and a 12-month follow-up. A total of 82 Veterans had a baseline assessment and 41 returned for testing after 12 months.
Results
The strongest longitudinal association was between an interview-based measure of motivation (the motivation and pleasure subscale from the Clinical Assessment Interview for Negative Symptoms) at baseline and measures of social integration at 12 months. In addition, cross-lagged panel analyses were consistent with a causal influence of general psychiatric symptoms at baseline driving social integration at 12 months, and reduced expressiveness at baseline driving independent living at 12 months, but there were no significant causal associations with measures of motivation.
Conclusions
The findings from this study complement and reinforce those in Veterans with psychosis. Across these two studies, our findings suggest that motivational factors are associated at baseline and at 12 months and are particularly important for understanding and improving community integration in recently-housed Veterans across psychiatric diagnoses.
The National Institutes of Health requires data and safety monitoring boards (DSMBs) for all phase III clinical trials. The National Heart, Lung and Blood Institute requires DSMBs for all clinical trials involving more than one site and those involving cooperative agreements and contracts. These policies have resulted in the establishment of DSMBs for many implementation trials, with little consideration regarding the appropriateness of DSMBs and/or key adaptations needed by DSMBs to monitor data quality and participant safety. In this perspective, we review the unique features of implementation trials and reflect on key questions regarding the justification for DSMBs and their potential role and monitoring targets within implementation trials.
The Neotoma Paleoecology Database is a community-curated data resource that supports interdisciplinary global change research by enabling broad-scale studies of taxon and community diversity, distributions, and dynamics during the large environmental changes of the past. By consolidating many kinds of data into a common repository, Neotoma lowers costs of paleodata management, makes paleoecological data openly available, and offers a high-quality, curated resource. Neotoma’s distributed scientific governance model is flexible and scalable, with many open pathways for participation by new members, data contributors, stewards, and research communities. The Neotoma data model supports, or can be extended to support, any kind of paleoecological or paleoenvironmental data from sedimentary archives. Data additions to Neotoma are growing and now include >3.8 million observations, >17,000 datasets, and >9200 sites. Dataset types currently include fossil pollen, vertebrates, diatoms, ostracodes, macroinvertebrates, plant macrofossils, insects, testate amoebae, geochronological data, and the recently added organic biomarkers, stable isotopes, and specimen-level data. Multiple avenues exist to obtain Neotoma data, including the Explorer map-based interface, an application programming interface, the neotoma R package, and digital object identifiers. As the volume and variety of scientific data grow, community-curated data resources such as Neotoma have become foundational infrastructure for big data science.
Giant foxtail, woolly cupgrass, and wild-proso millet infest millions of hectares of land devoted to corn production in the midwestern U.S. Control of these species and effects on corn grain yield were evaluated at various timings using POST applications of nicosulfuron vs. applications of various PRE herbicides at 17 locations across the midwestern U.S. in 1992 and 1993. Nicosulfuron applied to 5 to 10 cm giant foxtail and woolly cupgrass provided greater control than that observed with selected PRE herbicides. Giant foxtail control with nicosulfuron averaged 88%, and control of woolly cupgrass averaged 77% across all sites. Nicosulfuron, applied to 5 to 10 cm wild-proso millet, provided a level of control similar to that of selected PRE herbicides. Corn grain yield was greater when giant foxtail was controlled POST with nicosulfuron vs. PRE control with selected soil-applied herbicides. Corn grain yields were similar when nicosulfuron was applied POST to 5 to 10 cm woolly cupgrass or wild-proso millet vs. PRE control of these grass weeds. Across a broad range of geographical locations, nicosulfuron, applied POST to 5 to 10 cm tall grass, provided greater or similar levels of weed control vs. the selected PRE herbicides, with no deleterious effect on grain yield.
A telephone survey was conducted with growers in Iowa, Illinois, Indiana, Nebraska, Mississippi, and North Carolina to discern the utilization of the glyphosate-resistant (GR) trait in crop rotations, weed pressure, tillage practices, herbicide use, and perception of GR weeds. This paper focuses on survey results regarding herbicide decisions made during the 2005 cropping season. Less than 20% of the respondents made fall herbicide applications. The most frequently used herbicides for fall applications were 2,4-D and glyphosate, and these herbicides were also the most frequently used for preplant burndown weed control in the spring. Atrazine and acetochlor were frequently used in rotations containing GR corn. As expected, crop rotations using a GR crop had a high percentage of respondents that made one to three POST applications of glyphosate per year. GR corn, GR cotton, and non-GR crops had the highest percentage of growers applying non-glyphosate herbicides during the 2005 growing season. A crop rotation containing GR soybean had the greatest negative impact on non-glyphosate use. Overall, glyphosate use has continued to increase, with concomitant decreases in utilization of other herbicides.
Corn and soybean growers in Illinois, Indiana, Iowa, Mississippi, Nebraska, and North Carolina, as well as cotton growers in Mississippi and North Carolina, were surveyed about their views on changes in problematic weeds and weed pressure in cropping systems based on a glyphosate-resistant (GR) crop. No growers using a GR cropping system for more than 5 yr reported heavy weed pressure. Over all cropping systems investigated (continuous GR soybean, continuous GR cotton, GR corn/GR soybean, GR soybean/non-GR crop, and GR corn/non-GR crop), 0 to 7% of survey respondents reported greater weed pressure after implementing rotations using GR crops, whereas 31 to 57% felt weed pressure was similar and 36 to 70% indicated that weed pressure was less. Pigweed, morningglory, johnsongrass, ragweed, foxtail, and velvetleaf were mentioned as their most problematic weeds, depending on the state and cropping system. Systems using GR crops improved weed management compared with the technologies used before the adoption of GR crops. However, the long-term success of managing problematic weeds in GR cropping systems will require the development of multifaceted integrated weed management programs that include glyphosate as well as other weed management tactics.
A phone survey was administered to 1,195 growers in six states (Illinois, Indiana, Iowa, Mississippi, Nebraska, and North Carolina). The survey measured producers' crop history, perception of glyphosate-resistant (GR) weeds, past and present weed pressure, tillage practices, and herbicide use as affected by the adoption of GR crops. This article describes the changes in tillage practice reported in the survey. The adoption of a GR cropping system resulted in a large increase in the percentage of growers using no-till and reduced-till systems. Tillage intensity declined more in continuous GR cotton and GR soybean (45 and 23%, respectively) than in rotations that included GR corn or non-GR crops. Tillage intensity declined more in the states of Mississippi and North Carolina than in the other states, with 33% of the growers in these states shifting to more conservative tillage practices after the adoption of a GR crop. This was primarily due to the lower amount of conservation tillage adoption in these states before GR crop availability. Adoption rates of no-till and reduced-till systems increased as farm size decreased. Overall, producers in a crop rotation that included a GR crop shifted from a relatively more tillage-intense system to reduced-till or no-till systems after implementing a GR crop into their production system.
Over 175 growers in each of six states (Illinois, Indiana, Iowa, Mississippi, Nebraska, and North Carolina) were surveyed by telephone to assess their perceptions of the benefits of utilizing the glyphosate-resistant (GR) crop trait in corn, cotton, and soybean. The survey was also used to determine the weed management challenges growers were facing after using this trait for a minimum of 4 yr. This survey allowed the development of baseline information on how weed management and crop production practices have changed since the introduction of the trait. It provided useful information on common weed management issues that should be addressed through applied research and extension efforts. The survey also allowed an assessment of the perceived levels of concern among growers about glyphosate resistance in weeds and whether they believed they had experienced glyphosate resistance on their farms. Across the six states surveyed, producers reported 38, 97, and 96% of their corn, cotton, and soybean hectarage planted in a GR cultivar. The most widely adopted GR cropping system was a GR soybean/non-GR crop rotation system; second most common was a GR soybean/GR corn crop rotation system. The non-GR crop component varied widely, with the most common crops being non-GR corn or rice. A large range in farm size for the respondents was observed, with North Carolina having the smallest farms in all three crops. A large majority of corn and soybean growers reported using some type of crop rotation system, whereas very few cotton growers rotated out of cotton. Overall, rotations were much more common in Midwestern states than in Southern states. This is important information as weed scientists assist growers in developing and using best management practices to minimize the development of glyphosate resistance.
The final effort of the CLIMAP project was a study of the last interglaciation, a time of minimum ice volume some 122,000 yr ago coincident with the Substage 5e oxygen isotopic minimum. Based on detailed oxygen isotope analyses and biotic census counts in 52 cores across the world ocean, last interglacial sea-surface temperatures (SST) were compared with those today. There are small SST departures in the mid-latitude North Atlantic (warmer) and the Gulf of Mexico (cooler). The eastern boundary currents of the South Atlantic and Pacific oceans are marked by large SST anomalies in individual cores, but their interpretations are precluded by no-analog problems and by discordancies among estimates from different biotic groups. In general, the last interglacial ocean was not significantly different from the modern ocean. The relative sequencing of ice decay versus oceanic warming on the Stage 6/5 oxygen isotopic transition and of ice growth versus oceanic cooling on the Stage 5e/5d transition was also studied. In most of the Southern Hemisphere, the oceanic response marked by the biotic census counts preceded (led) the global ice-volume response marked by the oxygen-isotope signal by several thousand years. The reverse pattern is evident in the North Atlantic Ocean and the Gulf of Mexico, where the oceanic response lagged that of global ice volume by several thousand years. As a result, the very warm temperatures associated with the last interglaciation were regionally diachronous by several thousand years. These regional lead-lag relationships agree with those observed on other transitions and in long-term phase relationships; they cannot be explained simply as artifacts of bioturbational translations of the original signals.
Herbicides are the foundation of weed control in commercial crop-production systems. However, herbicide-resistant (HR) weed populations are evolving rapidly as a natural response to selection pressure imposed by modern agricultural management activities. Mitigating the evolution of herbicide resistance depends on reducing selection through diversification of weed control techniques, minimizing the spread of resistance genes and genotypes via pollen or propagule dispersal, and eliminating additions of weed seed to the soil seedbank. Effective deployment of such a multifaceted approach will require shifting from the current concept of basing weed management on single-year economic thresholds.
Macartney rose is an aggressive thorny shrub that displaces forage species and hinders cattle grazing in rangelands and pastures of the southern United States. Historically, Macartney rose has proven to be extremely difficult to control even with high rates of soil residual herbicides such as picloram. Recent advances in herbicide chemistry warrant testing on this troublesome species. We compared mowing and late summer broadcast applications of thirteen herbicide treatments that included combinations of aminopyralid, fluroxypyr, metsulfuron, picloram, triclopyr, and 2,4-D. Treatments were applied to the same rose clumps for 2 consecutive yr. An additional mowing was done to one half of the rose clumps in each treatment 6 mo after the second herbicide treatment. At 11 mo after initial treatment (MAIT), mowing and all herbicide treatments performed very poorly and provided 35% control or less. At 12 mo after retreatment (24 MAIT), picloram + 2,4-D and aminopyralid + metsulfuron, both followed by mowing, were the most effective treatments, providing 72 to 91% control. All other treatments provided less than 70% control. However, complete clump mortality was very low across all treatments, ranging from 3 to 32%. These results indicate that Macartney rose suppression is possible with certain new herbicides, but complete clump kill is still lacking.
A survey of farmers from six U.S. states (Indiana, Illinois, Iowa, Nebraska, Mississippi, and North Carolina) was conducted to assess the farmers' views on glyphosate-resistant (GR) weeds and tactics used to prevent or manage GR weed populations in genetically engineered (GE) GR crops. Only 30% of farmers thought GR weeds were a serious issue. Few farmers thought field tillage and/or using a non-GR crop in rotation with GR crops would be an effective strategy. Most farmers did not recognize the role that the recurrent use of an herbicide plays in evolution of resistance. A substantial number of farmers underestimated the potential for GR weed populations to evolve in an agroecosystem dominated by glyphosate as the weed control tactic. These results indicate there are major challenges that the agriculture and weed science communities must face to implement long-term sustainable GE GR-based cropping systems within the agroecosystem.
Because individuals develop dementia as a manifestation of neurodegenerative or neurovascular disorder, there is a need to develop reliable approaches to their identification. We are undertaking an observational study (Ontario Neurodegenerative Disease Research Initiative [ONDRI]) that includes genomics, neuroimaging, and assessments of cognition as well as language, speech, gait, retinal imaging, and eye tracking. Disorders studied include Alzheimer’s disease, amyotrophic lateral sclerosis, frontotemporal dementia, Parkinson’s disease, and vascular cognitive impairment. Data from ONDRI will be collected into the Brain-CODE database to facilitate correlative analysis. ONDRI will provide a repertoire of endophenotyped individuals that will be a unique, publicly available resource.