We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To compare 2 methods of communicating polymerase chain reaction (PCR) blood-culture results: active approach utilizing on-call personnel versus passive approach utilizing notifications in the electronic health record (EHR).
Design:
Retrospective observational study.
Setting:
A tertiary-care academic medical center.
Patients:
Adult patients hospitalized with ≥1 positive blood culture containing a gram-positive organism identified by PCR between October 2014 and January 2018.
Methods:
The standard protocol for reporting PCR results at baseline included a laboratory technician calling the patient’s nurse, who would report the critical result to the medical provider. The active intervention group consisted of an on-call pager system utilizing trained pharmacy residents, whereas the passive intervention group combined standard protocol with real-time in-basket notifications to pharmacists in the EHR.
Results:
Of 209 patients, 105, 61, and 43 patients were in the control, active, and passive groups, respectively. Median time to optimal therapy was shorter in the active group compared to the passive group and control (23.4 hours vs 42.2 hours vs 45.9 hours, respectively; P = .028). De-escalation occurred 12 hours sooner in the active group. In the contaminant group, empiric antibiotics were discontinued faster in the active group (0 hours) than in the control group and the passive group (17.7 vs 7.2 hours; P = .007). Time to active therapy and days of therapy were similar.
Conclusions:
A passive, electronic method of reporting PCR results to pharmacists was not as effective in optimizing stewardship metrics as an active, real-time method utilizing pharmacy residents. Further studies are needed to determine the optimal method of communicating time-sensitive information.
The 2017 solar eclipse was associated with mass gatherings in many of the 14 states along the path of totality. The Kentucky Department for Public Health implemented an enhanced syndromic surveillance system to detect increases in emergency department (ED) visits and other health care needs near Hopkinsville, Kentucky, where the point of greatest eclipse occurred.
Methods:
EDs flagged visits of patients who participated in eclipse events from August 17–22. Data from 14 area emergency medical services and 26 first-aid stations were also monitored to detect health-related events occurring during the eclipse period.
Results:
Forty-four potential eclipse event-related visits were identified, primarily injuries, gastrointestinal illness, and heat-related illness. First-aid stations and emergency medical services commonly attended to patients with pain and heat-related illness.
Conclusions:
Kentucky’s experience during the eclipse demonstrated the value of patient visit flagging to describe the disease burden during a mass gathering and to investigate epidemiological links between cases. A close collaboration between public health authorities within and across jurisdictions, health information exchanges, hospitals, and other first-response care providers will optimize health surveillance activities before, during, and after mass gatherings.
The history of social work in the United Kingdom is a long and complex one, and there are no signs of it getting less complex. If the theory and practice of UK social work is of interest to an international audience, it is not just because of the hegemony of the English language, but also because it has often been at the forefront of changes – for good and ill, perhaps. If there is something to learn from the UK experience, it might be as much from the wrong turns and difficulties of the occupation in that divided realm, as from the advances in thinking and practice. This chapter focuses on the contemporary position of social work in the UK, and on the challenges to what is seen as a managerial-technicist version of social work (Harlow, 2003). If the UK is to be seen as at the forefront of the New Public Management and of managerialist reforms in the 1980s, those in other countries and contexts might seek to avoid some of the problems of an ‘early adopter’. It will be apparent here, however, that the future course of the social work occupation in the UK is by no means fully charted. At best we can point to certain possibilities and to social imaginaries that may prove to be the basis for revised forms of practice, however liminal these are. In this discussion the focus will be the degree of professional autonomy of social workers as an occupational group as a feature of these possibilities. First, we focus on the situation from the 1990s to the present day in which this managerial-technicist version of social work takes root and flourishes. Second, we focus on three alternatives that have emerged in recent years. In doing so, we do not intend to provide an indication of the likely direction of travel for UK social work, nor do we assume that these alternatives are the only ones available; we hope merely to draw attention to possible alternatives that might be worth considering and supporting.
The ‘profession’ of social work in the UK can be regarded as occupying a favourable position, in that social work is a protected title and only those who have undergone graduate-level education in social work, in approved courses, can describe themselves as ‘social workers’.
Rural landscapes are increasingly diverse and heterogeneous, involving a mix of small and large parcels, amenity and agricultural properties, and resident and absentee owners. Managing invasive plants in landscapes with changing ownership requires understanding the views and practices of different landowners. We surveyed landowners in two rural valleys with 26% absentee ownership and a large number of small parcels in Missoula County, Montana. Landowners indicated a high level of awareness and concern about weeds; more than 80% agreed that weeds are a problem in their valley. Seventy-eight percent of landowners managed weeds, but only 63% were effective at weed management. Absentee owners were far less likely to manage weeds on their properties and less likely to utilize herbicides, as compared with resident landowners. Landowners reported that seeds coming from adjacent properties were the most significant barrier to effective weed control. Many landowners manage weeds to be a good neighbor and believe that cooperation between neighbors is critical to weed management.
To identify risk factors associated with methicillin-resistant Staphylococcus aureus (MRSA) acquisition in long-term care facility (LTCF) residents.
Design.
Multicenter, prospective cohort followed over 6 months.
Setting.
Three Veterans Affairs (VA) LTCFs.
Participants.
All current and new residents except those with short stay (<2 weeks).
Methods.
MRSA carriage was assessed by serial nares cultures and classified into 3 groups: persistent (all cultures positive), intermittent (at least 1 but not all cultures positive), and noncarrier (no cultures positive). MRSA acquisition was defined by an initial negative culture followed by more than 2 positive cultures with no subsequent negative cultures. Epidemiologic data were collected to identify risk factors, and MRSA isolates were typed by pulsed-field gel electrophoresis (PFGE).
Results.
Among 412 residents at 3 LTCFs, overall MRSA prevalence was 58%, with similar distributions of carriage at all 3 facilities: 20% persistent, 39% intermittent, 41% noncarriers. Of 254 residents with an initial negative swab, 25 (10%) acquired MRSA over the 6 months; rates were similar at all 3 LTCFs, with no clusters evident. Multivariable analysis demonstrated that receipt of systemic antimicrobials during the study was the only significant risk factor for MRSA acquisition (odds ratio, 7.8 [95% confidence interval, 2.1–28.6]; P = .002). MRSA strains from acquisitions were related by PFGE to those from a roommate in 9/25 (36%) cases; 6 of these 9 roommate sources were persistent carriers.
Conclusions.
MRSA colonization prevalence was high at 3 separate VA LTCFs. MRSA acquisition was strongly associated with antimicrobial exposure. Roommate sources were often persistent carriers, but transmission from roommates accounted for only approximately one-third of MRSA acquisitions.
The Golden-cheeked Warbler Dendroica chrysoparia is a federally endangered Neotropical migrant that inhabits montane pine-oak forests in Mexico and northern Central America during the non-breeding season. Although it is known that Golden-cheeked Warblers are closely associated with ‘encino’ oaks (evergreen or holm oak) such as Quercus sapotifolia, Q. eliptica and Q. elongata, which have shiny, narrow, elliptical, or oblong leaves, quantitative habitat targets are useful for effectively incorporating this information into conservation planning and forest management practices. We analysed data on wintering Golden-cheeked Warblers collected during the non-breeding season in Honduras from 1996 to 1998 to identify quantitative targets for habitat conditions for this species. Data on warbler abundance were collected using line transect surveys located in montane pine-oak forests in a stratified-random fashion. Habitat data were collected at five 0.04 ha plots on these same transects and the averaged values used as predictors of Golden-cheeked Warbler abundance. We found that Golden-cheeked Warblers were strongly associated with the basal area of encino oaks and density of ‘roble’ oaks, such as Q. segoviensis, Q. purulhana and Q. rugosa, which have large, lobed leaves. Density of Golden-cheeked Warblers peaked at ≈ 5.6 m2 ha–1 basal area of encino and ≈7 roble oaks ha–1. These values can be used to identify quantitative habitat targets that can be directly incorporated into forest management practices to ensure that these activities maintain habitat conditions necessary for their use by Golden-cheeked Warblers.
Pseudomonas aeruginosa is a nosocomial pathogen capable of exhibiting a variety of resistance mechanisms against multiple classes of antibiotics. Fluoroquinolones, commonly used to treat a variety of infections in both ambulatory and hospitalized patients, have been increasingly linked to the development of resistance, both to fluoroquinolones and to other classes of antibiotics including β-lactams, cephalosporins, and carbapenems. In turn, as many as 95% of multidrug-resistant pseudomonal isolates may be resistant to fluoroquinolones. Although research has examined the effect of fluoroquinolone use on P. aeruginosa resistance, to our knowledge, no work has been published describing possible differences among individual fluoroquinolones related to resistance to other antibiotic classes. The purpose of this analysis was to assess the possible effects of varying usage of levofloxacin, gatifloxacin, and moxifloxacin on P. aeruginosa susceptibility to piperacillin-tazobactam, cefepime, and tobramycin. Data from January 2000 through December 2008 were obtained from clinical microbiology and pharmacy databases of the Medical University of South Carolina Medical Center, which is a 689-bed academic medical center and level 1 trauma center with adult and pediatric beds. This study was approved by the institution's institutional review board.
Public perception of engineering recognizes its importance to national and international competitiveness, economy, quality of life, security, and other fundamental areas of impact; but uncertainty about engineering among the general public remains. Federal funding trends for education underscore many of the concerns regarding teaching and learning in science, technology, engineering, and mathematics subjects in primary through grade 12 (P-12) education. Conflicting perspectives on the essential attributes that comprise the engineering design process results in a lack of coherent criteria against which teachers and administrators can measure the validity of a resource, or assess its strengths and weaknesses, or grasp incongruities among competing process models. The literature suggests two basic approaches for representing engineering design: a phase-based, life cycle-oriented approach; and an activity-based, cognitive approach. Although these approaches serve various teaching and functional goals in undergraduate and graduate engineering education, as well as in practice, they tend to exacerbate the gaps in P-12 engineering efforts, where appropriate learning objectives that connect meaningfully to engineering are poorly articulated or understood. In this article, we examine some fundamental problems that must be resolved if preengineering is to enter the P-12 curriculum with meaningful standards and is to be connected through learning outcomes, shared understanding of engineering design, and other vestiges to vertically link P-12 engineering with higher education and the practice of engineering. We also examine historical aspects, various pedagogies, and current issues pertaining to undergraduate and graduate engineering programs. As a case study, we hope to shed light on various kinds of interventions and outreach efforts to inform these efforts or at least provide some insight into major factors that shape and define the environment and cultures of the two institutions (including epistemic perspectives, institutional objectives, and political constraints) that are very different and can compromise collaborative efforts between the institutions of P-12 and higher education.
Studies were conducted in 2003 and 2004 over seven environments evaluating rice root growth inhibition (RGI) and foliar injury from penoxsulam at 30 and 60 g ai/ha and bispyribac-sodium at 30 g ai/ha applied to four- to five-leaf rice at three flood timings, 1, 7, and 14 d after herbicide treatment (DAT), for five rice cultivars, ‘Bengal’, ‘Cypress’, ‘Wells’, ‘Cocodrie’, and ‘XP712’. Flooding at 1 and 7 DAT resulted in greater RGI compared with flood at 14 DAT when evaluated 1 wk after flood (WAF). By 2 WAF, RGI was greater with flooding at 1 DAT compared with flooding at 7 DAT for cultivars Bengal, Cypress, and Wells. Analyzing flood timing 1 DAT, bispyribac-sodium reduced root growth of Bengal and Cypress compared with penoxsulam at 30 g/ha at 1 week after treatment (WAT). At 2 WAT, RGI for Cocodrie was higher following penoxsulam at 60 g/ha when compared with bispyribac-sodium. By 3 WAT, RGI was higher following penoxsulam at 60 g/ha when compared with penoxsulam at 30 g/ha for Cocodrie and greater than bispyribac-sodium and penoxsulam at 30 g/ha for Cypress. Foliar injury following penoxsulam at both rates was less than injury following bispyribac-sodium for all cultivars except XP712 at 1 WAT. XP712 resulted in < 5% RGI and < 6% foliar injury at each evaluation. Rice grain yield was not affected by herbicide treatment for any cultivar compared with the standard treatment of propanil plus quinclorac.
Clomazone is an effective herbicide widely used for PRE grass control in rice. However, use of clomazone on sandy textured soils of the western Texas rice belt can cause serious rice injury. Two field experiments at three locations were conducted in 2002 and 2003 to determine the optimum rate range that maximizes barnyardgrass and broadleaf signalgrass control and minimizes rice injury across a wide variety of soil textures and planting dates. At Beaumont (silty clay loam), Eagle Lake (fine sandy loam), and Ganado (fine sandy loam), TX, PRE application of 0.34 kg ai/ha clomazone applied to rice planted in March, April, or May optimized barnyardgrass and broadleaf signalgrass control and rice yield while minimizing rice injury. Data suggest that, although injury might occur, clomazone is safe to use in rice on sandy textured soils.
Certain diseases have been associated with the administration of heavy elements as contrast agents to patients undergoing medical imaging procedures. Recently, the presence of gadolinium (Gd) administered as a paramagnetic contrast agent for MRI contrast studies was associated with the incidence of Nephrogenic Fibrosing Dermopathy (NFD), also called Nephrogenic Systemic Fibrosis (NSF). To determine specific causation, Gd and other metallic nanoparticles in various tissues must be detected directly and characterized in-situ. This is done to develop specific mechanisms for the chemical modification of the metal elements as the result of a biologic response. Fixed biopsies embedded in paraffin were sectioned at 3-5 μm thick, deparaffinized by hand (xylene and 100% ethyl alcohol), placed on carbon planchettes, and allowed to air dry. Deparaffinized tissues were examined using a field emission SEM (FE-SEM) to directly detect and image the presence of Gd as well as other metals. Backscatter electron (BSE) imaging (20kV) was used to discern metal particles within tissues. Energy dispersive spectroscopy (EDS) (15kV) was used to verify the specific elements present. This allowed for the spatial characterization of the nanoparticles within the tissues but due to the physical limitations of SEM/EDS, quantification of the amount of metal was not possible. Mass concentration of the metal elements was determined using inductively coupled plasma mass spectrometry (ICP-MS) on digested tissues. Thick tissue sections, >30 μm, were used for ICP-MS to provide enough mass for detection. These sections were taken from the histology blocks adjacent to the thin sections used in the FE-SEM. Gadolinium was detected in skin, heart, lung and liver tissues. The highest concentrations were found in heart and skin; both had average tissue concentrations greater than 200μg/g (100-450μg/g range). In skin, gadolinium nano-particulates were readily seen near cell body locations in autopsy samples and within the cells in biopsy samples. The cells where gadolinium was most easily found were along blood vessels. In the cells the agglomerates appear granular with a size of less than 100 nm. They are diffused throughout the cell but as of this time not associated with any particular cell structure. Subsequent work using TEM will examine that aspect as well as the specific ultrastructure and chemistry of the nanoparticles. In this investigation, gadolinium was detected in the tissues of a number of patients with NSF. Although neither dispositive of a pathophysiologic mechanism, nor proof of causation, the detection and quantification of gadolinium within tissues of NSF patients is supportive of the epidemiologic association between exposure to gadolinium containing contrast material and development of the disease.
Nosocomial transmission of group A Streptococcus (GAS) has been well described. A recent report of an outbreak investigation suggested that transmission can be extensive and that standard infection control measures may not be adequate to prevent transmission from patients with severe, invasive disease to healthcare workers (HCWs).
Objective.
A case of pharyngitis in an HCW caring for a patient with GAS pharyngitis and necrotizing fasciitis prompted an investigation of the extent and risk factors for nosocomial transmission of GAS.
Setting.
A 509-bed, tertiary care center in Portland, Oregon with 631,100 patient visits (hospital and clinic) and 11,500 employees in the year 2003.
Methods.
HCWs with exposure to the index patient (“contacts”) were identified for streptococcal screening and culture and completion of a questionnaire regarding the location and duration of exposure, use of personal protective equipment, and symptoms of GAS infection.
Results.
We identified 103 contacts of the index patient; 89 (86%) submitted oropharyngeal swabs for screening and culture. Only 3 (3.4%) of contacts had a culture that yielded GAS; emm typing results and pulsed-field gel electrophoresis patterns of GAS isolates from 2 HCWs were identical to those for the isolate from the index patient. Both HCWs were symptomatic, with febrile pharyngitis and reported prolonged contact with the open wound of the patient in the operating room.
Conclusions.
In this investigation, nosocomial transmission was not extensive, and standard precautions provided adequate protection for the majority of HCWs. Transmission was restricted to individuals with prolonged intraoperative exposure to open wounds. As a result, infection control policy for individuals was modified only for HCWs with exposure to GAS in the operating room.
Field experiments were conducted in 2002 and 2003 in Beaumont, TX, to evaluate the effect of flood timing on red rice control with imazethapyr applied at different cultivated rice growth stages. Treatments included flood establishment at 1, 7, 14, and 21 d after postemergence (POST) herbicide treatment (DAT). Imazethapyr was applied preemergence at 70 g ai/ha followed by 70 g/ ha POST when imidazolinone-tolerant rice cultivar ‘CL-161’ had three- to four-leaf stage (EPOST) or five-leaf stage (LPOST). Flood needed to be established within 14 DAT to achieve at least 95% red rice control when imazethapyr was applied EPOST. However, flood needed to be established within 7 DAT to provide at least 95% red rice control when imazethapyr was applied LPOST. Delaying the flood up to 21 DAT reduced rice grain yield for both application timings.
A study was conducted in 2001 and 2002 in Texas to evaluate red rice control and crop response of imidazolinone-tolerant rice with imazethapyr on coarse-textured soils. Because imazethapyr was not registered for use on imidazolinone-tolerant rice on coarse-textured soils in Texas, crop response was evaluated to determine whether imidazolinone-tolerant rice yields would be reduced with sequential applications of imazethapyr on soils having greater than 50% sand content. The treatment factors consisted of preemergence (PRE) applications of imazethapyr at 50, 70, or 87 g ai/ ha followed by (fb) preflood (PREFLD) applications of 35 or 50 g/ha. Imazethapyr at 70 g/ha PRE fb 70 g/ha PREFLD was added as a seventh treatment. PRE applications were activated by rainfall or surface irrigation after application, and PREFLD applications were sprayed 1 to 2 d before application of the permanent flood. In both years, 100% red rice control was achieved with all rate combinations. Early-season visual rice injury ranged from 5 to 21% and did not result in yield losses, indicating that imazethapyr is safe on coarse-textured soils.
Clomazone has been successfully used for weed control in rice, but crop injury is a potential problem on light-textured soils. Experiments were conducted to determine the effect of soil characteristics and water potential on plant-available clomazone and rice injury. A centrifugal double-tube technique was used to determine plant-available concentration in soil solution (ACSS), total amount available in soil solution (TASS), and Kd values for clomazone on four soils at four water potentials. A rice bioassay was conducted parallel to the plant-available study to correlate biological availability to ACSS, TASS, and Kd. TASS was significantly different in all soils. The order of increasing TASS for the soils studied was Morey < Edna < Nada < Crowley, which correlated well with soil characteristics. The order of increasing TASS after equilibrium was − 90 < − 75 < − 33 < 0 kPa. TASS values at 0 kPa were greater than two times the TASS values at − 90 kPa. It appears that severe rice injury from clomazone on these soils could occur if TASS > 110 ng g−1 and Kd < 1.1 ml g−1. We propose that the double-tube technique provides a more accurate estimate of available herbicide because the solution–soil ratios are < 0.33:1 and would be more representative of a plant root–herbicide relationship. This technique or some variation possibly could be further developed such that clomazone rates could be more clearly defined particularly on lighter-textured soils. TASS may be a better predictor of plant-available herbicide than ACSS when evaluating moderately to highly water-soluble herbicides in a nonsaturated soil environment.