We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Hierarchical Taxonomy of Psychopathology (HiTOP) has emerged out of the quantitative approach to psychiatric nosology. This approach identifies psychopathology constructs based on patterns of co-variation among signs and symptoms. The initial HiTOP model, which was published in 2017, is based on a large literature that spans decades of research. HiTOP is a living model that undergoes revision as new data become available. Here we discuss advantages and practical considerations of using this system in psychiatric practice and research. We especially highlight limitations of HiTOP and ongoing efforts to address them. We describe differences and similarities between HiTOP and existing diagnostic systems. Next, we review the types of evidence that informed development of HiTOP, including populations in which it has been studied and data on its validity. The paper also describes how HiTOP can facilitate research on genetic and environmental causes of psychopathology as well as the search for neurobiologic mechanisms and novel treatments. Furthermore, we consider implications for public health programs and prevention of mental disorders. We also review data on clinical utility and illustrate clinical application of HiTOP. Importantly, the model is based on measures and practices that are already used widely in clinical settings. HiTOP offers a way to organize and formalize these techniques. This model already can contribute to progress in psychiatry and complement traditional nosologies. Moreover, HiTOP seeks to facilitate research on linkages between phenotypes and biological processes, which may enable construction of a system that encompasses both biomarkers and precise clinical description.
The Minnesota Center for Twin and Family Research (MCTFR) comprises multiple longitudinal, community-representative investigations of twin and adoptive families that focus on psychological adjustment, personality, cognitive ability and brain function, with a special emphasis on substance use and related psychopathology. The MCTFR includes the Minnesota Twin Registry (MTR), a cohort of twins who have completed assessments in middle and older adulthood; the Minnesota Twin Family Study (MTFS) of twins assessed from childhood and adolescence into middle adulthood; the Enrichment Study (ES) of twins oversampled for high risk for substance-use disorders assessed from childhood into young adulthood; the Adolescent Brain (AdBrain) study, a neuroimaging study of adolescent twins; and the Siblings Interaction and Behavior Study (SIBS), a study of adoptive and nonadoptive families assessed from adolescence into young adulthood. Here we provide a brief overview of key features of these established studies and describe new MCTFR investigations that follow up and expand upon existing studies or recruit and assess new samples, including the MTR Study of Relationships, Personality, and Health (MTR-RPH); the Colorado-Minnesota (COMN) Marijuana Study; the Adolescent Brain Cognitive Development (ABCD) study; the Colorado Online Twins (CoTwins) study and the Children of Twins (CoT) study.
Medical procedures and patient care activities may facilitate environmental dissemination of healthcare-associated pathogens such as methicillin-resistant Staphylococcus aureus (MRSA).
Design:
Observational cohort study of MRSA-colonized patients to determine the frequency of and risk factors for environmental shedding of MRSA during procedures and care activities in carriers with positive nares and/or wound cultures. Bivariate analyses were performed to identify factors associated with environmental shedding.
Setting:
A Veterans Affairs hospital.
Participants:
This study included 75 patients in contact precautions for MRSA colonization or infection.
Results:
Of 75 patients in contact precautions for MRSA, 55 (73%) had MRSA in nares and/or wounds and 25 (33%) had positive skin cultures. For the 52 patients with MRSA in nares and/or wounds and at least 1 observed procedure, environmental shedding of MRSA occurred more frequently during procedures and care activities than in the absence of a procedure (59 of 138, 43% vs 8 of 83, 10%; P < .001). During procedures, increased shedding occurred ≤0.9 m versus >0.9 m from the patient (52 of 138, 38% vs 25 of 138, 18%; P = .0004). Contamination occurred frequently on surfaces touched by personnel (12 of 38, 32%) and on portable equipment used for procedures (25 of 101, 25%). By bivariate analysis, the presence of a wound with MRSA was associated with shedding (17 of 29, 59% versus 6 of 23, 26%; P = .04).
Conclusions:
Environmental shedding of MRSA occurs frequently during medical procedures and patient care activities. There is a need for effective strategies to disinfect surfaces and equipment after procedures.
In “Toward a Theory of Race, Crime, and Urban Inequality,” Sampson and Wilson (1995) argued that racial disparities in violent crime are attributable in large part to the persistent structural disadvantages that are disproportionately concentrated in African American communities. They also argued that the ultimate causes of crime were similar for both Whites and Blacks, leading to what has been labeled the thesis of “racial invariance.” In light of the large scale social changes of the past two decades and the renewed political salience of race and crime in the United States, this paper reassesses and updates evidence evaluating the theory. In so doing, we clarify key concepts from the original thesis, delineate the proper context of validation, and address new challenges. Overall, we find that the accumulated empirical evidence provides broad but qualified support for the theoretical claims. We conclude by charting a dual path forward: an agenda for future research on the linkages between race and crime, and policy recommendations that align with the theory’s emphasis on neighborhood level structural forces but with causal space for cultural factors.
Antineuronal antibodies are associated with psychosis, although their clinical significance in first episode of psychosis (FEP) is undetermined.
Aims
To examine all patients admitted for treatment of FEP for antineuronal antibodies and describe clinical presentations and treatment outcomes in those who were antibody positive.
Method
Individuals admitted for FEP to six mental health units in Queensland, Australia, were prospectively tested for serum antineuronal antibodies. Antibody-positive patients were referred for neurological and immunological assessment and therapy.
Results
Of 113 consenting participants, six had antineuronal antibodies (anti-N-methyl-D-aspartate receptor antibodies [n = 4], voltage-gated potassium channel antibodies [n = 1] and antibodies against uncharacterised antigen [n = 1]). Five received immunotherapy, which prompted resolution of psychosis in four.
Conclusions
A small subgroup of patients admitted to hospital with FEP have antineuronal antibodies detectable in serum and are responsive to immunotherapy. Early diagnosis and treatment is critical to optimise recovery.
Timing of weed emergence and seed persistence in the soil influence the ability to implement timely and effective control practices. Emergence patterns and seed persistence of kochia populations were monitored in 2010 and 2011 at sites in Kansas, Colorado, Wyoming, Nebraska, and South Dakota. Weekly observations of emergence were initiated in March and continued until no new emergence occurred. Seed was harvested from each site, placed into 100-seed mesh packets, and buried at depths of 0, 2.5, and 10 cm in fall of 2010 and 2011. Packets were exhumed at 6-mo intervals over 2 yr. Viability of exhumed seeds was evaluated. Nonlinear mixed-effects Weibull models were fit to cumulative emergence (%) across growing degree days (GDD) and to viable seed (%) across burial time to describe their fixed and random effects across site-years. Final emergence densities varied among site-years and ranged from as few as 4 to almost 380,000 seedlings m−2. Across 11 site-years in Kansas, cumulative GDD needed for 10% emergence were 168, while across 6 site-years in Wyoming and Nebraska, only 90 GDD were needed; on the calendar, this date shifted from early to late March. The majority (>95%) of kochia seed did not persist for more than 2 yr. Remaining seed viability was generally >80% when seeds were exhumed within 6 mo after burial in March, and declined to <5% by October of the first year after burial. Burial did not appear to increase or decrease seed viability over time but placed seed in a position from which seedling emergence would not be possible. High seedling emergence that occurs very early in the spring emphasizes the need for fall or early spring PRE weed control such as tillage, herbicides, and cover crops, while continued emergence into midsummer emphasizes the need for extended periods of kochia management.
A survey to determine the frequency and weed control impact of enhanced degradation of butylate or EPTC in field soils receiving repeat applications of these herbicides was conducted in a sugarbeet and three corn growing areas of Nebraska. All seven of the sugarbeet field soils exhibited enhanced EPTC degradation. In the corn areas, none of the 13 north central and southeast field soils displayed accelerated degradation; however, 10 of the 16 south central field soils did. In south central Nebraska, 60% and 45% of the surveyed growers were dissatisfied with weed control from butylate or EPTC in 1983 and 1984, respectively, compared to 24% and none in other survey areas. Enhanced herbicide degradation and the presence of shattercane were the main reasons for the disparity among areas.
Corn (Zea mays L. ‘Pioneer 3732′) showed little to no injury following the postemergence-directed application of sethoxydim {2-[1-(ethoxyimino)butyl]-5-[2-(ethylthio) propyl]-3-hydroxy-2-cyclohexen-1-one} plus crop oil concentrate (COC) at 56 g/ha plus 1.25% (v/v) at nine locations across Midwestern U.S. in 1984 and 1985. Little corn injury also occurred for the postemergence-directed application of sethoxydim plus COC at 110 g/ha plus 1.25% (v/v) at most locations in both years. Considerable variation in tolerance was seen across locations for over-the-top applications of sethoxydim at all rates tested and for the directed application at 220 g/ha. Although corn at most locations showed no yield reduction with the over-the-top application of sethoxydim plus COC at 56 g/ha plus 1.25% (v/v), a 70% yield reduction occurred in one location in one year. For an over-the-top application of sethoxydim plus COC at 110 g/ha plus 1.25% (v/v), yields ranged from 3 to 95% of the untreated check in 1984, and from 3 to 88% in 1985. Stand reductions from an over-the-top application of sethoxydim plus COC at 220 g/ha plus 1.25% (v/v) ranged from 0 to 99%. A significant negative correlation was found between yield of corn treated over the top with sethoxydim and precipitation on the day of application and in the week following application. Air temperature on the day of application was positively correlated with corn injury from over-the-top and directed sethoxydim applications, but no correlation existed between percent relative humidity and corn injury. Open pan evaporation and solar radiation before and after application were not correlated with corn injury from sethoxydim.
Effectiveness of rotary hoeing with cultivation and comparison of an in-row cultivator with a standard row-crop cultivator were determined in dry edible bean. The effectiveness of in-row cultivation conducted at various timings and frequencies was examined. The in-row cultivator was more effective in reducing weed populations than the standard cultivator, although at least two mechanical weeding operations were needed to reduce weed populations to levels of the herbicide check (EPTC [S-ethyl dipropyl carbamothioate] plus ethalfluralin). When the in-row cultivation was delayed until the second trifoliolate stage or later, weed populations were greater than those in the herbicide check. In situations with high weed populations, rotary hoeing prior to cultivation was required to reduce weed populations to levels similar to the herbicide check. An in-row cultivator has potential to improve mechanical weed control options in a crop such as dry edible bean. The types of adjustments made in combination with soil textures, soil moisture, and operator experience affect overall weed control. Thus, it is expected that the level of weed control will vary from year to year and even field to field for the same operator.
The effects of the dimethylamine salt of dicamba (3,6-dichloro-2-methoxybenzoic acid) and the dimethylamine salt of 2,4-D [(2,4-dichlorophenoxy)acetic acid] on fieldbeans (Phaseolus vulgaris L. ‘Great Northern Valley’) were studied in order to assess the potential hazards of using these herbicides in areas adjoining fieldbean production. Dicamba and 2,4-D were applied to fieldbeans at three different rates (1.1, 11.2, and 112.5 g ai/ha) and four different growth stages (preemergence, second trifoliolate leaf, early bloom, and early pod). Application of 2,4-D preemergence or in the second trifoliolate leaf stage of growth did not reduce seed yield, delay maturity, or reduce germination of seed obtained from treated plants. Dicamba or 2,4-D applied at 112.5 g/ha to fieldbeans in the early bloom or early pod stages of growth consistently reduced seed yield, delayed maturity, and reduced germination percentage. Fieldbeans exhibited a greater overall sensitivity to dicamba than to 2,4-D.
Research conducted since 1979 in the north central United States and southern Canada demonstrated that after repeated annual applications of the same thiocarbamate herbicide to the same field, control of some difficult-to-control weed species was reduced. Laboratory studies of herbicide degradation in soils from these fields indicated that these performance failures were due to more rapid or “enhanced” biodegradation of the thiocarbamate herbicides after repeated use with a shorter period during which effective herbicide levels remained in the soils. Weeds such as wild proso millet [Panicum miliaceum L. spp. ruderale (Kitagawa) Tzevelev. #3 PANMI] and shattercane [Sorghum bicolor (L.) Moench. # SORVU] which germinate over long time periods were most likely to escape these herbicides after repeated use. Adding dietholate (O,O-diethyl O-phenyl phosphorothioate) to EPTC (S-ethyl dipropyl carbamothioate) reduced problems caused by enhanced EPTC biodegradation in soils treated previously with EPTC alone but not in soils previously treated with EPTC plus dietholate. While previous use of other thiocarbamate herbicides frequently enhanced biodegradation of EPTC or butylate [S-ethyl bis(2-methylpropyl)carbamothioate], previous use of other classes of herbicides or the insecticide carbofuran (2,3 -dihydro-2,2 -dimethyl-7-benzofuranyl methylcarbamate) did not. Enhanced biodegradation of herbicides other than the thiocarbamates was not observed.
Field experiments, conducted from 1991 to 1994, generated information on weed seedbank emergence for 22 site-years from Ohio to Colorado and Minnesota to Missouri. Early spring seedbank densities were estimated through direct extraction of viable seeds from soil cores. Emerged seedlings were recorded periodically, as were daily values for air and soil temperature, and precipitation. Percentages of weed seedbanks that emerged as seedlings were calculated from seedbank and seedling data for each species, and relationships between seedbank emergence and microclimatic variables were sought. Fifteen species were found in 3 or more site-years. Average emergence percentages (and coefficients of variation) of these species were as follows: giant foxtail, 31.2 (84%); velvetleaf, 28.2 (66); kochia, 25.7 (79); Pennsylvania smartweed, 25.1 (65); common purslane, 15.4 (135); common ragweed, 15.0 (110); green foxtail, 8.5 (72); wild proso millet, 6.6 (104); hairy nightshade, 5.2 (62); common sunflower, 5.0 (26); yellow foxtail, 3.4 (67); pigweed species, 3.3 (103); common lambsquarters, 2.7 (111); wild buckwheat, 2.5 (63), and prostrate knotweed, 0.6 (79). Variation among site-years, for some species, could be attributed to microclimate variables thought to induce secondary dormancy in spring. For example, total seasonal emergence percentage of giant foxtail was related positively to the 1st date at which average daily soil temperature at 5 to 10 cm soil depth reached 16 C. Thus, if soil warmed before mid April, secondary dormancy was induced and few seedlings emerged, whereas many seedlings emerged if soil remained cool until June.
To determine the impact of an environmental disinfection intervention on the incidence of healthcare-associated Clostridium difficile infection (CDI).
DESIGN
A multicenter randomized trial.
SETTING
In total,16 acute-care hospitals in northeastern Ohio participated in the study.
INTERVENTION
We conducted a 12-month randomized trial to compare standard cleaning to enhanced cleaning that included monitoring of environmental services (EVS) personnel performance with feedback to EVS and infection control staff. We assessed the thoroughness of cleaning based on fluorescent marker removal from high-touch surfaces and the effectiveness of disinfection based on environmental cultures for C. difficile. A linear mixed model was used to compare CDI rates in the intervention and postintervention periods for control and intervention hospitals. The primary outcome was the incidence of healthcare-associated CDI.
RESULTS
Overall, 7 intervention hospitals and 8 control hospitals completed the study. The intervention resulted in significantly increased fluorescent marker removal in CDI and non-CDI rooms and decreased recovery of C. difficile from high-touch surfaces in CDI rooms. However, no reduction was observed in the incidence of healthcare-associated CDI in the intervention hospitals during the intervention and postintervention periods. Moreover, there was no correlation between the percentage of positive cultures after cleaning of CDI or non-CDI rooms and the incidence of healthcare-associated CDI.
CONCLUSIONS
An environmental disinfection intervention improved the thoroughness and effectiveness of cleaning but did not reduce the incidence of healthcare-associated CDI. Thus, interventions that focus only on improving cleaning may not be sufficient to control healthcare-associated CDI.
Field studies were conducted in 2003 and 2004 near Scottsbluff and Sidney, NE, to identify efficacious chemical weed-control options for irrigated and dryland chickpea production. Weed control had a greater relative effect on chickpea yield in the irrigated system than the dryland system, with yield from the hand-weeded check exceeding the nontreated check by 1,500% in the irrigated system and 87% in the dryland system. Imazethapyr, applied preemergence at the rate of 0.053 kg ai/ha, reduced plant height, delayed plant maturity, and caused leaf chlorosis. At Scottsbluff, preplant-incorporated ethalfluralin caused significant crop injury in 2003, but the ethalfluralin treatment also maintained weed densities 4 wk after crop emergence that were not significantly different than the hand-weeded check at both locations in 2003 and 2004. Treatments containing sulfentrazone provided a similar level of weed control but without any evidence of crop injury. Pendimethalin and pendimethalin + dimethenamid-P applied preemergence provided acceptable weed control in the irrigated system, where water was applied within 4 d after herbicide application, but did not provide acceptable control in the dryland system.
Field studies were conducted in 1999, 2000, and 2001 to evaluate broadleaf weed control in glyphosate-resistant cotton by glyphosate plus CGA 362622 applied postemergence. Treatments included 560 and 1,120 g ai/ha glyphosate-isopropylamine alone or in mixtures with CGA 362622 at 3.8 and 7.5 g ai/ha, and CGA 362622 at 7.5 g/ha alone. Cotton injury 7 d after treatment (DAT) was 3 to 11% from glyphosate alone and 16 to 24% from glyphosate plus CGA 362622. Injury 28 DAT with CGA 362622 or herbicide mixtures did not exceed 6%. Broadleaf weed control by herbicide mixtures was generally more consistent than control from either herbicide applied alone. Glyphosate plus CGA 362622 controlled common cocklebur and smooth pigweed better than glyphosate alone. In most instances, the mixtures also controlled common ragweed, common lambsquarters, ivyleaf morningglory, pitted morningglory, and tall morningglory better than glyphosate applied alone. Common cocklebur and smooth pigweed were controlled at least 85% by all treatments. CGA 362622 did not control spurred anoda or jimsonweed. Cotton yields generally reflected weed control. According to these results, glyphosate plus CGA 362622 mixtures can consistently control many broadleaf weeds in cotton.
Field and greenhouse studies were conducted to evaluate mesotrione alone and in combinations with low rates of atrazine and bentazon for control of yellow and purple nutsedge. Mesotrione alone at rates of 105 to 210 g ai/ha controlled yellow nutsedge 43 to 70%. Mixtures of mesotrione with atrazine at 280 g ai/ha did not always improve yellow nutsedge control over that by mesotrione alone, but increasing atrazine to 560 g ai/ha in these mixtures generally provided more consistent control of yellow nutsedge. Mesotrione at 105 g ai/ha mixed with bentazon at 280 or 560 g ai/ha controlled yellow nutsedge 88% or greater which was similar to control from the standard halosulfuron at 36 g ai/ha. Mesotrione, atrazine, and bentazon alone did not control purple nutsedge. Mixtures of mesotrione plus bentazon, however, did improve control of purple nutsedge over either herbicide applied alone, but this control was not considered commercially acceptable.
A segment of the debate surrounding the commercialization of genetically
engineered (GE) crops, such as glyphosate-resistant (GR) crops, focuses on
the theory that implementation of these traits is an extension of the
intensification of agriculture that will further erode the biodiversity of
agricultural landscapes. A large field-scale study was conducted in 2006 in
the United States on 156 different field sites with a minimum 3-yr history
of GR corn, cotton, or soybean in the cropping system. The impact of
cropping system, crop rotation, frequency of using the GR crop trait, and
several categorical variables on emerged weed density and diversity was
analyzed. Species richness, evenness, Shannon's H′, proportion of forbs,
erect growth habit, and C3 species diversity were all greater in
agricultural sites that lacked crop rotation or were in a continuous GR crop
system. Rotating between two GR crops (e.g., corn and soybean) or rotating
to a non-GR crop resulted in less weed diversity than a continuous GR crop.
The composition of the weed flora was more strongly related to location
(geography) than any other parameter. The diversity of weed flora in
agricultural sites with a history of GR crop production can be influenced by
several factors relating to the specific method in which the GR trait is
integrated (cropping system, crop rotation, GR trait rotation), the specific
weed species, and the geographical location. The finding that fields with
continuous GR crops demonstrated greater weed diversity is contrary to
arguments opposing the use of GE crops. These results justify further
research to clarify the complexities of crops grown with
herbicide-resistance traits, or more broadly, GE crops, to provide a more
complete characterization of their culture and local adaptation.
Field studies were conducted in 1999, 2000, and 2001 to investigate weed control and glyphosate-resistant corn tolerance to postemergence applications of mesotrione at 70, 105, and 140 g ai/ha applied with and without glyphosate at 560 g ai/ha. Mesotrione alone and mixed with glyphosate controlled smooth pigweed greater than 97% and common lambsquarters 93 to 99%. Control of common ragweed and morningglory species was variable. Common ragweed control was generally best when mesotrione was applied at 105 or 140 g/ha, and control increased only in 2000 with the addition of glyphosate. Giant foxtail control was below 25% with all rates of mesotrione, but mixtures of mesotrione plus glyphosate controlled giant foxtail 65 to 75%. Mesotrione injured glyphosate-resistant corn 4 to 24% when averaged over glyphosate rates, and injury was usually increased by higher mesotrione rates, with rainfall after herbicide applications, and in mixtures with glyphosate. Injury was transient and did not reduce corn yields. Mesotrione injury on glyphosate-resistant corn was confirmed in the greenhouse, where all mesotrione treatments reduced glyphosate-resistant corn biomass 9 to 23% compared with the nontreated check.