To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Prestigious journals are widely admired for publishing quality scholarship, yet the primary indicators of journal prestige (i.e., impact factors) do not directly assess audience admiration. Moreover, the publication landscape has changed substantially in the last 20 years, with electronic publishing changing the way we consume scientific research. Given that it has been 18 years since the publication of the last journal prestige survey of SIOP members, the authors conducted a new survey and used these results to reflect on changing practices within industrial and organizational (I-O) psychology. SIOP members (n = 557) rated the prestige and relevance of I-O and management journals. Responses were analyzed according to job setting, and were compared to a survey conducted by Zickar and Highhouse (2001) in 2000. There was considerable consistency in prestige ratings across settings (i.e., management department vs. psychology department; academic vs. applied), especially among the top journals. There was considerable variance, however, in the perceived usefulness of different journals. Results also suggested considerable consistency across the two time periods, but with some increases in prestige among OB-oriented journals. Changes in the journal landscape are discussed, including the rise of OHP as a topic of concentration in I-O. We suggest that I-O programs will continue to attract the top researchers in talent management and OHP, which should result in the use of a broader set of journals for judging I-O program impact.
Carbapenem-resistant Enterobacterales (CRE) are common causes of healthcare-associated infections and are often multidrug resistant with limited therapeutic options. Additionally, CRE can spread within and between healthcare facilities, amplifying potential harms.
To better understand the burden, risk factors, and source of acquisition of carbapenemase genes in clinical Escherichia coli and Klebsiella spp isolates from patients in Washington to guide prevention efforts.
Multicenter prospective surveillance study.
Escherichia coli and Klebsiella spp isolates meeting the Washington state CRE surveillance case definition were solicited from clinical laboratories and tested at Washington Public Health Laboratories using polymerase chain reaction (PCR) for the 5 most common carbapenemase genes: blaKPC, blaNDM, blaIMP, blaVIM, and blaOXA-48. Case patients positive by PCR were investigated by the public health department.
From October 2012 through December 2017, 363 carbapenem-resistant E. coli and Klebsiella spp isolates were tested. Overall, 45 of 115 carbapenem-resistant K. pneumoniae (39%), 1 of 8 K. oxytoca (12.5%), and 28 of 239 carbapenem-resistant E. coli (11.7%) were carbapenemase positive. Of 74 carbapenemase-positive isolates, blaKPC was most common (47%), followed by blaNDM (30%), blaOXA-48 (22%), and blaIMP (1%). Although all cases had healthcare exposure, blaKPC acquisition was associated with US health care, whereas non-blaKPC acquisition was associated with international health care or travel.
We report that blaKPC, the most prevalent carbapenemase in the United States, accounts for nearly half of carbapenemase cases in Washington state and that most KPC-cases are likely acquired through in-state health care.
Academic medical centers (AMCs) face challenges in conducting research among traditionally marginalized communities due to long-standing community mistrust. Evidence suggests that some AMC faculty and staff lack an understanding of the history of distrust and social determinants of health (SDH) affecting their communities. Wake Forest Clinical and Translational Science Institute Program in Community Engagement (PCE) aims to build bridges between communities and Wake Forest Baptist Health by equipping faculty, clinicians, administrators, and staff (FCAS) with a better understanding of SDH. The PCE collaborated with community partners to develop and implement community tours to improve cross-community AMC understanding and communication, enhance knowledge of SDH, and build awareness of community needs, priorities, and assets. Nine day-long tours have been conducted with 92 FCAS. Tours included routes through under-resourced neighborhoods and visits to community assets. Participant evaluations assessed program quality; 89% reported enhanced understanding of access-to-care barriers and how SDH affect health; 86% acknowledged the experience would improve future interactions with participants and patients; and 96% agreed they would recommend the tour to colleagues. This work supports the use of community tours as a strategy to improve cross-community AMC communication, build trust, and raise awareness of community needs, priorities, and assets.
Accurate perception of visual contours is essential for seeing and differentiating objects in the environment. Both the ability to detect visual contours and the influence of perceptual context created by surrounding stimuli are diminished in people with schizophrenia (SCZ). The central aim of the present study was to better understand the biological underpinnings of impaired contour integration and weakened effects of perceptual context. Additionally, we sought to determine whether visual perceptual abnormalities reflect genetic factors in SCZ and are present in other severe mental disorders.
We examined behavioral data and event-related potentials (ERPs) collected during the perception of simple linear contours embedded in similar background stimuli in 27 patients with SCZ, 23 patients with bipolar disorder (BP), 23 first-degree relatives of SCZ, and 37 controls.
SCZ exhibited impaired visual contour detection while BP exhibited intermediate performance. The orientation of neighboring stimuli (i.e. flankers) relative to the contour modulated perception across all groups, but SCZ exhibited weakened suppression by the perceptual context created by flankers. Late visual (occipital P2) and cognitive (centroparietal P3) neural responses showed group differences and flanker orientation effects, unlike earlier ERPs (occipital P1 and N1). Moreover, behavioral effects of flanker context on contour perception were correlated with modulation in P2 & P3 amplitudes.
In addition to replicating and extending findings of abnormal contour integration and visual context modulation in SCZ, we provide novel evidence that the abnormal use of perceptual context is associated with higher-order sensory and cognitive processes.
Early detection and intervention strategies in patients at clinical high-risk (CHR) for syndromal psychosis have the potential to contain the morbidity of schizophrenia and similar conditions. However, research criteria that have relied on severity and number of positive symptoms are limited in their specificity and risk high false-positive rates. Our objective was to examine the degree to which measures of recency of onset or intensification of positive symptoms [a.k.a., new or worsening (NOW) symptoms] contribute to predictive capacity.
We recruited 109 help-seeking individuals whose symptoms met criteria for the Progression Subtype of the Attenuated Positive Symptom Psychosis-Risk Syndrome defined by the Structured Interview for Psychosis-Risk Syndromes and followed every three months for two years or onset of syndromal psychosis.
Forty-one (40.6%) of 101 participants meeting CHR criteria developed a syndromal psychotic disorder [mostly (80.5%) schizophrenia] with half converting within 142 days (interquartile range: 69–410 days). Patients with more NOW symptoms were more likely to convert (converters: 3.63 ± 0.89; non-converters: 2.90 ± 1.27; p = 0.001). Patients with stable attenuated positive symptoms were less likely to convert than those with NOW symptoms. New, but not worsening, symptoms, in isolation, also predicted conversion.
Results suggest that the severity and number of attenuated positive symptoms are less predictive of conversion to syndromal psychosis than the timing of their emergence and intensification. These findings also suggest that the earliest phase of psychotic illness involves a rapid, dynamic process, beginning before the syndromal first episode, with potentially substantial implications for CHR research and understanding the neurobiology of psychosis.
Use latent class analysis (LCA) to identify patterns of cognitive functioning in a sample of older adults with clinical depression and without dementia and assess demographic, psychiatric, and neurobiological predictors of class membership.
Neuropsychological assessment data from 121 participants in the Alzheimer’s Disease Neuroimaging Initiative-Depression project (ADNI-D) were analyzed, including measures of executive functioning, verbal and visual memory, visuospatial and language functioning, and processing speed. These data were analyzed using LCA, with predictors of class membership such as depression severity, depression and treatment history, amyloid burden, and APOE e4 allele also assessed.
A two-class model of cognitive functioning best fit the data, with the Lower Cognitive Class (46.1% of the sample) performing approximately one standard deviation below the Higher Cognitive Class (53.9%) on most tests. When predictors of class membership were assessed, carrying an APOE e4 allele was significantly associated with membership in the Lower Cognitive Class. Demographic characteristics, age of depression onset, depression severity, history of psychopharmacological treatment for depression, and amyloid positivity did not predict class membership.
LCA allows for identification of subgroups of cognitive functioning in a mostly cognitively intact late life depression (LLD) population. One subgroup, the Lower Cognitive Class, more likely to carry an APOE e4 allele, may be at a greater risk for subsequent cognitive decline, even though current performance on neuropsychological testing is within normal limits. These findings have implications for early identification of those at greatest risk, risk factors, and avenues for preventive intervention.
Treatment for hoarding disorder is typically performed by mental health professionals, potentially limiting access to care in underserved areas.
We aimed to conduct a non-inferiority trial of group peer-facilitated therapy (G-PFT) and group psychologist-led cognitive–behavioural therapy (G-CBT).
We randomised 323 adults with hording disorder 15 weeks of G-PFT or 16 weeks of G-CBT and assessed at baseline, post-treatment and longitudinally (≥3 months post-treatment: mean 14.4 months, range 3–25). Predictors of treatment response were examined.
G-PFT (effect size 1.20) was as effective as G-CBT (effect size 1.21; between-group difference 1.82 points, t = −1.71, d.f. = 245, P = 0.04). More homework completion and ongoing help from family and friends resulted in lower severity scores at longitudinal follow-up (t = 2.79, d.f. = 175, P = 0.006; t = 2.89, d.f. = 175, P = 0.004).
Peer-led groups were as effective as psychologist-led groups, providing a novel treatment avenue for individuals without access to mental health professionals.
Declaration of interest
C.A.M. has received grant funding from the National Institutes of Health (NIH) and travel reimbursement and speakers’ honoraria from the Tourette Association of America (TAA), as well as honoraria and travel reimbursement from the NIH for serving as an NIH Study Section reviewer. K.D. receives research support from the NIH and honoraria and travel reimbursement from the NIH for serving as an NIH Study Section reviewer. R.S.M. receives research support from the National Institute of Mental Health, National Institute of Aging, the Hillblom Foundation, Janssen Pharmaceuticals (research grant) and the Alzheimer's Association. R.S.M. has also received travel support from the National Institute of Mental Health for Workshop participation. J.Y.T. receives research support from the NIH, Patient-Centered Outcomes Research Institute and the California Tobacco Related Research Program, and honoraria and travel reimbursement from the NIH for serving as an NIH Study Section reviewer. All other authors report no conflicts of interest.
The authors developed a practical and clinically useful model to predict the risk of psychosis that utilizes clinical characteristics empirically demonstrated to be strong predictors of conversion to psychosis in clinical high-risk (CHR) individuals. The model is based upon the Structured Interview for Psychosis Risk Syndromes (SIPS) and accompanying clinical interview, and yields scores indicating one's risk of conversion.
Baseline data, including demographic and clinical characteristics measured by the SIPS, were obtained on 199 CHR individuals seeking evaluation in the early detection and intervention for mental disorders program at the New York State Psychiatric Institute at Columbia University Medical Center. Each patient was followed for up to 2 years or until they developed a syndromal DSM-4 disorder. A LASSO logistic fitting procedure was used to construct a model for conversion specifically to a psychotic disorder.
At 2 years, 64 patients (32.2%) converted to a psychotic disorder. The top five variables with relatively large standardized effect sizes included SIPS subscales of visual perceptual abnormalities, dysphoric mood, unusual thought content, disorganized communication, and violent ideation. The concordance index (c-index) was 0.73, indicating a moderately strong ability to discriminate between converters and non-converters.
The prediction model performed well in classifying converters and non-converters and revealed SIPS measures that are relatively strong predictors of conversion, comparable with the risk calculator published by NAPLS (c-index = 0.71), but requiring only a structured clinical interview. Future work will seek to externally validate the model and enhance its performance with the incorporation of relevant biomarkers.
Posthodiplostomum minimum utilizes a three-host life cycle with multiple developmental stages. The metacercarial stage, commonly known as ‘white grub’, infects the visceral organs of many freshwater fishes and was historically considered a host generalist due to its limited morphological variation among a wide range of hosts. In this study, infection data and molecular techniques were used to evaluate the host and tissue specificity of Posthodiplostomum metacercariae in centrarchid fishes. Eleven centrarchid species from three genera were collected from the Illinois portion of the Ohio River drainage and necropsied. Posthodiplostomum infection levels differed significantly by host age, host genera and infection locality. Three Posthodiplostomum spp. were identified by DNA sequencing, two of which were relatively common within centrarchid hosts. Both common species were host specialists at the genus level, with one species restricted to Micropterus hosts and the other preferentially infecting Lepomis. Host specificity is likely dictated by physiological compatibility and deviations from Lepomis host specificity may be related to host hybridization. Posthodiplostomum species also differed in their utilization of host tissues. Neither common species displayed strong genetic structure over the scale of this study, likely due to their utilization of bird definitive hosts.
Depression and post-traumatic stress disorder (PTSD) are significant risks for suicide and other adverse events among US military personnel, but prevalence data among ship-assigned personnel at the onset of deployment are unknown.
To determine the prevalence of shipboard personnel who screen positive for PTSD and/or major depressive disorder (MDD) at the onset of deployment, and also those who reported these diagnoses made by a physician or healthcare professional in the year prior to deployment.
Active-duty ship-assigned personnel (N = 2078) completed anonymous assessments at the beginning of deployment. Depression was measured using the Center for Epidemiologic Studies Depression Scale (CES-D; score of ≥22), and PTSD was assessed using the PTSD Checklist–Civilian Version (PCL-C; both score and symptom criteria were used).
In total, 7.3% (n = 151 of 2076) screened positive for PTSD and 22% (n = 461 of 2078) for MDD at deployment onset. Only 6% and 15% of those who screened positive for PTSD or MDD, respectively, had been diagnosed by a healthcare professional in the past year.
Missed opportunities for mental healthcare among screen-positive shipboard personnel reduce the benefits associated with early identification and linkage to care. Improved methods of mental health screening that promote early recognition and referral to care may mitigate psychiatric events in theatre.
The Taipan galaxy survey (hereafter simply ‘Taipan’) is a multi-object spectroscopic survey starting in 2017 that will cover 2π steradians over the southern sky (δ ≲ 10°, |b| ≳ 10°), and obtain optical spectra for about two million galaxies out to z < 0.4. Taipan will use the newly refurbished 1.2-m UK Schmidt Telescope at Siding Spring Observatory with the new TAIPAN instrument, which includes an innovative ‘Starbugs’ positioning system capable of rapidly and simultaneously deploying up to 150 spectroscopic fibres (and up to 300 with a proposed upgrade) over the 6° diameter focal plane, and a purpose-built spectrograph operating in the range from 370 to 870 nm with resolving power R ≳ 2000. The main scientific goals of Taipan are (i) to measure the distance scale of the Universe (primarily governed by the local expansion rate, H0) to 1% precision, and the growth rate of structure to 5%; (ii) to make the most extensive map yet constructed of the total mass distribution and motions in the local Universe, using peculiar velocities based on improved Fundamental Plane distances, which will enable sensitive tests of gravitational physics; and (iii) to deliver a legacy sample of low-redshift galaxies as a unique laboratory for studying galaxy evolution as a function of dark matter halo and stellar mass and environment. The final survey, which will be completed within 5 yrs, will consist of a complete magnitude-limited sample (i ⩽ 17) of about 1.2 × 106 galaxies supplemented by an extension to higher redshifts and fainter magnitudes (i ⩽ 18.1) of a luminous red galaxy sample of about 0.8 × 106 galaxies. Observations and data processing will be carried out remotely and in a fully automated way, using a purpose-built automated ‘virtual observer’ software and an automated data reduction pipeline. The Taipan survey is deliberately designed to maximise its legacy value by complementing and enhancing current and planned surveys of the southern sky at wavelengths from the optical to the radio; it will become the primary redshift and optical spectroscopic reference catalogue for the local extragalactic Universe in the southern sky for the coming decade.
Military personnel generally under-consume n-3 fatty acids and overconsume n-6 fatty acids. In a placebo-controlled, double-blinded study, we investigated whether a diet suitable for implementation in military dining facilities and civilian cafeterias could benefit n-3/n-6 fatty acid status of consumers. Three volunteer groups were provided different diets for 10 weeks. Control (CON) participants consumed meals from the US Military’s Standard Garrison Dining Facility Menu. Experimental, moderate (EXP-Mod) and experimental-high (EXP-High) participants consumed the same meals, but high n-6 fatty acid and low n-3 fatty acid containing chicken, egg, oils and food ingredients were replaced with products having less n-6 fatty acids and more n-3 fatty acids. The EXP-High participants also consumed smoothies containing 1000 mg n-3 fatty acids per serving, whereas other participants received placebo smoothies. Plasma and erythrocyte EPA and DHA in CON group remained unchanged throughout, whereas EPA, DHA and Omega-3 Index increased in EXP-Mod and EXP-High groups, and were higher than in CON group after 5 weeks. After 10 weeks, Omega-3 Index in EXP-High group had increased further. No participants exhibited changes in fasting plasma TAG, total cholesterol, LDL, HDL, mood or emotional reactivity. Replacing high linoleic acid (LA) containing foods in dining facility menus with similar high oleic acid/low LA and high n-3 fatty acid foods can improve n-6/n-3 blood fatty acid status after 5 weeks. The diets were well accepted and suitable for implementation in group feeding settings like military dining facilities and civilian cafeterias.
Aminocyclopyrachlor (AMCP) is a synthetic auxin herbicide used for broadleaf
weed control in pasture and rangeland. The tolerance and fate of AMCP within
pertinent grass species is not well understood. Research was conducted to
establish the tolerance of four grass species to AMCP application and
observe their absorption, translocation, and metabolism. Results indicate
that tall fescue is the most tolerant of AMCP at rates required for weed
control. Bahiagrass and bermudagrass are marginally tolerant, and cogongrass
is the most sensitive. Tall fescue and bahiagrass absorbed more AMCP than
bermudagrass and cogongrass, but cogongrass absorption is the most rapid and
complete within 2 days after treatment (DAT). Cogongrass and bermudagrass
translocated the least amount out of the target area, whereas bahiagrass and
tall fescue translocated the most. Radioisotope imaging revealed that tall
fescue may sequester absorbed AMCP in leaf tips. This sequestering may be
the basis of the greater tolerance to AMCP by tall fescue relative to the
other species evaluated. No metabolism of AMCP was detected in any grass
species out to 42 DAT.
The development and spread of glyphosate-resistant (GR) horseweed has increased the use of dicamba as an alternative herbicide treatment. Research evaluated suspected glyphosate-resistant horseweed populations from DeKalb (GR-1) and Cherokee (GR-2) counties, Alabama, for response to glyphosate, dicamba, and glyphosate + dicamba. Populations used for resistance determination were tested at rosette and bolt growth stages. Glyphosate resistance evaluation treatments ranged from 0 to 36.0 kg ae ha−1. Data confirmed that GR-1 and GR-2 horseweed populations were 3.0 to 38 times more resistant to glyphosate than the susceptible population, according to population, data type, and growth stage at treatment. GR-1 and GR-2 populations were further evaluated for response to dicamba. Dicamba was applied at 0 to 1.12 kg ai ha−1, both with and without the addition of glyphosate at 1.12 kg ae ha−1. All populations had similar tolerance to dicamba, with the exception of GR-2 treated at the rosette growth stage, which had ~2-fold greater tolerance. When glyphosate was tank-mixed with dicamba, the response of GR populations was similar to that of dicamba alone. Therefore, any potential resistance-management benefit of tank-mixing dicamba with glyphosate may be negated when one is attempting to control GR horseweed. Conversely, adding glyphosate to dicamba drastically enhanced control of the susceptible population at both growth stages.
Ethnohistorical and ethnographic observations from around the world indicate that projectiles were often made differently for warfare and hunting. Using experiential archaeology and analysis of a thousand years’ worth of data from the middle Gila River in Arizona, the authors argue that side notched arrow points were produced for hunting large animals and were designed to be retrieved and reused, while unnotched points were intended for single use and for another purpose: to kill people. The data suggests furthermore that the region witnessed a steady increase in levels of violence during the period under study.
Nearly 258 million ha (28%) of the United States is publicly owned land that is managed by federal government agencies. For example, the US Department of Agriculture's Forest Service (USFS) manages over 77 million ha of national forests and grasslands for the benefit of the American public. Given its legal directive to manage multiple uses, it is not surprising that conflicts arise among stakeholders over how this land should be used (Lansky, 1992). The USFS has much discretion in how land is managed, yet must often balance conflicting values of public use and benefit (Nie, 2004). As national priorities, social preferences and public awareness of national forest goods, services and values have changed over time, USFS managers have faced increased pressure to balance consumptive uses with the need for environmental protection. Competing stakeholder demands coupled with increased environmental risks (wildfires, tree diseases and insect epidemics) have resulted in an escalating conservation conflict that is manifested in administrative appeals, lawsuits and a growing distrust of the agency.
Over time, the USFS has embraced new directions and management paradigms to reduce conflict. Some of these have been ecosystem management, adaptive management and now collaborative management (e.g. Holling, 1978; Maser, 1988; Franklin, 1992; Boyce and Haney, 1997; Wondolleck and Yaffee, 2000; Brown et al., 2004). These approaches reflect changing societal values, political pressures and new scientific information.
A persistent conflict has been the logging of trees in national forests and related impacts on forest ecosystems (Lansky, 1992). The USFS’ timber sale programme has supported jobs and community stability through economic development. Logging has also been a mechanism to reduce the risk of wildfire by reducing tree density (fuel for fires) and vertical stand diversity (‘ladder’ fuels; North et al., 2009). However, logging can also negatively affect forest integrity, watershed quality, wildlife, aesthetic and spiritual values of forests (Satterfield, 2002; North et al., 2009).