To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Reward Deficiency Syndrome (RDS) is an umbrella term for all drug and nondrug addictive behaviors, due to a dopamine deficiency, “hypodopaminergia.” There is an opioid-overdose epidemic in the USA, which may result in or worsen RDS. A paradigm shift is needed to combat a system that is not working. This shift involves the recognition of dopamine homeostasis as the ultimate treatment of RDS via precision, genetically guided KB220 variants, called Precision Behavioral Management (PBM). Recognition of RDS as an endophenotype and an umbrella term in the future DSM 6, following the Research Domain Criteria (RDoC), would assist in shifting this paradigm.
Lewy body dementia, consisting of both dementia with Lewy bodies (DLB) and Parkinson's disease dementia (PDD), is considerably under-recognised clinically compared with its frequency in autopsy series.
This study investigated the clinical diagnostic pathways of patients with Lewy body dementia to assess if difficulties in diagnosis may be contributing to these differences.
We reviewed the medical notes of 74 people with DLB and 72 with non-DLB dementia matched for age, gender and cognitive performance, together with 38 people with PDD and 35 with Parkinson's disease, matched for age and gender, from two geographically distinct UK regions.
The cases of individuals with DLB took longer to reach a final diagnosis (1.2 v. 0.6 years, P = 0.017), underwent more scans (1.7 v. 1.2, P = 0.002) and had more alternative prior diagnoses (0.8 v. 0.4, P = 0.002), than the cases of those with non-DLB dementia. Individuals diagnosed in one region of the UK had significantly more core features (2.1 v. 1.5, P = 0.007) than those in the other region, and were less likely to have dopamine transporter imaging (P < 0.001). For patients with PDD, more than 1.4 years prior to receiving a dementia diagnosis: 46% (12 of 26) had documented impaired activities of daily living because of cognitive impairment, 57% (16 of 28) had cognitive impairment in multiple domains, with 38% (6 of 16) having both, and 39% (9 of 23) already receiving anti-dementia drugs.
Our results show the pathway to diagnosis of DLB is longer and more complex than for non-DLB dementia. There were also marked differences between regions in the thresholds clinicians adopt for diagnosing DLB and also in the use of dopamine transporter imaging. For PDD, a diagnosis of dementia was delayed well beyond symptom onset and even treatment.
This study provides a morphological and phylogenetic characterization of two novel species of the order Haplosporida (Haplosporidium carcini n. sp., and H. cranc n. sp.) infecting the common shore crab Carcinus maenas collected at one location in Swansea Bay, South Wales, UK. Both parasites were observed in the haemolymph, gills and hepatopancreas. The prevalence of clinical infections (i.e. parasites seen directly in fresh haemolymph preparations) was low, at ~1%, whereas subclinical levels, detected by polymerase chain reaction, were slightly higher at ~2%. Although no spores were found in any of the infected crabs examined histologically (n = 334), the morphology of monokaryotic and dikaryotic unicellular stages of the parasites enabled differentiation between the two new species. Phylogenetic analyses of the new species based on the small subunit (SSU) rDNA gene placed H. cranc in a clade of otherwise uncharacterized environmental sequences from marine samples, and H. carcini in a clade with other crustacean-associated lineages.
We present a calibration component for the Murchison Widefield Array All-Sky Virtual Observatory (MWA ASVO) utilising a newly developed PostgreSQL database of calibration solutions. Since its inauguration in 2013, the MWA has recorded over 34 petabytes of data archived at the Pawsey Supercomputing Centre. According to the MWA Data Access policy, data become publicly available 18 months after collection. Therefore, most of the archival data are now available to the public. Access to public data was provided in 2017 via the MWA ASVO interface, which allowed researchers worldwide to download MWA uncalibrated data in standard radio astronomy data formats (CASA measurement sets or UV FITS files). The addition of the MWA ASVO calibration feature opens a new, powerful avenue for researchers without a detailed knowledge of the MWA telescope and data processing to download calibrated visibility data and create images using standard radio astronomy software packages. In order to populate the database with calibration solutions from the last 6 yr we developed fully automated pipelines. A near-real-time pipeline has been used to process new calibration observations as soon as they are collected and upload calibration solutions to the database, which enables monitoring of the interferometric performance of the telescope. Based on this database, we present an analysis of the stability of the MWA calibration solutions over long time intervals.
Commercialization of 2,4-D–tolerant crops is a major concern for sweetpotato producers because of potential 2,4-D drift that can cause severe crop injury and yield reduction. A field study was initiated in 2014 and repeated in 2015 to assess impacts of reduced rates of 2,4-D, glyphosate, or a combination of 2,4-D with glyphosate on sweetpotato. In one study, 2,4-D and glyphosate were applied alone and in combination at 1/10, 1/100, 1/250, 1/500, 1/750, and 1/1,000 of anticipated field use rates (1.05 kg ha−1 for 2,4-D and 1.12 kg ha−1 for glyphosate) to ‘Beauregard’ sweetpotato at storage root formation (10 days after transplanting [DAP]). In a separate study, all these treatments were applied to ‘Beauregard’ sweetpotato at storage root development (30 DAP). Injury with 2,4-D alone or in combination with glyphosate was generally equal or greater than with glyphosate applied alone at equivalent herbicide rates, indicating that injury is attributable mostly to 2,4-D in the combination. There was a quadratic increase in crop injury and quadratic decrease in crop yield (with respect to most yield grades) with increased rate of 2,4-D applied alone or in combination with glyphosate applied at storage root development. However, neither the results of this relationship nor of the significance of herbicide rate were observed on crop injury or sweetpotato yield when herbicide application occurred at storage root formation, with a few exceptions. In general, crop injury and yield reduction were greatest at the highest rate (1/10×) of 2,4-D applied alone or in combination with glyphosate, although injury observed at lower rates was also a concern after initial observation by sweetpotato producers. However, in some cases, yield reduction of U.S. no.1 and marketable grades was also observed after application of 1/250×, 1/100×, or 1/10× rates of 2,4-D alone or with glyphosate when applied at storage root development.
Antimicrobial stewardship of anti-infectives prescribed upon hospital discharge was implemented to improve the rate of appropriate prescribing at discharge. Appropriate prescribing significantly improved from 47.5% to 85.2% (P < .001), antimicrobial days of therapy decreased, and 30-day readmission rates decreased. Discharge antimicrobial stewardship was effective in improving anti-infective prescribing practices.
Accurate near-field measurements for either deterministic or stochastic electromagnetic fields characterization require a relevant process that removes the influence of the probes, transmission lines, and measurement circuits. The main part of the experimental work presented here is related to a calibration procedure of a test setup consisting of a microstrip test structure and a scanning loop probe. The calibration characteristic, obtained by comparing measured and simulated results, is then used to convert the measured voltage into the magnetic field across and along the microstrip line at the specific height above it. By performing the measurements and simulations of the same test structure with the loop probe in the presence of an additional scanning probe, the influence of the additional probe to the measured output is thoroughly investigated and relevant corrections are given. These corrections can be important when two-point correlation measurement is required, especially in scanning points when two probes are mutually close.
OBJECTIVES/GOALS: Decision-making impairments in addiction can arise from dysfunction in distinct neural circuits. Such processes can be dissociated by measuring complex, computationally distinct behaviors within an economic framework. We aim to characterize computational changes conserved across models of addiction. METHODS/STUDY POPULATION: We used neuroeconomic tasks capable of dissociating neurally separable decision processes using behavioral analyses equally applicable to humans and rodents. We tested 12 human cocaine-users and 9 healthy controls on the Web-Surf task designed to match the rodent Restaurant Row task on which 27 mice were trained and then exposed to saline (n = 10), cocaine (n = 7), or morphine (n = 10). All subjects foraged for rewards (humans: entertaining videos; mice: food) of varying costs (1-30s delays) and subjective value (humans: genres; mice: flavors) by making serial accept or reject decisions while on a limited time budget, balancing the utility of wanting desirable rewards despite conflicting costs. RESULTS/ANTICIPATED RESULTS: When encountering unique offers for rewards with a delay above one’s willingness to wait, cocaine-treated mice like cocaine-exposed humans were less likely to appropriately reject economically disadvantageous offers. Furthermore, these mice and humans did so despite spending more time deliberating between future options. In contrast, morphine-treated mice displayed distinct impairments when given the opportunity to correct past mistakes, a process we previously demonstrated was uniquely sensitive to alterations in strength of synaptic connectivity of the infralimbic-accumbens shell circuit in mice. We anticipate human opioid-users will mirror these latter, computationally distinct findings. DISCUSSION/SIGNIFICANCE OF IMPACT: These data elucidate facets of addiction shared across species yet fundamentally distinct between disease subtypes. Our translational approach can help shed light on conserved pathophysiological mechanisms in order to identify novel diagnostic parameters and computational targets for intervention.
OBJECTIVES/GOALS: There is a high burden of lung cancer in persons living with HIV (PLWH). The role that HIV status, by levels of immune function and viral load, has on survival from lung cancer is not fully understood. The study’s objectives were to assess 1) the association of HIV with survival in non-small cell lung cancer (NSCLC) and 2) prognostic factors in PLWH with NSCLC. METHODS/STUDY POPULATION: Participants were from a cohort of lung cancer patients diagnosed between 2004-2017 in the Bronx, NY, with vital status ascertainment at least annually. We compared survival from NSCLC diagnosis between HIV-negative patients (HIV-, N = 2881) and PLWH (N = 88), using Cox regression, accounting for clinical and sociodemographic factors including smoking status. In three separate comparisons to HIV-, PLWH were dichotomized by CD4 count (<200 vs. ≥200 cells/μL), CD4/CD8 ratio (median, <0.43 vs. ≥0.43) and HIV viral load (VL) suppression (<75 vs. ≥75 copies/mL). In PLWH only, we assessed the relationships of CD4 count, CD4/CD8 ratio, and VL at diagnosis with survival adjusting for age, sex, and cancer stage. CD4 count and CD4/CD8 ratio were also examined as time-varying variables using a counting process approach. RESULTS/ANTICIPATED RESULTS: PLWH were younger (median 56 years, IQR 51-52 vs. 68, IQR 60-76) and more likely to be current smokers (58% vs. 37%) at diagnosis than HIV- patients. Median survival was lower in PLWH [1.1 years, 95% confidence interval (95%CI): 0.6-1.3] than in HIV- [1.6 (1.5-1.7)]. Survival comparing PLWH with higher CD4/CD8 to HIV- was similar [hazard ratio (HR), 95%CI: 0.63 (0.37-1.07)], but those with lower CD4/CD8 experienced worse survival (HR = 1.74, 95%CI: 1.07-3.89). Among PLWH, having a CD4 count < 200 cells/μL was associated with over twice the risk of death compared to those with CD4 ≥ 200 cells/μL (HR = 2.37, 95%CI: 1.14-4.92). VL and CD4/CD8 ratio were not associated with survival. Lower time-updated CD4 count was also associated with worse survival (HR = 2.19 for CD4 <200 vs. >200 cells/μL, 95%CI: 1.16-4.13). DISCUSSION/SIGNIFICANCE OF IMPACT: Among persons with NSCLC, CD4/CD8 ratio nearest diagnosis was shown to distinguish mortality risk in PLWH compared with HIV- patients. In addition, PLWH with low CD4 had worse prognosis than PLWH who had higher CD4 counts. These results suggest HIV immune status to be an essential component influencing survival in lung cancer.
A major concern of sweetpotato producers is the potential negative effects from herbicide drift or sprayer contamination events when dicamba is applied to nearby dicamba-resistant crops. A field study was initiated in 2014 and repeated in 2015 to assess the effects of reduced rates of N,N-Bis-(3-aminopropyl)methylamine (BAPMA) or diglycloamine (DGA) salt of dicamba, glyphosate, or a combination of these individually in separate trials with glyphosate on sweetpotato. Reduced rates of 1/10, 1/100, 1/250, 1/500, 1/750, and 1/1,000 of the 1× use rate of each dicamba formulation at 0.56 kg ha−1, glyphosate at 1.12 kg ha−1, and a combination of the two at aforementioned rates were applied to ‘Beauregard’ sweetpotato at storage root formation (10 d after transplanting) in one trial and storage root development (30 d after transplanting) in a separate trial. Injury with each salt of dicamba (BAPMA or DGA) applied alone or with glyphosate was generally equal to or greater than glyphosate applied alone at equivalent rates, indicating that injury is most attributable to the dicamba in the combination. There was a quadratic increase in crop injury and a quadratic decrease in crop yield (with respect to most yield grades) observed with an increased herbicide rate of dicamba applied alone or in combination with glyphosate applied at storage root development. However, with a few exceptions, neither this relationship nor the significance of herbicide rate was observed on crop injury or sweetpotato yield when herbicide application occurred at the storage root formation stage. In general, crop injury and yield reduction were greatest at the highest rate (1/10×) of either salt of dicamba applied alone or in combination with glyphosate, although injury observed at lower rates would be cause for concern after initial observation by sweetpotato producers. However, in some cases yield reduction of No.1 and marketable grades was observed following 1/250×, 1/100×, or 1/10× application rates of dicamba alone or with glyphosate when applied at storage root development.
Across East Asia, the period after the Mongol retreat was one of rebuilding and reordering. As they solidified political power, new regimes in China, Korea and Japan aggressively established authority over the religious realm, demanding compliance with moral and ritual norms, managing certain types of religious pluralism and violently crushing deviant devotion and organised religious resistance. Violence pervaded religion itself. Theological exploration of ideas such as cosmic destruction and rebirth, divine retribution, enforcer deities and the morality of killing for a greater good created stylised roles for both victims and perpetrators of violence. These themes manifested differently across the region. After Japanese militarists destroyed Buddhist mountain strongholds, lay armies defending the dharma fought with the ferocity of the faithful. Persecuted Christian converts willingly met martyrdom in the Catholic idiom. In China, the undercurrent of millenarian ideas that circulated through banned texts and teachings proved impossible to contain. These ideas could quickly militarise in response to stress, feeding a devastating cycle of rebellion and repression that continued through the eighteenth and nineteenth centuries. Across the region, temples and monasteries fought for resources, and religious affiliations often provided a spark for local tensions to erupt into organised violence.
Over its long reign, the Qing imperial state aggressively pursued unauthorized religion, both to uphold its own spiritual hegemony, and to avert religious militarization. With growing social dislocation over the nineteenth century, the dynasty faced a massive explosion of religious violence – a seemingly irrepressible series of millenarian “White Lotus” movements in central China, Muslim uprisings in the north and southwest, and the pseudo-Christian Taiping Rebellion that divided the country for more than a decade. Together, these rebellions and their suppression claimed the lives of tens of millions. The anti-Christian Boxer Uprising was brutally extirpated by a coalition of foreign forces, but at least as deadly were the waves of recriminations between Chinese villages. After coming to power in 1949, the Communist regime moved quickly to contain religion, expelling Catholic missionaries and initiating a suppression of native groups like Yiguandao. Policy towards religion appeared to soften in the 1990s, and yet remained highly vigilant towards any hint of millenarianism or religious sedition. Even knowing this, few observers were prepared for the sheer brutality of the 1999 campaign against Falungong (Dharma Wheel Practice).
No evidence-based therapy for borderline personality disorder (BPD) exhibits a clear superiority. However, BPD is highly heterogeneous, and different patients may specifically benefit from the interventions of a particular treatment.
From a randomized trial comparing a year of dialectical behavior therapy (DBT) to general psychiatric management (GPM) for BPD, long-term (2-year-post) outcome data and patient baseline variables (n = 156) were used to examine individual and combined patient-level moderators of differential treatment response. A two-step bootstrapped and partially cross-validated moderator identification process was employed for 20 baseline variables. For identified moderators, 10-fold bootstrapped cross-validated models estimated response to each therapy, and long-term outcomes were compared for patients randomized to their model-predicted optimal v. non-optimal treatment.
Significant moderators surviving the two-step process included psychiatric symptom severity, BPD impulsivity symptoms (both GPM > DBT), dependent personality traits, childhood emotional abuse, and social adjustment (all DBT > GPM). Patients randomized to their model-predicted optimal treatment had significantly better long-term outcomes (d = 0.36, p = 0.028), especially if the model had a relatively stronger (top 60%) prediction for that patient (d = 0.61, p = 0.004). Among patients with a stronger prediction, this advantage held even when applying a conservative statistical check (d = 0.46, p = 0.043).
Patient characteristics influence the degree to which they respond to two treatments for BPD. Combining information from multiple moderators may help inform providers and patients as to which treatment is the most likely to lead to long-term symptom relief. Further research on personalized medicine in BPD is needed.
Methionine, an essential sulphur-containing amino acid (SAA), plays an integral role in many metabolic processes. Evidence for the methionine requirements of adult dogs is limited, and we employed the indicator amino acid oxidation (IAAO) method to estimate dietary methionine requirements in Labrador retrievers (n 21). Using semi-purified diets, the mean requirement was 0·55 (95 % CI 0·41, 0·71) g/4184 kJ. In a subsequent parallel design study, three groups of adult Labrador retrievers (n 52) were fed semi-purified diets with 0·55 g/4184 kJ (test diet 1), 0·71 g/4184 kJ (test diet 2) or 1·37 g/4184 kJ (control diet) of methionine for 32 weeks to assess the long-term consequences of feeding. The total SAA content (2·68 g/4184 kJ) was maintained through dietary supplementation of cystine. Plasma methionine did not decrease in test group and increased significantly on test diet 1 in weeks 8 and 16 compared with control. Reducing dietary methionine did not have a significant effect on whole blood, plasma or urinary taurine or plasma N-terminal pro B-type natriuretic peptide. Significant effects in both test diets were observed for cholesterol, betaine and dimethylglycine. In conclusion, feeding methionine at the IAAO-estimated mean was sufficient to maintain plasma methionine over 32 weeks when total SAA was maintained. However, choline oxidation may have increased to support plasma methionine and have additional consequences for lipid metabolism. While the IAAO can be employed to assess essential amino acid requirements, such as methionine in the dog using semi-purified diets, further work is required to establish safe levels for commercial diet formats.
Residential mobility during upbringing, and especially adolescence, is associated with multiple negative mental health outcomes. However, whether associations are confounded by unmeasured familial factors, including genetic liability, is unclear.
We used a population-based case–cohort study to assess whether polygenic risk scores (PRSs) for schizophrenia, bipolar disorder and major depression were associated with mobility from ages 10–14 years, and whether PRS and parental history of mental disorder together explained associations between mobility and each disorder.
Information on cases (n = 4207 schizophrenia, n = 1402 bipolar disorder, n = 18 215 major depression) and a random population sample (n = 17 582), born 1981–1997, was linked between Danish civil and psychiatric registries. Genome-wide data were obtained from the Danish Neonatal Screening Biobank and PRSs were calculated based on results of separate, large meta-analyses.
PRSs for schizophrenia and major depression were weakly associated with moving once (odds ratio 1.07, 95% CI 1.00–1.16; and odds ratio 1.10, 95% CI 1.04–1.17, respectively), but not twice or three or more times. Mobility was positively associated with each disorder, with more moves associated with greater risk. Adjustment for PRS produced slight reductions in the magnitude of associations. Adjustment for PRS and parental history of mental disorder together reduced estimates by 5–11%. In fully adjusted models mobility was associated with all three disorders; hazard ratios ranged from 1.33 (95% CI 1.08–1.62; one move and bipolar disorder) to 3.05 (95% CI 1.92–4.86; three or more moves and bipolar disorder).
Associations of mobility with schizophrenia, bipolar disorder and depression do not appear to be attributable to genetic liability as measured here. Potential familial confounding of mobility associations may be predominantly environmental in nature.