To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Reward Deficiency Syndrome (RDS) is an umbrella term for all drug and nondrug addictive behaviors, due to a dopamine deficiency, “hypodopaminergia.” There is an opioid-overdose epidemic in the USA, which may result in or worsen RDS. A paradigm shift is needed to combat a system that is not working. This shift involves the recognition of dopamine homeostasis as the ultimate treatment of RDS via precision, genetically guided KB220 variants, called Precision Behavioral Management (PBM). Recognition of RDS as an endophenotype and an umbrella term in the future DSM 6, following the Research Domain Criteria (RDoC), would assist in shifting this paradigm.
Turbulent fluxes make a substantial and growing contribution to the energy balance of ice surfaces globally, but are poorly constrained owing to challenges in estimating the aerodynamic roughness length (z0). Here, we used structure from motion (SfM) photogrammetry and terrestrial laser scanning (TLS) surveys to make plot-scale 2-D and 3-D microtopographic estimations of z0 and upscale these to map z0 across an ablating mountain glacier. At plot scales, we found spatial variability in z0 estimates of over two orders of magnitude with unpredictable z0 trajectories, even when classified into ice surface types. TLS-derived surface roughness exhibited strong relationships with plot-scale SfM z0 estimates. At the glacier scale, a consistent increase in z0 of ~0.1 mm d−1 was observed. Space-for-time substitution based on time since surface ice was exposed by snow melt confirmed this gradual increase in z0 over 60 d. These measurements permit us to propose a scale-dependent temporal z0 evolution model where unpredictable variability at the plot scale gives way to more predictable changes of z0 at the glacier scale. This model provides a critical step towards deriving spatially and temporally distributed representations of z0 that are currently lacking in the parameterisation of distributed glacier surface energy balance models.
We implemented universal severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) testing of patients undergoing surgical procedures as a means to conserve personal protective equipment (PPE). The rate of asymptomatic coronavirus disease 2019 (COVID-19) was <0.5%, which suggests that early local public health interventions were successful. Although our protocol was resource intensive, it prevented exposures to healthcare team members.
OBJECTIVES/GOALS: Decision-making impairments in addiction can arise from dysfunction in distinct neural circuits. Such processes can be dissociated by measuring complex, computationally distinct behaviors within an economic framework. We aim to characterize computational changes conserved across models of addiction. METHODS/STUDY POPULATION: We used neuroeconomic tasks capable of dissociating neurally separable decision processes using behavioral analyses equally applicable to humans and rodents. We tested 12 human cocaine-users and 9 healthy controls on the Web-Surf task designed to match the rodent Restaurant Row task on which 27 mice were trained and then exposed to saline (n = 10), cocaine (n = 7), or morphine (n = 10). All subjects foraged for rewards (humans: entertaining videos; mice: food) of varying costs (1-30s delays) and subjective value (humans: genres; mice: flavors) by making serial accept or reject decisions while on a limited time budget, balancing the utility of wanting desirable rewards despite conflicting costs. RESULTS/ANTICIPATED RESULTS: When encountering unique offers for rewards with a delay above one’s willingness to wait, cocaine-treated mice like cocaine-exposed humans were less likely to appropriately reject economically disadvantageous offers. Furthermore, these mice and humans did so despite spending more time deliberating between future options. In contrast, morphine-treated mice displayed distinct impairments when given the opportunity to correct past mistakes, a process we previously demonstrated was uniquely sensitive to alterations in strength of synaptic connectivity of the infralimbic-accumbens shell circuit in mice. We anticipate human opioid-users will mirror these latter, computationally distinct findings. DISCUSSION/SIGNIFICANCE OF IMPACT: These data elucidate facets of addiction shared across species yet fundamentally distinct between disease subtypes. Our translational approach can help shed light on conserved pathophysiological mechanisms in order to identify novel diagnostic parameters and computational targets for intervention.
The aging of any biological system results in quantifiable change which may affect the output of the system in subtle or substantial ways. Human cognitive aging is no exception and the manner in which the system, in this case the brain, is able to withstand and/or adapt to the effects of age-related physiological change will determine the individual cognitive trajectory. In this chapter, we review the emerging field of blood biomarkers of cognitive aging with a focus on specific metabolic pathways implicated in cognitive health including cellular energetics, lipid metabolism, the maintenance of redox state, and inflammation. Challenges to blood biomarker development, including methodological and inferential limitations, are also reviewed. Ultimately, blood biomarkers of age-related neurodegenerative disease and cognitive success will provide clues for how we might all age successfully, reducing health care burden on societies and improving quality of life for individuals.
In much of Europe, the advent of low-input cereal farming regimes between c.ad 800 and 1200 enabled landowners—lords—to amass wealth by greatly expanding the amount of land under cultivation and exploiting the labour of others. Scientific analysis of plant remains and animal bones from archaeological contexts is generating the first direct evidence for the development of such low-input regimes. This article outlines the methods used by the FeedSax project to resolve key questions regarding the ‘cerealization’ of the medieval countryside and presents preliminary results using the town of Stafford as a worked example. These indicate an increase in the scale of cultivation in the Mid-Saxon period, while the Late Saxon period saw a shift to a low-input cultivation regime and probably an expansion onto heavier soils. Crop rotation appears to have been practised from at least the mid-tenth century.
Analysis of human remains and a copper band found in the center of a Late Archaic (ca. 5000–3000 cal BP) shell ring demonstrate an exchange network between the Great Lakes and the coastal southeast United States. Similarities in mortuary practices suggest that the movement of objects between these two regions was more direct and unmediated than archaeologists previously assumed based on “down-the-line” models of exchange. These findings challenge prevalent notions that view preagricultural Native American communities as relatively isolated from one another and suggest instead that wide social networks spanned much of North America thousands of years before the advent of domestication.
The Centers for Disease Control and Prevention developed 15 National Public Health Emergency and Preparedness Response Capabilities (NPHPRCs) to serve as national standards for health-related core capabilities. The objective of this study is to determine the level of federal funding allocated for research related to NPHPRCs during 2008–2017.
An online search of http://www.USAspending.gov was performed to identify federal awards, grants, contracts from 2008–2017 related to research associated with NPHPRCs. Inclusion criteria were identifiable as research and disaster-related; US-based; and specific reference to any of the NPHPRCs. A panel of 3 experts reviewed each entry for inclusion.
The search identified 15 278 transactions representing US $29.2 billion in awards. After exclusions, 93 entries were found to be related to NPHPRCs, averaging US $2 783 136 annually. Funding notably dropped to US $168 684 in 2010 and ceased entirely in 2016. Ten (67%) of NPHPRCs received funding. Eighty-percent of funding focused on 4 capabilities. Three federal agencies funded 80% of research. Sixteen (24%) of the 47 recipients received 80% of all funding.
US federal investments in research and development related to NPHPRCs have been highly variable over the past decade. One-third of NPHPRCs receive no funding. There are notable gaps in funding, content, continuity, and scope of participation.
The major facilitator superfamily domain 2a protein was identified recently as a lysophosphatidylcholine (LPC) symporter with high affinity for LPC species enriched with DHA (LPC-DHA). To test the hypothesis that reproductive state and choline intake influence plasma LPC-DHA, we performed a post hoc analysis of samples available through 10 weeks of a previously conducted feeding study, which provided two doses of choline (480 and 930 mg/d) to non-pregnant (n 21), third-trimester pregnant (n 26), and lactating (n 24) women; all participants consumed 200 mg of supplemental DHA and 22 % of their daily choline intake as 2H-labelled choline. The effects of reproductive state and choline intake on total LPC-DHA (expressed as a percentage of LPC) and plasma enrichments of labelled LPC and LPC-DHA were assessed using mixed and generalised linear models. Reproductive state interacted with time (P = 0·001) to influence total LPC-DHA, which significantly increased by week 10 in non-pregnant women, but not in pregnant or lactating women. Contrary to total LPC-DHA, patterns of labelled LPC-DHA enrichments were discordant between pregnant and lactating women (P < 0·05), suggestive of unique, reproductive state-specific mechanisms that result in reduced production and/or enhanced clearance of LPC-DHA during pregnancy and lactation. Regardless of the reproductive state, women consuming 930 v. 480 mg choline per d exhibited no change in total LPC-DHA but higher d3-LPC-DHA (P = 0·02), indicating that higher choline intakes favour the production of LPC-DHA from the phosphatidylethanolamine N-methyltransferase pathway of phosphatidylcholine biosynthesis. Our results warrant further investigation into the effect of reproductive state and dietary choline on LPC-DHA dynamics and its contribution to DHA status.
Institutionally deprived young children often display distinctive patterns of attachment, classified as insecure/other (INS/OTH), with their adoptive parents. The associations between INS/OTH and developmental trajectories of mental health and neurodevelopmental symptoms were examined. Age 4 attachment status was determined for 97 Romanian adoptees exposed to up to 24 months of deprivation in Romanian orphanages and 49 nondeprived UK adoptees. Autism, inattention/overactivity and disinhibited-social-engagement symptoms, emotional problems, and IQ were measured at 4, 6, 11, and 15 years and in young adulthood. Romanian adoptees with over 6 months deprivation (Rom>6) were more often classified as INS/OTH than UK and Romanian adoptees with less than 6 months deprivation combined. INS/OTH was associated with cognitive impairment at age 4 years. The interaction between deprivation, attachment status, and age for autism spectrum disorder assessment was significant, with greater symptom persistence in Rom>6 INS/OTH(+) than other groups. This effect was reduced when IQ at age 4 was controlled for. Age 4 INS/OTH in Rom>6 was associated with worse autism spectrum disorder outcomes up to two decades later. Its association with cognitive impairment at age 4 is consistent with INS/OTH being an early marker of this negative developmental trajectory, rather than its cause.
We are developing the novel αIIbβ3 antagonist, RUC-4, for subcutaneously (SC)-administered first-point-of-medical-contact treatment for ST segment elevation myocardial infarction (STEMI).
We studied the (1) pharmacokinetics (PK) of RUC-4 at 1.0, 1.93, and 3.86 mg/kg intravenous (IV), intramuscular (IM), and SC in non-human primates (NHPs); (2) impact of aspirin on RUC-4 IC50 in human platelet-rich plasma (PRP); (3) effect of different anticoagulants on the RUC-4 IC50 in human PRP; and (4) relationship between αIIbβ3 receptor blockade by RUC-4 and inhibition of ADP-induced platelet aggregation.
(1) All doses of RUC-4 were well tolerated, but animals demonstrated variable temporary bruising. IM and SC RUC-4 reached dose-dependent peak levels within 5–15 minutes, with T1/2 s between 0.28 and 0.56 hours. Platelet aggregation studies in NHPs receiving IM RUC-4 demonstrated >80% inhibition of the initial slope of ADP-induced aggregation with all three doses 30 minutes post-dosing, with subsequent dose-dependent loss of inhibition over 4–5 hours. (2) The RUC-4 IC50 for ADP-induced platelet aggregation was unaffected by aspirin treatment (40±9 nM vs 37±5 nM; p = 0.39). (3) The RUC-4 IC50 was significantly higher in PRP prepared from D-phenylalanyl-prolyl-arginyl chloromethyl ketone (PPACK)-anticoagulated blood compared to citrate-anticoagulated blood using either thrombin receptor activating peptide (TRAP) (122±17 vs 66±25 nM; p = 0.05; n = 4) or ADP (102±22 vs 54±13; p<0.001; n = 5). (4) There was a close correspondence between receptor blockade and inhibition of ADP-induced platelet aggregation, with aggregation inhibition beginning with ~40% receptor blockade and becoming nearly complete at >80% receptor blockade.
Based on these results and others, RUC-4 has now progressed to formal preclinical toxicology studies.
Disaster-related research funding in the United States has not been described. This study characterizes Federal funding for disaster-related research for 5 professional disciplines: medicine, public health, social science, engineering, emergency management.
An online key word search was performed using the website, www.USAspending.gov, to identify federal awards, grants, and contracts during 2011–2016. A panel of experts then reviewed each entry for inclusion.
The search identified 9145 entries, of which 262 (3%) met inclusion criteria. Over 6 years, the Federal Government awarded US $69 325 130 for all disaster-related research. Total funding levels quadrupled in the first 3 years and then halved in the last 3 years. Half of the funding was for engineering, 3 times higher than social sciences and emergency management and 5 times higher than public health and medicine. Ten (11%) institutions received 52% of all funding. The search returned entries for only 12 of the 35 pre-identified disaster-related capabilities; 6 of 12 capabilities appear to have received no funding for at least 2 years.
US federal funding for disaster-related research is limited and highly variable during 2011–2016. There are no clear reasons for apportionment. There appears to be an absence of prioritization. There does not appear to be a strategy for alignment of research with national disaster policies.
The objective of this study is to characterize US-based disaster training courses available to disaster response and disaster health professionals. Its purpose is to better inform policies and decision-making regarding workforce and professional development to improve performance.
Courses were identified from 4 inventories of courses: (1) National Library of Medicine Disaster Lit database; (2) TRAIN National Learning Network; (3) Federal Emergency Management Agency (FEMA) National Preparedness Course Catalog; and (4) Preparedness and Emergency Response Learning Centers. An online search used 30 disaster-related key words. Data included the course title, description, target audience, and delivery modality. Levels of learning, target capability, and function were categorized by 3 expert reviewers. Descriptive statistics were used.
There were 3662 trainings: 2380 (65%) for professionals (53% for public health); 83% of the courses were distance learning, with 16% via classroom. Half of all trainings focused on 3 of 37 disaster capabilities and 38% of them were related to chemical, biological, radiological, nuclear, and explosives (CBRNE). The educational approach was knowledge-based for all courses and 99.6% imparted only lower levels of learning.
Despite thousands of courses available, there remain significant gaps in target audience, subject matter content, educational approaches, and delivery modalities, particularly for health and public health professionals.
The early Middle Ages saw a major expansion of cereal cultivation across large parts of Europe thanks to the spread of open-field farming. A major project to trace this expansion in England by deploying a range of scientific methods is generating direct evidence for this so-called ‘Medieval Agricultural Revolution’.