To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Saccade and pupil responses are potential neurodegenerative disease biomarkers due to overlap between oculomotor circuitry and disease-affected areas. Instruction-based tasks have previously been examined as biomarker sources, but are arduous for patients with limited cognitive abilities; additionally, few studies have evaluated multiple neurodegenerative pathologies concurrently. Methods: The Ontario Neurodegenerative Disease Research Initiative recruited individuals with Alzheimer’s disease (AD), mild cognitive impairment (MCI), amyotrophic lateral sclerosis (ALS), frontotemporal dementia, progressive supranuclear palsy, or Parkinson’s disease (PD). Patients (n=274, age 40-86) and healthy controls (n=101, age 55-86) viewed 10 minutes of frequently changing video clips without instruction while their eyes were tracked. We evaluated differences in saccade and pupil parameters (e.g. saccade frequency and amplitude, pupil size, responses to clip changes) between groups. Results: Preliminary data indicates low-level behavioural alterations in multiple disease cohorts: increased centre bias, lower overall saccade rate and reduced saccade amplitude. After clip changes, patient groups generally demonstrated lower saccade rate but higher microsaccade rate following clip change to varying degrees. Additionally, pupil responses were blunted (AD, MCI, ALS) or exaggerated (PD). Conclusions: This task may generate behavioural biomarkers even in cognitively impaired populations. Future work should explore the possible effects of factors such as medication and disease stage.
Background: Eye movements reveal neurodegenerative disease processes due to overlap between oculomotor circuitry and disease-affected areas. Characterizing oculomotor behaviour in context of cognitive function may enhance disease diagnosis and monitoring. We therefore aimed to quantify cognitive impairment in neurodegenerative disease using saccade behaviour and neuropsychology. Methods: The Ontario Neurodegenerative Disease Research Initiative recruited individuals with neurodegenerative disease: one of Alzheimer’s disease, mild cognitive impairment, amyotrophic lateral sclerosis, frontotemporal dementia, Parkinson’s disease, or cerebrovascular disease. Patients (n=450, age 40-87) and healthy controls (n=149, age 42-87) completed a randomly interleaved pro- and anti-saccade task (IPAST) while their eyes were tracked. We explored the relationships of saccade parameters (e.g. task errors, reaction times) to one another and to cognitive domain-specific neuropsychological test scores (e.g. executive function, memory). Results: Task performance worsened with cognitive impairment across multiple diseases. Subsets of saccade parameters were interrelated and also differentially related to neuropsychology-based cognitive domain scores (e.g. antisaccade errors and reaction time associated with executive function). Conclusions: IPAST detects global cognitive impairment across neurodegenerative diseases. Subsets of parameters associate with one another, suggesting disparate underlying circuitry, and with different cognitive domains. This may have implications for use of IPAST as a cognitive screening tool in neurodegenerative disease.
On March 11, 2020, the World Health Organization declared an outbreak of a new viral entity, coronavirus 2019 (COVID-19), to be a worldwide pandemic. The characteristics of this virus, as well as its short- and long-term implications, are not yet well understood. The objective of the current paper was to provide a critical review of the emerging literature on COVID-19 and its implications for neurological, neuropsychiatric, and cognitive functioning.
A critical review of recently published empirical research, case studies, and reviews pertaining to central nervous system (CNS) complications of COVID-19 was conducted by searching PubMed, PubMed Central, Google Scholar, and bioRxiv.
After considering the available literature, areas thought to be most pertinent to clinical and research neuropsychologists, including CNS manifestations, neurologic symptoms/syndromes, neuroimaging, and potential long-term implications of COVID-19 infection, were reviewed.
Once thought to be merely a respiratory virus, the scientific and medical communities have realized COVID-19 to have broader effects on renal, vascular, and neurological body systems. The question of cognitive deficits is not yet well studied, but neuropsychologists will undoubtedly play an important role in the years to come.
Our objective was to compare patterns of dental antibiotic prescribing in Australia, England, and North America (United States and British Columbia, Canada).
Population-level analysis of antibiotic prescription.
Outpatient prescribing by dentists in 2017.
Patients receiving an antibiotic dispensed by an outpatient pharmacy.
Prescription-based rates adjusted by population were compared overall and by antibiotic class. Contingency tables assessed differences in the proportion of antibiotic class by country.
In 2017, dentists in the United States had the highest antibiotic prescribing rate per 1,000 population and Australia had the lowest rate. The penicillin class, particularly amoxicillin, was the most frequently prescribed for all countries. The second most common agents prescribed were clindamycin in the United States and British Columbia (Canada) and metronidazole in Australia and England. Broad-spectrum agents, amoxicillin-clavulanic acid, and azithromycin were the highest in Australia and the United States, respectively.
Extreme differences exist in antibiotics prescribed by dentists in Australia, England, the United States, and British Columbia. The United States had twice the antibiotic prescription rate of Australia and the most frequently prescribed antibiotic in the US was clindamycin. Significant opportunities exist for the global dental community to update their prescribing behavior relating to second-line agents for penicillin allergic patients and to contribute to international efforts addressing antibiotic resistance. Patient safety improvements will result from optimizing dental antibiotic prescribing, especially for antibiotics associated with resistance (broad-spectrum agents) or C. difficile (clindamycin). Dental antibiotic stewardship programs are urgently needed worldwide.
Residual depressive symptoms are generally documented as a risk factor for recurrence. In the absence of a specific instrument for the assessment of residual symptoms, a new 25-item Depression Residual Symptom Scale (DRSS) was elaborated and tested for recurrence prediction over a 1-year follow-up.
Sampling and methods
Fifty-nine patients in remission after a major depressive episode (MDE) were recruited in two centres. They were assessed with the DRSS and the Montgomery-Asberg Depression Rating Scale (MADRS) at inclusion and followed for 1 year according to a seminaturalistic design. The DRSS included specific depressive symptoms and subjective symptoms of vulnerability, lack of return to usual self and premorbid level of functioning.
Severity of residual symptoms was not significantly associated with increased risk of recurrence. However, DRSS score was significantly higher among patients with three or more episodes than one to two episodes. Number of previous episodes and treatment interruption were not identified as significant predictors of recurrence.
The proposed instrument is not predictive of depressive recurrence, but is sensitive to increased perception of vulnerability associated with consecutive episodes. Limitations include small sample size, seminaturalistic design (no standardisation of treatment) and content of the instrument.
We review some of our recent results about the Radial Acceleration Relation (RAR) and its interpretation as either a fundamental or an emergent law. The former interpretation is in agreement with a class of modified gravity theories that dismiss the need for dark matter in galaxies (MOND in particular). Our most recent analysis, which includes refinements on the priors and the Bayesian test for compatibility between the posteriors, confirms that the hypothesis of a fundamental RAR is rejected at more than 5σ from the very same data that was used to infer the RAR.
Frascati international research criteria for HIV-associated neurocognitive disorders (HAND) are controversial; some investigators have argued that Frascati criteria are too liberal, resulting in a high false positive rate. Meyer et al. recommended more conservative revisions to HAND criteria, including exploring other commonly used methodologies for neurocognitive impairment (NCI) in HIV including the global deficit score (GDS). This study compares NCI classifications by Frascati, Meyer, and GDS methods, in relation to neuroimaging markers of brain integrity in HIV.
Two hundred forty-one people living with HIV (PLWH) without current substance use disorder or severe (confounding) comorbid conditions underwent comprehensive neurocognitive testing and brain structural magnetic resonance imaging and magnetic resonance spectroscopy. Participants were classified using Frascati criteria versus Meyer criteria: concordant unimpaired [Frascati(Un)/Meyer(Un)], concordant impaired [Frascati(Imp)/Meyer(Imp)], or discordant [Frascati(Imp)/Meyer(Un)] which were impaired via Frascati criteria but unimpaired via Meyer criteria. To investigate the GDS versus Meyer criteria, the same groupings were utilized using GDS criteria instead of Frascati criteria.
When examining Frascati versus Meyer criteria, discordant Frascati(Imp)/Meyer(Un) individuals had less cortical gray matter, greater sulcal cerebrospinal fluid volume, and greater evidence of neuroinflammation (i.e., choline) than concordant Frascati(Un)/Meyer(Un) individuals. GDS versus Meyer comparisons indicated that discordant GDS(Imp)/Meyer(Un) individuals had less cortical gray matter and lower levels of energy metabolism (i.e., creatine) than concordant GDS(Un)/Meyer(Un) individuals. In both sets of analyses, the discordant group did not differ from the concordant impaired group on any neuroimaging measure.
The Meyer criteria failed to capture a substantial portion of PLWH with brain abnormalities. These findings support continued use of Frascati or GDS criteria to detect HIV-associated CNS dysfunction.
We describe the case of an 11-month-old girl with a rare cerebellar glioblastoma driven by a NACC2-NTRK2 (Nucleus Accumbens Associated Protein 2-Neurotrophic Receptor Tyrosine Kinase 2) fusion. Initial workup of our case demonstrated homozygous CDKN2A deletion, but immunohistochemistry for other driver mutations, including IDH1 R132H, BRAF V600E, and H3F3A K27M were negative, and ATRX was retained. Tissue was subsequently submitted for personalized oncogenomic analysis, including whole genome and whole transcriptome sequencing, which demonstrated an activating NTRK2 fusion, as well as high PD-L1 expression, which was subsequently confirmed by immunohistochemistry. Furthermore, H3 and IDH demonstrated wildtype status. These findings suggested the possibility of treatment with either NTRK- or immune checkpoint- inhibitors through active clinical trials. Ultimately, the family pursued standard treatment that involved Head Start III chemotherapy and proton radiotherapy. Notably, at most recent follow upapproximately two years from initial diagnosis, the patient is in disease remission and thriving, suggesting favorable biology despite histologic malignancy. This case illustrates the value of personalized oncogenomics, as the molecular profiling revealed two actionable changes that would not have been apparent through routine diagnostics. NTRK fusions are known oncogenic drivers in a range of cancer types, but this is the first report of a NACC2-NTRK2 fusion in a glioblastoma.
This presentation will enable the learner to:
1. Explore the current molecular landscape of pediatric high grade gliomas
2. Recognize the value of personalized oncogenomic analysis, particularly in rare and/or aggressive tumors
3. Discuss the current status of NTRK inhibitor clinical trials
Objectives: Studies of neurocognitively elite older adults, termed SuperAgers, have identified clinical predictors and neurobiological indicators of resilience against age-related neurocognitive decline. Despite rising rates of older persons living with HIV (PLWH), SuperAging (SA) in PLWH remains undefined. We aimed to establish neuropsychological criteria for SA in PLWH and examined clinically relevant correlates of SA. Methods: 734 PLWH and 123 HIV-uninfected participants between 50 and 64 years of age underwent neuropsychological and neuromedical evaluations. SA was defined as demographically corrected (i.e., sex, race/ethnicity, education) global neurocognitive performance within normal range for 25-year-olds. Remaining participants were labeled cognitively normal (CN) or impaired (CI) based on actual age. Chi-square and analysis of variance tests examined HIV group differences on neurocognitive status and demographics. Within PLWH, neurocognitive status differences were tested on HIV disease characteristics, medical comorbidities, and everyday functioning. Multinomial logistic regression explored independent predictors of neurocognitive status. Results: Neurocognitive status rates and demographic characteristics differed between PLWH (SA=17%; CN=38%; CI=45%) and HIV-uninfected participants (SA=35%; CN=55%; CI=11%). In PLWH, neurocognitive groups were comparable on demographic and HIV disease characteristics. Younger age, higher verbal IQ, absence of diabetes, fewer depressive symptoms, and lifetime cannabis use disorder increased likelihood of SA. SA reported increased independence in everyday functioning, employment, and health-related quality of life than non-SA. Conclusions: Despite combined neurological risk of aging and HIV, youthful neurocognitive performance is possible for older PLWH. SA relates to improved real-world functioning and may be better explained by cognitive reserve and maintenance of cardiometabolic and mental health than HIV disease severity. Future research investigating biomarker and lifestyle (e.g., physical activity) correlates of SA may help identify modifiable neuroprotective factors against HIV-related neurobiological aging. (JINS, 2019, 25, 507–519)
Objectives: Human immunodeficiency virus (HIV) disproportionately affects Hispanics/Latinos in the United States, yet little is known about neurocognitive impairment (NCI) in this group. We compared the rates of NCI in large well-characterized samples of HIV-infected (HIV+) Latinos and (non-Latino) Whites, and examined HIV-associated NCI among subgroups of Latinos. Methods: Participants included English-speaking HIV+ adults assessed at six U.S. medical centers (194 Latinos, 600 Whites). For overall group, age: M=42.65 years, SD=8.93; 86% male; education: M=13.17, SD=2.73; 54% had acquired immunodeficiency syndrome. NCI was assessed with a comprehensive test battery with normative corrections for age, education and gender. Covariates examined included HIV-disease characteristics, comorbidities, and genetic ancestry. Results: Compared with Whites, Latinos had higher rates of global NCI (42% vs. 54%), and domain NCI in executive function, learning, recall, working memory, and processing speed. Latinos also fared worse than Whites on current and historical HIV-disease characteristics, and nadir CD4 partially mediated ethnic differences in NCI. Yet, Latinos continued to have more global NCI [odds ratio (OR)=1.59; 95% confidence interval (CI)=1.13–2.23; p<.01] after adjusting for significant covariates. Higher rates of global NCI were observed with Puerto Rican (n=60; 71%) versus Mexican (n=79, 44%) origin/descent; this disparity persisted in models adjusting for significant covariates (OR=2.40; CI=1.11–5.29; p=.03). Conclusions: HIV+ Latinos, especially of Puerto Rican (vs. Mexican) origin/descent had increased rates of NCI compared with Whites. Differences in rates of NCI were not completely explained by worse HIV-disease characteristics, neurocognitive comorbidities, or genetic ancestry. Future studies should explore culturally relevant psychosocial, biomedical, and genetic factors that might explain these disparities and inform the development of targeted interventions. (JINS, 2018, 24, 163–175)
A model for determination of economic thresholds, or minimum weed population densities justifying the use of postemergence herbicide treatment, for five weed species in soybeans [Glycine max (L.) Merr.] is presented. Sensitivity analysis was performed on the model with respect to economic, statistical, and agronomic variables. The model was refined to include uncertainty about lost field days during the spraying period. Predictions from both the simple and refined models were consistent with economic theory. It was also determined that the economic threshold is sensitive to choice of data-collection ranges and functional form in weed-interference studies.
Many studies on the adoption of precision technologies have generally used logit models to explain the adoption behavior of individuals. This study investigates factors affecting the intensity of precision agriculture technologies adopted by cotton farmers. Particular attention is given to the role of spatial yield variability on the number of precision farming technologies adopted, using a count data estimation procedure and farm-level data. Results indicate that farmers with more within-field yield variability adopted a higher number of precision agriculture technologies. Younger and better educated producers and the number of precision agriculture technologies used were significantly correlated. Finally, farmers using computers for management decisions also adopted a higher number of precision agriculture technologies.
This study identified the factors that influenced whether farmers in the
Southeastern United States perceived an improvement in environmental quality
from adopting precision farming technologies (PFTs). Farmers with larger
farms or higher yields were more likely to believe that they observed
positive externalities associated with PFTs. Farmers who found PFTs
profitable or who believed input reduction was important had higher
probabilities whereas those with higher incomes or who were more dependent
on farm income were less likely to perceive such benefits. Interestingly,
the importance of environmental quality and length of time using PFTs were
not found to affect the probability of perceiving an improvement in
Ray et al. have examined the national agricultural sector impacts of the 1996 FAIR Act using a stochastic simulation model based on the Policy Analysis System (POLYSYS). The model outcomes are predictions of various economic measures, including the coefficient of variation of net returns for corn, wheat, soy-beans, and cotton. Then Knutson et al. use the results of the Ray et al. simulations to predict the distribution of net farm income over the next 10 years for several types of representative southern farms. Any attempt to measure the changes in riskiness in southern agriculture is commendable—even heroic—and I applaud these efforts, both the national modeling effort by Ray et al. and the application to southern representative farms by Knutson et al. This task has a lot in common, I think, with trying to “tease out” the temperature changes associated with the greenhouse effect. The number of factors and mechanisms at work is mind-boggling. It's tough enough to try to get a handle on what might happen to the first moment of net farm income over the next 10 years, let alone the second.
Probit analysis identified factors that influence the adoption of precision farming technologies by Southeastern cotton farmers. Younger, more educated farmers who operated larger farms and were optimistic about the future of precision farming were most likely to adopt site-specific information technology. The probability of adopting variable-rate input application technology was higher for younger farmers who operated larger farms, owned more of the land they farmed, were more informed about the costs and benefits of precision farming, and were optimistic about the future of precision farming. Computer use was not important, possibly because custom hiring shifts the burden of computer use to agribusiness firms.
Binary logit analysis was used to identify the factors influencing adoption of Global Positioning System (GPS) guidance systems by cotton farmers in 11 Mid-south and Southeastern states. Results indicate that adoption was more likely by those who had already adopted other precision-farming practices and had used computers for farm management. In addition, younger and more affluent farmers were more likely to adopt. Farmers with larger farms and with relatively high yields were also more likely to adopt. Education was not a significant factor in a farmer's decision to adopt GPS guidance systems.
This article investigates how information from cotton yield monitors influences the perceptions of within-field yield variability of cotton producers. Using yield distribution modeling techniques and survey data from cotton producers in 11 southeastern states, we find that cotton farmers who responded to the survey tend to underestimate within-field yield variability (by approximately 5-18%) when not using site-specific yield monitor information. Results further indicate that surveyed cotton farmers who responded to a specific question about yield monitors place a value of approximately $20/acre/year (on average) on the additional information about within-field yield variability that the yield monitor technology provides.
Personal digital assistants (PDA) and handheld global positioning systems (GPS) have become increasingly important in cotton production but little is known about their use. This research analyzed the adoption of PDA/handheld GPS devices in cotton production. A younger farmer who used a computer in farm management and had a positive perception of Extension had a greater likelihood of adopting the devices. In addition, farmers who used complementary remote sensing, plant mapping, and grid soil sampling information were more likely to use PDA/handheld GPS devices. Finally, the COTMAN in-field decision support program from Extension also positively impacted adoption.