To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Although the efficacy of endovascular thrombectomy (EVT) for acute ischemic stroke caused by intracranial anterior circulation large vessel occlusion (LVO) is proven, demonstration of local effectiveness is critical for health system planning and resource allocation because of the complexity and cost of this treatment.
Using our prospective registry, we identified all patients who underwent EVT for out-of-hospital LVO stroke from February 1, 2013 through January 31, 2017 (n = 44), and matched them 1:1 in a hierarchical fashion with control patients not treated with EVT based on age (±5 years), prehospital functional status, stroke syndrome, severity, and thrombolysis administration. Demographics, in-hospital mortality, discharge disposition from acute care, length of hospitalization, and functional status at discharge from acute care and at follow-up were compared between cases and controls.
For EVT-treated patients (median age 66, 50% women), the median onset-to-recanalization interval was 247 min, and successful recanalization was achieved in 30/44 (91%). Alteplase was administered in 75% of cases and 57% of controls (p = 0.07). In-hospital mortality was 11% among the cases and 36% in the control group (p = 0.006); this survival benefit persisted during follow-up (p = 0.014). More EVT patients were discharged home from acute care (50% vs. 18%, p = 0.002). Among survivors, there were nonsignificant trends in favor of EVT for median length of hospitalization (14 vs. 41 days, p = 0.11) and functional independence at follow-up (51% vs. 32%, p = 0.079).
EVT improved survival and decreased disability. This demonstration of single-center effectiveness may help facilitate expansion of EVT services in similar health-care jurisdictions.
Endovascular thrombectomy (EVT) is efficacious for ischemic stroke caused by proximal intracranial large-vessel occlusion involving the anterior cerebral circulation. However, evidence of its cost-effectiveness, especially in a real-world setting, is limited. We assessed whether EVT ± tissue plasminogen activator (tPA) was cost-effective when compared with standard care ± tPA at our center.
We identified patients treated with EVT ± tPA after the Endovascular treatment for Small Core and Anterior circulation Proximal occlusion with Emphasis on minimizing computed tomography to recanalization times trial from our prospective stroke registry from February 1, 2013 to January 31, 2017. Patients admitted before February 2013 and treated with standard care ± tPA constitute the controls. The sample size was 88. Cost-effectiveness was assessed using the net monetary benefit (NMB). Differences in average costs and quality-adjusted life years (QALYs) were estimated using the augmented inverse probability weighted estimator. We accounted for sampling and methodological uncertainty in sensitivity analyses.
Patients treated with EVT ± tPA had a net gain of 2.89 [95% confidence interval (CI): 0.93–4.99] QALYs at an additional cost of $22,200 (95% CI: −28,902–78,244) per patient compared with the standard care ± tPA group. The NMB was $122,300 (95% CI: −4777–253,133) with a 0.85 probability of being cost-effective. The expected savings to the healthcare system would amount to $321,334 per year.
EVT ± tPA had higher costs and higher QALYs compared with the control, and is likely to be cost-effective at a willingness-to-pay threshold of $50,000 per QALY.
Over recent decades, biomass gains in remaining old-growth Amazonia forests have declined due to environmental change. Amazonia’s huge size and complexity makes understanding these changes, drivers, and consequences very challenging. Here, using a network of permanent monitoring plots at the Amazon–Cerrado transition, we quantify recent biomass carbon changes and explore their environmental drivers. Our study area covers 30 plots of upland and riparian forests sampled at least twice between 1996 and 2016 and subject to various levels of fire and drought. Using these plots, we aimed to: (1) estimate the long-term biomass change rate; (2) determine the extent to which forest changes are influenced by forest type; and (3) assess the threat to forests from ongoing environmental change. Overall, there was no net change in biomass, but there was clear variation among different forest types. Burning occurred at least once in 8 of the 12 riparian forests, while only 1 of the 18 upland forests burned, resulting in losses of carbon in burned riparian forests. Net biomass gains prevailed among other riparian and upland forests throughout Amazonia. Our results reveal an unanticipated vulnerability of riparian forests to fire, likely aggravated by drought, and threatening ecosystem conservation at the Amazon southern margins.
Maternal insufficiency during fetal development can have long-lasting effects on the offspring, most notably on nephron endowment. In polycystic kidney disease (PKD), variability in severity of disease is observed and maternal environment may be a modifying factor. In this study, we first established that in a rodent model of PKD, the Lewis polycystic kidney (LPK) rat’s nephron numbers are 25% lower compared with wildtype animals. We then investigated the effects of prenatal and postnatal maternal environment on phenotype and nephron number. LPK pups born from and raised by homozygous LPK dams (control) were compared with LPK pups cross-fostered onto heterozygous LPK dams to improve postnatal environment; with LPK pups born from and raised by heterozygous LPK dams to improve both prenatal and postnatal environment and with LPK pups born from and raised by Wistar Kyoto-LPK heterozygous dams to improve both prenatal and postnatal environment on a different genetic background. Improvement in both prenatal and postnatal environment improved postnatal growth, renal function and reduced blood pressure, most notably in animals with different genetic background. Animals with improved postnatal environment only showed improved growth and blood pressure, but to a lesser extent. All intervention groups showed increased nephron number compared with control LPK. In summary, prenatal and postnatal environment had significant effect in delaying progression and reducing severity of PKD, including nephron endowment.
Indigenous women and children experience some of the most profound health disparities globally. These disparities are grounded in historical and contemporary trauma secondary to colonial atrocities perpetuated by settler society. The health disparities that exist for chronic diseases may have their origins in early-life exposures that Indigenous women and children face. Mechanistically, there is evidence that these adverse exposures epigenetically modify genes associated with cardiometabolic disease risk. Interventions designed to support a resilient pregnancy and first 1000 days of life should abrogate disparities in early-life socioeconomic status. Breastfeeding, prenatal care and early child education are key targets for governments and health care providers to start addressing current health disparities in cardiometabolic diseases among Indigenous youth. Programmes grounded in cultural safety and co-developed with communities have successfully reduced health disparities. More works of this kind are needed to reduce inequities in cardiometabolic diseases among Indigenous women and children worldwide.
Seven half-day regional listening sessions were held between December 2016 and April 2017 with groups of diverse stakeholders on the issues and potential solutions for herbicide-resistance management. The objective of the listening sessions was to connect with stakeholders and hear their challenges and recommendations for addressing herbicide resistance. The coordinating team hired Strategic Conservation Solutions, LLC, to facilitate all the sessions. They and the coordinating team used in-person meetings, teleconferences, and email to communicate and coordinate the activities leading up to each regional listening session. The agenda was the same across all sessions and included small-group discussions followed by reporting to the full group for discussion. The planning process was the same across all the sessions, although the selection of venue, time of day, and stakeholder participants differed to accommodate the differences among regions. The listening-session format required a great deal of work and flexibility on the part of the coordinating team and regional coordinators. Overall, the participant evaluations from the sessions were positive, with participants expressing appreciation that they were asked for their thoughts on the subject of herbicide resistance. This paper details the methods and processes used to conduct these regional listening sessions and provides an assessment of the strengths and limitations of those processes.
Herbicide resistance is ‘wicked’ in nature; therefore, results of the many educational efforts to encourage diversification of weed control practices in the United States have been mixed. It is clear that we do not sufficiently understand the totality of the grassroots obstacles, concerns, challenges, and specific solutions needed for varied crop production systems. Weed management issues and solutions vary with such variables as management styles, regions, cropping systems, and available or affordable technologies. Therefore, to help the weed science community better understand the needs and ideas of those directly dealing with herbicide resistance, seven half-day regional listening sessions were held across the United States between December 2016 and April 2017 with groups of diverse stakeholders on the issues and potential solutions for herbicide resistance management. The major goals of the sessions were to gain an understanding of stakeholders and their goals and concerns related to herbicide resistance management, to become familiar with regional differences, and to identify decision maker needs to address herbicide resistance. The messages shared by listening-session participants could be summarized by six themes: we need new herbicides; there is no need for more regulation; there is a need for more education, especially for others who were not present; diversity is hard; the agricultural economy makes it difficult to make changes; and we are aware of herbicide resistance but are managing it. The authors concluded that more work is needed to bring a community-wide, interdisciplinary approach to understanding the complexity of managing weeds within the context of the whole farm operation and for communicating the need to address herbicide resistance.
Introduction: With the current opioid crisis in Canada, presentations of acute opioid withdrawal (AOW) to emergency departments (ED) are increasing. Undertreated symptoms may result in relapse, overdose and death. Buprenorphine/naloxone (bup/nal) is a partial opioid agonist/antagonist used to mitigate symptoms of AOW, approved by Health Canad in 2007 for opioid use disorder. It is superior to clonidine, and increases follow up with addiction treatment programs when initiated in the ED. Nevertheless, in our inner-city ED in 2014, bup/nal was rarely prescribed. We aimed to increase ED physician prescribing of bup/nal for AOW by 50% over a 26-month period. Methods: Commencing in 2014, an interprofessional team of ED physicians, nurses (RN), pharmacists and QI specialists collaborated to improve the care of patients with AOW. PDSA cycles included: (1) needs assessment of emergency physicians knowledge and practices in 2014; (2) Grand Rounds and a web based information sheet in 2015; (3) ED stocking of bup/nal; (4) convenience order set to standardize AOW management; (5) Grand Rounds in 2016 and (6) peer-coaching for RNs, including case-based discussions and pocket card cognitive aids. The outcome was the number of times bup/nal was prescribed per month by ED physicians between Sept, 2015 and Oct, 2017. Data included the prescriber and use of order set as the process measure. The balancing measure was the number of patients referred to the Addiction Medicine Team who subsequently received bup/nal. Results: Bup/nal was prescribed by ED physicians 70 times, and 14 times by the Addiction Medicine Team. With each PDSA cycle, there was an increase in prescribing, with no significant shifts or trends. By all physicians, the median number of prescriptions per month was 3, and increased from 2 to 4 prescriptions/month after nursing education. There was a smaller increase in the median from 2 to 3 prescriptions/month by ED physicians alone. The order set was used 97% of the time. Conclusion: Bup/nal is safe, effective, and increases follow up with addiction programs for comprehensive assessment and treatment planning. We met our goal of increasing bup/nal prescribing in the ED for AOW by 50%. Moreover, prescribing increased by 100% with the addition of patients who received bup/nal after a referral to the Addiction Medicine Team. The intervention with the greatest impact was RN education, demonstrating that peer-coaching and teaching by an interprofessional team is key to changing practice. Unfortunately, overall prescribing remains low, and ED physicians may still be hesitant to prescribe bup/nal and defer to the specialists. It is unclear if this is due to a low number of patients presenting with AOW, patients with contraindications to bup/nal, or ED physician factors. The next step is an audit of all patients with AOW to see what percentage of those eligible are treated with bup/nal. A follow up survey to determine ongoing barriers will inform further PDSA cycles.
We present multi–epoch VLBI observations of the methanol and water masers in the high–mass star formation region G 339.884−1.259, made using the Australian Long Baseline Array (LBA). Our sub–milliarcsecond precision measurements trace the proper motions of individual maser features in the plane of the sky. When combined with the direct line–of–sight radial velocity (vlsr), these measure the 3 D gas kinematics of the associated high–mass star formation region, allowing us to probe the dynamical processes to within 1000 AU of the core.
Knowledge regarding association of dietary branched-chain amino acid (BCAA) and type 2 diabetes (T2D), and the contribution of BCAA from meat to the risk of T2D are scarce. We evaluated associations between dietary BCAA intake, meat intake, interaction between BCAA and meat intake and risk of T2D. Data analyses were performed for 74 155 participants aged 50−79 years at baseline from the Women’s Health Initiative for up to 15 years of follow-up. We excluded from analysis participants with treated T2D, and factors potentially associated with T2D or missing covariate data. The BCAA and total meat intake was estimated from FFQ. Using Cox proportional hazards models, we assessed the relationship between BCAA intake, meat intake, and T2D, adjusting for confounders. A 20 % increment in total BCAA intake (g/d and %energy) was associated with a 7 % higher risk for T2D (hazard ratio (HR) 1·07; 95 % CI 1·05, 1·09). For total meat intake, a 20 % increment was associated with a 4 % higher risk of T2D (HR 1·04; 95 % CI 1·03, 1·05). The associations between BCAA intake and T2D were attenuated but remained significant after adjustment for total meat intake. These relations did not materially differ with or without adjustment for BMI. Our results suggest that dietary BCAA and meat intake are positively associated with T2D among postmenopausal women. The association of BCAA and diabetes risk was attenuated but remained positive after adjustment for meat intake suggesting that BCAA intake in part but not in full is contributing to the association of meat with T2D risk.
We present results from a multiwavelength study of the blazar PKS 1954–388 at radio, UV, X-ray, and gamma-ray energies. A RadioAstron observation at 1.66 GHz in June 2012 resulted in the detection of interferometric fringes on baselines of 6.2 Earth-diameters. This suggests a source frame brightness temperature of greater than 2 × 1012 K, well in excess of both equipartition and inverse Compton limits and implying the existence of Doppler boosting in the core. An 8.4-GHz TANAMI VLBI image, made less than a month after the RadioAstron observations, is consistent with a previously reported superluminal motion for a jet component. Flux density monitoring with the Australia Telescope Compact Array confirms previous evidence for long-term variability that increases with observing frequency. A search for more rapid variability revealed no evidence for significant day-scale flux density variation. The ATCA light-curve reveals a strong radio flare beginning in late 2013, which peaks higher, and earlier, at higher frequencies. Comparison with the Fermi gamma-ray light-curve indicates this followed ~ 9 months after the start of a prolonged gamma-ray high-state—a radio lag comparable to that seen in other blazars. The multiwavelength data are combined to derive a Spectral Energy Distribution, which is fitted by a one-zone synchrotron-self-Compton (SSC) model with the addition of external Compton (EC) emission.
Functional neurological disorders (FNDs), also known as conversion disorder, are unexplained neurological symptoms unrelated to a neurological cause. The disorder is common, yet poorly understood. The symptoms are experienced as involuntary but have similarities to voluntary processes. Here we studied intention awareness in FND.
A total of 26 FND patients and 25 healthy volunteers participated in this functional magnetic resonance study using Libet's clock.
FND is characterized by delayed awareness of the intention to move relative to the movement itself. The reporting of intention was more precise, suggesting that these findings are reliable and unrelated to non-specific attentional deficits. That these findings were more prominent with aberrant positive functional movement symptoms rather than negative symptoms may be relevant to impairments in timing for an inhibitory veto process. Attention towards intention relative to movement was associated with lower right inferior parietal cortex activity in FND, a region early in the processing of intention. During rest, aberrant functional connectivity was observed with the right inferior parietal cortex and other motor intention regions.
The results converge with observations of low inferior parietal activity comparing involuntary with voluntary movement in FND, emphasizing core deficiencies in intention. Heightened precision of this impaired intention is consistent with Bayesian theories of impaired top-down priors that might influence the sense of involuntariness. A primary impairment in voluntary motor intention at an early processing stage might explain clinical observations of slowed effortful voluntary movement, heightened self-directed attention and underlie functional movements. These findings further suggest novel therapeutic targets.
Identifying youth who may engage in future substance use could facilitate early identification of substance use disorder vulnerability. We aimed to identify biomarkers that predicted future substance use in psychiatrically un-well youth.
LASSO regression for variable selection was used to predict substance use 24.3 months after neuroimaging assessment in 73 behaviorally and emotionally dysregulated youth aged 13.9 (s.d. = 2.0) years, 30 female, from three clinical sites in the Longitudinal Assessment of Manic Symptoms (LAMS) study. Predictor variables included neural activity during a reward task, cortical thickness, and clinical and demographic variables.
Future substance use was associated with higher left middle prefrontal cortex activity, lower left ventral anterior insula activity, thicker caudal anterior cingulate cortex, higher depression and lower mania scores, not using antipsychotic medication, more parental stress, older age. This combination of variables explained 60.4% of the variance in future substance use, and accurately classified 83.6%.
These variables explained a large proportion of the variance, were useful classifiers of future substance use, and showed the value of combining multiple domains to provide a comprehensive understanding of substance use development. This may be a step toward identifying neural measures that can identify future substance use disorder risk, and act as targets for therapeutic interventions.
The purpose of this study was to evaluate a programme of lesion surgery carried out on patients with treatment-resistant depression (TRD).
This was a retrospective study looking at clinical and psychometric data from 45 patients with TRD who had undergone bilateral stereotactic anterior capsulotomy surgery over a period of 15 years, with the approval of the Mental Health Act Commission (37 with unipolar depression and eight with bipolar disorder). The Beck Depression Inventory (BDI) before and after surgery was used as the primary outcome measure. The Montgomery–Asberg Depression Rating Scale was administered and cognitive aspects of executive and memory functions were also examined. We carried out a paired-samples t test on the outcome measures to determine any statistically significant change in the group as a consequence of surgery.
Patients improved on the clinical measure of depression after surgery by −21.20 points on the BDI with a 52% change. There were no significant cognitive changes post-surgery. Six patients were followed up in 2013 by phone interview and reported a generally positive experience. No major surgical complications occurred.
With the limitations of an uncontrolled, observational study, our data suggest that capsulotomy can be an effective treatment for otherwise TRD. Performance on neuropsychological tests did not deteriorate.
Genetically similar nulliparous Polled Hereford heifers from a closed pedigree herd were used to evaluate the effects of dietary protein during the first and second trimester of gestation upon foetal, placental and postnatal growth. Heifers were randomly allocated into two groups at 35 days after artificial insemination (35 days post conception (dpc)) to a single bull and fed high (15.7% CP) or low (5.9% CP) protein in the first trimester (T1). At 90 dpc, half of each nutritional treatment group changed to a high- or low-protein diet for the second trimester until 180 dpc (T2). High protein intake in the second trimester increased birth weight in females (P=0.05), but there was no effect of treatment upon birth weight when taken over both sexes. Biparietal diameter was significantly increased by high protein in the second trimester with the effect being greater in the female (P=0.02), but also significant overall (P=0.05). Placental weight was positively correlated with birth weight, fibroblast volume and relative blood vessel volume (P<0.05). Placental fibroblast density was increased and trophoblast volume decreased in the high-protein first trimester treatment group (P<0.05). There was a trend for placental weight to be increased by high protein in the second trimester (P=0.06). Calves from heifers fed the high-protein treatment in the second trimester weighed significantly more on all occasions preweaning (at 1 month (P=0.0004), 2 months (P=0.006), 3 months (P=0.002), 4 months (P=0.01), 5 months (P=0.03), 6 months (P=0.001)), and grew at a faster rate over the 6-month period. By 6 months of age, the calves from heifers fed high nutrition in the second trimester weighed 33 kg heavier than those fed the low diet in the second trimester. These results suggest that dietary protein in early pregnancy alters the development of the bovine placenta and calf growth to weaning.
Annual bluegrass is a weed species in turfgrass environments known for exhibiting resistance to multiple herbicide modes of action, including photosystem II (PSII) inhibitors. To evaluate populations of annual bluegrass for susceptibility to PSII inhibitors of varied chemistries, populations were treated with herbicides from triazolinone, triazine, and substituted urea families: amicarbazone, atrazine, and diuron, respectively. Sequencing of the psbA gene confirmed the presence of a Ser264 to Gly amino acid substitution within populations that exhibited resistance to both atrazine and amicarbazone. A single biotype, DR3, which lacked any previously reported psbA gene point mutation, exhibited resistance to diuron, atrazine, and amicarbazone. DR3 had a significantly lower rate of absorption and translocation of atrazine and had enhanced atrazine metabolism when compared with both the Ser264 to Gly resistant mutant and susceptible biotypes. We thus report possible nontarget mechanisms of resistance to PSII-inhibiting herbicides in annual bluegrass.
Repeat rectal chlamydia infection is common in men who have sex with men (MSM) following treatment with 1 g azithromycin. This study describes the association between organism load and repeat rectal chlamydia infection, genovar distribution, and efficacy of azithromycin in asymptomatic MSM. Stored rectal chlamydia-positive samples from MSM were analysed for organism load and genotyped to assist differentiation between reinfection and treatment failure. Included men had follow-up tests within 100 days of index infection. Lymphogranuloma venereum and proctitis diagnosed symptomatically were excluded. Factors associated with repeat infection, treatment failure and reinfection were investigated. In total, 227 MSM were included – 64 with repeat infections [28·2%, 95% confidence interval (CI) 22·4–34·5]. Repeat positivity was associated with increased pre-treatment organism load [odds ratio (OR) 1·7, 95% CI 1·4–2·2]. Of 64 repeat infections, 29 (12·8%, 95% CI 8·7–17·8) were treatment failures and 35 (15·4%, 95% CI 11·0–20·8) were reinfections, 11 (17·2%, 95% CI 8·9–28·7) of which were definite reinfections. Treatment failure and reinfection were both associated with increased load (OR 2·0, 95% CI 1·4–2·7 and 1·6, 95% CI 1·2–2·2, respectively). The most prevalent genovars were G, D and J. Treatment efficacy for 1 g azithromycin was 83·6% (95% CI 77·2–88·8). Repeat positivity was associated with high pre-treatment organism load. Randomized controlled trials are urgently needed to evaluate azithromycin's efficacy and whether extended doses can overcome rectal infections with high organism load.
The mechanism behind the beneficial effects of enteral nutrition (EN) for patients with acute pancreatitis (AP) is largely unknown. Adipokines, as mediators of metabolism and inflammation, may be a possible mechanism. The study aimed to investigate the effect of EN on adipokines early in the course of AP. Patients with AP were randomised to EN or nil-by-mouth (NBM). Blood samples were taken on the first 4 d of admission and adipokine concentrations for adiponectin, leptin, omentin, resistin and visfatin were determined by ELISA assays. A linear mixed model analysis was run to determine differences in adipokine concentrations between the two study groups. A total of thirty-two patients were included in the study. Omentin concentrations were significantly higher in patients who received EN compared with NBM across the first 4 d of admission (mean difference: 11·6 (95 % CI 1·0, 22·3) ng/ml; P = 0·033). Leptin concentrations were significantly higher in patients who received EN compared with NBM after adjusting for age, sex and BMI (mean difference: 2·3 (95 % CI 0·1, 4·5) ng/ml; P = 0·037). No significant difference in adiponectin, resistin or visfatin concentrations were observed between the two study groups. EN significantly increases omentin and leptin concentrations in AP. Future research should be directed towards understanding whether these adipokines are responsible for the therapeutic benefits of EN.
Offspring of parents with bipolar disorder (BD) (BO) are at higher risk of BD than offspring of parents with non-BD psychopathology (NBO), although both groups are at higher risk than offspring of psychiatrically healthy parents (HC) for other affective and psychiatric disorders. Abnormal functioning in reward circuitry has been demonstrated previously in individuals with BD. We aimed to determine whether activation and functional connectivity in this circuitry during risky decision-making differentiated BO, NBO and HC.
BO (n = 29; mean age = 13.8 years; 14 female), NBO (n = 28; mean age = 13.9 years; 12 female) and HC (n = 23; mean age = 13.7 years; 11 female) were scanned while performing a number-guessing reward task. Of the participants, 11 BO and 12 NBO had current non-BD psychopathology; five BO and four NBO were taking psychotropic medications.
A 3 (group) × 2 (conditions: win-control/loss-control) analysis of variance revealed a main effect of group on right frontal pole activation: BO showed significantly greater activation than HC. There was a significant main effect of group on functional connectivity between the bilateral ventral striatum and the right ventrolateral prefrontal cortex (Z > 3.09, cluster-p < 0.05): BO showed significantly greater negative functional connectivity than other participants. These between-group differences remained after removing youth with psychiatric disorders and psychotropic medications from analyses.
This is the first study to demonstrate that reward circuitry activation and functional connectivity distinguish BO from NBO and HC. The fact that the pattern of findings remained when comparing healthy BO v. healthy NBO v. HC suggests that these neuroimaging measures may represent trait-level neurobiological markers conferring either risk for, or protection against, BD in youth.
A symptom of mild cognitive impairment (MCI) and Alzheimer’s disease
(AD) is a flat learning profile. Learning slope calculation methods vary, and
the optimal method for capturing neuroanatomical changes associated with MCI and
early AD pathology is unclear. This study cross-sectionally compared four
different learning slope measures from the Rey Auditory Verbal Learning Test
(simple slope, regression-based slope, two-slope method, peak slope) to
structural neuroimaging markers of early AD neurodegeneration (hippocampal
volume, cortical thickness in parahippocampal gyrus, precuneus, and lateral
prefrontal cortex) across the cognitive aging spectrum [normal
control (NC); (n=198;
age=76±5), MCI (n=370;
age=75±7), and AD (n=171;
age=76±7)] in ADNI. Within diagnostic group,
general linear models related slope methods individually to neuroimaging
variables, adjusting for age, sex, education, and APOE4 status. Among MCI,
better learning performance on simple slope, regression-based slope, and late
slope (Trial 2–5) from the two-slope method related to larger
parahippocampal thickness (all p-values<.01) and
hippocampal volume (p<.01). Better regression-based
slope (p<.01) and late slope
(p<.01) were related to larger ventrolateral
prefrontal cortex in MCI. No significant associations emerged between any slope
and neuroimaging variables for NC (p-values ≥.05) or
AD (p-values ≥.02). Better learning performances
related to larger medial temporal lobe (i.e., hippocampal volume,
parahippocampal gyrus thickness) and ventrolateral prefrontal cortex in MCI
only. Regression-based and late slope were most highly correlated with
neuroimaging markers and explained more variance above and beyond other common
memory indices, such as total learning. Simple slope may offer an acceptable
alternative given its ease of calculation. (JINS, 2015,