To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Impulsive and compulsive problem behaviours are associated with a variety of mental disorders. Latent phenotyping indicates the expression of impulsive and compulsive problem behaviours is predominantly governed by a transdiagnostic ‘disinhibition’ phenotype. In a cohort of 117 individuals, recruited as part of the Neuroscience in Psychiatry Network (NSPN), we examined how brain functional connectome and network properties relate to disinhibition. Reduced functional connectivity within a subnetwork of frontal (especially right inferior frontal gyrus), occipital and parietal regions was linked to disinhibition. Findings provide insights into neurobiological pathways underlying the emergence of impulsive and compulsive disorders.
Mortality risk is known to be associated with many physiological or biochemical risk factors, and polygenic risk scores (PRSs) may offer an additional or alternative approach to risk stratification. We have compared the predictive value of common biochemical tests, PRSs and information on parental survival in a cohort of twins and their families. Common biochemical test results were available for up to 13,365 apparently healthy men and women, aged 17−93 years (mean 49.0, standard deviation [SD] 13.7) at blood collection. PRSs for longevity were available for 14,169 study participants and reported parental survival for 25,784 participants. A search for information on date and cause of death was conducted through the Australian National Death Index, with median follow-up of 11.3 years. Cox regression was used to evaluate associations with mortality from all causes, cancers, cardiovascular diseases and other causes. Linear relationships with all-cause mortality were strongest for C-reactive protein, gamma-glutamyl transferase, glucose and alkaline phosphatase, with hazard ratios (HRs) of 1.16 (95% CI [1.07, 1.24]), 1.15 (95% CI 1.04–1.21), 1.13 (95% CI [1.08, 1.19]) and 1.11 (95% CI [1.05, 1.88]) per SD difference, respectively. Significant nonlinear effects were found for urea, uric acid and butyrylcholinesterase. Lipid risk factors were not statistically significant for mortality in our cohort. Family history and PRS showed weaker but significant associations with survival, with HR in the range 1.05 to 1.09 per SD difference. In conclusion, biochemical tests currently predict long-term mortality more strongly than genetic scores based on genotyping or on reported parental survival.
Late Pleistocene and Early Holocene aeolian deposits in Tasmania are extensive in the present subhumid climate zone but also occur in areas receiving >1000 mm of rain annually. Thermoluminescence, optically stimulated luminescence, and radiocarbon ages indicate that most of the deposits formed during periods of cold climate. Some dunes are remnants of longitudinal desert dunes sourced from now-inundated continental shelves which were previously semi-arid. Others formed near source, often in the form of lunettes east of seasonally-dry lagoons in the previously semi-arid Midlands and southeast of Tasmania, or as accumulations close to floodplains of major rivers, or as sandsheets in exposed areas. Burning of vegetation by the Aboriginal population after 40 ka is likely to have influenced sediment supply. A key site for determining climate variability in southern Tasmania is Maynes Junction which records three periods of aeolian deposition (at ca. 90, 32 and 20 ka), interspersed with periods of hillslope instability. Whether wind speeds were higher than at present during the last glacial period is uncertain, but shells in the Mary Ann Bay sandsheet near Hobart and particle size analysis of the Ainslie dunes in northeast Tasmania suggest stronger winds during the last glacial period than at present.
Extracorporeal membrane oxygenation (ECMO) has accelerated rapidly for patients in severe cardiac or respiratory failure. As a result, ECMO networks are being developed across the world using a “hub and spoke” model. Current guidelines call for all patients transported on ECMO to be accompanied by a physician during transport. However, as ECMO centers and networks grow, the increasing number of transports will be limited by this mandate.
The aim of this study was to compare rates of adverse events occurring during transport of ECMO patients with and without an additional clinician, defined as a physician, nurse practitioner (NP), or physician assistant (PA).
This is a retrospective cohort study of all adults transported while cannulated on ECMO from 2011-2018 via ground and air between 21 hospitals in the northeastern United States, comparing transports with and without additional clinicians. The primary outcome was the rate of major adverse events, and the secondary outcome was minor adverse events.
Over the seven-year study period, 93 patients on ECMO were transported. Twenty-three transports (24.7%) were accompanied by a physician or other additional clinician. Major adverse events occurred in 21.5% of all transports. There was no difference in the total rate of major adverse events between accompanied and unaccompanied transports (P = .91). Multivariate analysis did not demonstrate any parameter as being predictive of major adverse events.
In a retrospective cohort study of transports of ECMO patients, there was no association between the overall rate of major adverse events in transport and the accompaniment of an additional clinician. No variables were associated with major adverse events in either cohort.
A Canadian outbreak investigation into a cluster of Escherichia coli O121 was initiated in late 2016. When initial interviews using a closed-ended hypothesis-generating questionnaire did not point to a common source, cases were centrally re-interviewed using an open-ended approach. The open-ended interviews led cases to describe exposures with greater specificity, as well as food preparation activities. Data collected supported hypothesis generation, particularly with respect to flour exposures. In March 2017, an open sample of Brand X flour from a case home, and a closed sample collected at retail of the same brand and production date, tested positive for the outbreak strain of E. coli O121. In total, 76% (16/21) of cases reported that they used or probably used Brand X flour or that it was used or probably was used in the home during their exposure period. Crucial hypothesis-generating techniques used during the course of the investigation included a centralised open-ended interviewing approach and product sampling from case homes. This was the first outbreak investigation in Canada to identify flour as the source of infection.
Background: With the emergence of antibiotic resistant threats and the need for appropriate antibiotic use, laboratory microbiology information is important to guide clinical decision making in nursing homes, where access to such data can be limited. Susceptibility data are necessary to inform antibiotic selection and to monitor changes in resistance patterns over time. To contribute to existing data that describe antibiotic resistance among nursing home residents, we summarized antibiotic susceptibility data from organisms commonly isolated from urine cultures collected as part of the CDC multistate, Emerging Infections Program (EIP) nursing home prevalence survey. Methods: In 2017, urine culture and antibiotic susceptibility data for selected organisms were retrospectively collected from nursing home residents’ medical records by trained EIP staff. Urine culture results reported as negative (no growth) or contaminated were excluded. Susceptibility results were recorded as susceptible, non-susceptible (resistant or intermediate), or not tested. The pooled mean percentage tested and percentage non-susceptible were calculated for selected antibiotic agents and classes using available data. Susceptibility data were analyzed for organisms with ≥20 isolates. The definition for multidrug-resistance (MDR) was based on the CDC and European Centre for Disease Prevention and Control’s interim standard definitions. Data were analyzed using SAS v 9.4 software. Results: Among 161 participating nursing homes and 15,276 residents, 300 residents (2.0%) had documentation of a urine culture at the time of the survey, and 229 (76.3%) were positive. Escherichia coli, Proteus mirabilis, Klebsiella spp, and Enterococcus spp represented 73.0% of all urine isolates (N = 278). There were 215 (77.3%) isolates with reported susceptibility data (Fig. 1). Of these, data were analyzed for 187 (87.0%) (Fig. 2). All isolates tested for carbapenems were susceptible. Fluoroquinolone non-susceptibility was most prevalent among E. coli (42.9%) and P. mirabilis (55.9%). Among Klebsiella spp, the highest percentages of non-susceptibility were observed for extended-spectrum cephalosporins and folate pathway inhibitors (25.0% each). Glycopeptide non-susceptibility was 10.0% for Enterococcus spp. The percentage of isolates classified as MDR ranged from 10.1% for E. coli to 14.7% for P. mirabilis. Conclusions: Substantial levels of non-susceptibility were observed for nursing home residents’ urine isolates, with 10% to 56% reported as non-susceptible to the antibiotics assessed. Non-susceptibility was highest for fluoroquinolones, an antibiotic class commonly used in nursing homes, and ≥ 10% of selected isolates were MDR. Our findings reinforce the importance of nursing homes using susceptibility data from laboratory service providers to guide antibiotic prescribing and to monitor levels of resistance.
Background: With an aging population, increasingly complex care, and frequent re-admissions, prevention of healthcare-associated infections (HAIs) in nursing homes (NHs) is a federal priority. However, few contemporary sources of HAI data exist to inform surveillance, prevention, and policy. Prevalence surveys (PSs) are an efficient approach to generating data to measure the burden and describe the types of HAI. In 2017, the Centers for Disease Control and Prevention (CDC) performed its first large-scale HAI PS through the Emerging Infections Program (EIP) to measure the prevalence and describe the epidemiology of HAI in NH residents. Methods: NHs from several states (CA, CO, CT, GA, MD, MN, NM, NY, OR, & TN) were randomly selected and asked to participate in a 1-day HAI PS between April and October 2017; participation was voluntary. EIP staff reviewed available medical records for NH residents present on the survey date to collect demographic and basic clinical information and infection signs and symptoms. HAIs with onset on or after NH day 3 were identified using revised McGeer infection definitions applied to data collected by EIP staff and were reported to the CDC through a web-based system. Data were reviewed by CDC staff for potential errors and to validate HAI classifications prior to analysis. HAI prevalence, number of residents with >1 HAI per number of surveyed residents ×100, and 95% CIs were calculated overall (pooled mean) and for selected resident characteristics. Data were analyzed using SAS v9.4 software. Results: Among 15,296 residents in 161 NHs, 358 residents with 375 HAIs were identified. The most common HAI sites were skin (32%), respiratory tract (29%), and urinary tract (20%). Cellulitis, soft-tissue or wound infection, symptomatic UTI, and cold or pharyngitis were the most common individual HAIs (Fig. 1). Overall HAI prevalence was 2.3 per 100 residents (95% CI, 2.1–2.6); at the NH level, the median HAI prevalence was 1.8 and ranged from 0 to 14.3 (interquartile range, 0–3.1). At the resident level (Fig. 2), HAI prevalence was significantly higher in persons admitted for postacute care with diabetes, with a pressure ulcer, receiving wound care, or with a device. Conclusions: In this large-scale survey, 1 in 43 NH residents had an HAI on a given day. Three HAI types comprised >80% of infections. In addition to identifying characteristics that place residents at higher risk for HAIs, these findings provide important data on HAI epidemiology in NHs that can be used to expand HAI surveillance and inform prevention policies and practices.
Background: Antibiotics are among the most commonly prescribed drugs in nursing homes; urinary tract infections (UTIs) are a frequent indication. Although there is no gold standard for the diagnosis of UTIs, various criteria have been developed to inform and standardize nursing home prescribing decisions, with the goal of reducing unnecessary antibiotic prescribing. Using different published criteria designed to guide decisions on initiating treatment of UTIs (ie, symptomatic, catheter-associated, and uncomplicated cystitis), our objective was to assess the appropriateness of antibiotic prescribing among NH residents. Methods: In 2017, the CDC Emerging Infections Program (EIP) performed a prevalence survey of healthcare-associated infections and antibiotic use in 161 nursing homes from 10 states: California, Colorado, Connecticut, Georgia, Maryland, Minnesota, New Mexico, New York, Oregon, and Tennessee. EIP staff reviewed resident medical records to collect demographic and clinical information, infection signs, symptoms, and diagnostic testing documented on the day an antibiotic was initiated and 6 days prior. We applied 4 criteria to determine whether initiation of treatment for UTI was supported: (1) the Loeb minimum clinical criteria (Loeb); (2) the Suspected UTI Situation, Background, Assessment, and Recommendation tool (UTI SBAR tool); (3) adaptation of Infectious Diseases Society of America UTI treatment guidelines for nursing home residents (Crnich & Drinka); and (4) diagnostic criteria for uncomplicated cystitis (cystitis consensus) (Fig. 1). We calculated the percentage of residents for whom initiating UTI treatment was appropriate by these criteria. Results: Of 248 residents for whom UTI treatment was initiated in the nursing home, the median age was 79 years [IQR, 19], 63% were female, and 35% were admitted for postacute care. There was substantial variability in the percentage of residents with antibiotic initiation classified as appropriate by each of the criteria, ranging from 8% for the cystitis consensus, to 27% for Loeb, to 33% for the UTI SBAR tool, to 51% for Crnich and Drinka (Fig. 2). Conclusions: Appropriate initiation of UTI treatment among nursing home residents remained low regardless of criteria used. At best only half of antibiotic treatment met published prescribing criteria. Although insufficient documentation of infection signs, symptoms and testing may have contributed to the low percentages observed, adequate documentation in the medical record to support prescribing should be standard practice, as outlined in the CDC Core Elements of Antibiotic Stewardship for nursing homes. Standardized UTI prescribing criteria should be incorporated into nursing home stewardship activities to improve the assessment and documentation of symptomatic UTI and to reduce inappropriate antibiotic use.
Introduction: Emergency department (ED) crowding is a major problem across Canada. We studied the ability of artificial intelligence methods to improve patient flow through the ED by predicting patient disposition using information available at triage and shortly after patients’ arrival in the ED. Methods: This retrospective study included all visits to an urban, academic, adult ED between May 2012 and June 2019. For each visit, 489 variables were extracted including triage data that had been collected for use in the Canadian Triage Assessment Scale (CTAS) and information regarding laboratory tests, radiological tests, consultations and admissions. A training set consisting of all visits from April 2012 up to December 2018 was used to train 5 classes of machine learning models to predict admission to the hospital from the ED. The models were trained to predict admission at the time of the patient's arrival in the ED and every 30 minutes after arrival until 6 hours into their ED stay. The performance of models was compared using the area under the ROC curve (AUC) on a test set consisting of all visits from January 2019 to June 2019. Results: The study included 536,332 visits and the admission rate was 15.0%. Gradient boosting models generally outperformed other machine learning models. A gradient boosting model using all available data at 2 hours after patient arrival in the ED yielded a test set AUC 0.92 [95% CI 0.91-0.93], while a model using only data available at triage yielded an AUC 0.90 [95% CI 0.89-0.91]. The quality of predictions generally improved as predictions were made later in the patient's ED stay leading to an AUC 0.95 [95% CI 0.93-0.96] at 6 hours after arrival. A gradient boosting model with 20 variables available at 2 hours after patient arrival in the ED yielded an AUC 0.91 [95% CI 0.89-0.93]. A gradient boosting model that makes predictions at 2 hours after arrival in ED using only variables that are available at all EDs in the province of Quebec yielded an AUC 0.91 [95% 0.89-0.92]. Conclusion: Machine learning can predict admission to a hospital from the ED using variables that area collected as part of routine ED care. Machine learning tools may potentially be used to help ED physicians to make faster and more appropriate disposition decisions, to decrease unnecessary testing and alleviate ED crowding.
Introduction: The Canadian Syncope Risk Score (CSRS) is a validated risk tool developed using the best practices of conventional biostatistics, for predicting 30-day serious adverse events (SAE) after an Emergency Department (ED) visit for syncope. We sought to improve on the prediction ability of the CSRS and compared it to physician judgement using artificial intelligence (AI) research with modern machine learning (ML) methods. Methods: We used the prospective multicenter cohort data collected for the CSRS derivation and validation at 11 EDs across Canada over an 8-year period. The same 43 candidate variables considered for CSRS development were used to train and validate the four classes of ML models to predict 30-day SAE (death, arrhythmias, MI, structural heart disease, pulmonary embolism, hemorrhage) after ED disposition. Physician judgement was modeled using the two variables, referral for consultation and hospitalization. We compared the area under the curve (AUC) for the three models. Results: The proportion of patients who suffered 30-day SAE in the derivation cohort (N = 4030) was 3.6% and in validation phase (N = 2290) was 3.4%. Characteristics of the both cohorts were similar with no shift. The best performing ML model, a gradient boosting tree-based model used all 43 variables as predictors as opposed to the 9 final CSRS predictors. The AUC for the three models on the validation data were: best ML model 0.91 (95% CI 0.87–0.93), CSRS 0.87 (95% CI 0.83–0.90) and physician judgment 0.79 (95% CI 0.74 - 0.84). The most important predictors in the ML model were the same as the CSRS predictors. Conclusion: A ML model developed using AI method for risk-stratification of ED syncope performed with slightly better discrimination ability though not significantly different when compared to the CSRS. Both the ML model and the CSRS were better predictors of poor outcomes after syncope than physician judgement. ML models can perform with similar discrimination abilities when compared to traditional statistical models and outperform physician judgement given their ability to use all candidate variables.
Acute change in mental status (ACMS), defined by the Confusion Assessment Method, is used to identify infections in nursing home residents. A medical record review revealed that none of 15,276 residents had an ACMS documented. Using the revised McGeer criteria with a possible ACMS definition, we identified 296 residents and 21 additional infections. The use of a possible ACMS definition should be considered for retrospective nursing home infection surveillance.
Hope is considered as an important factor in recovery from severe mental illness. So far it has been studied in patients with depression, anxiety disorders and post traumatic stress disorder, whereas empirical studies involving people with psychosis are scarce and their results are inconclusive.
We aimed at evaluating the relationship between
(i) hope and positive as well as negative psychotic symptoms and
(ii) hope and depression in people with psychosis.
In this cross-sectional study 148 patients with schizophrenia and schizo-affective disorder were interviewed by a psychologist who rated the positive and negative symptoms on the Positive and Negative Syndrome Scale (PANSS). Hope and depression were measured using the self-assessment scales Integrative Hope Scale (IHS) and the Center for Epidemiologic Studies Depression Scale (CES-D).
No statistically significant correlation was found between hope and positive symptoms (r = .071, p = .414). Hope and negative symptoms, however, showed a statistically significant negative correlation (r = -.196, p = .023), as did hope and depression (r = -.255, p = .003). This latter relationship remained significant after controlling for negative symptoms in a partial correlation (r = -.216, p = .013).
While hope appears unrelated to positive symptoms, a significant correlation with negative symptoms and depression was found. These results emphasise the potential importance of hope as a target variable to support recovery in patients with psychosis. However, prospective studies are needed to clarify the causal relationships between hope and symptoms of psychotic disorders.
Reasons for differences in the effect sizes of studies on complex interventions, such as assertive outreach, between the US and the UK are a much debated topic. One possible explanation was suggested to be the potentially different quality of standard care in the two countries.
We aimed to
(i) empirically establish the comparability of research results on complex interventions for people with severe mental illness (SMI) from the UK and the US, and
(ii) explore developments over time in standard care in both countries by comparing studies that use “treatment as usual” (TAU) as the control intervention.
We conducted a systematic review and meta-analysis of RCTs conducted in the UK or the US
(i) involving people with SMI,
(ii) comparing complex interventions with TAU, and
(iii) using the outcome relapse or readmission to hospital.
The Risk Ratios for relapse/readmission were very similar and favouring experimental treatment both in the UK (RR 0.80, CI 0.73–0.88) and the US (RR 0.87, CI 0.79–0.95). The development of effects resulting from experimental interventions relative to those from TAU over time shows a slightly different pattern for the two countries.
The broadly similar total RR for relapse/readmission in both countries confirms the comparability of studies conducted in the UK and the US and suggests no significant overall difference in the quality of standard care. The chronological development of effects, however, reflects developments in TAU over time which differ between the two countries.
Negative symptoms have been previously reported during the psychosis prodrome, however our understanding of their relationship with treatment-phase negative symptoms remains unclear.
We report the prevalence of psychosis prodrome onset negative symptoms (PONS) and ascertain whether these predict negative symptoms at first presentation for treatment.
Presence of expressivity or experiential negative symptom domains was established at first presentation for treatment using the Scale for Assessment of Negative Symptoms (SANS) in 373 individuals with a first episode psychosis. PONS were established using the Beiser Scale. The relationship between PONS and negative symptoms at first presentation was ascertained and regression analyses determined the relationship independent of confounding.
PONS prevalence was 50.3% in the schizophrenia spectrum group (n = 155) and 31.2% in the non-schizophrenia spectrum group (n = 218). In the schizophrenia spectrum group, PONS had a significant unadjusted (χ2 = 10.41, P < 0.001) and adjusted (OR = 2.40, 95% CI = 1.11–5.22, P = 0.027) association with first presentation experiential symptoms, however this relationship was not evident in the non-schizophrenia spectrum group. PONS did not predict expressivity symptoms in either diagnostic group.
PONS are common in schizophrenia spectrum diagnoses, and predict experiential symptoms at first presentation. Further prospective research is needed to examine whether negative symptoms commence during the psychosis prodrome.
Obsessive-compulsive disorder (OCD) is a highly disabling condition, with frequent early onset. Adult/adolescent OCD has been extensively investigated, but little is known about prevalence and clinical characterization of geriatric patients with OCD (G-OCD = 65 years). The present study aimed to assess prevalence of G-OCD and associated socio-demographic and clinical correlates in a large international sample.
Data from 416 outpatients, participating in the ICOCS network, were assessed and categorized into 2 groups, age < vs = 65 years, and then divided on the basis of the median age of the sample (age < vs = 42 years). Socio-demographic and clinical variables were compared between groups (Pearson Chi-squared and t tests).
G-OCD compared with younger patients represented a significant minority of the sample (6% vs 94%, P < .001), showing a significantly later age at onset (29.4 ± 15.1 vs 18.7 ± 9.2 years, P < .001), a more frequent adult onset (75% vs 41.1%, P < .001) and a less frequent use of cognitive-behavioural therapy (CBT) (20.8% vs 41.8%, P < .05). Female gender was more represented in G-OCD patients, though not at a statistically significant level (75% vs 56.4%, P = .07). When the whole sample was divided on the basis of the median age, previous results were confirmed for older patients, including a significantly higher presence of women (52.1% vs 63.1%, P < .05).
G-OCD compared with younger patients represented a small minority of the sample and showed later age at onset, more frequent adult onset and lower CBT use. Age at onset may influence course and overall management of OCD, with additional investigation needed.
Frascati international research criteria for HIV-associated neurocognitive disorders (HAND) are controversial; some investigators have argued that Frascati criteria are too liberal, resulting in a high false positive rate. Meyer et al. recommended more conservative revisions to HAND criteria, including exploring other commonly used methodologies for neurocognitive impairment (NCI) in HIV including the global deficit score (GDS). This study compares NCI classifications by Frascati, Meyer, and GDS methods, in relation to neuroimaging markers of brain integrity in HIV.
Two hundred forty-one people living with HIV (PLWH) without current substance use disorder or severe (confounding) comorbid conditions underwent comprehensive neurocognitive testing and brain structural magnetic resonance imaging and magnetic resonance spectroscopy. Participants were classified using Frascati criteria versus Meyer criteria: concordant unimpaired [Frascati(Un)/Meyer(Un)], concordant impaired [Frascati(Imp)/Meyer(Imp)], or discordant [Frascati(Imp)/Meyer(Un)] which were impaired via Frascati criteria but unimpaired via Meyer criteria. To investigate the GDS versus Meyer criteria, the same groupings were utilized using GDS criteria instead of Frascati criteria.
When examining Frascati versus Meyer criteria, discordant Frascati(Imp)/Meyer(Un) individuals had less cortical gray matter, greater sulcal cerebrospinal fluid volume, and greater evidence of neuroinflammation (i.e., choline) than concordant Frascati(Un)/Meyer(Un) individuals. GDS versus Meyer comparisons indicated that discordant GDS(Imp)/Meyer(Un) individuals had less cortical gray matter and lower levels of energy metabolism (i.e., creatine) than concordant GDS(Un)/Meyer(Un) individuals. In both sets of analyses, the discordant group did not differ from the concordant impaired group on any neuroimaging measure.
The Meyer criteria failed to capture a substantial portion of PLWH with brain abnormalities. These findings support continued use of Frascati or GDS criteria to detect HIV-associated CNS dysfunction.
Objectives: Studies of neurocognitively elite older adults, termed SuperAgers, have identified clinical predictors and neurobiological indicators of resilience against age-related neurocognitive decline. Despite rising rates of older persons living with HIV (PLWH), SuperAging (SA) in PLWH remains undefined. We aimed to establish neuropsychological criteria for SA in PLWH and examined clinically relevant correlates of SA. Methods: 734 PLWH and 123 HIV-uninfected participants between 50 and 64 years of age underwent neuropsychological and neuromedical evaluations. SA was defined as demographically corrected (i.e., sex, race/ethnicity, education) global neurocognitive performance within normal range for 25-year-olds. Remaining participants were labeled cognitively normal (CN) or impaired (CI) based on actual age. Chi-square and analysis of variance tests examined HIV group differences on neurocognitive status and demographics. Within PLWH, neurocognitive status differences were tested on HIV disease characteristics, medical comorbidities, and everyday functioning. Multinomial logistic regression explored independent predictors of neurocognitive status. Results: Neurocognitive status rates and demographic characteristics differed between PLWH (SA=17%; CN=38%; CI=45%) and HIV-uninfected participants (SA=35%; CN=55%; CI=11%). In PLWH, neurocognitive groups were comparable on demographic and HIV disease characteristics. Younger age, higher verbal IQ, absence of diabetes, fewer depressive symptoms, and lifetime cannabis use disorder increased likelihood of SA. SA reported increased independence in everyday functioning, employment, and health-related quality of life than non-SA. Conclusions: Despite combined neurological risk of aging and HIV, youthful neurocognitive performance is possible for older PLWH. SA relates to improved real-world functioning and may be better explained by cognitive reserve and maintenance of cardiometabolic and mental health than HIV disease severity. Future research investigating biomarker and lifestyle (e.g., physical activity) correlates of SA may help identify modifiable neuroprotective factors against HIV-related neurobiological aging. (JINS, 2019, 25, 507–519)