We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Noonan syndrome is a genetic disorder with high prevalence of congenital heart defects, such as pulmonary stenosis, atrial septal defect and hypertrophic cardiomyopathy. Scarce data exists regarding the safety of pregnancy in patients with Noonan syndrome, particularly in the context of maternal cardiac disease.
Study design:
We performed a retrospective chart review of patients at Yale-New Haven Hospital from 2012 to 2020 with diagnoses of Noonan syndrome and pregnancy. We analysed medical records for pregnancy details and cardiac health, including echocardiograms to quantify maternal cardiac dysfunction through measurements of pulmonary valve peak gradient, structural heart defects and interventricular septal thickness.
Results:
We identified five women with Noonan syndrome (10 pregnancies). Three of five patients had pulmonary valve stenosis at the time of pregnancy, two of which had undergone cardiac procedures. 50% of pregnancies (5/10) resulted in pre-term birth. 80% (8/10) of all deliveries were converted to caesarean section after a trial of labour. One pregnancy resulted in intra-uterine fetal demise while nine pregnancies resulted in the birth of a living infant. 60% (6/10) of livebirths required care in the neonatal intensive care unit. One infant passed away at 5 weeks of age.
Conclusions:
The majority of mothers had pre-existing, though mild, heart disease. We found high rates of prematurity, conversion to caesarean section, and elevated level of care. No maternal complications resulted in long-term morbidity. Our study suggests that women with Noonan syndrome and low-risk cardiac lesions can become pregnant and deliver a healthy infant with counselling and risk evaluation.
Problematic anger is frequently reported by soldiers who have deployed to combat zones. However, evidence is lacking with respect to how anger changes over a deployment cycle, and which factors prospectively influence change in anger among combat-deployed soldiers.
Methods
Reports of problematic anger were obtained from 7298 US Army soldiers who deployed to Afghanistan in 2012. A series of mixed-effects growth models estimated linear trajectories of anger over a period of 1–2 months before deployment to 9 months post-deployment, and evaluated the effects of pre-deployment factors (prior deployments and perceived resilience) on average levels and growth of problematic anger.
Results
A model with random intercepts and slopes provided the best fit, indicating heterogeneity in soldiers' levels and trajectories of anger. First-time deployers reported the lowest anger overall, but the most growth in anger over time. Soldiers with multiple prior deployments displayed the highest anger overall, which remained relatively stable over time. Higher pre-deployment resilience was associated with lower reports of anger, but its protective effect diminished over time. First- and second-time deployers reporting low resilience displayed different anger trajectories (stable v. decreasing, respectively).
Conclusions
Change in anger from pre- to post-deployment varies based on pre-deployment factors. The observed differences in anger trajectories suggest that efforts to detect and reduce problematic anger should be tailored for first-time v. repeat deployers. Ongoing screening is needed even for soldiers reporting high resilience before deployment, as the protective effect of pre-deployment resilience on anger erodes over time.
Retrospective self-report is typically used for diagnosing previous pediatric traumatic brain injury (TBI). A new semi-structured interview instrument (New Mexico Assessment of Pediatric TBI; NewMAP TBI) investigated test–retest reliability for TBI characteristics in both the TBI that qualified for study inclusion and for lifetime history of TBI.
Method:
One-hundred and eight-four mTBI (aged 8–18), 156 matched healthy controls (HC), and their parents completed the NewMAP TBI within 11 days (subacute; SA) and 4 months (early chronic; EC) of injury, with a subset returning at 1 year (late chronic; LC).
Results:
The test–retest reliability of common TBI characteristics [loss of consciousness (LOC), post-traumatic amnesia (PTA), retrograde amnesia, confusion/disorientation] and post-concussion symptoms (PCS) were examined across study visits. Aside from PTA, binary reporting (present/absent) for all TBI characteristics exhibited acceptable (≥0.60) test–retest reliability for both Qualifying and Remote TBIs across all three visits. In contrast, reliability for continuous data (exact duration) was generally unacceptable, with LOC and PCS meeting acceptable criteria at only half of the assessments. Transforming continuous self-report ratings into discrete categories based on injury severity resulted in acceptable reliability. Reliability was not strongly affected by the parent completing the NewMAP TBI.
Conclusions:
Categorical reporting of TBI characteristics in children and adolescents can aid clinicians in retrospectively obtaining reliable estimates of TBI severity up to a year post-injury. However, test–retest reliability is strongly impacted by the initial data distribution, selected statistical methods, and potentially by patient difficulty in distinguishing among conceptually similar medical concepts (i.e., PTA vs. confusion).
Recent cannabis exposure has been associated with lower rates of neurocognitive impairment in people with HIV (PWH). Cannabis’s anti-inflammatory properties may underlie this relationship by reducing chronic neuroinflammation in PWH. This study examined relations between cannabis use and inflammatory biomarkers in cerebrospinal fluid (CSF) and plasma, and cognitive correlates of these biomarkers within a community-based sample of PWH.
Methods:
263 individuals were categorized into four groups: HIV− non-cannabis users (n = 65), HIV+ non-cannabis users (n = 105), HIV+ moderate cannabis users (n = 62), and HIV+ daily cannabis users (n = 31). Differences in pro-inflammatory biomarkers (IL-6, MCP-1/CCL2, IP-10/CXCL10, sCD14, sTNFR-II, TNF-α) by study group were determined by Kruskal–Wallis tests. Multivariable linear regressions examined relationships between biomarkers and seven cognitive domains, adjusting for age, sex/gender, race, education, and current CD4 count.
Results:
HIV+ daily cannabis users showed lower MCP-1 and IP-10 levels in CSF compared to HIV+ non-cannabis users (p = .015; p = .039) and were similar to HIV− non-cannabis users. Plasma biomarkers showed no differences by cannabis use. Among PWH, lower CSF MCP-1 and lower CSF IP-10 were associated with better learning performance (all ps < .05).
Conclusions:
Current daily cannabis use was associated with lower levels of pro-inflammatory chemokines implicated in HIV pathogenesis and these chemokines were linked to the cognitive domain of learning which is commonly impaired in PWH. Cannabinoid-related reductions of MCP-1 and IP-10, if confirmed, suggest a role for medicinal cannabis in the mitigation of persistent inflammation and cognitive impacts of HIV.
The number of people over the age of 65 attending Emergency Departments (ED) in the United Kingdom (UK) is increasing. Those who attend with a mental health related problem may be referred to liaison psychiatry for assessment. Improving responsiveness and integration of liaison psychiatry in general hospital settings is a national priority. To do this psychiatry teams must be adequately resourced and organised. However, it is unknown how trends in the number of referrals of older people to liaison psychiatry teams by EDs are changing, making this difficult.
Method
We performed a national multi-centre retrospective service evaluation, analysing existing psychiatry referral data from EDs of people over 65. Sites were selected from a convenience sample of older peoples liaison psychiatry departments. Departments from all regions of the UK were invited to participate via the RCPsych liaison and older peoples faculty email distribution lists. From departments who returned data, we combined the date and described trends in the number and rate of referrals over a 7 year period.
Result
Referral data from up to 28 EDs across England and Scotland over a 7 year period were analysed (n = 18828 referrals). There is a general trend towards increasing numbers of older people referred to liaison psychiatry year on year. Rates rose year on year from 1.4 referrals per 1000 ED attenders (>65 years) in 2011 to 4.5 in 2019 . There is inter and intra site variability in referral numbers per 1000 ED attendances between different departments, ranging from 0.1 - 24.3.
Conclusion
To plan an effective healthcare system we need to understand the population it serves, and have appropriate structures and processes within it. The overarching message of this study is clear; older peoples mental health emergencies presenting in ED are common and appear to be increasingly so. Without appropriate investment either in EDs or community mental health services, this is unlikely to improve.
The data also suggest very variable inter-departmental referral rates. It is not possible to establish why rates from one department to another are so different, or whether outcomes for the population they serve are better or worse. The data does however highlight the importance of asking further questions about why the departments are different, and what impact that has on the patients they serve.
Early assessment, diagnosis and management for people living with dementia is essential, both for the patient and their carers. We recognised delays in established local pathways when patients had unplanned acute hospital admissions preventing them from attending memory diagnostic appointments. The Psychiatric Liaison Team (PLT) Memory Pathway was introduced as we had the skills and expertise to resume the process and to find new undetected patients.
Our aim was to determine how well the newly implemented PLT Memory Pathway follows the standards outlined in the National Institute of Health & Care Excellence (NICE) Clinical Guideline 97 (CG97): Assessment, management and support for people living with dementia and their carers.
Method
A retrospective analysis of all PLT referrals from July 2018 to February 2020 (20 months) was performed to identify patients on the community memory pathway and those with possible undetected cognitive impairment. Data were collected from electronic patient records which included demographics, primary and collateral history, cognitive testing and imaging, dementia type among others. Results were analysed using Microsoft Excel.
Result
41 patients were included (59% female). 80% of patients were referred for memory problems or confusion. 63% had previous referrals to a memory service and was on the community memory pathway at the time of the referral. 34% were on anticholinergic medication but in only 14% were this documented as reviewed. 100 % were offered and had head imaging. A finding worthy of note was the absence of any from the ethnic minority background. 63% of patients were given a memory diagnosis and 34% had anti-dementia medication started. Patients’ families were made aware of the diagnosis in 83% of cases, due to the absence of next of kin details in the patient record. Primary Care was made aware in 100% of cases; post-diagnostic support was 100%.
Conclusion
The PLT is well placed to bridge the service gap between the acute care trust and established community memory services when dealing with patients with dementia. A dedicated Memory Pathway has helped to close this gap and adherence to NICE CG97 standards was good, but there is room for improvement. A particular focus will be on improving documentation of anticholinergic medication review and exploration for the absence of ethnic minority patients. Aiming to achieve 100% family involvement is also recommended.
This study has been submitted to the Royal College of Psychiatrists' Faculty of Old Age Annual Conference 2021.
Microstructures, including crystallographic fabric, within the margin of streaming ice can exert strong control on flow dynamics. To characterize a natural setting, we retrieved three cores, two of which reached bed, from the flank of Jarvis Glacier, eastern Alaska Range, Alaska. The core sites lie ~1 km downstream of the source, with abundant water present in the extracted cores and at the base of the glacier. All cores exhibit dipping layers, a combination of debris bands and bubble-free domains. Grain sizes coarsen on average approaching the lateral margin. Crystallographic orientations are more clustered and with c-axes closer to horizontal nearer the lateral margin. The measured fabric is sufficiently weak to induce little mechanical anisotropy, but the data suggest that despite the challenging conditions of warm ice, abundant water and a short flow distance, many aspects of the microstructure, including measurable crystallographic fabric, evolved in systematic ways.
The Subglacial Antarctic Lakes Scientific Access (SALSA) Project accessed Mercer Subglacial Lake using environmentally clean hot-water drilling to examine interactions among ice, water, sediment, rock, microbes and carbon reservoirs within the lake water column and underlying sediments. A ~0.4 m diameter borehole was melted through 1087 m of ice and maintained over ~10 days, allowing observation of ice properties and collection of water and sediment with various tools. Over this period, SALSA collected: 60 L of lake water and 10 L of deep borehole water; microbes >0.2 μm in diameter from in situ filtration of ~100 L of lake water; 10 multicores 0.32–0.49 m long; 1.0 and 1.76 m long gravity cores; three conductivity–temperature–depth profiles of borehole and lake water; five discrete depth current meter measurements in the lake and images of ice, the lake water–ice interface and lake sediments. Temperature and conductivity data showed the hydrodynamic character of water mixing between the borehole and lake after entry. Models simulating melting of the ~6 m thick basal accreted ice layer imply that debris fall-out through the ~15 m water column to the lake sediments from borehole melting had little effect on the stratigraphy of surficial sediment cores.
Definition of disorder subtypes may facilitate precision treatment for posttraumatic stress disorder (PTSD). We aimed to identify PTSD subtypes and evaluate their associations with genetic risk factors, types of stress exposures, comorbidity, and course of PTSD.
Methods
Data came from a prospective study of three U.S. Army Brigade Combat Teams that deployed to Afghanistan in 2012. Soldiers with probable PTSD (PTSD Checklist for Diagnostic and Statistical Manual of Mental Disorders-Fifth Edition ≥31) at three months postdeployment comprised the sample (N = 423) for latent profile analysis using Gaussian mixture modeling and PTSD symptom ratings as indicators. PTSD profiles were compared on polygenic risk scores (derived from external genomewide association study summary statistics), experiences during deployment, comorbidity at three months postdeployment, and persistence of PTSD at nine months postdeployment.
Results
Latent profile analysis revealed profiles characterized by prominent intrusions, avoidance, and hyperarousal (threat-reactivity profile; n = 129), anhedonia and negative affect (dysphoric profile; n = 195), and high levels of all PTSD symptoms (high-symptom profile; n = 99). The threat-reactivity profile had the most combat exposure and the least comorbidity. The dysphoric profile had the highest polygenic risk for major depression, and more personal life stress and co-occurring major depression than the threat-reactivity profile. The high-symptom profile had the highest rates of concurrent mental disorders and persistence of PTSD.
Conclusions
Genetic and trauma-related factors likely contribute to PTSD heterogeneity, which can be parsed into subtypes that differ in symptom expression, comorbidity, and course. Future studies should evaluate whether PTSD typology modifies treatment response and should clarify distinctions between the dysphoric profile and depressive disorders.
Ecosystem modeling, a pillar of the systems ecology paradigm (SEP), addresses questions such as, how much carbon and nitrogen are cycled within ecological sites, landscapes, or indeed the earth system? Or how are human activities modifying these flows? Modeling, when coupled with field and laboratory studies, represents the essence of the SEP in that they embody accumulated knowledge and generate hypotheses to test understanding of ecosystem processes and behavior. Initially, ecosystem models were primarily used to improve our understanding about how biophysical aspects of ecosystems operate. However, current ecosystem models are widely used to make accurate predictions about how large-scale phenomena such as climate change and management practices impact ecosystem dynamics and assess potential effects of these changes on economic activity and policy making. In sum, ecosystem models embedded in the SEP remain our best mechanism to integrate diverse types of knowledge regarding how the earth system functions and to make quantitative predictions that can be confronted with observations of reality. Modeling efforts discussed are the Century ecosystem model, DayCent ecosystem model, Grassland Ecosystem Model ELM, food web models, Savanna model, agent-based and coupled systems modeling, and Bayesian modeling.
Feeding difficulty is a known complication of congenital heart surgery. Despite this, there is a relative sparsity in the available data regarding risk factors, incidence, associated symptoms, and outcomes.
Methods:
In this retrospective chart review, patients aged 0–18 years who underwent congenital heart surgery at a single institution between January and December, 2017 were reviewed. Patients with feeding difficulties before surgery, multiple surgeries, and potentially abnormal recurrent laryngeal nerve anatomy were excluded. Data collected included patient demographics, feeding outcomes, post-operative symptoms, flexible nasolaryngoscopy findings, and rates of readmission within a 1-year follow-up period. Multivariable regression analyses were performed to evaluate the risk of an alternative feeding plan at discharge and length of stay.
Results:
Three-hundred and twenty-six patients met the inclusion criteria for this study. Seventy-two (22.09%) were discharged with a feeding tube and 70 (97.22%) of this subgroup were younger than 12 months at the time of surgery. Variables that increased the risk of being discharged with a feeding tube included patient age, The Society of Thoracic Surgeons–European Association for Cardio-Thoracic Surgery score, procedure group, aspiration, and reflux. Speech-language pathology was the most frequently utilised consulting service for patients discharged with feeding tubes (90.28%) while other services were not frequently consulted. The median length of stay was increased from 4 to 10 days for patients who required an enteral feeding tube at discharge.
Discussion:
Multidisciplinary management protocol and interventions should be developed and standardised to improve feeding outcomes following congenital heart surgery.
Biospecimen repositories play a vital role in enabling investigation of biologic mechanisms, identification of disease-related biomarkers, advances in diagnostic assays, recognition of microbial evolution, and characterization of new therapeutic targets for intervention. They rely on the complex integration of scientific need, regulatory oversight, quality control in collection, processing and tracking, and linkage to robust phenotype information. The COVID-19 pandemic amplified many of these considerations and illuminated new challenges, all while academic health centers were trying to adapt to unprecedented clinical demands and heightened research constraints not witnessed in over 100 years. The outbreak demanded rapid understanding of SARS-CoV-2 to develop diagnostics and therapeutics, prompting the immediate need for access to high quality, well-characterized COVID-19-associated biospecimens. We surveyed 60 Clinical and Translational Science Award (CTSA) hubs to better understand the strategies and barriers encountered in biobanking before and in response to the COVID-19 pandemic. Feedback revealed a major shift in biorepository model, specimen-acquisition and consent process from a combination of investigator-initiated and institutional protocols to an enterprise-serving strategy. CTSA hubs were well equipped to leverage established capacities and expertise to quickly respond to the scientific needs of this crisis through support of institutional approaches in biorepository management.
This study aimed to examine the predictors of cognitive performance in patients with pediatric mild traumatic brain injury (pmTBI) and to determine whether group differences in cognitive performance on a computerized test battery could be observed between pmTBI patients and healthy controls (HC) in the sub-acute (SA) and the early chronic (EC) phases of injury.
Method:
203 pmTBI patients recruited from emergency settings and 159 age- and sex-matched HC aged 8–18 rated their ongoing post-concussive symptoms (PCS) on the Post-Concussion Symptom Inventory and completed the Cogstate brief battery in the SA (1–11 days) phase of injury. A subset (156 pmTBI patients; 144 HC) completed testing in the EC (~4 months) phase.
Results:
Within the SA phase, a group difference was only observed for the visual learning task (One-Card Learning), with pmTBI patients being less accurate relative to HC. Follow-up analyses indicated higher ongoing PCS and higher 5P clinical risk scores were significant predictors of lower One-Card Learning accuracy within SA phase, while premorbid variables (estimates of intellectual functioning, parental education, and presence of learning disabilities or attention-deficit/hyperactivity disorder) were not.
Conclusions:
The absence of group differences at EC phase is supportive of cognitive recovery by 4 months post-injury. While the severity of ongoing PCS and the 5P score were better overall predictors of cognitive performance on the Cogstate at SA relative to premorbid variables, the full regression model explained only 4.1% of the variance, highlighting the need for future work on predictors of cognitive outcomes.
Streaming ice accounts for a major fraction of global ice flux, yet we cannot yet fully explain the dominant controls on its kinematics. In this contribution, we use an anisotropic full-Stokes thermomechanical flow solver to characterize how mechanical anisotropy and temperature distribution affect ice flux. For the ice stream and glacier geometries we explored, we found that the ice flux increases 1–3% per °C temperature increase in the margin. Glaciers and ice streams with crystallographic fabric oriented approximately normal to the shear plane increase by comparable amounts: an otherwise isotropic ice stream containing a concentrated transverse single maximum fabric in the margin flows 15% faster than the reference case. Fabric and temperature variations independently impact ice flux, with slightly nonlinear interactions. We find that realistic variations in temperature and crystallographic fabric both affect ice flux to similar degrees, with the exact effect a function of the local fabric and temperature distributions. Given this sensitivity, direct field-based measurements and models incorporating additional factors, such as water content and temporal evolution, are essential for explaining and predicting streaming ice dynamics.
Unit cohesion may protect service member mental health by mitigating effects of combat exposure; however, questions remain about the origins of potential stress-buffering effects. We examined buffering effects associated with two forms of unit cohesion (peer-oriented horizontal cohesion and subordinate-leader vertical cohesion) defined as either individual-level or aggregated unit-level variables.
Methods
Longitudinal survey data from US Army soldiers who deployed to Afghanistan in 2012 were analyzed using mixed-effects regression. Models evaluated individual- and unit-level interaction effects of combat exposure and cohesion during deployment on symptoms of post-traumatic stress disorder (PTSD), depression, and suicidal ideation reported at 3 months post-deployment (model n's = 6684 to 6826). Given the small effective sample size (k = 89), the significance of unit-level interactions was evaluated at a 90% confidence level.
Results
At the individual-level, buffering effects of horizontal cohesion were found for PTSD symptoms [B = −0.11, 95% CI (−0.18 to −0.04), p < 0.01] and depressive symptoms [B = −0.06, 95% CI (−0.10 to −0.01), p < 0.05]; while a buffering effect of vertical cohesion was observed for PTSD symptoms only [B = −0.03, 95% CI (−0.06 to −0.0001), p < 0.05]. At the unit-level, buffering effects of horizontal (but not vertical) cohesion were observed for PTSD symptoms [B = −0.91, 90% CI (−1.70 to −0.11), p = 0.06], depressive symptoms [B = −0.83, 90% CI (−1.24 to −0.41), p < 0.01], and suicidal ideation [B = −0.32, 90% CI (−0.62 to −0.01), p = 0.08].
Conclusions
Policies and interventions that enhance horizontal cohesion may protect combat-exposed units against post-deployment mental health problems. Efforts to support individual soldiers who report low levels of horizontal or vertical cohesion may also yield mental health benefits.
We developed a tilt sensor for studying ice deformation and installed our tilt sensor systems in two boreholes drilled close to the shear margin of Jarvis Glacier, Alaska to obtain kinematic measurements of streaming ice. We used the collected tilt data to calculate borehole deformation by tracking the orientation of the sensors over time. The sensors' tilts generally trended down-glacier, with an element of cross-glacier flow in the borehole closer to the shear margin. We also evaluated our results against flow dynamic parameters derived from Glen's exponential flow law and explored the parameter space of the stress exponent n and enhancement factor E. Comparison with values from ice deformation experiments shows that the ice on Jarvis is characterized by higher n values than that is expected in regions of low stress, particularly at the shear margin (~3.4). The higher n values could be attributed to the observed high total strains coupled with potential dynamic recrystallization, causing anisotropic development and consequently sped up ice flow. Jarvis' n values place the creep regime of the ice between basal slip and dislocation creep. Tuning E towards a theoretical upper limit of 10 for anisotropic ice with single-maximum fabric reduces the n values by 0.2.
Frascati international research criteria for HIV-associated neurocognitive disorders (HAND) are controversial; some investigators have argued that Frascati criteria are too liberal, resulting in a high false positive rate. Meyer et al. recommended more conservative revisions to HAND criteria, including exploring other commonly used methodologies for neurocognitive impairment (NCI) in HIV including the global deficit score (GDS). This study compares NCI classifications by Frascati, Meyer, and GDS methods, in relation to neuroimaging markers of brain integrity in HIV.
Method:
Two hundred forty-one people living with HIV (PLWH) without current substance use disorder or severe (confounding) comorbid conditions underwent comprehensive neurocognitive testing and brain structural magnetic resonance imaging and magnetic resonance spectroscopy. Participants were classified using Frascati criteria versus Meyer criteria: concordant unimpaired [Frascati(Un)/Meyer(Un)], concordant impaired [Frascati(Imp)/Meyer(Imp)], or discordant [Frascati(Imp)/Meyer(Un)] which were impaired via Frascati criteria but unimpaired via Meyer criteria. To investigate the GDS versus Meyer criteria, the same groupings were utilized using GDS criteria instead of Frascati criteria.
Results:
When examining Frascati versus Meyer criteria, discordant Frascati(Imp)/Meyer(Un) individuals had less cortical gray matter, greater sulcal cerebrospinal fluid volume, and greater evidence of neuroinflammation (i.e., choline) than concordant Frascati(Un)/Meyer(Un) individuals. GDS versus Meyer comparisons indicated that discordant GDS(Imp)/Meyer(Un) individuals had less cortical gray matter and lower levels of energy metabolism (i.e., creatine) than concordant GDS(Un)/Meyer(Un) individuals. In both sets of analyses, the discordant group did not differ from the concordant impaired group on any neuroimaging measure.
Conclusions:
The Meyer criteria failed to capture a substantial portion of PLWH with brain abnormalities. These findings support continued use of Frascati or GDS criteria to detect HIV-associated CNS dysfunction.
One of the foundations of product design is the division between production and design. This division manifests as designers aspiring to create fixed iconic archetypes and production replicates endlessly in thousands or millions. Today innovation and technological change are challenging this idea of product design and manufacturing. The evolution of Rapid Prototyping into Additive Manufacturing (AM), is challenging the notion of mass manufacture and consumer value. As AM advances in capability and capacity, the ability to economically manufacture products in low numbers with high degrees of personalisation poses questions of the accepted product development process. Removing the need for dedicated expensive tooling also eliminates the cyclical timescales and commitment to fixed designs that investment in tooling demands. The ability to alter designs arbitrarily, frequently and responsively means that the traditional design process need not be applied and because of this, design processes and practice might be radically different in the future. In this paper, we explore this possible evolution by drawing parallels with principles and development models found in software development.
Whereas genetic susceptibility increases the risk for major depressive disorder (MDD), non-genetic protective factors may mitigate this risk. In a large-scale prospective study of US Army soldiers, we examined whether trait resilience and/or unit cohesion could protect against the onset of MDD following combat deployment, even in soldiers at high polygenic risk.
Methods
Data were analyzed from 3079 soldiers of European ancestry assessed before and after their deployment to Afghanistan. Incident MDD was defined as no MDD episode at pre-deployment, followed by a MDD episode following deployment. Polygenic risk scores were constructed from a large-scale genome-wide association study of major depression. We first examined the main effects of the MDD PRS and each protective factor on incident MDD. We then tested the effects of each protective factor on incident MDD across strata of polygenic risk.
Results
Polygenic risk showed a dose–response relationship to depression, such that soldiers at high polygenic risk had greatest odds for incident MDD. Both unit cohesion and trait resilience were prospectively associated with reduced risk for incident MDD. Notably, the protective effect of unit cohesion persisted even in soldiers at highest polygenic risk.
Conclusions
Polygenic risk was associated with new-onset MDD in deployed soldiers. However, unit cohesion – an index of perceived support and morale – was protective against incident MDD even among those at highest genetic risk, and may represent a potent target for promoting resilience in vulnerable soldiers. Findings illustrate the value of combining genomic and environmental data in a prospective design to identify robust protective factors for mental health.
Objectives: Studies of neurocognitively elite older adults, termed SuperAgers, have identified clinical predictors and neurobiological indicators of resilience against age-related neurocognitive decline. Despite rising rates of older persons living with HIV (PLWH), SuperAging (SA) in PLWH remains undefined. We aimed to establish neuropsychological criteria for SA in PLWH and examined clinically relevant correlates of SA. Methods: 734 PLWH and 123 HIV-uninfected participants between 50 and 64 years of age underwent neuropsychological and neuromedical evaluations. SA was defined as demographically corrected (i.e., sex, race/ethnicity, education) global neurocognitive performance within normal range for 25-year-olds. Remaining participants were labeled cognitively normal (CN) or impaired (CI) based on actual age. Chi-square and analysis of variance tests examined HIV group differences on neurocognitive status and demographics. Within PLWH, neurocognitive status differences were tested on HIV disease characteristics, medical comorbidities, and everyday functioning. Multinomial logistic regression explored independent predictors of neurocognitive status. Results: Neurocognitive status rates and demographic characteristics differed between PLWH (SA=17%; CN=38%; CI=45%) and HIV-uninfected participants (SA=35%; CN=55%; CI=11%). In PLWH, neurocognitive groups were comparable on demographic and HIV disease characteristics. Younger age, higher verbal IQ, absence of diabetes, fewer depressive symptoms, and lifetime cannabis use disorder increased likelihood of SA. SA reported increased independence in everyday functioning, employment, and health-related quality of life than non-SA. Conclusions: Despite combined neurological risk of aging and HIV, youthful neurocognitive performance is possible for older PLWH. SA relates to improved real-world functioning and may be better explained by cognitive reserve and maintenance of cardiometabolic and mental health than HIV disease severity. Future research investigating biomarker and lifestyle (e.g., physical activity) correlates of SA may help identify modifiable neuroprotective factors against HIV-related neurobiological aging. (JINS, 2019, 25, 507–519)