To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Introduction: The Canadian Syncope Risk Score (CSRS) is a validated risk tool developed using the best practices of conventional biostatistics, for predicting 30-day serious adverse events (SAE) after an Emergency Department (ED) visit for syncope. We sought to improve on the prediction ability of the CSRS and compared it to physician judgement using artificial intelligence (AI) research with modern machine learning (ML) methods. Methods: We used the prospective multicenter cohort data collected for the CSRS derivation and validation at 11 EDs across Canada over an 8-year period. The same 43 candidate variables considered for CSRS development were used to train and validate the four classes of ML models to predict 30-day SAE (death, arrhythmias, MI, structural heart disease, pulmonary embolism, hemorrhage) after ED disposition. Physician judgement was modeled using the two variables, referral for consultation and hospitalization. We compared the area under the curve (AUC) for the three models. Results: The proportion of patients who suffered 30-day SAE in the derivation cohort (N = 4030) was 3.6% and in validation phase (N = 2290) was 3.4%. Characteristics of the both cohorts were similar with no shift. The best performing ML model, a gradient boosting tree-based model used all 43 variables as predictors as opposed to the 9 final CSRS predictors. The AUC for the three models on the validation data were: best ML model 0.91 (95% CI 0.87–0.93), CSRS 0.87 (95% CI 0.83–0.90) and physician judgment 0.79 (95% CI 0.74 - 0.84). The most important predictors in the ML model were the same as the CSRS predictors. Conclusion: A ML model developed using AI method for risk-stratification of ED syncope performed with slightly better discrimination ability though not significantly different when compared to the CSRS. Both the ML model and the CSRS were better predictors of poor outcomes after syncope than physician judgement. ML models can perform with similar discrimination abilities when compared to traditional statistical models and outperform physician judgement given their ability to use all candidate variables.
Acute change in mental status (ACMS), defined by the Confusion Assessment Method, is used to identify infections in nursing home residents. A medical record review revealed that none of 15,276 residents had an ACMS documented. Using the revised McGeer criteria with a possible ACMS definition, we identified 296 residents and 21 additional infections. The use of a possible ACMS definition should be considered for retrospective nursing home infection surveillance.
Integration of depression treatment into primary care could improve patient outcomes in low-resource settings. Losses along the depression care cascade limit integrated service effectiveness. This study identified patient-level factors that predicted detection of depressive symptoms by nurses, referral for depression treatment, and uptake of counseling, as part of integrated care in KwaZulu-Natal, South Africa.
This was an analysis of baseline data from a prospective cohort. Participants were adult patients with at least moderate depressive symptoms at primary care facilities in Amajuba, KwaZulu-Natal, South Africa. Participants were screened for depressive symptoms prior to routine assessment by a nurse. Generalized linear mixed-effects models were used to estimate associations between patient characteristics and service delivery outcomes.
Data from 412 participants were analyzed. Nurses successfully detected depressive symptoms in 208 [50.5%, 95% confidence interval (CI) 38.9–62.0] participants; of these, they referred 76 (36.5%, 95% CI 20.3–56.5) for depression treatment; of these, 18 (23.7%, 95% CI 10.7–44.6) attended at least one session of depression counseling. Depressive symptom severity, alcohol use severity, and perceived stress were associated with detection. Similar factors did not drive referral or counseling uptake.
Nurses detected patients with depressive symptoms at rates comparable to primary care providers in high-resource settings, though gaps in referral and uptake persist. Nurses were more likely to detect symptoms among patients in more severe mental distress. Implementation strategies for integrated mental health care in low-resource settings should target improved rates of detection, referral, and uptake.
A national need is to prepare for and respond to accidental or intentional disasters categorized as chemical, biological, radiological, nuclear, or explosive (CBRNE). These incidents require specific subject-matter expertise, yet have commonalities. We identify 7 core elements comprising CBRNE science that require integration for effective preparedness planning and public health and medical response and recovery. These core elements are (1) basic and clinical sciences, (2) modeling and systems management, (3) planning, (4) response and incident management, (5) recovery and resilience, (6) lessons learned, and (7) continuous improvement. A key feature is the ability of relevant subject matter experts to integrate information into response operations. We propose the CBRNE medical operations science support expert as a professional who (1) understands that CBRNE incidents require an integrated systems approach, (2) understands the key functions and contributions of CBRNE science practitioners, (3) helps direct strategic and tactical CBRNE planning and responses through first-hand experience, and (4) provides advice to senior decision-makers managing response activities. Recognition of both CBRNE science as a distinct competency and the establishment of the CBRNE medical operations science support expert informs the public of the enormous progress made, broadcasts opportunities for new talent, and enhances the sophistication and analytic expertise of senior managers planning for and responding to CBRNE incidents.
Although dementia has been described in ancient texts over many centuries (e.g., “Be kind to your father, even if his mind fail him.” – Old Testament: Sirach 3:12), our knowledge of its underlying causes is little more than a century old. Alzheimer published his now famous case study only 110 years ago, and our modern understanding of the disease that bears his name, and its neuropsychological consequences, really only began to accelerate in the 1980s. Since then we have witnessed an explosion of basic and translational research into the causes, characterizations, and possible treatments for Alzheimer’s disease (AD) and other dementias. We review this lineage of work beginning with Alzheimer’s own writings and drawings, then jump to the modern era beginning in the 1970s and early 1980s and provide a sampling of neuropsychological and other contextual work from each ensuing decade. During the 1980s our field began its foundational studies of profiling the neuropsychological deficits associated with AD and its differentiation from other dementias (e.g., cortical vs. subcortical dementias). The 1990s continued these efforts and began to identify the specific cognitive mechanisms affected by various neuropathologic substrates. The 2000s ushered in a focus on the study of prodromal stages of neurodegenerative disease before the full-blown dementia syndrome (i.e., mild cognitive impairment). The current decade has seen the rise of imaging and other biomarkers to characterize preclinical disease before the development of significant cognitive decline. Finally, we suggest future directions and predictions for dementia-related research and potential therapeutic interventions. (JINS, 2017, 23, 818–831)
This paper highlights major developments over the past two to three decades in the neuropsychology of movement and its disorders. We focus on studies in healthy individuals and patients, which have identified cognitive contributions to movement control and animal work that has delineated the neural circuitry that makes these interactions possible. We cover advances in three major areas: (1) the neuroanatomical aspects of the “motor” system with an emphasis on multiple parallel circuits that include cortical, corticostriate, and corticocerebellar connections; (2) behavioral paradigms that have enabled an appreciation of the cognitive influences on the preparation and execution of movement; and (3) hemispheric differences (exemplified by limb praxis, motor sequencing, and motor learning). Finally, we discuss the clinical implications of this work, and make suggestions for future research in this area. (JINS, 2017, 23, 768–777)
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
Identifying the transmission sources and reservoirs of Streptococcus pneumoniae (SP) is a long-standing question for pneumococcal epidemiology, transmission dynamics, and vaccine policy. Here we use serotype to identify SP transmission and examine acquisitions (in the same household, local community, and county, or of unidentified origin) in a longitudinal cohort of children and adults from the Navajo Nation and the White Mountain Apache American Indian Tribes. We found that adults acquire SP relatively more in the household than other age groups, and children 2–8 years old typically acquire in their own or surrounding communities. Age-specific transmission probability matrices show that transmissions within household were mostly seen from older to younger siblings. Outside the household, children most often transmit to other children in the same age group, showing age-assortative mixing behavior. We find toddlers and older children to be most involved in SP transmission and acquisition, indicating their role as key drivers of SP epidemiology. Although infants have high carriage prevalence, they do not play a central role in transmission of SP compared with toddlers and older children. Our results are relevant to inform alternative pneumococcal conjugate vaccine dosing strategies and analytic efforts to inform optimization of vaccine programs, as well as assessing the transmission dynamics of pathogens transmitted by close contact in general.
Negative symptoms significantly contribute to disability and lack of community participation for low functioning individuals with schizophrenia. Cognitive therapy has been shown to improve negative symptoms and functional outcome in this population. Elucidation of the mechanisms of the therapy would lead to a better understanding of negative symptoms and the development of more effective interventions to promote recovery. The objective of this study was to determine (1) whether guided success at a card-sorting task will produce improvement in defeatist beliefs, positive beliefs about the self, mood, and card-sorting performance, and (2) whether these changes in beliefs and mood predict improvements in unguided card-sorting.
Individuals with schizophrenia having prominent negative symptoms and impaired neurocognitive performance (N = 35) were randomized to guided success (n = 19) or a control (n = 16) condition.
Controlling for baseline performance, the experimental group performed significantly better, endorsed defeatist beliefs to a lesser degree, reported greater positive self-concept, and reported better mood than the control condition immediately after the experimental session. A composite index of change in defeatist beliefs, self-concept, and mood was significantly correlated with improvements in card-sorting.
This analogue study supports the rationale of cognitive therapy and provides a general therapeutic model in which experiential interventions that produce success have a significant immediate effect on a behavioral task, mediated by changes in beliefs and mood. The rapid improvement is a promising indicator of the responsiveness of this population, often regarded as recalcitrant, to cognitively-targeted behavioral interventions.
Evidence for a relationship between neurocognition and functional outcome in important areas of community living is robust in serious mental illness research. Dysfunctional attitudes (defeatist performance beliefs and asocial beliefs) have been identified as intervening variables in this causal chain. This study seeks to expand upon previous research by longitudinally testing the link between neurocognition and community participation (i.e. time in community-based activity) through dysfunctional attitudes and motivation.
Adult outpatients with serious mental illness (N = 175) participated, completing follow-up assessments approximately 6 months after initial assessment. Path analysis tested relationships between baseline neurocognition, emotion perception, functional skills, dysfunctional attitudes, motivation, and outcome (i.e. community participation) at baseline and follow-up.
Path models demonstrated two pathways to community participation. The first linked neurocognition and community participation through functional skills, defeatist performance beliefs, and motivation. A second pathway linked asocial beliefs and community participation, via a direct path passing through motivation. Model fit was excellent for models predicting overall community participation at baseline and, importantly, at follow-up.
The existence of multiple pathways to community participation in a longitudinal model supports the utility of multi-modal interventions for serious mental illness (i.e. treatment packages that build upon individuals’ strengths while addressing the array of obstacles to recovery) that feature dysfunctional attitudes and motivation as treatment targets.
This paper seeks to establish good practice in setting inputs for operational risk models for banks, insurers and other financial service firms. It reviews Basel, Solvency II and other regulatory requirements as well as publicly available literature on operational risk modelling. It recommends a combination of historic loss data and scenario analysis for modelling of individual risks, setting out issues with these data, and outlining good practice for loss data collection and scenario analysis. It recommends the use of expert judgement for setting correlations, and addresses information requirements for risk mitigation allowances and capital allocation, before briefly covering Bayesian network methods for modelling operational risks.
The Solvency II Directive introduces the idea of a formal Actuarial Function to have responsibility over delivering the requirements of Article 48 of the Directive. Article 48 describes the responsibilities as being concerned with technical provisions, an opinion on reinsurance adequacy, an opinion on underwriting policy and contributing to the risk management system. Considerable documentation has been produced by the Prudential Regulation Authority (PRA), the Institute and Faculty of Actuaries (IFoA) and the European Insurance and Occupational Pensions Authority on the subject, much of it very recent to the publication of this paper. The purpose of this paper is to provide the reader with some practical insights and suggestions around addressing the requirements of Article 48 of the Solvency II Directive in general insurance firms, taking into consideration the publications of the aforementioned regulatory authorities. It is not our intention to give advice, nor to be seen to give advice, but rather to make suggestions and observations that we hope the reader will find useful. The Regulations lay down the tasks of the Actuarial Function, so insurers should consider the need for formal terms of reference, backed up by proportionate governance procedures. The Regulations also require the production of an Actuarial Function Report to document the tasks undertaken by the Actuarial Function and its results. Such a report can be an aggregate report, made up of individual component reports completed at suitable points in the Actuarial Function’s work cycle, so long as it reports on all the required tasks. The technical provisions section should cover at least all the areas laid down in the Delegated Acts. The opinions required covering reinsurance adequacy and underwriting policy are not formal “sign offs”, but contributions to the effective running of the insurer by applying the skills and knowledge of actuaries to areas for which they are not normally responsible. Again, the Delegated Acts mandate the minimum contribution the Actuarial Function should make. The responsibility for delivering the work of the Actuarial Function does not have to be given to a member of the IFoA; however, the PRA is going to require (at least) one person to be designated the “Chief Actuary”, defined as the person responsible for delivering the requirements of Article 48 of the Directive. In response, the IFoA has stated its intention to require its members holding the role of Chief Actuary, as defined by the PRA, to hold a practicing certificate. Any Actuarial Function will need to consider issues of governance, independence and conflicts of interests. The PRA intends to require the Actuarial Function to be independent of an insurer’s revenue-generating functions. In addition, normal good governance requires a degree of separation between those who perform Actuarial Function work and those who review and supervise it. There are numerous stakeholders in the Actuarial Function’s work. Some of these will rely on the output of the Actuarial Function, others will provide inputs to its work. Setting out stakeholder responsibilities clearly and in advance will be of vital importance. Good communication and coordination between these groups will be important to the efficient running of the insurer. Bringing together issues of governance, independence and meeting the Directive and regulators’ requirements will require a suitable organisational structure which will also need to consider practical issues, such as the availability of suitable staff. Many such arrangements may be possible, but all will require trading off advantages and disadvantages. The Actuarial Function is primarily about good practice and getting the most out of the actuarial skills available. For many insurers, meeting the requirements should not be unduly burdensome.
Data were pooled from three Australian sentinel general practice influenza surveillance networks to estimate Australia-wide influenza vaccine coverage and effectiveness against community presentations for laboratory-confirmed influenza for the 2012, 2013 and 2014 seasons. Patients presenting with influenza-like illness at participating GP practices were swabbed and tested for influenza. The vaccination odds of patients testing positive were compared with patients testing negative to estimate influenza vaccine effectiveness (VE) by logistic regression, adjusting for age group, week of presentation and network. Pooling of data across Australia increased the sample size for estimation from a minimum of 684 to 3,683 in 2012, from 314 to 2,042 in 2013 and from 497 to 3,074 in 2014. Overall VE was 38% [95% confidence interval (CI) 24–49] in 2012, 60% (95% CI 45–70) in 2013 and 44% (95% CI 31–55) in 2014. For A(H1N1)pdm09 VE was 54% (95% CI–28 to 83) in 2012, 59% (95% CI 33–74) in 2013 and 55% (95% CI 39–67) in 2014. For A(H3N2), VE was 30% (95% CI 14–44) in 2012, 67% (95% CI 39–82) in 2013 and 26% (95% CI 1–45) in 2014. For influenza B, VE was stable across years at 56% (95% CI 37–70) in 2012, 57% (95% CI 30–73) in 2013 and 54% (95% CI 21–73) in 2014. Overall VE against influenza was low in 2012 and 2014 when A(H3N2) was the dominant strain and the vaccine was poorly matched. In contrast, overall VE was higher in 2013 when A(H1N1)pdm09 dominated and the vaccine was a better match. Pooling data can increase the sample available and enable more precise subtype- and age group-specific estimates, but limitations remain.
To investigate familial influences on the full range of variability in attention and activity across adolescence, we collected maternal ratings of 339 twin pairs at ages 12, 14, and 16, and estimated the transmitted and new familial influences on attention and activity as measured by the Strengths and Weaknesses of Attention-Deficit/Hyperactivity Disorder Symptoms and Normal Behavior Scale. Familial influences were substantial for both traits across adolescence: genetic influences accounted for 54%–73% (attention) and 31%–73% (activity) of the total variance, and shared environmental influences accounted for 0%–22% of the attention variance and 13%–57% of the activity variance. The longitudinal stability of individual differences in attention and activity was largely accounted for by familial influences transmitted from previous ages. Innovations over adolescence were also partially attributable to familial influences. Studying the full range of variability in attention and activity may facilitate our understanding of attention-deficit/hyperactivity disorder's etiology and intervention.
A trend toward greater body size in dizygotic (DZ) than in monozygotic (MZ) twins has been suggested by some but not all studies, and this difference may also vary by age. We analyzed zygosity differences in mean values and variances of height and body mass index (BMI) among male and female twins from infancy to old age. Data were derived from an international database of 54 twin cohorts participating in the COllaborative project of Development of Anthropometrical measures in Twins (CODATwins), and included 842,951 height and BMI measurements from twins aged 1 to 102 years. The results showed that DZ twins were consistently taller than MZ twins, with differences of up to 2.0 cm in childhood and adolescence and up to 0.9 cm in adulthood. Similarly, a greater mean BMI of up to 0.3 kg/m2 in childhood and adolescence and up to 0.2 kg/m2 in adulthood was observed in DZ twins, although the pattern was less consistent. DZ twins presented up to 1.7% greater height and 1.9% greater BMI than MZ twins; these percentage differences were largest in middle and late childhood and decreased with age in both sexes. The variance of height was similar in MZ and DZ twins at most ages. In contrast, the variance of BMI was significantly higher in DZ than in MZ twins, particularly in childhood. In conclusion, DZ twins were generally taller and had greater BMI than MZ twins, but the differences decreased with age in both sexes.
Genetic influences contribute significantly to co-morbidity between conduct disorder and substance use disorders. Estimating the extent of overlap can assist in the development of phenotypes for genomic analyses.
Multivariate quantitative genetic analyses were conducted using data from 9577 individuals, including 3982 complete twin pairs and 1613 individuals whose co-twin was not interviewed (aged 24–37 years) from two Australian twin samples. Analyses examined the genetic correlation between alcohol dependence, nicotine dependence and cannabis abuse/dependence and the extent to which the correlations were attributable to genetic influences shared with conduct disorder.
Additive genetic (a2 = 0.48–0.65) and non-shared environmental factors explained variance in substance use disorders. Familial effects on conduct disorder were due to additive genetic (a2 = 0.39) and shared environmental (c2 = 0.15) factors. All substance use disorders were influenced by shared genetic factors (rg = 0.38–0.56), with all genetic overlap between substances attributable to genetic influences shared with conduct disorder. Genes influencing individual substance use disorders were also significant, explaining 40–73% of the genetic variance per substance.
Among substance users in this sample, the well-documented clinical co-morbidity between conduct disorder and substance use disorders is primarily attributable to shared genetic liability. Interventions targeted at generally reducing deviant behaviors may address the risk posed by this shared genetic liability. However, there is also evidence for genetic and environmental influences specific to each substance. The identification of these substance-specific risk factors (as well as potential protective factors) is critical to the future development of targeted treatment protocols.
For over 100 years, the genetics of human anthropometric traits has attracted scientific interest. In particular, height and body mass index (BMI, calculated as kg/m2) have been under intensive genetic research. However, it is still largely unknown whether and how heritability estimates vary between human populations. Opportunities to address this question have increased recently because of the establishment of many new twin cohorts and the increasing accumulation of data in established twin cohorts. We started a new research project to analyze systematically (1) the variation of heritability estimates of height, BMI and their trajectories over the life course between birth cohorts, ethnicities and countries, and (2) to study the effects of birth-related factors, education and smoking on these anthropometric traits and whether these effects vary between twin cohorts. We identified 67 twin projects, including both monozygotic (MZ) and dizygotic (DZ) twins, using various sources. We asked for individual level data on height and weight including repeated measurements, birth related traits, background variables, education and smoking. By the end of 2014, 48 projects participated. Together, we have 893,458 height and weight measures (52% females) from 434,723 twin individuals, including 201,192 complete twin pairs (40% monozygotic, 40% same-sex dizygotic and 20% opposite-sex dizygotic) representing 22 countries. This project demonstrates that large-scale international twin studies are feasible and can promote the use of existing data for novel research purposes.
Policymakers may wish to align healthcare payment and quality of care while minimizing unintended consequences, particularly for safety net hospitals.
To determine whether the 2008 Centers for Medicare and Medicaid Services Hospital-Acquired Conditions policy had a differential impact on targeted healthcare-associated infection rates in safety net compared with non–safety net hospitals.
Interrupted time-series design.
SETTING AND PARTICIPANTS
Nonfederal acute care hospitals that reported central line–associated bloodstream infection and ventilator-associated pneumonia rates to the Centers for Disease Control and Prevention’s National Health Safety Network from July 1, 2007, through December 31, 2013.
We did not observe changes in the slope of targeted infection rates in the postpolicy period compared with the prepolicy period for either safety net (postpolicy vs prepolicy ratio, 0.96 [95% CI, 0.84–1.09]) or non–safety net (0.99 [0.90–1.10]) hospitals. Controlling for prepolicy secular trends, we did not detect differences in an immediate change at the time of the policy between safety net and non–safety net hospitals (P for 2-way interaction, .87).
The Centers for Medicare and Medicaid Services Hospital-Acquired Conditions policy did not have an impact, either positive or negative, on already declining rates of central line–associated bloodstream infection in safety net or non–safety net hospitals. Continued evaluations of the broad impact of payment policies on safety net hospitals will remain important as the use of financial incentives and penalties continues to expand in the United States.