We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Proper care and maintenance of central lines is essential to prevent central-line–associated bloodstream infections (CLABSI). Our facility implemented a hospital-wide central-line maintenance bundle based on CLABSI prevention guidelines. The objective of this study was to determine whether maintenance bundle adherence was influenced by nursing shift or the day of week. Methods: A central-line maintenance bundle was implemented in April 2018 at a 1,266-bed academic medical center. The maintenance bundle components included alcohol-impregnated disinfection caps on all ports and infusion tubing, infusion tubing dated, dressings, not damp or soiled, no oozing at insertion site greater than the size of a quarter, dressings occlusive with all edges intact, transparent dressing change recorded within 7 days, and no gauze dressings in place for >48 hours. To monitor bundle compliance, 4 non–unit-based nurse observers were trained to audit central lines. Observations were collected between August 2018 and October 2019. Observations were performed during all shifts and 7 days per week. Just-in-time feedback was provided for noncompliant central lines. Nursing shifts were defined as day (7:00 a.m. to 3:00 p.m.), evening (3:00 p.m. to 11:00 p.m.), and night (11:00 p.m. to 7:00 a.m.). Central-line bundle compliance between shifts were compared using multinomial logistic regression. Bundle compliance between week day and weekend were compared using Mantel-Haenszel 2 analysis. Results: Of the 25,902 observations collected, 11,135 (42.9%) were day-shift observations, 11,559 (44.6%) occurred on evening shift, and 3,208 (12.4%) occurred on the night shift. Overall, 22,114 (85.9%) observations occurred on a week day versus 3,788 (14.6%) on a Saturday or Sunday (median observations per day of the week, 2,570; range, 1,680–6,800). In total, 4,599 CLs (17.8%) were noncompliant with >1 bundle component. The most common reasons for noncompliance were dressing not dated (n = 1,577; 44.0%) and dressings not occlusive with all edges intact (n = 1340; 37.4%). The noncompliant rates for central-line observations by shift were 12.8% (1,430 of 1,1,135) on day shift, 20.4% (2,361 of 11,559) on evening shift, and 25.2% (808 of 3,208) on night shift. Compared to day shift, evening shift (OR, 1.74; 95% CI, 1.62–1.87; P < .001) and night shift (OR, 2.29; 95% CI, 2.07–2.52; P < .001) were more likely to have a noncompliant central lines. Compared to a weekday, observations on weekend days were more likely to find a noncompliant central line: 914 of 3,788 (24.4%) weekend days versus 3,685 of 22,114 (16.7%) week days (P < .001). Conclusions: Noncompliance with central-line maintenance bundle was more likely on evening and night shifts and during the weekends.
OBJECTIVES/GOALS: The new CLIC Education & Career Development Gateway aims to be a translational science workforce ecosystem for CTSAs to share learning and training resources and career opportunities. The Gateway also provides individualized assistance to identify and implement TS learning and training resources. METHODS/STUDY POPULATION: The CLIC Education & Career Development Gateway, located on the CLIC website, is an entry way to: 1) the Education Clearinghouse, a platform where CTSA Program hubs can find and share educational resources individually or as part of resource kits; 2) the Opportunities Board, which includes jobs and mini-sabbaticals from CTSA Program hubs; and 3) the Education & Training Navigator, a personalized approach to education and training requests. These approaches help empower and support a cooperative learning and training community that is inclusive and collaborative, facilitating and amplifying opportunities for the sharing of educational resources throughout the translational science workforce. RESULTS/ANTICIPATED RESULTS: Through a person-centered, direct engagement approach, the anticipated outcomes of these efforts are to promote increased collaboration across CTSA Program Hubs and partners, and the amplification of accessible, relevant existing resources. Another anticipated outcome is increased production of educational materials through the reduction of work duplication and identification of gaps in education and training resources. The Gateway also provides an opportunity to communicate the work and efforts that consortium-level special groups (working groups, special interest groups, etc.) produce. Ongoing evaluations and suggestions will help determine future improvements and functionalities. DISCUSSION/SIGNIFICANCE OF IMPACT: CLIC’s education and training ecosystem promotes education as a community space to facilitate opportunities for collaboration and partnerships, amplifying visibility of the work created by members of the CTSA community, and encouraging a transformative career trajectory for trainees and scholars.
This study investigated metabolic, endocrine, appetite and mood responses to a maximal eating occasion in fourteen men (mean: age 28 (sd 5) years, body mass 77·2 (sd 6·6) kg and BMI 24·2 (sd 2·2) kg/m2) who completed two trials in a randomised crossover design. On each occasion, participants ate a homogenous mixed-macronutrient meal (pizza). On one occasion, they ate until ‘comfortably full’ (ad libitum) and on the other, until they ‘could not eat another bite’ (maximal). Mean energy intake was double in the maximal (13 024 (95 % CI 10 964, 15 084) kJ; 3113 (95 % CI 2620, 3605) kcal) compared with the ad libitum trial (6627 (95 % CI 5708, 7547) kJ; 1584 (95 % CI 1364, 1804) kcal). Serum insulin incremental AUC (iAUC) increased approximately 1·5-fold in the maximal compared with ad libitum trial (mean: ad libitum 43·8 (95 % CI 28·3, 59·3) nmol/l × 240 min and maximal 67·7 (95 % CI 47·0, 88·5) nmol/l × 240 min, P < 0·01), but glucose iAUC did not differ between trials (ad libitum 94·3 (95 % CI 30·3, 158·2) mmol/l × 240 min and maximal 126·5 (95 % CI 76·9, 176·0) mmol/l × 240 min, P = 0·19). TAG iAUC was approximately 1·5-fold greater in the maximal v. ad libitum trial (ad libitum 98·6 (95 % CI 69·9, 127·2) mmol/l × 240 min and maximal 146·4 (95 % CI 88·6, 204·1) mmol/l × 240 min, P < 0·01). Total glucagon-like peptide-1, glucose-dependent insulinotropic peptide and peptide tyrosine–tyrosine iAUC were greater in the maximal compared with ad libitum trial (P < 0·05). Total ghrelin concentrations decreased to a similar extent, but AUC was slightly lower in the maximal v. ad libitum trial (P = 0·02). There were marked differences on appetite and mood between trials, most notably maximal eating caused a prolonged increase in lethargy. Healthy men have the capacity to eat twice the energy content required to achieve comfortable fullness at a single meal. Postprandial glycaemia is well regulated following initial overeating, with elevated postprandial insulinaemia probably contributing.
The present study aims to investigate the effect of wholegrain and legume consumption on the incidence of age-related cataract in an older Australian population-based cohort. The Blue Mountains Eye Study (BMES) is a population-based cohort study of eye diseases among older adults aged 49 years or older (1992–1994, n 3654). Of 2334 participants of the second examination of the BMES (BMES 2, 1997–2000), 1541 (78·3 % of survivors) were examined 5 years later (BMES 3) who had wholegrain and legume consumption estimated from the FFQ at BMES 2. Cataract was assessed using photographs taken during examinations following the Wisconsin cataract grading system. Multivariable-adjusted logistic regression models were used to assess associations with the 5-year incidence of cataract from BMES 2 (baseline) to BMES 3. The 5-year incidence of cortical, nuclear and posterior subcapsular (PSC) cataract was 18·2, 16·5 and 5·9 %, respectively. After adjustment for age, sex and other factors, total wholegrain consumption at baseline was not associated with incidence of any type of cataract. High consumption of legumes showed a protective association for incident PSC cataract (5th quintile: adjusted OR 0·37; 95 % CI 0·15, 0·92). There was no significant trend of this association across quintiles (P = 0·08). In this older Australian population, we found no associations between wholegrain intake at baseline and the 5-year incidence of three cataract types. However, intake of legumes in the highest quintile, compared with the lowest quintile, may protect against PSC formation, a finding needing replication in other studies.
Excess energy intake is recognised as a strong contributing factor to the global rise of being overweight and obese. The aim of this paper was to investigate if oral sensitivity to complex carbohydrate relates to ad libitum consumption of complex carbohydrate foods in a sample group of female adults. Participants’ ((n 51 females): age 23·0 (sd 0·6) years (range 20·0–41·0 years); excluding restrained eaters) sensitivity towards maltodextrin (oral complex carbohydrate) and glucose (sweet taste) was assessed by measuring detection threshold (DT) and suprathreshold intensity perception (ST). A crossover design was used to assess consumption of two different iso-energetic preload milkshakes and ad libitum milkshakes – (1) glucose-based milkshake, (2) maltodextrin-based milkshake. Ad libitum intake (primary outcome) and eating rate, liking, hunger, fullness and prospective consumption ratings were measured. Participants who were more sensitive towards complex carbohydrate (maltodextrin DT) consumed significantly more maltodextrin-based milkshake in comparison with less-sensitive participants (P = 0·01) and this was independent of liking. Participants who had higher liking for glucose-based milkshake consumed significantly more glucose-based milkshake in comparison with participants with lower hedonic ratings (P = 0·049). The results provide support regarding the role of the oral system sensitivity (potentially taste) to complex carbohydrate and the prospective to overconsume complex carbohydrate-based milkshake in a single sitting.
The Kuramoto–Sivashinsky equation is a prototypical chaotic nonlinear partial differential equation (PDE) in which the size of the spatial domain plays the role of a bifurcation parameter. We investigate the changing dynamics of the Kuramoto–Sivashinsky PDE by calculating the Lyapunov spectra over a large range of domain sizes. Our comprehensive computation and analysis of the Lyapunov exponents and the associated Kaplan–Yorke dimension provides new insights into the chaotic dynamics of the Kuramoto–Sivashinsky PDE, and the transition to its one-dimensional turbulence.
Both India and Nepal are prone to a wide range of natural and man-made disasters. Almost 85% of India’s area is vulnerable to one or more hazards, and more than 80% of the total population of Nepal is at risk of natural hazards. In terms of the number of people affected in reported disastrous events, India is in the top 10 and Nepal is in the top 20 globally. Over the last two decades, India and Nepal have taken steps to establish their respective National Disaster Management organizations, which provide essential disaster responses. However, key gaps still remain in trained clinical capacity for managing impacts from various disasters. Our review of the region has shown that large parts of the population suffer injuries, diseases, disabilities, psychosocial, and other health-related problems from disasters.
Aim:
Develop disaster medicine clinical capacity to reduce morbidities and mortalities from disasters.
Methods:
Independent published data and work undertaken by the lead author in various disasters in India and Nepal since 1993 formed the basis of establishing the Faculty of Disaster Medicine for South Asia. The Faculty of Disaster Medicine - India and Nepal (FDMIN) was launched from Pune in March 2015. This initiative is supported by the National Association of Primary Care (UK), Public Health England, Faculty of Pre-hospital Care of Royal College of Surgeons - Edinburgh and CRIMEDIM (Novara) - Italy.
Discussion:
FDMIN has international expert advisors and has outlined 16 modules training curriculum for health care professionals. FDMIN currently has partnerships for teaching disaster medicine program with 3 medical universities and 12 major health care providers. Six pilot training programmes have been conducted in Pune, Delhi, Chennai, and Kochin. Work is underway to submit an application to the Indian regulatory bodies for approval to establish a post-graduate diploma and Master’s for Disaster Medicine.
Astrophysics Telescope for Large Area Spectroscopy Probe is a concept for a National Aeronautics and Space Administration probe-class space mission that will achieve ground-breaking science in the fields of galaxy evolution, cosmology, Milky Way, and the Solar System. It is the follow-up space mission to Wide Field Infrared Survey Telescope (WFIRST), boosting its scientific return by obtaining deep 1–4 μm slit spectroscopy for ∼70% of all galaxies imaged by the ∼2 000 deg2 WFIRST High Latitude Survey at z > 0.5. Astrophysics Telescope for Large Area Spectroscopy will measure accurate and precise redshifts for ∼200 M galaxies out to z < 7, and deliver spectra that enable a wide range of diagnostic studies of the physical properties of galaxies over most of cosmic history. Astrophysics Telescope for Large Area Spectroscopy Probe and WFIRST together will produce a 3D map of the Universe over 2 000 deg2, the definitive data sets for studying galaxy evolution, probing dark matter, dark energy and modifications of General Relativity, and quantifying the 3D structure and stellar content of the Milky Way. Astrophysics Telescope for Large Area Spectroscopy Probe science spans four broad categories: (1) Revolutionising galaxy evolution studies by tracing the relation between galaxies and dark matter from galaxy groups to cosmic voids and filaments, from the epoch of reionisation through the peak era of galaxy assembly; (2) Opening a new window into the dark Universe by weighing the dark matter filaments using 3D weak lensing with spectroscopic redshifts, and obtaining definitive measurements of dark energy and modification of General Relativity using galaxy clustering; (3) Probing the Milky Way’s dust-enshrouded regions, reaching the far side of our Galaxy; and (4) Exploring the formation history of the outer Solar System by characterising Kuiper Belt Objects. Astrophysics Telescope for Large Area Spectroscopy Probe is a 1.5 m telescope with a field of view of 0.4 deg2, and uses digital micro-mirror devices as slit selectors. It has a spectroscopic resolution of R = 1 000, and a wavelength range of 1–4 μm. The lack of slit spectroscopy from space over a wide field of view is the obvious gap in current and planned future space missions; Astrophysics Telescope for Large Area Spectroscopy fills this big gap with an unprecedented spectroscopic capability based on digital micro-mirror devices (with an estimated spectroscopic multiplex factor greater than 5 000). Astrophysics Telescope for Large Area Spectroscopy is designed to fit within the National Aeronautics and Space Administration probe-class space mission cost envelope; it has a single instrument, a telescope aperture that allows for a lighter launch vehicle, and mature technology (we have identified a path for digital micro-mirror devices to reach Technology Readiness Level 6 within 2 yr). Astrophysics Telescope for Large Area Spectroscopy Probe will lead to transformative science over the entire range of astrophysics: from galaxy evolution to the dark Universe, from Solar System objects to the dusty regions of the Milky Way.
To describe the epidemiology of surgical site infections (SSIs) after pediatric ambulatory surgery.
Design
Observational cohort study with 60 days follow-up after surgery.
Setting
The study took place in 3 ambulatory surgical facilities (ASFs) and 1 hospital-based facility in a single pediatric healthcare network.
Participants
Children <18 years undergoing ambulatory surgery were included in the study. Of 19,777 eligible surgical encounters, 8,502 patients were enrolled.
Methods
Data were collected through parental interviews and from chart reviews. We assessed 2 outcomes: (1) National Healthcare Safety Network (NHSN)–defined SSI and (2) evidence of possible infection using a definition developed for this study.
Results
We identified 21 NSHN SSIs for a rate of 2.5 SSIs per 1,000 surgical encounters: 2.9 per 1,000 at the hospital-based facility and 1.6 per 1,000 at the ASFs. After restricting the search to procedures completed at both facilities and adjustment for patient demographics, there was no difference in the risk of NHSN SSI between the 2 types of facilities (odds ratio, 0.7; 95% confidence interval, 0.2–2.3). Within 60 days after surgery, 404 surgical patients had some or strong evidence of possible infection obtained from parental interview and/or chart review (rate, 48 SSIs per 1,000 surgical encounters). Of 306 cases identified through parental interviews, 176 cases (57%) did not have chart documentation. In our multivariable analysis, older age and black race were associated with a reduced risk of possible infection.
Conclusions
The rate of NHSN-defined SSI after pediatric ambulatory surgery was low, although a substantial additional burden of infectious morbidity related to surgery might not have been captured by standard surveillance strategies and definitions.
In 2017, Public Health England South East Health Protection Team (HPT) were involved in the management of an outbreak of Mycobacterium bovis (the causative agent of bovine tuberculosis) in a pack of working foxhounds. This paper summarises the actions taken by the team in managing the public health aspects of the outbreak, and lessons learned to improve the management of future potential outbreaks. A literature search was conducted to identify relevant publications on M. bovis. Clinical notes from the Public Health England (PHE) health protection database were reviewed and key points extracted. Animal and public health stakeholders involved in the management of the situation provided further evidence through unstructured interviews and personal communications. The PHE South East team initially provided ‘inform and advise’ letters to human contacts whilst awaiting laboratory confirmation to identify the infectious agent. Once M. bovis had been confirmed in the hounds, an in-depth risk assessment was conducted, and contacts were stratified in to risk pools. Eleven out of 20 exposed persons with the greatest risk of exposure were recommended to attend TB screening and one tested positive, but had no evidence of active TB infection. The number of human contacts working with foxhound packs can be large and varied. HPTs should undertake a comprehensive risk assessment of all potential routes of exposure, involve all other relevant stakeholders from an early stage and undertake regular risk assessments. Current guidance should be revised to account for the unique risks to human health posed by exposure to infected working dogs.
Oral anticoagulation (OAC) reduces stroke risk in patients with atrial fibrillation (AF) or atrial flutter (AFL). However, OAC initiation rates in patients discharged directly from the emergency department (ED) are low. We aimed to address this care gap by implementing a quality improvement intervention.
Methods
The study was performed in four Canadian urban EDs between 2015 and 2016. Patients were included if they had an electrocardiogram (ECG) documenting AF/AFL in the ED, were directly discharged from the ED, and were alive after 90 days. Baseline rates of OAC initiation were determined prior to the intervention. Between June and December 2016, we implemented our intervention in two EDs (ED-intervention), with the remaining sites acting as controls (ED-control). The intervention included a reminder statement prompting OAC initiation according to guideline recommendations, manually added to ECGs with a preliminary interpretation of AF/AFL, along with a decision-support algorithm that included a referral sheet. The primary outcome was the rate of OAC initiation within 90 days of the ED visit.
Results
Prior to the intervention, 37.2% OAC-naïve patients with ECG-documented AF/AFL were initiated on OAC. Following implementation of the intervention, the rate of OAC initiation increased from 38.6% to 47.5% (absolute increase of 8.5%; 95% CI, 0.3% to 16.7%, p=0.04) among the ED-intervention sites, whereas the rate remained unchanged in ED-control sites (35.3% to 35.9%, p=0.9).
Conclusions
Implementation of a quality improvement intervention consisting of a reminder and decision-support tool increased initiation of OAC in high-risk patients. This support package can be readily implemented in other jurisdictions to improve OAC rates for AF/AFL.
Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88) presented a critique of our recently published paper in Cell Reports entitled ‘Large-Scale Cognitive GWAS Meta-Analysis Reveals Tissue-Specific Neural Expression and Potential Nootropic Drug Targets’ (Lam et al., Cell Reports, Vol. 21, 2017, 2597–2613). Specifically, Hill offered several interrelated comments suggesting potential problems with our use of a new analytic method called Multi-Trait Analysis of GWAS (MTAG) (Turley et al., Nature Genetics, Vol. 50, 2018, 229–237). In this brief article, we respond to each of these concerns. Using empirical data, we conclude that our MTAG results do not suffer from ‘inflation in the FDR [false discovery rate]’, as suggested by Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88), and are not ‘more relevant to the genetic contributions to education than they are to the genetic contributions to intelligence’.
The History, Electrocardiogram (ECG), Age, Risk Factors, and Troponin (HEART) score is a decision aid designed to risk stratify emergency department (ED) patients with acute chest pain. It has been validated for ED use, but it has yet to be evaluated in a prehospital setting.
Hypothesis
A prehospital modified HEART score can predict major adverse cardiac events (MACE) among undifferentiated chest pain patients transported to the ED.
Methods
A retrospective cohort study of patients with chest pain transported by two county-based Emergency Medical Service (EMS) agencies to a tertiary care center was conducted. Adults without ST-elevation myocardial infarction (STEMI) were included. Inter-facility transfers and those without a prehospital 12-lead ECG or an ED troponin measurement were excluded. Modified HEART scores were calculated by study investigators using a standardized data collection tool for each patient. All MACE (death, myocardial infarction [MI], or coronary revascularization) were determined by record review at 30 days. The sensitivity and negative predictive values (NPVs) for MACE at 30 days were calculated.
Results
Over the study period, 794 patients met inclusion criteria. A MACE at 30 days was present in 10.7% (85/794) of patients with 12 deaths (1.5%), 66 MIs (8.3%), and 12 coronary revascularizations without MI (1.5%). The modified HEART score identified 33.2% (264/794) of patients as low risk. Among low-risk patients, 1.9% (5/264) had MACE (two MIs and three revascularizations without MI). The sensitivity and NPV for 30-day MACE was 94.1% (95% CI, 86.8-98.1) and 98.1% (95% CI, 95.6-99.4), respectively.
Conclusions
Prehospital modified HEART scores have a high NPV for MACE at 30 days. A study in which prehospital providers prospectively apply this decision aid is warranted.
The Neotoma Paleoecology Database is a community-curated data resource that supports interdisciplinary global change research by enabling broad-scale studies of taxon and community diversity, distributions, and dynamics during the large environmental changes of the past. By consolidating many kinds of data into a common repository, Neotoma lowers costs of paleodata management, makes paleoecological data openly available, and offers a high-quality, curated resource. Neotoma’s distributed scientific governance model is flexible and scalable, with many open pathways for participation by new members, data contributors, stewards, and research communities. The Neotoma data model supports, or can be extended to support, any kind of paleoecological or paleoenvironmental data from sedimentary archives. Data additions to Neotoma are growing and now include >3.8 million observations, >17,000 datasets, and >9200 sites. Dataset types currently include fossil pollen, vertebrates, diatoms, ostracodes, macroinvertebrates, plant macrofossils, insects, testate amoebae, geochronological data, and the recently added organic biomarkers, stable isotopes, and specimen-level data. Multiple avenues exist to obtain Neotoma data, including the Explorer map-based interface, an application programming interface, the neotoma R package, and digital object identifiers. As the volume and variety of scientific data grow, community-curated data resources such as Neotoma have become foundational infrastructure for big data science.
In the context of an austere financial climate, local health care budget holders are increasingly expected to make and enact decisions to decommission (reduce or stop providing) services. However, little is currently known about the experiences of those seeking to decommission. This paper presents the first national study of decommissioning in the English National Health Service drawing on multiple methods, including: an interview-based review of the contemporary policy landscape of health care decommissioning; a national online survey of commissioners of health care services responsible for managing and enacting budget allocation decisions locally; and illustrative vignettes provided by those who have led decommissioning activities. Findings are presented and discussed in relation to four themes: national-local relationships; organisational capacity and resources for decommissioning; the extent and nature of decommissioning; and intended outcomes of decommissioning. Whilst it is unlikely that local commissioners will be able to ‘successfully’ implement decommissioning decisions unless aspects of engagement, local context and outcomes are addressed, it remains unclear what ‘success’ looks like in terms of a decommissioning process.
Dementia is a major health problem with a growing number of people affected by the condition, both directly and indirectly through caring for someone with dementia. Many live at home but little is known about the range and intensity of the support they receive. Previous studies have mainly reported on discrete services within a single geographical area. This paper presents a protocol for study of different services across several sites in England. The aim is to explore the presence, effects, and cost-effectiveness of approaches to home support for people in later stage dementia and their carers.
Methods:
This is a prospective observational study employing mixed methods. At least 300 participants (people with dementia and their carers) from geographical areas with demonstrably different ranges of services available for people with dementia will be selected. Within each area, participants will be recruited from a range of services. Participants will be interviewed on two occasions and data will be collected on their characteristics and circumstances, quality of life, carer health and burden, and informal and formal support for the person with dementia. The structured interviews will also collect qualitative data to explore the perceptions of older people and carers.
Conclusions:
This national study will explore the components of appropriate and effective home support for people with late stage dementia and their carers. It aims to inform commissioners and service providers across health and social care.
Socializing a client to the cognitive behavioural model is advised in almost every cognitive behavioural therapy (CBT) textbook, but there is limited evidence for whether socialization is measurable or important. The aim of the study was to pilot a written and interview-based measure of socialization to investigate whether socialization to the model can be measured in a sample of young people who have completed CBT. Sixteen participants (mean age 14.9 years, 75% female) completed a semi-structured socialization interview and a novel written measure of socialization. Treating clinicians were asked to provide subjective ratings of participant socialization. The structure and content of these measures was examined. A moderate but non-significant correlation was found between the novel written measure of socialization and clinician rating of socialization (r = .37). The concept of ‘socialization’ is not well understood and the socialization interview presented mixed, unclear results. This may be due to issues with the design, but may also be that socialization, as currently understood, is more complex than can be captured in this way. The important aspect of this study is introducing the concept of measuring socialization and factors that may be important in future research. Socialization to the model is an important construct within CBT but at present is a challenging concept to measure. Future research will need to focus on operationalizing the concept further and refining measures so that it can be accurately captured. Understanding which therapist and client behaviours contribute to the process of socialization could conceivably improve outcomes, but this cannot be done until this area is understood more fully.
Stress-related pathophysiology drives comorbid trajectories that elude precise prediction. Allostatic load algorithms that quantify biological “wear and tear” represent a comprehensive approach to detect multisystemic disease processes of the mind and body. However, the multiple morbidities directly or indirectly related to stress physiology remain enigmatic. Our aim in this article is to propose that biological comorbidities represent discrete pathophysiological processes captured by measuring allostatic load. This has applications in research and clinical settings to predict physical and psychiatric comorbidities alike. The reader will be introduced to the concepts of allostasis, allostasic states, allostatic load, and allostatic overload as they relate to stress-related diseases and the proposed prediction of biological comorbidities that extend rather to understanding psychopathologies. In our transdisciplinary discussion, we will integrate perspectives related to (a) mitochondrial biology as a key player in the allostatic load time course toward diseases that “get under the skin and skull”; (b) epigenetics related to child maltreatment and biological embedding that shapes stress perception throughout lifespan development; and (c) evolutionary drivers of distinct personality profiles and biobehavioral patterns that are linked to dimensions of psychopathology.