To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Childhood exposure to interpersonal violence (IPV) may be linked to distinct manifestations of mental illness, yet the nature of this change remains poorly understood. Network analysis can provide unique insights by contrasting the interrelatedness of symptoms underlying psychopathology across exposed and non-exposed youth, with potential clinical implications for a treatment-resistant population. We anticipated marked differences in symptom associations among IPV-exposed youth, particularly in terms of ‘hub’ symptoms holding outsized influence over the network, as well as formation and influence of communities of highly interconnected symptoms.
Participants from a population-representative sample of youth (n = 4433; ages 11–18 years) completed a comprehensive structured clinical interview assessing mental health symptoms, diagnostic status, and history of violence exposure. Network analytic methods were used to model the pattern of associations between symptoms, quantify differences across diagnosed youth with (IPV+) and without (IPV–) IPV exposure, and identify transdiagnostic ‘bridge’ symptoms linking multiple disorders.
Symptoms organized into six ‘disorder’ communities (e.g. Intrusive Thoughts/Sensations, Depression, Anxiety), that exhibited considerably greater interconnectivity in IPV+ youth. Five symptoms emerged in IPV+ youth as highly trafficked ‘bridges’ between symptom communities (11 in IPV– youth).
IPV exposure may alter mutually reinforcing symptom co-occurrence in youth, thus contributing to greater psychiatric comorbidity and treatment resistance. The presence of a condensed and unique set of bridge symptoms suggests trauma-enriched nodes which could be therapeutically targeted to improve outcomes in violence-exposed youth.
Vitamin D deficiency is associated with an increased risk of acute respiratory infection. There is an excess of respiratory infections and deaths in schizophrenia, a condition where vitamin D deficiency is especially prevalent. This potentially offers a modifiable risk factor to reduce the risk for and the severity of respiratory infection in people with schizophrenia, although there is as yet no evidence regarding the risk of COVID-19. In this narrative review, we describe the prevalence of vitamin D deficiency in schizophrenia, report the research examining the relationship between vitamin D levels and COVID-19 and discuss the associations between vitamin D deficiency and respiratory infection, including its immunomodulatory mechanism of action.
Compulsory admission procedures of patients with mental disorders vary between countries in Europe. The Ethics Committee of the European Psychiatric Association (EPA) launched a survey on involuntary admission procedures of patients with mental disorders in 40 countries to gather information from all National Psychiatric Associations that are members of the EPA to develop recommendations for improving involuntary admission processes and promote voluntary care.
The survey focused on legislation of involuntary admissions and key actors involved in the admission procedure as well as most common reasons for involuntary admissions.
We analyzed the survey categorical data in themes, which highlight that both medical and legal actors are involved in involuntary admission procedures.
We conclude that legal reasons for compulsory admission should be reworded in order to remove stigmatization of the patient, that raising awareness about involuntary admission procedures and patient rights with both patients and family advocacy groups is paramount, that communication about procedures should be widely available in lay-language for the general population, and that training sessions and guidance should be available for legal and medical practitioners. Finally, people working in the field need to be constantly aware about the ethical challenges surrounding compulsory admissions.
Maternal obesity is an established risk factor for poor infant neurodevelopmental outcomes; however, the link between maternal weight and fetal development in utero is unknown. We investigated whether maternal obesity negatively influences fetal autonomic nervous system (ANS) development. Fetal heart rate variability (HRV) is an index of the ANS that is associated with neurodevelopmental outcomes in the infant. Maternal–fetal magnetocardiograms were recorded using a fetal biomagnetometer at 36 weeks (n = 46). Fetal HRV was represented by the standard deviation of sinus beat-to-beat intervals (SDNN). Maternal weight was measured at enrollment (12–20 weeks) and 36 weeks. The relationships between fetal HRV and maternal weight at both time points were modeled using adjusted ordinary least squares regression models. Higher maternal weight at enrollment and 36 weeks were associated with lower fetal HRV, an indicator of poorer ANS development. Further study is needed to better understand how maternal obesity influences fetal autonomic development and long-term neurodevelopmental outcomes.
Clusters of Salmonella Enteritidis cases were identified by the Minnesota Department of Health using both pulsed-field gel electrophoresis (PFGE) and whole genome sequencing (WGS) single nucleotide polymorphism analysis from 1 January 2015 through 31 December 2017. The median turnaround time for obtaining WGS results was 11 days longer than for PFGE (12 vs. 1 day). WGS analysis more than doubled the number of clusters compared to PFGE analysis, but reduced the total number of cases included in clusters by 34%. The median cluster size was two cases for WGS compared to four for PFGE, and the median duration of WGS clusters was 27 days shorter than PFGE clusters. While the percentage of PFGE clusters with a confirmed source (46%) was higher than WGS clusters (32%), a higher percentage of cases in clusters that were confirmed as outbreaks reported the vehicle or exposure of interest for WGS (78%) than PFGE (46%). WGS cluster size was a significant predictor of an outbreak source being confirmed. WGS data have enhanced S. Enteritidis cluster investigations in Minnesota by improving the specificity of cluster case definitions and has become an integral part of the S. Enteritidis surveillance process.
Children treated for brain tumors often experience social and emotional difficulties, including challenges with emotion regulation; our goal was to investigate the attention-related component processes of emotion regulation, using a novel eye-tracking measure, and to evaluate its relations with emotional functioning and white matter (WM) organization.
Fifty-four children participated in this study; 36 children treated for posterior fossa tumors, and 18 typically developing children. Participants completed two versions of an emotion regulation eye-tracking task, designed to differentiate between implicit (i.e., automatic) and explicit (i.e., voluntary) subprocesses. The Emotional Control scale from the Behavior Rating Inventory of Executive Function was used to evaluate emotional control in daily life, and WM organization was assessed with diffusion tensor imaging.
We found that emotional faces captured attention across all groups (F(1,51) = 32.18, p < .001, η2p = .39). However, unlike typically developing children, patients were unable to override the attentional capture of emotional faces when instructed to (emotional face-by-group interaction: F(2,51) = 5.58, p = .006, η2p = .18). Across all children, our eye-tracking measure of emotion regulation was modestly associated with the parent-report emotional control score (r = .29, p = .045), and in patients it was associated with WM microstructure in the body and splenium of the corpus callosum (all t > 3.03, all p < .05).
Our findings suggest that an attention-related component process of emotion regulation is disrupted in children treated for brain tumors, and that it may relate to their emotional difficulties and WM organization. This work provides a foundation for future theoretical and mechanistic investigations of emotional difficulties in brain tumor survivors.
Why patients with psychosis use cannabis remains debated. The self-medication hypothesis has received some support but other evidence points towards an alleviation of dysphoria model. This study investigated the reasons for cannabis use in first-episode psychosis (FEP) and whether strength in their endorsement changed over time.
FEP inpatients and outpatients at the South London and Maudsley, Oxleas and Sussex NHS Trusts UK, who used cannabis, rated their motives at baseline (n = 69), 3 months (n = 29) and 12 months (n = 36). A random intercept model was used to test the change in strength of endorsement over the 12 months. Paired-sample t-tests assessed the differences in mean scores between the five subscales on the Reasons for Use Scale (enhancement, social motive, coping with unpleasant affect, conformity and acceptance and relief of positive symptoms and side effects), at each time-point.
Time had a significant effect on scores when controlling for reason; average scores on each subscale were higher at baseline than at 3 months and 12 months. At each time-point, patients endorsed ‘enhancement’ followed by ‘coping with unpleasant affect’ and ‘social motive’ more highly for their cannabis use than any other reason. ‘Conformity and acceptance’ followed closely. ‘Relief of positive symptoms and side effects’ was the least endorsed motive.
Patients endorsed their reasons for use at 3 months and 12 months less strongly than at baseline. Little support for the self-medication or alleviation of dysphoria models was found. Rather, patients rated ‘enhancement’ most highly for their cannabis use.
The updated common rule, for human subjects research, requires that consents “begin with a ‘concise and focused’ presentation of the key information that will most likely help someone make a decision about whether to participate in a study” (Menikoff, Kaneshiro, Pritchard. The New England Journal of Medicine. 2017; 376(7): 613–615.). We utilized a community-engaged technology development approach to inform feature options within the REDCap software platform centered around collection and storage of electronic consent (eConsent) to address issues of transparency, clinical trial efficiency, and regulatory compliance for informed consent (Harris, et al. Journal of Biomedical Informatics 2009; 42(2): 377–381.). eConsent may also improve recruitment and retention in clinical research studies by addressing: (1) barriers for accessing rural populations by facilitating remote consent and (2) cultural and literacy barriers by including optional explanatory material (e.g., defining terms by hovering over them with the cursor) or the choice of displaying different videos/images based on participant’s race, ethnicity, or educational level (Phillippi, et al. Journal of Obstetric, Gynecologic, & Neonatal Nursing. 2018; 47(4): 529–534.).
We developed and pilot tested our eConsent framework to provide a personalized consent experience whereby users are guided through a consent document that utilizes avatars, contextual glossary information supplements, and videos, to facilitate communication of information.
The eConsent framework includes a portfolio of eight features, reviewed by community stakeholders, and tested at two academic medical centers.
Early adoption and utilization of this eConsent framework have demonstrated acceptability. Next steps will emphasize testing efficacy of features to improve participant engagement with the consent process.
In the UK, mental illness is a major source of disease burden costing in the region of £105 billion pounds. mHealth is a novel and emerging field in psychiatric and psychological care for the treatment of mental health difficulties such as psychosis.
To develop an intelligent real-time therapy (iRTT) mobile intervention (TechCare) which assesses participant's symptoms in real-time and responds with a personalised self-help based psychological intervention, with the aim of reducing participant's symptoms. The system will utilise intelligence at two levels:
– intelligently increasing the frequency of assessment notifications if low mood/paranoia is detected;
– an intelligent machine learning algorithm which provides interventions in real-time and also provides recommendations on the most popular selected interventions.
The aim of the current project is to develop a mobile phone intervention for people with psychosis, and to conduct a feasibility study of the TechCare App.
The study consists of both qualitative and quantitative components. The study will be run across three strands:
– qualitative work;
– test run and intervention refinement;
– feasibility trial.
Preliminary analysis of qualitative data from Strand 2 (test run and intervention refinement) in-depth interviews with service users (n = 2) and focus group with health professionals (n = 1), highlighted main themes around security of the device, multimedia and the acceptability of psychological interventions being delivered via the TechCare App.
Research in this area can be potentially helpful in addressing the demand on mental health services globally, particularly improving access to psychological interventions.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Transoral laser microsurgery for glottic squamous cell carcinoma is the standard of care at many institutions. Repeat transoral laser microsurgery for recurrence may avoid the need for radiotherapy and total laryngectomy. This study aimed to identify oncological and functional outcomes in a cohort of patients who had undergone repeat transoral laser microsurgery procedures.
A retrospective review of prospectively collected data of patients treated with transoral laser microsurgery for carcinoma in situ or tumour stages T1 or T2 glottic cancer, from 2003 to 2018.
Twenty patients were identified. Additional treatment was not needed in 45 per cent of patients. The five-year overall survival rate was 90 per cent. The disease-specific survival rate was 100 per cent. The laryngeal preservation rate was 85 per cent. There was improvement in mean Voice Handicap Index-10 scores following repeat transoral laser microsurgery treatment, when comparing the pre- and post-operative periods (mean scores = 15.5 vs 11.5, p = 0.373).
Repeat transoral laser microsurgery can be an oncologically safe alternative to other salvage therapies for glottic squamous cell carcinoma recurrence, without sacrificing functional outcomes.
TwinsUK is the largest cohort of community-dwelling adult twins in the UK. The registry comprises over 14,000 volunteer twins (14,838 including mixed, single and triplets); it is predominantly female (82%) and middle-aged (mean age 59). In addition, over 1800 parents and siblings of twins are registered volunteers. During the last 27 years, TwinsUK has collected numerous questionnaire responses, physical/cognitive measures and biological measures on over 8500 subjects. Data were collected alongside four comprehensive phenotyping clinical visits to the Department of Twin Research and Genetic Epidemiology, King’s College London. Such collection methods have resulted in very detailed longitudinal clinical, biochemical, behavioral, dietary and socioeconomic cohort characterization; it provides a multidisciplinary platform for the study of complex disease during the adult life course, including the process of healthy aging. The major strength of TwinsUK is the availability of several ‘omic’ technologies for a range of sample types from participants, which includes genomewide scans of single-nucleotide variants, next-generation sequencing, metabolomic profiles, microbiomics, exome sequencing, epigenetic markers, gene expression arrays, RNA sequencing and telomere length measures. TwinsUK facilitates and actively encourages sharing the ‘TwinsUK’ resource with the scientific community — interested researchers may request data via the TwinsUK website (http://twinsuk.ac.uk/resources-for-researchers/access-our-data/) for their own use or future collaboration with the study team. In addition, further cohort data collection is planned via the Wellcome Open Research gateway (https://wellcomeopenresearch.org/gateways). The current article presents an up-to-date report on the application of technological advances, new study procedures in the cohort and future direction of TwinsUK.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
We have observed the G23 field of the Galaxy AndMass Assembly (GAMA) survey using the Australian Square Kilometre Array Pathfinder (ASKAP) in its commissioning phase to validate the performance of the telescope and to characterise the detected galaxy populations. This observation covers ~48 deg2 with synthesised beam of 32.7 arcsec by 17.8 arcsec at 936MHz, and ~39 deg2 with synthesised beam of 15.8 arcsec by 12.0 arcsec at 1320MHz. At both frequencies, the root-mean-square (r.m.s.) noise is ~0.1 mJy/beam. We combine these radio observations with the GAMA galaxy data, which includes spectroscopy of galaxies that are i-band selected with a magnitude limit of 19.2. Wide-field Infrared Survey Explorer (WISE) infrared (IR) photometry is used to determine which galaxies host an active galactic nucleus (AGN). In properties including source counts, mass distributions, and IR versus radio luminosity relation, the ASKAP-detected radio sources behave as expected. Radio galaxies have higher stellar mass and luminosity in IR, optical, and UV than other galaxies. We apply optical and IR AGN diagnostics and find that they disagree for ~30% of the galaxies in our sample. We suggest possible causes for the disagreement. Some cases can be explained by optical extinction of the AGN, but for more than half of the cases we do not find a clear explanation. Radio sources aremore likely (~6%) to have an AGN than radio quiet galaxies (~1%), but the majority of AGN are not detected in radio at this sensitivity.
A national need is to prepare for and respond to accidental or intentional disasters categorized as chemical, biological, radiological, nuclear, or explosive (CBRNE). These incidents require specific subject-matter expertise, yet have commonalities. We identify 7 core elements comprising CBRNE science that require integration for effective preparedness planning and public health and medical response and recovery. These core elements are (1) basic and clinical sciences, (2) modeling and systems management, (3) planning, (4) response and incident management, (5) recovery and resilience, (6) lessons learned, and (7) continuous improvement. A key feature is the ability of relevant subject matter experts to integrate information into response operations. We propose the CBRNE medical operations science support expert as a professional who (1) understands that CBRNE incidents require an integrated systems approach, (2) understands the key functions and contributions of CBRNE science practitioners, (3) helps direct strategic and tactical CBRNE planning and responses through first-hand experience, and (4) provides advice to senior decision-makers managing response activities. Recognition of both CBRNE science as a distinct competency and the establishment of the CBRNE medical operations science support expert informs the public of the enormous progress made, broadcasts opportunities for new talent, and enhances the sophistication and analytic expertise of senior managers planning for and responding to CBRNE incidents.
Weaning of beef calves is a stressful event that negatively impacts health and performance. A variety of interventions have been proposed to reduce stress and improve gains following weaning. This study used 288 7- to 8-month-old calves from two separate locations, to examine four different weaning strategies, as well as the impact of shipment. Calves were blocked by weight and sex, and then randomly assigned to one of four treatments: abrupt weaning (AW), where calves were separated from the dam on day 0 (D0) and allowed no further contact with the dam; fence line (FL), where calves were weaned on D0 but had fence line contact with dams for 7 days; nose flap (NF), where on day -6 calves received a nose flap that interferes with suckling, then had the flap removed and were weaned from the dam on D0; and intermittent separation (SEP), where calves were removed from dams for 24-h intervals on day -13 and day -6, then weaned on D0, but allowed fence line contact with the dam for 7 days. Each treatment group was further divided into two subgroups, one of which was shipped early (D0 for AW, day 7 for others) or shipped later (day 28). Body weight and sickness were recorded for all groups. Results showed a negative impact on gain for early shipping compared to later shipping, and poorer gain in AW calves than most other treatments. Results of the analyses of morbidity were inconclusive. This study found that delayed shipment following FL weaning improves performance under common management conditions for the US cow–calf industry.
Shiga toxin-producing Escherichia coli (STEC) infection can cause serious illness including haemolytic uraemic syndrome. The role of socio-economic status (SES) in differential clinical presentation and exposure to potential risk factors amongst STEC cases has not previously been reported in England. We conducted an observational study using a dataset of all STEC cases identified in England, 2010–2015. Odds ratios for clinical characteristics of cases and foodborne, waterborne and environmental risk factors were estimated using logistic regression, stratified by SES, adjusting for baseline demographic factors. Incidence was higher in the highest SES group compared to the lowest (RR 1.54, 95% CI 1.19–2.00). Odds of Accident and Emergency attendance (OR 1.35, 95% CI 1.10–1.75) and hospitalisation (OR 1.71, 95% CI 1.36–2.15) because of illness were higher in the most disadvantaged compared to the least, suggesting potential lower ascertainment of milder cases or delayed care-seeking behaviour in disadvantaged groups. Advantaged individuals were significantly more likely to report salad/fruit/vegetable/herb consumption (OR 1.59, 95% CI 1.16–2.17), non-UK or UK travel (OR 1.76, 95% CI 1.40–2.27; OR 1.85, 95% CI 1.35–2.56) and environmental exposures (walking in a paddock, OR 1.82, 95% CI 1.22–2.70; soil contact, OR 1.52, 95% CI 2.13–1.09) suggesting other unmeasured risks, such as person-to-person transmission, could be more important in the most disadvantaged group.
To describe the relationship between adherence to distinct dietary patterns and nutrition literacy.
We identified distinct dietary patterns using principal covariates regression (PCovR) and principal components analysis (PCA) from the Diet History Questionnaire II. Nutrition literacy was assessed using the Nutrition Literacy Assessment Instrument (NLit). Cross-sectional relationships between dietary pattern adherence and global and domain-specific NLit scores were tested by multiple linear regression. Mean differences in diet pattern adherence among three predefined nutrition literacy performance categories were tested by ANOVA.
Metropolitan Kansas City, USA.
Adults (n 386) with at least one of four diet-related diseases.
Three diet patterns of interest were derived: a PCovR prudent pattern and PCA-derived Western and Mediterranean patterns. After controlling for age, sex, BMI, race, household income, education level and diabetes status, PCovR prudent pattern adherence positively related to global NLit score (P < 0·001, β = 0·36), indicating more intake of prudent diet foods with improved nutrition literacy. Validating the PCovR findings, PCA Western pattern adherence inversely related to global NLit (P = 0·003, β = −0·13) while PCA Mediterranean pattern positively related to global NLit (P = 0·02, β = 0·12). Using predefined cut points, those with poor nutrition literacy consumed more foods associated with the Western diet (fried foods, sugar-sweetened beverages, red meat, processed foods) while those with good nutrition literacy consumed more foods associated with prudent and Mediterranean diets (vegetables, olive oil, nuts).
Nutrition literacy predicted adherence to healthy/unhealthy diet patterns. These findings warrant future research to determine if improving nutrition literacy effectively improves eating patterns.
Introduction: Although use of point of care ultrasound (PoCUS) protocols for patients with undifferentiated hypotension in the Emergency Department (ED) is widespread, our previously reported SHoC-ED study showed no clear survival or length of stay benefit for patients assessed with PoCUS. In this analysis, we examine if the use of PoCUS changed fluid administration and rates of other emergency interventions between patients with different shock types. The primary comparison was between cardiogenic and non-cardiogenic shock types. Methods: A post-hoc analysis was completed on the database from an RCT of 273 patients who presented to the ED with undifferentiated hypotension (SBP <100 or shock index > 1) and who had been randomized to receive standard care with or without PoCUS in 6 centres in Canada and South Africa. PoCUS-trained physicians performed scans after initial assessment. Shock categories and diagnoses recorded at 60 minutes after ED presentation, were used to allocate patients into subcategories of shock for analysis of treatment. We analyzed actual care delivered including initial IV fluid bolus volumes (mL), rates of inotrope use and major procedures. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: Although there were expected differences in the mean fluid bolus volume between patients with non-cardiogenic and cardiogenic shock, there was no difference in fluid bolus volume between the control and PoCUS groups (non-cardiogenic control 1878 mL (95% CI 1550 – 2206 mL) vs. non-cardiogenic PoCUS 1687 mL (1458 – 1916 mL); and cardiogenic control 768 mL (194 – 1341 mL) vs. cardiogenic PoCUS 981 mL (341 – 1620 mL). Likewise there were no differences in rates of inotrope administration, or major procedures for any of the subcategories of shock between the control group and PoCUS group patients. The most common subcategory of shock was distributive. Conclusion: Despite differences in care delivered by subcategory of shock, we did not find any significant difference in actual care delivered between patients who were examined using PoCUS and those who were not. This may help to explain the previously reported lack of outcome difference between groups.
Introduction: Point of care ultrasound has been reported to improve diagnosis in non-traumatic hypotensive ED patients. We compared diagnostic performance of physicians with and without PoCUS in undifferentiated hypotensive patients as part of an international prospective randomized controlled study. The primary outcome was diagnostic performance of PoCUS for cardiogenic vs. non-cardiogenic shock. Methods: SHoC-ED recruited hypotensive patients (SBP < 100 mmHg or shock index > 1) in 6 centres in Canada and South Africa. We describe previously unreported secondary outcomes relating to diagnostic accuracy. Patients were randomized to standard clinical assessment (No PoCUS) or PoCUS groups. PoCUS-trained physicians performed scans after initial assessment. Demographics, clinical details and findings were collected prospectively. Initial and secondary diagnoses including shock category were recorded at 0 and 60 minutes. Final diagnosis was determined by independent blinded chart review. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: 273 patients were enrolled with follow-up for primary outcome completed for 270. Baseline demographics and perceived category of shock were similar between groups. 11% of patients were determined to have cardiogenic shock. PoCUS had a sensitivity of 80.0% (95% CI 54.8 to 93.0%), specificity 95.5% (90.0 to 98.1%), LR+ve 17.9 (7.34 to 43.8), LR-ve 0.21 (0.08 to 0.58), Diagnostic OR 85.6 (18.2 to 403.6) and accuracy 93.7% (88.0 to 97.2%) for cardiogenic shock. Standard assessment without PoCUS had a sensitivity of 91.7% (64.6 to 98.5%), specificity 93.8% (87.8 to 97.0%), LR+ve 14.8 (7.1 to 30.9), LR- of 0.09 (0.01 to 0.58), Diagnostic OR 166.6 (18.7 to 1481) and accuracy of 93.6% (87.8 to 97.2%). There was no significant difference in sensitivity (-11.7% (-37.8 to 18.3%)) or specificity (1.73% (-4.67 to 8.29%)). Diagnostic performance was also similar between other shock subcategories. Conclusion: As reported in other studies, PoCUS based assessment performed well diagnostically in undifferentiated hypotensive patients, especially as a rule-in test. However performance was similar to standard (non-PoCUS) assessment, which was excellent in this study.