To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Birnbaum (2020) reanalyses the data from Butler and Pogrebna (2018) using his ‘true and error’ test of choice patterns. His results generally support the evidence we presented in that paper. Here we reiterate the reasons for our agnosticism as to the direction any cycles might take, even though the paradox that motivated our study takes a ‘probable winner’ direction. We conclude by returning to the potential significance of predictably intransitive preferences for decision theory generally.
The transitivity axiom is common to nearly all descriptive and normative utility theories of choice under risk. Contrary to both intuition and common assumption, the little-known ’Steinhaus-Trybula paradox’ shows the relation ’stochastically greater than’ will not always be transitive, in contradiction of Weak Stochastic Transitivity. We bespoke-design pairs of lotteries inspired by the paradox, over which individual preferences might cycle. We run an experiment to look for evidence of cycles, and violations of expansion/contraction consistency between choice sets. Even after considering possible stochastic but transitive explanations, we show that cycles can be the modal preference pattern over these simple lotteries, and we find systematic violations of expansion/contraction consistency.
Data from neurocognitive assessments may not be accurate in the context of factors impacting validity, such as disengagement, unmotivated responding, or intentional underperformance. Performance validity tests (PVTs) were developed to address these phenomena and assess underperformance on neurocognitive tests. However, PVTs can be burdensome, rely on cutoff scores that reduce information, do not examine potential variations in task engagement across a battery, and are typically not well-suited to acquisition of large cognitive datasets. Here we describe the development of novel performance validity measures that could address some of these limitations by leveraging psychometric concepts using data embedded within the Penn Computerized Neurocognitive Battery (PennCNB).
We first developed these validity measures using simulations of invalid response patterns with parameters drawn from real data. Next, we examined their application in two large, independent samples: 1) children and adolescents from the Philadelphia Neurodevelopmental Cohort (n = 9498); and 2) adult servicemembers from the Marine Resiliency Study-II (n = 1444).
Our performance validity metrics detected patterns of invalid responding in simulated data, even at subtle levels. Furthermore, a combination of these metrics significantly predicted previously established validity rules for these tests in both developmental and adult datasets. Moreover, most clinical diagnostic groups did not show reduced validity estimates.
These results provide proof-of-concept evidence for multivariate, data-driven performance validity metrics. These metrics offer a novel method for determining the performance validity for individual neurocognitive tests that is scalable, applicable across different tests, less burdensome, and dimensional. However, more research is needed into their application.
Clozapine is the only drug licensed for treatment-resistant schizophrenia (TRS) but the real-world clinical and cost-effectiveness of community initiation of clozapine is unclear.
The aim was to assess the feasibility and cost-effectiveness of community initiation of clozapine.
This was a naturalistic study of community patients recommended for clozapine treatment.
Of 158 patients recommended for clozapine treatment, 88 (56%) patients agreed to clozapine initiation and, of these, 58 (66%) were successfully established on clozapine. The success rate for community initiation was 65.4%; which was not significantly different from that for in-patient initiation (58.82%, χ2(1,88) = 0.47, P = 0.49). Following clozapine initiation, there was a significant reduction in median out-patient visits over 1 year (from 24.00 (interquartile range (IQR) = 14.00–41.00) to 13.00 visits (IQR = 5.00–24.00), P < 0.001), and 2 years (from 47.50 visits (IQR = 24.75–71.00) to 22.00 (IQR = 11.00–42.00), P < 0.001), and a 74.71% decrease in psychiatric hospital bed days (z = −2.50, P = 0.01). Service-use costs decreased (1 year: –£963/patient (P < 0.001); 2 years: –£1598.10/patient (P < 0.001). Subanalyses for community-only initiation also showed significant cost reductions (1 year: –£827.40/patient (P < 0.001); 2 year: –£1668.50/patient (P < 0.001) relative to costs prior to starting clozapine. Relative to before initiation, symptom severity was improved in patients taking clozapine at discharge (median Positive and Negative Syndrome Scale total score: initial visit: 80 (IQR = 71.00–104.00); discharge visit 50.5 (IQR = 44.75–75.00), P < 0.001) and at 2 year follow-up (Health of Nation Outcome Scales total score median initial visit: 13.00 (IQR = 9.00–15.00); 2 year follow-up: 8.00 (IQR = 3.00–13.00), P = 0.023).
These findings indicate that community initiation of clozapine is feasible and is associated with significant reductions in costs, service use and symptom severity.
OBJECTIVES/GOALS: Osteoarthritis (OA) is a cartilage destroying disease. We are investigating abaloparatide (ABL) activation of parathyroid hormone receptor type 1 (PTH1R), which is expressed by articular chondrocytes in OA. We propose ABL treatment is chondroprotective in murine PTOA via stimulation of matrix production and inhibition of chondrocyte maturation. METHODS/STUDY POPULATION: 16-week-old C57BL/6 male mice received destabilization of the medial meniscus (DMM) surgery to induce knee PTOA. Beginning 2 weeks post-DMM, 40 Î¼g/kg of ABL (or saline) was administered daily via subcutaneous injection and tissues were harvested after 6 weeks of daily injections and 8 weeks after DMM surgery. Harvested joint tissues were used for histological and molecular assessment of OA using three 5 Î¼m thick sagittal sections from each joint, 50 Î¼m apart, cut from the medial compartment of injured knees. Safranin O/Fast Green tissue staining and immunohistochemistry-based detection of type 10 collagen (Col10) and lubricin (Prg4) was performed using standard methods. Histomorphometric quantification of tibial cartilage area and larger hypertrophic-like cells was performed using the Osteomeasure system. RESULTS/ANTICIPATED RESULTS: Safranin O/Fast Green stained sections showed a decreased cartilage loss in DMM joints from ABL-treated versus saline-treated mice. Histomorphometric analysis of total tibial cartilage area revealed preservation of cartilage tissue on the tibial surface. Immunohistochemical analyses showed that upregulation of Col10 in DMM joints was mitigated in the cartilage of ABL-treated mice, and chondrocyte expression of Prg4 was increased in uncalcified cartilage areas in ABL-treated group. The Prg4 finding suggests a matrix anabolic effect that may counter OA cartilage loss. Quantification of chondrocytes in uncalcified and calcified tibial cartilage areas revealed a reduction in the number of larger hypertrophic-like cells in ABL treated mice, suggesting deceleration of hypertrophic differentiation. DISCUSSION/SIGNIFICANCE: Cartilage preservation/regeneration therapies would fill a critical unmet need. We demonstrate that an osteoporosis drug targeting PTH1R decelerates PTOA in mice. ABL treatment was associated with preservation of cartilage, decreased Col10, increased Prg4, and decreased number of large hypertrophic-like chondrocytes in the tibial cartilage.
This chapter reviews five decades of research on reactions to mirrors and self-recognition in nonhuman primates, starting with Gallup’s (1970) pioneering experimental demonstration of self-recognition in chimpanzees and its apparent absence in monkeys. Taking a decade-by-decade approach, developments in the field are presented separately for great apes on the one hand, and all other primates on the other (prosimians, monkeys, and so-called lesser apes), considering both empirical studies and theoretical issues. The literature clearly shows that among nonhuman primates the most compelling evidence for something approaching human-like visual self-recognition is seen only in great apes, despite an impressive range of sometimes highly original procedures employed to study many monkey species. In the past decade, research has been shifting from simple questions about whether great apes can self-recognize (now considered beyond doubt), to addressing possible biological bases for individual and species differences in the strength of self-recognition, analysis of possible adaptive functions of the capacity for self-visualization, and searching for evidence of self-recognition in a range of nonprimate species.
This paper explores the interaction of informal constraints on human behaviour by examining the evolution of English football jerseys. The jersey provides an excellent setting to demonstrate how informal constraints emerge from formal rules and shape human behaviour. Customs, approved norms and habits are all observed in this setting. The commercialisation of football in recent decades has resulted in these informal constraints, in many cases dating back over a century, co-existing with branding, goodwill and identity effects. Combined, these motivate clubs to maintain the status quo. As a result, club colours have remained remarkably resilient within a frequently changing landscape.
An early economic evaluation to inform the translation into clinical practice of a spectroscopic liquid biopsy for the detection of brain cancer. Two specific aims are (1) to update an existing economic model with results from a prospective study of diagnostic accuracy and (2) to explore the potential of brain tumor-type predictions to affect patient outcomes and healthcare costs.
A cost-effectiveness analysis from a UK NHS perspective of the use of spectroscopic liquid biopsy in primary and secondary care settings, as well as a cost–consequence analysis of the addition of tumor-type predictions was conducted. Decision tree models were constructed to represent simplified diagnostic pathways. Test diagnostic accuracy parameters were based on a prospective validation study. Four price points (GBP 50-200, EUR 57-228) for the test were considered.
In both settings, the use of liquid biopsy produced QALY gains. In primary care, at test costs below GBP 100 (EUR 114), testing was cost saving. At GBP 100 (EUR 114) per test, the ICER was GBP 13,279 (EUR 15,145), whereas at GBP 200 (EUR 228), the ICER was GBP 78,300 (EUR 89,301). In secondary care, the ICER ranged from GBP 11,360 (EUR 12,956) to GBP 43,870 (EUR 50,034) across the range of test costs.
The results demonstrate the potential for the technology to be cost-effective in both primary and secondary care settings. Additional studies of test use in routine primary care practice are needed to resolve the remaining issues of uncertainty—prevalence in this patient population and referral behavior.
The Late Triassic fauna of the Lossiemouth Sandstone Formation (LSF) from the Elgin area, Scotland, has been pivotal in expanding our understanding of Triassic terrestrial tetrapods. Frustratingly, due to their odd preservation, interpretations of the Elgin Triassic specimens have relied on destructive moulding techniques, which only provide incomplete, and potentially distorted, information. Here, we show that micro-computed tomography (μCT) could revitalise the study of this important assemblage. We describe a long-neglected specimen that was originally identified as a pseudosuchian archosaur, Ornithosuchus woodwardi. μCT scans revealed dozens of bones belonging to at least two taxa: a small-bodied pseudosuchian and a specimen of the procolophonid Leptopleuron lacertinum. The pseudosuchian skeleton possesses a combination of characters that are unique to the clade Erpetosuchidae. As a basis for investigating the phylogenetic relationships of this new specimen, we reviewed the anatomy, taxonomy and systematics of other erpetosuchid specimens from the LSF (all previously referred to Erpetosuchus). Unfortunately, due to the differing representation of the skeleton in the available Erpetosuchus specimens, we cannot determine whether the erpetosuchid specimen we describe here belongs to Erpetosuchus granti (to which we show it is closely related) or if it represents a distinct new taxon. Nevertheless, our results shed light on rarely preserved details of erpetosuchid anatomy. Finally, the unanticipated new information extracted from both previously studied and neglected specimens suggests that fossil remains may be much more widely distributed in the Elgin quarries than previously recognised, and that the richness of the LSF might have been underestimated.
To develop a regional antibiogram within the Chicagoland metropolitan area and to compare regional susceptibilities against individual hospitals within the area and national surveillance data.
Multicenter retrospective analysis of antimicrobial susceptibility data from 2017 and comparison to local institutions and national surveillance data.
Setting and participants:
The analysis included 51 hospitals from the Chicago–Naperville–Elgin Metropolitan Statistical Area within the state of Illinois. Overall, 18 individual collaborator hospitals provided antibiograms for analysis, and data from 33 hospitals were provided in aggregate by the Becton Dickinson Insights Research Database.
All available antibiogram data from calendar year 2017 were combined to generate the regional antibiogram. The final Chicagoland antibiogram was then compared internally to collaborators and externally to national surveillance data to assess its applicability and utility.
In total, 167,394 gram-positive, gram-negative, fungal, and mycobacterial isolates were collated to create a composite regional antibiogram. The regional data represented the local institutions well, with 96% of the collaborating institutions falling within ±2 standard deviations of the regional mean. The regional antibiogram was able to include 4–5-fold more gram-positive and -negative species with ≥30 isolates than the median reported by local institutions. Against national surveillance data, 18.6% of assessed pathogen–antibiotic combinations crossed prespecified clinical thresholds for disparity in susceptibility rates, with notable trends for resistant gram-positive and gram-negative bacteria.
Developing an accurate, reliable regional antibiogram is feasible, even in one of the largest metropolitan areas in the United States. The biogram is useful in assessing susceptibilities to less commonly encountered organisms and providing clinicians a more accurate representation of local antimicrobial resistance rates compared to national surveillance databases.
Psychosis is more prevalent among people in prison compared with the community. Early detection is important to optimise health and justice outcomes; for some, this may be the first time they have been clinically assessed.
Determine factors associated with a first diagnosis of psychosis in prison and describe time to diagnosis from entry into prison.
This retrospective cohort study describes individuals identified for the first time with psychosis in New South Wales (NSW) prisons (2006–2012). Logistic regression was used to identify factors associated with a first diagnosis of psychosis. Cox regression was used to describe time to diagnosis from entry into prison.
Of the 38 489 diagnosed with psychosis for the first time, 1.7% (n = 659) occurred in prison. Factors associated with an increased likelihood of being diagnosed in prison (versus community) were: male gender (odds ratio (OR) = 2.27, 95% CI 1.79–2.89), Aboriginality (OR = 1.81, 95% CI 1.49–2.19), older age (OR = 1.70, 95% CI 1.37–2.11 for 25–34 years and OR = 1.63, 95% CI 1.29–2.06 for 35–44 years) and disadvantaged socioeconomic area (OR = 4.41, 95% CI 3.42–5.69). Eight out of ten were diagnosed within 3 months of reception.
Among those diagnosed with psychosis for the first time, only a small number were identified during incarceration with most identified in the first 3 months following imprisonment. This suggests good screening processes are in place in NSW prisons for detecting those with serious mental illness. It is important these individuals receive appropriate care in prison, have the opportunity to have matters reheard and possibly diverted into treatment, and are subsequently connected to community mental health services on release.
The Interplay of Genes and Environment across Multiple Studies (IGEMS) is a consortium of 18 twin studies from 5 different countries (Sweden, Denmark, Finland, United States, and Australia) established to explore the nature of gene–environment (GE) interplay in functioning across the adult lifespan. Fifteen of the studies are longitudinal, with follow-up as long as 59 years after baseline. The combined data from over 76,000 participants aged 14–103 at intake (including over 10,000 monozygotic and over 17,000 dizygotic twin pairs) support two primary research emphases: (1) investigation of models of GE interplay of early life adversity, and social factors at micro and macro environmental levels and with diverse outcomes, including mortality, physical functioning and psychological functioning; and (2) improved understanding of risk and protective factors for dementia by incorporating unmeasured and measured genetic factors with a wide range of exposures measured in young adulthood, midlife and later life.
The National Academy of Sciences-National Research Council (NAS-NRC) Twin Registry is one of the oldest, national population-based twin registries in the USA. It comprises 15,924 White male twin pairs born in the years 1917–1927 (N = 31.848), both of whom served in the armed forces, chiefly during World War II. This article updates activities in this registry since the most recent report in Twin Research and Human Genetics (Page, 2006). Records-based data include information from enlistment charts and Veterans Administration data linkages. There have been three major epidemiologic questionnaires and an education and earnings survey. Separate data collection efforts with the NAS-NRC registry include the National Heart, Lung, and Blood Institute (NHLBI) subsample, the Duke Twins Study of Memory in Aging and a clinically based study of Parkinson’s disease. Progress has been made on consolidating the various data holdings of the NAS-NRC Twin Registry. Data that had been available through the National Academy of Sciences are now freely available through National Archive of Computerized Data on Aging (NACDA).
The voting paradox occurs when a democratic society seeking to aggregate individual preferences into a social preference reaches an intransitive ordering. However it is not widely known that the paradox may also manifest for an individual aggregating over attributes of risky objects to form a preference over those objects. When this occurs, the relation ‘stochastically greater than’ is not always transitive and so transitivity need not hold between those objects. We discuss the impact of other decision paradoxes to address a series of philosophical and economic arguments against intransitive (cyclical) choice, before concluding that intransitive choices can be justified.
With significant numbers of individuals in the criminal justice system having mental health problems, court-based diversion programmes and liaison services have been established to address this problem.
To examine the effectiveness of the New South Wales (Australia) court diversion programme in reducing re-offending among those diagnosed with psychosis by comparing the treatment order group with a comparison group who received a punitive sanction.
Those with psychoses were identified from New South Wales Ministry of Health records between 2001 and 2012 and linked to offending records. Cox regression models were used to identify factors associated with re-offending.
A total of 7743 individuals were identified as diagnosed with a psychotic disorder prior to their court finalisation date for their first principal offence. Overall, 26% of the cohort received a treatment order and 74% received a punitive sanction. The re-offending rate in the treatment order group was 12% lower than the punitive sanction group. ‘Acts intended to cause injury’ was the most common type of the first principal offence for the treatment order group compared with the punitive sanction group (48% v. 27%). Drug-related offences were more likely to be punished with a punitive sanction than a treatment order (12% v. 2%).
Among those with a serious mental illness (i.e. psychosis), receiving a treatment order by the court rather than a punitive sanction was associated with reduced risk for subsequent offending. We further examined actual mental health treatment received and found that receiving no treatment following the first offence was associated with an increased risk of re-offending and, so, highlighting the importance of treatment for those with serious mental illness in the criminal justice system.
We retrospectively evaluated the effect of penicillin adverse drug reaction (ADR) labeling on surgical antibiotic prophylaxis. Cefazolin was administered in 86% of penicillin ADR-negative (−) and 28% penicillin ADR-positive (+) cases. Broad-spectrum antibiotic use was more common in ADR(+) cases and was more commonly associated with perioperative adverse drug events.
Introduction: Situational awareness (SA) is essential for maintenance of scene safety and effective resource allocation in mass casualty incidents (MCI). Unmanned aerial vehicles (UAV) can potentially enhance SA with real-time visual feedback during chaotic and evolving or inaccessible events. The purpose of this study was to test the ability of paramedics to use UAV video from a simulated MCI to identify scene hazards, initiate patient triage, and designate key operational locations. Methods: A simulated MCI, including fifteen patients of varying acuity (blast type injuries), plus four hazards, was created on a college campus. The scene was surveyed by UAV capturing video of all patients, hazards, surrounding buildings and streets. Attendees of a provincial paramedic meeting were invited to participate. Participants received a lecture on SALT Triage and the principles of MCI scene management. Next, they watched the UAV video footage. Participants were directed to sort patients according to SALT Triage step one, identify injuries, and localize the patients within the campus. Additionally, they were asked to select a start point for SALT Triage step two, identify and locate hazards, and designate locations for an Incident Command Post, Treatment Area, Transport Area and Access/Egress routes. Summary statistics were performed and a linear regression model was used to assess relationships between demographic variables and both patient triage and localization. Results: Ninety-six individuals participated. Mean age was 35 years (SD 11), 46% (44) were female, and 49% (47) were Primary Care Paramedics. Most participants (80 (84%)) correctly sorted at least 12 of 15 patients. Increased age was associated with decreased triage accuracy [-0.04(-0.07,-0.01);p=0.031]. Fifty-two (54%) were able to localize 12 or more of the 15 patients to a 27x 20m grid area. Advanced paramedic certification, and local residency were associated with improved patient localization [2.47(0.23,4.72);p=0.031], [-3.36(-5.61,-1.1);p=0.004]. The majority of participants (78 (81%)) chose an acceptable location to start SALT triage step two and 84% (80) identified at least three of four hazards. Approximately half (53 (55%)) of participants designated four or more of five key operational areas in appropriate locations. Conclusion: This study demonstrates the potential of UAV technology to remotely provide emergency responders with SA in a MCI. Additional research is required to further investigate optimal strategies to deploy UAVs in this context.
Campylobacteriosis, the most frequent bacterial enteric disease, shows a clear yet unexplained seasonality. The study purpose was to explore the influence of seasonal fluctuation in the contamination of and in the behaviour exposures to two important sources of Campylobacter on the seasonality of campylobacteriosis. Time series analyses were applied to data collected through an integrated surveillance system in Canada in 2005–2010. Data included sporadic, domestically-acquired cases of Campylobacter jejuni infection, contamination of retail chicken meat and of surface water by C. jejuni, and exposure to each source through barbequing and swimming in natural waters. Seasonal patterns were evident for all variables with a peak in summer for human cases and for both exposures, in fall for chicken meat contamination, and in late fall for water contamination. Time series analyses showed that the observed campylobacteriosis summer peak could only be significantly linked to behaviour exposures rather than sources contamination (swimming rather than water contamination and barbequing rather than chicken meat contamination). The results indicate that the observed summer increase in human cases may be more the result of amplification through more frequent risky exposures rather than the result of an increase of the Campylobacter source contamination.