We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A nonparametric test of dispersion with paired replicates data is described which involves jackknifing logarithmic transformations of the ratio of variance estimates for the pre- and post-treatment populations. Results from a Monte Carlo simulation show that the test performs well under Ho and has good power properties. Examples are given of applying the procedure on psychiatric data.
The need to maintain transport during a pandemic places transport workers at higher risk of infection and can have other effects on health and well-being. The aim of this study was to understand the current state of research on the impact of respiratory diseases on transport workers and to identify any existing evidence-based recommendations that can help mitigate the risks associated with these diseases in the transport industry. A scoping review was undertaken as per PRISMA guidelines. A search was conducted in English-language databases for peer-reviewed research articles. We reviewed research articles published over 20 years (2002–2022). We found 12540 articles, of which 39 deemed relevant, were analysed. The review highlighted the high risk of transport workers’ exposure to respiratory diseases during pandemics, exacerbated by structural inequalities including the significant number holding precarious/non-standard jobs. Increased financial strains led to poorer mental health outcomes and risks of detrimental behaviours for health. Economic measures implemented by governments were found to be insufficient in addressing these issues. The review found that transport is a significant transmission point for pandemics of respiratory diseases, and it suggests some remedies to best meet these challenges.
Public Health is essential to disaster preparedness, mitigation, response, and recovery. This has never been more evident than during the COVID-19 pandemic when public health was the disaster response lead. However, students are graduating from accredited schools and colleges of public health with limited or no education in disaster management. This is a crisis unto itself, and it is incumbent upon The Council on Education for Public Health (CEPH) to take immediate action. Public health preparedness should be recognized as a core element in public health curricula, and practical experiences, such as drills and simulations, are necessary to equip students with the confidence and competencies needed in high-stress situations. The need for such preparedness education extends beyond the COVID-19 pandemic. It is a crucial step for creating a resilient and competent public health workforce capable of safeguarding community health in the face of complex and emerging challenges.
How was trust created and reinforced between the inhabitants of medieval and early modern cities? And how did the social foundations of trusting relationships change over time? Current research highlights the role of kinship, neighbourhood, and associations, particularly guilds, in creating ‘relationships of trust’ and social capital in the face of high levels of migration, mortality, and economic volatility, but tells us little about their relative importance or how they developed. We uncover a profound shift in the contribution of family and guilds to trust networks among the middling and elite of one of Europe's major cities, London, over three centuries, from the 1330s to the 1680s. We examine almost 15,000 networks of sureties created to secure orphans’ inheritances to measure the presence of trusting relationships connected by guild membership, family, and place. We uncover a profound increase in the role of kinship – a re-embedding of trust within the family – and a decline of the importance of shared guild membership in connecting Londoners who secured orphans’ inheritances together. These developments indicate a profound transformation in the social fabric of urban society.
Chapter 11 focuses on ancient ‘contracts’, with specific reference to commerce, property and other economic activities for which there is relevant evidence. The chapter begins with urbanization in southern Mesopotamia in the fourth millennium bce, bringing together archaeological, material and written evidence in order to introduce a broad working idea of ‘contracts’. The next section moves on to a discussion of technical ancient terms and concepts, noting the ‘considerable terminological instability in the common English translations of the original terms’. The following section turns to ‘contracts’ between states, whilst the next develops a comparative analysis of ‘oaths in interpersonal agreements’. The following two sections analyse specific questions surrounding the use of writing and ’the contract of sale’, noting that there is surviving evidence for the use of (different forms of) contacts of sale across every ancient legal system. The chapter concludes by drawing together a set of generalized conceptions of ‘contract’ and briefly suggesting that long-distance trade - among other factors - may lie behind some of the similarities - for example the use of seals - evident across the extant ancient evidence.
Non-traumatic posterior fossa haemorrhage accounts for approximately 10% of all intracranial haematomas, and 1.5% of all strokes. In the posterior fossa, a small amount of mass effect can have dramatic effects, due to its small volume. This can be due to immediate transmission of pressure to the brainstem, or via occlusion of the aqueduct of Sylvius or compression of the fourth ventricle, leading to acute obstructive hydrocephalus, with the risk of tonsillar herniation. Timely investigations and management are essential to maximise good outcomes. This Element offers a brief overview of posterior fossa haemorrhage. It looks at the anatomy, aetiology, management, and surgical options, with a review of the available evidence to guide practice.
A 30 years long data series on the infection dynamics of European eel (Anguilla anguilla L.) with the non-native invasive nematode Anguillicola crassus Kuwahara, Niimi & Hagaki, 1974 is presented. Parasite burden was evaluated for 30 years in inland and coastal waters in Mecklenburg-Western Pomerania from 1991 to 2020. The total prevalence, mean intensity and damage status of the swim bladders were very high during the first decade (1991–2000), and significantly decreased in both marine and freshwater eel populations in the following decades (2001–2010, 2011–2020). The parasite intensity of eels in coastal waters was significantly lower compared with the freshwater systems (61.3% vs 79.5% in the first decade), indicating the vulnerability of the parasites to brackish water conditions and the fact that the life cycle of A. crassus cannot be completed under high saline conditions. Eel caught in the western part of the Baltic Sea (west of Darss sill) had the lowest mean infection (51.8% in first decade) compared to the eastern part with 63.8%. Thus, besides different infection patterns caused by the environmental conditions, a temporal trend towards a reduced parasite intensity and a more balanced parasite–host relationship developed in the 30 years of interaction after the first invasion. Possible reasons and mechanisms for the observed trends in parasite–host interactions are discussed.
Excellence is that quality that drives continuously improving outcomes for patients. Excellence must be measurable. We set out to measure excellence in forensic mental health services according to four levels of organisation and complexity (basic, standard, progressive and excellent) across seven domains: values and rights; clinical organisation; consistency; timescale; specialisation; routine outcome measures; research and development.
Aims
To validate the psychometric properties of a measurement scale to test which objective features of forensic services might relate to excellence: for example, university linkages, service size and integrated patient pathways across levels of therapeutic security.
Method
A survey instrument was devised by a modified Delphi process. Forensic leads, either clinical or academic, in 48 forensic services across 5 jurisdictions completed the questionnaire.
Results
Regression analysis found that the number of security levels, linked patient pathways, number of in-patient teams and joint university appointments predicted total excellence score.
Conclusions
Larger services organised according to stratified therapeutic security and with strong university and research links scored higher on this measure of excellence. A weakness is that these were self-ratings. Reliability could be improved with peer review and with objective measures such as quality and quantity of research output. For the future, studies are needed of the determinants of other objective measures of better outcomes for patients, including shorter lengths of stay, reduced recidivism and readmission, and improved physical and mental health and quality of life.
While in the United Kingdom, the government was initially slow to recognize the profound dangers of the COVID-19 pandemic, soon after Prime Minister Boris Johnson's initial plea to the public to ‘stay at home’, in March 2021, emergency legislation was rushed through parliament. On 25 March, the 350-page Coronavirus Act 2020 received royal assent, bringing the biggest restrictions on civil liberties in a generation into law the following day. Overnight, the Coronavirus Act, along with the broader raft of legal restrictions under The Health Protection (Coronavirus) Regulations 2020, made it unlawful to undertake a wide range of hitherto economically essential, prosocial and noncriminal activities. Even as the Act was rushed through parliament, civil liberties organizations were alerting parliamentarians to its dangers (Gidda, 2020).
As antiracist commentators and academics forewarned (Frazer-Carroll, 2020; Khan, 2020), racial disproportionality in policing has endured and often increased through the pandemic. As the first ‘lockdown’ came into effect, stop and search practices ‘surged’ despite the steep drop in crime rates (Grierson, 2020). Limited and prone to undercounting as they may be, Home Office data show that in the year ending March 2021, stop and search practices (under Section 1 of the Police and Criminal Evidence Act 1984) increased significantly to reach their highest level in seven years, impacting most on racially minoritized men (Home Office, 2022). Home Office data (2021) also show an increase in use of force for the year ending March 2021. This was racially disproportionate too, with Black people accounting for 16 per cent of those affected (though they make up just 3 per cent of the population according to the 2011 Census), and Asian people accounting for 8 per cent (7 per cent of the population according to the 2011 Census). In the summer of 2020, these patterns coalesced with mass global protests against racist police violence. The police murder of George Floyd in the United States catalyzed millions to march under the banner of Black Lives Matter (BLM) and spoke to the ongoing police brutality faced by racially minoritized people in Britain (Joseph-Salisbury et al, 2020).
Objectives: The dissemination of Escherichia coli–producing extended-spectrum β-lactamase (ESBL-Ec) is evident in the community. In this population-based spatial analysis, we sought to describe the distribution of ESBL-Ec and to identify predictors of incidence in the community. Methods: The study population was defined as individuals with the ESBL-Ec isolate in Queensland, Australia, from 2010 to 2019. Annual choropleth maps and a global Moran index were constructed to describe ESBL-Ec distribution. Getis-Ord Gi* was performed to identify “hot spots” of statistical significance. Using demographic factors and incidence per postal area from 2016, multivariable analyses with or without spatially structured random effects were performed. Results: In total, 12,786 individuals with ESBL-Ec isolate were identified. The incidence rate increased annually from 9.1 per 100,000 residents in 2010 to 49.8 per 100,000 residents in 2019. The geographical distribution changed from random to clustered in 2014. Hot spots were more frequently identified in the Outback and Far North Queensland, where remote communities and hotter weather are prevalent. Multivariable spatial analysis suggests that communities with higher socioeconomic status (RR, 0.66; 95% CI, 0.55–0.79 per 100 units) and employment in the agricultural industry (RR, 0.79; 95% CI, 0.67–0.95 per 10%) were protective of lower ESBL-Ec incidence. After accounting for multiple demographic factors, the residual, structured, random-effects model indicated that hot spots were still detected in more remote communities but also in several city regions. Conclusions: The change in distribution of ESBL-Ec across Queensland suggests the presence of area-level specific risk factors that enhance spread in the community. Risk factors for spread appear different between remote and city settings, and future research should be tailored to understand the respective area-level risk factors. Factors such as local temperature, antibiotic consumption, and access to services should be validated. Future public health measures to reduce transmission should be focused on the identified hot spots.
The dissemination of Escherichia coli producing extended-spectrum beta-lactamase (ESBL-Ec) is evident in the community. A population-based spatial analysis is necessary to investigate community risk factors for ESBL-Ec occurrence. The study population was defined as individuals with ESBL-Ec isolated in Queensland, Australia, from 2010 to 2019. Choropleth maps, global Moran's index and Getis-Ord Gi* were used to describe ESBL-Ec distribution and identify hot spots. Multivariable Poisson regression models with or without spatially structured random effects were performed. A total of 12 786 individuals with ESBL-Ec isolate were identified. The crude incidence rate increased annually from 9.1 per 100 000 residents in 2010 to 49.8 per 100 000 residents in 2019. The geographical distribution of ESBL-Ec changed from random to clustered after 2014, suggesting presence of community-specific factors that can enhance occurrence. Hot spots were more frequently identified in Outback and Far North Queensland, future public health measures to reduce transmission should prioritise these communities. Communities with higher socioeconomic status (RR = 0.66, 95% CI 0.55–0.79, per 100 units increase) and higher proportion of residents employed in the agricultural industry (RR = 0.79, 95% CI 0.67–0.95, per 10% increase) had lower ESBL-Ec incidence. Risk factors for occurrence appear differential between remote and city settings and this should be further investigated.
Whole-genome sequencing (WGS) shotgun metagenomics (metagenomics) attempts to sequence the entire genetic content straight from the sample. Diagnostic advantages lie in the ability to detect unsuspected, uncultivatable, or very slow-growing organisms.
Objective:
To evaluate the clinical and economic effects of using WGS and metagenomics for outbreak management in a large metropolitan hospital.
Design:
Cost-effectiveness study.
Setting:
Intensive care unit and burn unit of large metropolitan hospital.
Patients:
Simulated intensive care unit and burn unit patients.
Methods:
We built a complex simulation model to estimate pathogen transmission, associated hospital costs, and quality-adjusted life years (QALYs) during a 32-month outbreak of carbapenem-resistant Acinetobacter baumannii (CRAB). Model parameters were determined using microbiology surveillance data, genome sequencing results, hospital admission databases, and local clinical knowledge. The model was calibrated to the actual pathogen spread within the intensive care unit and burn unit (scenario 1) and compared with early use of WGS (scenario 2) and early use of WGS and metagenomics (scenario 3) to determine their respective cost-effectiveness. Sensitivity analyses were performed to address model uncertainty.
Results:
On average compared with scenario 1, scenario 2 resulted in 14 fewer patients with CRAB, 59 additional QALYs, and $75,099 cost savings. Scenario 3, compared with scenario 1, resulted in 18 fewer patients with CRAB, 74 additional QALYs, and $93,822 in hospital cost savings. The likelihoods that scenario 2 and scenario 3 were cost-effective were 57% and 60%, respectively.
Conclusions:
The use of WGS and metagenomics in infection control processes were predicted to produce favorable economic and clinical outcomes.
Research was conducted using a functional malachite green colorimetric assay to evaluate acetyl-coenzyme A carboxylase (ACCase) activity previously identified as resistant to sethoxydim and select aryloxyphenoxypropionate (FOPs) herbicides, fenoxaprop, and fluazifop. Two resistant southern crabgrass [Digitaria ciliaris (Retz.) Koeler] biotypes, R1 and R2, containing an Ile-1781-Leu amino acid substitution and previously identified as resistant to sethoxydim, pinoxaden, and fluazifop but not clethodim was utilized as the resistant chloroplastic ACCase source compared with known susceptible (S) ACCase. Dose-response studies with sethoxydim, clethodim, fluazifop-p-butyl, and pinoxaden (0.6 to 40 µM) were conducted to compare the ACCase–herbicide interactions of R1, R2, and S using the malachite green functional assay. Assay results indicated that R biotypes required more ACCase-targeting herbicides to inhibit ACCase activity compared with S. IC50 values of all four herbicides for R biotypes were consistently an order of magnitude greater than those of S. No sequencing differences in the carboxyltransferase domain was observed for R1 and R2; however, R2 IC50 values were greater across all herbicides. These results indicate the malachite green functional assay is effective in evaluating ACCase activity of R and S biotypes in the presence of ACCase-targeting herbicides, which can be used as a replacement for the 14C-based radiometric functional assays.
Discussions about increasing diversity in economics have ignored the role that associations play in the engagement of underrepresented economists. We continue work on diversity and inclusion in the Northeastern Agriculture and Resource Economics Association (NAREA) and other associations by analyzing membership and meeting attendance to promote diversity in economics. We estimate a vector error correction model (VECM) to identify the determinants of membership and meeting attendance and use member survey data to model membership and meeting attendance behavior. We find inequalities across gender, income, and professional status. Recommendations include locating meetings in accessible cities, increasing networking opportunities, and providing more services supporting underrepresented groups.
Studies have shown that the reduction in serum TAG concentrations with long-chain n-3 fatty acid supplementation is highly variable among individuals. The objectives of the present study were to compare the proportions of individuals whose TAG concentrations lowered after high-dose DHA and EPA, and to identify the predictors of response to both modalities. In a double-blind, controlled, crossover study, 154 men and women were randomised to three supplemented phases of 10 weeks each: (1) 2·7 g/d of DHA, (2) 2·7 g/d of EPA and (3) 3 g/d of maize oil, separated by 9-week washouts. As secondary analyses, the mean intra-individual variation in TAG was calculated using the standard deviation from the mean of four off-treatment samples. The response remained within the intra-individual variation (±0·25 mmol/l) in 47 and 57 % of participants after DHA and EPA, respectively. Although there was a greater proportion of participants with a reduction >0·25 mmol/l after DHA than after EPA (45 υ. 32 %; P < 0·001), the mean TAG reduction was comparable between groups (–0·59 (sem 0·04) υ. –0·57 (sem 0·05) mmol/l). Participants with a reduction >0·25 mmol/l after both DHA and EPA had higher non-HDL-cholesterol, TAG and insulin concentrations compared with other responders at baseline (all P < 0·05). In conclusion, supplementation with 2·7 g/d DHA or EPA had no meaningful effect on TAG concentrations in a large proportion of individuals with normal mean TAG concentrations at baseline. Although DHA lowered TAG in a greater proportion of individuals compared with EPA, the magnitude of TAG lowering among them was similar.
Currently it is estimated that about 1 billion people globally have non-alcoholic fatty liver disease (NAFLD), a condition in which liver fat exceeds 5 % of liver weight in the absence of significant alcohol intake. Due to the central role of the liver in metabolism, the prevalence of NAFLD is increasing in parallel with the prevalence of obesity, insulin resistance and other risk factors of metabolic diseases. However, the contribution of liver fat to the risk of type 2 diabetes mellitus and CVD, relative to other ectopic fat depots and to other risk markers, is unclear. Various studies have suggested that the accumulation of liver fat can be reduced or prevented via dietary changes. However, the amount of liver fat reduction that would be physiologically relevant, and the timeframes and dose–effect relationships for achieving this through different diet-based approaches, are unclear. Also, it is still uncertain whether the changes in liver fat per se or the associated metabolic changes are relevant. Furthermore, the methods available to measure liver fat, or even individual fatty acids, differ in sensitivity and reliability. The present report summarises key messages of presentations from different experts and related discussions from a workshop intended to capture current views and research gaps relating to the points above.
The watershed events of September 11, 2001; the anthrax attacks; Hurricane Katrina; and H1N1 necessitated that the United States define alternative mechanisms for disaster response. Specifically, there was a need to shift from a capacity building approach to a capabilities based approach that would place more emphasis on the health care community rather than just first responders. Georgia responded to this initiative by creating a Regional Coordinating Hospital (RCH) infrastructure that was responsible for coordinating regional responses within their individual geographic footprint. However, it was quickly realized that hospitals could not accomplish community-wide preparedness as a single entity and that siloed planning must come to an end. To reconcile this issue, Georgia responded to the 2012 US Department of Health and Human Services concept of coalitions. Georgia utilized the existing RCH boundaries to define its coalition regions and began inviting all medical and nonmedical response partners to the planning table (nursing homes, community health centers, volunteer groups, law enforcement, etc). This new collaboration effectively enhanced emergency response practices in Georgia, but also identified additional preparedness-related gaps that will require attention as our coalitions continue to grow and mature.(Disaster Med Public Health Preparedness. 2016;10:174–179)