To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Abnormal effort-based decision-making represents a potential mechanism underlying motivational deficits (amotivation) in psychotic disorders. Previous research identified effort allocation impairment in chronic schizophrenia and focused mostly on physical effort modality. No study has investigated cognitive effort allocation in first-episode psychosis (FEP).
Cognitive effort allocation was examined in 40 FEP patients and 44 demographically-matched healthy controls, using Cognitive Effort-Discounting (COGED) paradigm which quantified participants’ willingness to expend cognitive effort in terms of explicit, continuous discounting of monetary rewards based on parametrically-varied cognitive demands (levels N of N-back task). Relationship between reward-discounting and amotivation was investigated. Group differences in reward-magnitude and effort-cost sensitivity, and differential associations of these sensitivity indices with amotivation were explored.
Patients displayed significantly greater reward-discounting than controls. In particular, such discounting was most pronounced in patients with high levels of amotivation even when N-back performance and reward base amount were taken into consideration. Moreover, patients exhibited reduced reward-benefit sensitivity and effort-cost sensitivity relative to controls, and that decreased sensitivity to reward-benefit but not effort-cost was correlated with diminished motivation. Reward-discounting and sensitivity indices were generally unrelated to other symptom dimensions, antipsychotic dose and cognitive deficits.
This study provides the first evidence of cognitive effort-based decision-making impairment in FEP, and indicates that decreased effort expenditure is associated with amotivation. Our findings further suggest that abnormal effort allocation and amotivation might primarily be related to blunted reward valuation. Prospective research is required to clarify the utility of effort-based measures in predicting amotivation and functional outcome in FEP.
Better understanding of interplay among symptoms, cognition and functioning in first-episode psychosis (FEP) is crucial to promoting functional recovery. Network analysis is a promising data-driven approach to elucidating complex interactions among psychopathological variables in psychosis, but has not been applied in FEP.
This study employed network analysis to examine inter-relationships among a wide array of variables encompassing psychopathology, premorbid and onset characteristics, cognition, subjective quality-of-life and psychosocial functioning in 323 adult FEP patients in Hong Kong. Graphical Least Absolute Shrinkage and Selection Operator (LASSO) combined with extended Bayesian information criterion (BIC) model selection was used for network construction. Importance of individual nodes in a generated network was quantified by centrality analyses.
Our results showed that amotivation played the most central role and had the strongest associations with other variables in the network, as indexed by node strength. Amotivation and diminished expression displayed differential relationships with other nodes, supporting the validity of two-factor negative symptom structure. Psychosocial functioning was most strongly connected with amotivation and was weakly linked to several other variables. Within cognitive domain, digit span demonstrated the highest centrality and was connected with most of the other cognitive variables. Exploratory analysis revealed no significant gender differences in network structure and global strength.
Our results suggest the pivotal role of amotivation in psychopathology network of FEP and indicate its critical association with psychosocial functioning. Further research is required to verify the clinical significance of diminished motivation on functional outcome in the early course of psychotic illness.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Background: There is an unmet need for blood-based biomarkers that can reliably detect MS disease activity. Serum Biomarkers of interest includ Neurofilament-light-chain (NfL), Glial-fibrillary-strocyte-protein(GFAP) and Tau. Bone Marrow Transplantation (BMT) is reserved for aggressive forms of MS and has been shown to halt detectable CNS inflammatory activity for prolonged periods. Significant pre-treatment tissue damage at followed by inflammatory disease abeyance should be reflected longitudinal sera collected from these patients. Methods: Sera were collected from 23 MS patients pre-treatment, and following BMT at 3, 6, 9 and 12-months in addition from 33 non-inflammatory neurological controls. Biomarker quantification was performed with SiMoA. Results: Pre-AHSCT levels of serum NfL and GFAP but not Tau were elevated compared to controls (p=0.0001), and NfL correlated with lesion-based disease activity (6-month-relapse, MRI-T2 and Gadolinium-enhancement). 3-months post-treatment, while NfL levels remained elevated, Tau/GFAP paradoxically increased (p=0.0023/0.0017). These increases at 3m correlated with MRI ‘pseudoatrophy’ at 6-months. NfL/Tau levels dropped to that of controls by 6-months (p=0.0036/0.0159). GFAP levels dropped progressively after 6-months although even at 12-months remained higher than controls (p=0.004). Conclusions: NfL was the closest correlate of MS disease activity and treatment response. Chemotherapy-related toxicity may account for transient increases in NfL, Tau and MRI brain atrophy post-BMT.
Introduction: Individualizing risk for stroke following a transient ischemic attack (TIA) is a topic of intense research, as existing scores are context-dependent or have not been well validated. The Canadian TIA Score stratifies risk of subsequent stroke into low, moderate and high risk. Our objective was to prospectively validate the Canadian TIA Score in a new cohort of emergency department (ED) patients. Methods: We conducted a prospective cohort study in 14 Canadian EDs over 4 years. We enrolled consecutive adult patients with an ED visit for TIA or nondisabling stroke. Treating physicians recorded standardized clinical variables onto data collection forms. Given the ability of prompt emergency carotid endarterectomy (CEA) to prevent stroke (NNT = 3) in high risk patients, our primary outcome was the composite of subsequent stroke or CEA ≤7 days. We conducted telephone follow-up using the validated Questionnaire for Verifying Stroke Free Status at 7 and 90 days. Outcomes were adjudicated by panels of 3 local stroke experts, blinded to the index ED data collection form. Based on prior work, we estimated a sample size of 5,004 patients including 93 subsequent strokes, would yield 95% confidence bands of +/− 10% for sensitivity and likelihood ratio (LR). Our analyses assessed interval LRs (iLR) with 95% CIs. Results: We prospectively enrolled 7,569 patients with mean 68.4 +/−14.7 years and 52.4% female, of whom 107 (1.4%) had a subsequent stroke and 74 (1.0%) CEA ≤7 days (total outcomes = 181). We enrolled 81.2% of eligible patients; missed patients were similar to enrolled. The Canadian TIA Score stratified the stroke/CEA ≤7days risk as: Low (probability <0.2%, iLR 0.20 [95%CI 0.091-0.44]; Moderate (probability 1.3%, iLR 0.79 [0.68-0.92]; High (probability 2.6%, iLR 2.2 [1.9-2.6]. Sensitivity analysis for just stroke ≤7 days yielded similar results: Low iLR 0.17 [95%CI 0.056-0.52], Medium iLR 0.89 [0.75-1.1], High iLR 2.0 [1.6-2.4]. Conclusion: The Canadian TIA Score accurately identifies TIA patients risk for stroke/CEA ≤7 days. Patients classified as low risk can be safely discharged following a careful ED assessment with elective follow-up. Patients at moderate risk can undergo additional testing in the ED, have antithrombotic therapy optimized, and be offered early stroke specialist follow-up. Patients at high risk should in most cases be fully investigated and managed ideally in consultation with a stroke specialist during their index ED visit.
Identifying routes of transmission among hospitalized patients during a healthcare-associated outbreak can be tedious, particularly among patients with complex hospital stays and multiple exposures. Data mining of the electronic health record (EHR) has the potential to rapidly identify common exposures among patients suspected of being part of an outbreak.
We retrospectively analyzed 9 hospital outbreaks that occurred during 2011–2016 and that had previously been characterized both according to transmission route and by molecular characterization of the bacterial isolates. We determined (1) the ability of data mining of the EHR to identify the correct route of transmission, (2) how early the correct route was identified during the timeline of the outbreak, and (3) how many cases in the outbreaks could have been prevented had the system been running in real time.
Correct routes were identified for all outbreaks at the second patient, except for one outbreak involving >1 transmission route that was detected at the eighth patient. Up to 40 or 34 infections (78% or 66% of possible preventable infections, respectively) could have been prevented if data mining had been implemented in real time, assuming the initiation of an effective intervention within 7 or 14 days of identification of the transmission route, respectively.
Data mining of the EHR was accurate for identifying routes of transmission among patients who were part of the outbreak. Prospective validation of this approach using routine whole-genome sequencing and data mining of the EHR for both outbreak detection and route attribution is ongoing.
This study evaluated tumour necrosis factor-α, interleukins 10 and 12, and interferon-γ levels, peripheral blood mononuclear cells, and clusters of differentiation 17c and 86 expression in unilateral sudden sensorineural hearing loss.
Twenty-four patients with unilateral sudden sensorineural hearing loss, and 24 individuals with normal hearing and no history of sudden sensorineural hearing loss (who were attending the clinic for other problems), were enrolled. Peripheral blood mononuclear cells, and clusters of differentiation 11c and 86 were isolated and analysed. Plasma and supernatant levels of tumour necrosis factor-α, interferon-γ, and interleukins 10 and 12 were measured.
There were no significant differences with respect to age and gender. Monocyte population, mean tumour necrosis factor-α level and cluster of differentiation 86 expression were significantly increased in the study group compared to the control group. However, interferon-γ and interleukin 12 levels were significantly decreased. The difference in mean interleukin 10 level was not significant.
Increases in tumour necrosis factor-α level and monocyte population might play critical roles in sudden sensorineural hearing loss. This warrants detailed investigation and further studies on the role of dendritic cells in sudden sensorineural hearing loss.
Recovery of multidrug-resistant (MDR) Pseudomonas aeruginosa and Klebsiella pneumoniae from a cluster of patients in the medical intensive care unit (MICU) prompted an epidemiologic investigation for a common exposure.
Clinical and microbiologic data from MICU patients were retrospectively reviewed, MICU bronchoscopes underwent culturing and borescopy, and bronchoscope reprocessing procedures were reviewed. Bronchoscope and clinical MDR isolates epidemiologically linked to the cluster underwent molecular typing using pulsed-field gel electrophoresis (PFGE) followed by whole-genome sequencing.
Of the 33 case patients, 23 (70%) were exposed to a common bronchoscope (B1). Both MDR P. aeruginosa and K. pneumonia were recovered from the bronchoscope’s lumen, and borescopy revealed a luminal defect. Molecular testing demonstrated genetic relatedness among case patient and B1 isolates, providing strong evidence for horizontal bacterial transmission. MDR organism (MDRO) recovery in 19 patients was ultimately linked to B1 exposure, and 10 of 19 patients were classified as belonging to an MDRO pseudo-outbreak.
Surveillance of bronchoscope-derived clinical culture data was important for early detection of this outbreak, and whole-genome sequencing was important for the confirmation of findings. Visualization of bronchoscope lumens to confirm integrity should be a critical component of device reprocessing.
A cluster of Salmonella Paratyphi B variant L(+) tartrate(+) infections with indistinguishable pulsed-field gel electrophoresis patterns was detected in October 2015. Interviews initially identified nut butters, kale, kombucha, chia seeds and nutrition bars as common exposures. Epidemiologic, environmental and traceback investigations were conducted. Thirteen ill people infected with the outbreak strain were identified in 10 states with illness onset during 18 July–22 November 2015. Eight of 10 (80%) ill people reported eating Brand A raw sprouted nut butters. Brand A conducted a voluntary recall. Raw sprouted nut butters are a novel outbreak vehicle, though contaminated raw nuts, nut butters and sprouted seeds have all caused outbreaks previously. Firms producing raw sprouted products, including nut butters, should consider a kill step to reduce the risk of contamination. People at greater risk for foodborne illness may wish to consider avoiding raw products containing raw sprouted ingredients.
Planning mental health carer services requires information about the number of carers, their characteristics, service use and unmet support needs. Available Australian estimates vary widely due to different definitions of mental illness and the types of carers included. This study aimed to provide a detailed profile of Australian mental health carers using a nationally representative household survey.
The number of mental health carers, characteristics of carers and their care recipients, caring hours and tasks provided, service use and unmet service needs were derived from the national 2012 Survey of Disability, Ageing and Carers. Co-resident carers of adults with a mental illness were compared with those caring for people with physical health and other cognitive/behavioural conditions (e.g., autism, intellectual disability, dementia) on measures of service use, service needs and aspects of their caring role.
In 2012, there were 225 421 co-resident carers of adults with mental illness in Australia, representing 1.0% of the population, and an estimated further 103 813 mental health carers not living with their care recipient. The majority of co-resident carers supported one person with mental illness, usually their partner or adult child. Mental health carers were more likely than physical health carers to provide emotional support (68.1% v. 19.7% of carers) and less likely to assist with practical tasks (64.1% v. 86.6%) and activities of daily living (31.9% v. 48.9%). Of co-resident mental health carers, 22.5% or 50 828 people were confirmed primary carers – the person providing the most support to their care recipient. Many primary mental health carers (37.8%) provided more than 40 h of care per week. Only 23.8% of primary mental health carers received government income support for carers and only 34.4% received formal service assistance in their caring role, while 49.0% wanted more support. Significantly more primary mental health than primary physical health carers were dissatisfied with received services (20.0% v. 3.2%), and 35.0% did not know what services were available to them.
Results reveal a sizable number of mental health carers with unmet needs in the Australian community, particularly with respect to financial assistance and respite care, and that these carers are poorly informed about available supports. The prominence of emotional support and their greater dissatisfaction with services indicate a need to better tailor carer services. If implemented carefully, recent Australian reforms including the Carer Gateway and National Disability Insurance Scheme hold promise for improving mental health carer supports.
Giardia duodenalis is the most common intestinal parasite of humans in the USA, but the risk factors for sporadic (non-outbreak) giardiasis are not well described. The Centers for Disease Control and Prevention and the Colorado and Minnesota public health departments conducted a case-control study to assess risk factors for sporadic giardiasis in the USA. Cases (N = 199) were patients with non-outbreak-associated laboratory-confirmed Giardia infection in Colorado and Minnesota, and controls (N = 381) were matched by age and site. Identified risk factors included international travel (aOR = 13.9; 95% CI 4.9–39.8), drinking water from a river, lake, stream, or spring (aOR = 6.5; 95% CI 2.0–20.6), swimming in a natural body of water (aOR = 3.3; 95% CI 1.5–7.0), male–male sexual behaviour (aOR = 45.7; 95% CI 5.8–362.0), having contact with children in diapers (aOR = 1.6; 95% CI 1.01–2.6), taking antibiotics (aOR = 2.5; 95% CI 1.2–5.0) and having a chronic gastrointestinal condition (aOR = 1.8; 95% CI 1.1–3.0). Eating raw produce was inversely associated with infection (aOR = 0.2; 95% CI 0.1–0.7). Our results highlight the diversity of risk factors for sporadic giardiasis and the importance of non-international-travel-associated risk factors, particularly those involving person-to-person transmission. Prevention measures should focus on reducing risks associated with diaper handling, sexual contact, swimming in untreated water, and drinking untreated water.
Introduction: Survival from cardiac arrest has been linked to the quality of resuscitation care. Unfortunately, healthcare providers frequently underperform in these critical scenarios, with a well-documented deterioration in skills weeks to months following advanced life support courses. Improving initial training and preventing decay in knowledge and skills are a priority in resuscitation education. The spacing effect has repeatedly been shown to have an impact on learning and retention. Despite its potential advantages, the spacing effect has seldom been applied to organized education training or complex motor skill learning where it has the potential to make a significant impact. The purpose of this study was to determine if a resuscitation course taught in a spaced format compared to the usual massed instruction results in improved retention of procedural skills. Methods: EMS providers (Paramedics and Emergency Medical Technicians (EMT)) were block randomized to receive a Pediatric Advanced Life Support (PALS) course in either a spaced format (four 210-minute weekly sessions) or a massed format (two sequential 7-hour days). Blinded observers used expert-developed 4-point global rating scales to assess video recordings of each learner performing various resuscitation skills before, after and 3-months following course completion. Primary outcomes were performance on infant bag-valve-mask ventilation (BVMV), intraosseous (IO) insertion, infant intubation, infant and adult chest compressions. Results: Forty-eight of 50 participants completed the study protocol (26 spaced and 22 massed). There was no significant difference between the two groups on testing before and immediately after the course. 3-months following course completion participants in the spaced cohort scored higher overall for BVMV (2.2 ± 0.13 versus 1.8 ± 0.14, p=0.012) without statistically significant difference in scores for IO insertion (3.0 ± 0.13 versus 2.7± 0.13, p= 0.052), intubation (2.7± 0.13 versus 2.5 ± 0.14, p=0.249), infant compressions (2.5± 0.28 versus 2.5± 0.31, p=0.831) and adult compressions (2.3± 0.24 versus 2.2± 0.26, p=0.728) Conclusion: Procedural skills taught in a spaced format result in at least as good learning as the traditional massed format; more complex skills taught in a spaced format may result in better long term retention when compared to traditional massed training as there was a clear difference in BVMV and trend toward a difference in IO insertion.
Introduction: The cricothyroid membrane is used as a landmark for emergent surgical airway access. Ultrasound identification of the cricothyroid membrane is more accurate than landmarking by palpation. The objective of this study was to determine whether head of bed elevation affects the position of the cricothyroid membrane as identified by ultrasound. Methods: This was a prospective, observational study on a convenience sample of adult patients presenting to the emergency department. Participants underwent ultrasound scans by trained physicians at 0, 30 and 90 degrees head of bed elevation to identify the cricothyroid membrane. The cricothyroid membrane position identified at 0 degrees was used as a reference, and the change in position of the external landmark of the cricothyroid membrane with the patient at 30 and 90 degrees was measured. Additionally, the patients gender, age, body mass index (BMI) and Mallampati score were recorded for comparison. Linear mixed effects models with 95% confidence intervals were used to determine the effect of head of bed elevation, age, BMI and Mallampati score on the differences between measured distances. Results: One hundred and two patients were enrolled in the study. The average change in position from reference was statistically significant for both 30 degrees [2.72±0.77mm (p<0.01)] and 90 degrees [4.23±0.77mm (p<0.01)] head of bed elevation. The adjusted linear mixed effects model showed age greater than 70, BMI over 30 and higher Mallampati score were associated with greater change in distance between cricothyroid membrane landmarks. Conclusion: There was a statistically significant difference in the position of the cricothyroid membrane comparing 0 degrees to 30 and 90 degrees head of bed elevation. However, the relatively small differences suggest that this finding is not clinically relevant. Further study is required to evaluate if these differences impact the actual successful performance of cricothyrotomy.
For livestock production systems to play a positive role in global food security, the balance between their benefits and disbenefits to society must be appropriately managed. Based on the evidence provided by field-scale randomised controlled trials around the world, this debate has traditionally centred on the concept of economic-environmental trade-offs, of which existence is theoretically assured when resource allocation is perfect on the farm. Recent research conducted on commercial farms indicates, however, that the economic-environmental nexus is not nearly as straightforward in the real world, with environmental performances of enterprises often positively correlated with their economic profitability. Using high-resolution primary data from the North Wyke Farm Platform, an intensively instrumented farm-scale ruminant research facility located in southwest United Kingdom, this paper proposes a novel, information-driven approach to carry out comprehensive assessments of economic-environmental trade-offs inherent within pasture-based cattle and sheep production systems. The results of a data-mining exercise suggest that a potentially systematic interaction exists between ‘soil health’, ecological surroundings and livestock grazing, whereby a higher level of soil organic carbon (SOC) stock is associated with a better animal performance and less nutrient losses into watercourses, and a higher stocking density with greater botanical diversity and elevated SOC. We contend that a combination of farming system-wide trials and environmental instrumentation provides an ideal setting for enrolling scientifically sound and biologically informative metrics for agricultural sustainability, through which agricultural producers could obtain guidance to manage soils, water, pasture and livestock in an economically and environmentally acceptable manner. Priority areas for future farm-scale research to ensure long-term sustainability are also discussed.
The objective of this study was to assess determinants of poor sleep quality which is an under-diagnosed and under-treated problem in elderly patients with diabetes mellitus, hyperlipidemia and hypertension.
Poor sleep quality is linked to decreased quality of life, increased morbidity and mortality. Poor sleep quality is common in the elderly population with associated cardiometabolic risk factors such as diabetes, hyperlipidemia and hypertension.
This is a cross-sectional study undertaken in the primary healthcare setting (Singhealth Polyclinics-Outram) in Singapore. Singaporeans aged 65 years and above who had at least one of the three cardiometabolic risk factors (diabetes, hypertension and hyperlipidemia) were identified. Responders’ sleep quality was assessed using the Pittsburgh Sleep Quality Index (PSQI) questionnaire and were divided into those with good quality sleep and those with poor quality sleep, based on the PSQI score. Information on demographics, co-morbidities and lifestyle practices were collected. Descriptive and multivariate analyses of determinants of poor sleep were determined.
There were 199 responders (response rate 88.1%). Nocturia (adjusted prevalence rate ratio 1.54, 95% confidence interval 1.06–2.26) was found to be associated with an increased risk of poor sleep quality in elderly patients with diabetes mellitus, hypertension and hyperlipidaemia. Nocturia, a prevalent problem in the Asian elderly population, has been found to be associated with poor sleep quality in our study. Hence, it is imperative to identify and treat patients with nocturia to improve sleep quality among them.
Exercise and physical training are known to affect gastrointestinal function and digestibility in horses and can lead to inaccurate estimates of nutrient and energy digestibility when markers are used. The effect of exercise on apparent nutrient digestibility and faecal recoveries of ADL and TiO2 was studied in six Welsh pony geldings subjected to either a low- (LI) or high-intensity (HI) exercise regime according to a cross-over design. Ponies performing LI exercise were walked once per day for 45 min in a horse walker (5 km/h) for 47 consecutive days. Ponies submitted to HI exercise were gradually trained for the same 47 days according a standardized protocol. Throughout the experiment, the ponies received a fixed level of feed and the daily rations consisted of 4.7 kg DM of grass hay and 0.95 kg DM of concentrate. The diet was supplemented with minerals, vitamins and TiO2 (3.0 g Ti/day). Total tract digestibility of DM, organic matter (OM), CP, crude fat, NDF, ADF, starch, sugar and energy was determined with the total faeces collection (TFC) method. In addition, DM and OM digestibility was estimated using internal ADL and the externally supplemented Ti as markers. Urine was collected on the final 2 days of each experimental period. Exercise did not affect apparent digestibility of CP, crude fat, starch and sugar. Digestibility of DM (DMD), OM (OMD), ADF and NDF tended to be lower and DE was decreased when ponies received the HI exercise regime. For all treatments combined, mean faecal recoveries of ADL and Ti were 87.8±1.7% and 99.3±1.7%, respectively. Ti was not detected in the urine, indicating that intestinal integrity was maintained with exercise. Dry matter digestibility estimated with the TFC, ADL and Ti for ponies subjected to LI exercise were 66.3%, 60.3% and 64.8%, respectively, while DMD for HI ponies were 64.2%, 60.3% and 65.2%, respectively. In conclusion, physical exercise has an influence on the GE digestibility of the feed in ponies provided with equivalent levels of feed intake. In addition, the two markers used for estimating apparent DMD and OMD indicate that externally supplemented Ti is a suitable marker to determine digestibility of nutrients in horses performing exercise unlike dietary ADL.
Simulation models are used widely in pharmacology, epidemiology and health economics (HEs). However, there have been no attempts to incorporate models from these disciplines into a single integrated model. Accordingly, we explored this linkage to evaluate the epidemiological and economic impact of oseltamivir dose optimisation in supporting pandemic influenza planning in the USA. An HE decision analytic model was linked to a pharmacokinetic/pharmacodynamics (PK/PD) – dynamic transmission model simulating the impact of pandemic influenza with low virulence and low transmissibility and, high virulence and high transmissibility. The cost-utility analysis was from the payer and societal perspectives, comparing oseltamivir 75 and 150 mg twice daily (BID) to no treatment over a 1-year time horizon. Model parameters were derived from published studies. Outcomes were measured as cost per quality-adjusted life year (QALY) gained. Sensitivity analyses were performed to examine the integrated model's robustness. Under both pandemic scenarios, compared to no treatment, the use of oseltamivir 75 or 150 mg BID led to a significant reduction of influenza episodes and influenza-related deaths, translating to substantial savings of QALYs. Overall drug costs were offset by the reduction of both direct and indirect costs, making these two interventions cost-saving from both perspectives. The results were sensitive to the proportion of inpatient presentation at the emergency visit and patients’ quality of life. Integrating PK/PD–EPI/HE models is achievable. Whilst further refinement of this novel linkage model to more closely mimic the reality is needed, the current study has generated useful insights to support influenza pandemic planning.
Arthropod communities in the tropics are increasingly impacted by rapid changes in land use. Because species showing distinct seasonal patterns of activity are thought to be at higher risk of climate-related extirpation, global warming is generally considered a lower threat to arthropod biodiversity in the tropics than in temperate regions. To examine changes associated with land use and weather variables in tropical arthropod communities, we deployed Malaise traps at three major anthropogenic forests (secondary reserve forest, oil palm forest, and urban ornamental forest (UOF)) in Peninsular Malaysia and collected arthropods continuously for 12 months. We used metabarcoding protocols to characterize the diversity within weekly samples. We found that changes in the composition of arthropod communities were significantly associated with maximum temperature in all the three forests, but shifts were reversed in the UOF compared with the other forests. This suggests arthropods in forests in Peninsular Malaysia face a double threat: community shifts and biodiversity loss due to exploitation and disturbance of forests which consequently put species at further risk related to global warming. We highlight the positive feedback mechanism of land use and temperature, which pose threats to the arthropod communities and further implicates ecosystem functioning and human well-being. Consequently, conservation and mitigation plans are urgently needed.