To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Alzheimer’s disease (AD) studies are increasingly targeting earlier (pre)clinical populations, in which the expected degree of observable cognitive decline over a certain time interval is reduced as compared to the dementia stage. Consequently, endpoints to capture early cognitive changes require refinement. We aimed to determine the sensitivity to decline of widely applied neuropsychological tests at different clinical stages of AD as outlined in the National Institute on Aging – Alzheimer’s Association (NIA-AA) research framework.
Amyloid-positive individuals (as determined by positron emission tomography or cerebrospinal fluid) with longitudinal neuropsychological assessments available were included from four well-defined study cohorts and subsequently classified among the NIA-AA stages. For each stage, we investigated the sensitivity to decline of 17 individual neuropsychological tests using linear mixed models.
1103 participants (age = 70.54 ± 8.7, 47% female) were included: n = 120 Stage 1, n = 206 Stage 2, n = 467 Stage 3 and n = 309 Stage 4. Neuropsychological tests were differentially sensitive to decline across stages. For example, Category Fluency captured significant 1-year decline as early as Stage 1 (β = −.58, p < .001). Word List Delayed Recall (β = −.22, p < .05) and Trail Making Test (β = 6.2, p < .05) became sensitive to 1-year decline in Stage 2, whereas the Mini-Mental State Examination did not capture 1-year decline until Stage 3 (β = −1.13, p < .001) and 4 (β = −2.23, p < .001).
We demonstrated that commonly used neuropsychological tests differ in their ability to capture decline depending on clinical stage within the AD continuum (preclinical to dementia). This implies that stage-specific cognitive endpoints are needed to accurately assess disease progression and increase the chance of successful treatment evaluation in AD.
Surgical antimicrobial prophylaxis (SAP) is commonly administered in orthopedic procedures. Research regarding SAP appropriateness for specific orthopedic procedures is limited and is required to facilitate targeted orthopedic prescriber behavior change.
To describe SAP prescribing and appropriateness for orthopedic procedures in Australian hospitals.
Design, setting, and participants:
Multicenter, national, quality improvement study with retrospective analysis of data collected from Australian hospitals via Surgical National Antimicrobial Prescribing Survey (Surgical NAPS) audits from January 1, 2016, to April 15, 2019, were analyzed.
Logistic regression identified hospital, patient and surgical factors associated with appropriateness. Adjusted appropriateness was calculated from the multivariable model. Additional subanalyses were conducted on smaller subsets to calculate the adjusted appropriateness for specific orthopedic procedures.
In total, 140 facilities contributed to orthopedic audits in the Surgical NAPS, including 4,032 orthopedic surgical episodes and 6,709 prescribed doses. Overall appropriateness was low, 58.0% (n = 3,894). This differed for prescribed procedural (n = 3,978, 64.7%) and postprocedural doses (n = 2,731, 48.3%). The most common reasons for inappropriateness, when prophylaxis was required, was timing for procedural doses (50.9%) and duration for postprocedural prescriptions (49.8%). The adjusted appropriateness of each orthopedic procedure group was low for procedural SAP (knee surgery, 54.1% to total knee joint replacement, 74.1%). The adjusted appropriateness for postprocedural prescription was also low (from hand surgery, 40.7%, to closed reduction fractures, 68.7%).
Orthopedic surgical specialties demonstrated differences across procedural and postprocedural appropriateness. The metric of appropriateness identifies targets for quality improvement and is meaningful for clinicians. Targeted quality improvement projects for orthopedic specialties need to be developed to support optimization of antimicrobial use.
The current study aimed to investigate availability and placement of healthy and discretionary (less healthy) food in supermarkets in Victoria, Australia, and examine variation by supermarket chain and area-level socio-economic disadvantage.
Cross-sectional supermarket audit. Measures included: (i) proportion of shelf space (in square metres) allocated to selected healthy and discretionary food and beverages; (ii) proportion of end-of-aisle, checkout and island bin displays containing discretionary food and beverages and (iii) proportion of space within end-of-aisle, checkout and island bin displays devoted to discretionary food and beverages.
Metropolitan areas of Melbourne and Geelong, Australia. Assessment: June–July 2019.
Random sample of 104 stores, with equal numbers from each supermarket group (Coles, Woolworths, Aldi and Independent stores) within strata of area-level socio-economic position.
Proportion of shelf space devoted to selected discretionary foods was greater for Independent stores (72·7 %) compared with Woolworths (65·7 %), Coles (64·8 %) and Aldi (63·2 %) (all P < 0·001). Proportion of shelf space devoted to selected discretionary food for all Coles, Woolworths and Aldi stores was 9·7 % higher in the most compared with the least disadvantaged areas (P = 0·002). Across all stores, 90 % of staff-assisted checkout displays and 50 % of end-of-aisle displays included discretionary food. Aldi was less likely to feature discretionary food in end-of-aisle and checkout displays compared with other supermarket groups.
Extensive marketing of discretionary food in all Australian supermarket chains was observed, which is likely to strongly influence purchasing patterns and population diets. Findings should be used to inform private and public sector policies to reduce marketing of discretionary food in supermarkets.
We undertook a strengths, weaknesses, opportunities, and threats (SWOT) analysis of Northern Hemisphere tree-ring datasets included in IntCal20 in order to evaluate their strategic fit with the demands of archaeological users. Case studies on wiggle-matching single tree rings from timbers in historic buildings and Bayesian modeling of series of results on archaeological samples from Neolithic long barrows in central-southern England exemplify the archaeological implications that arise when using IntCal20. The SWOT analysis provides an opportunity to think strategically about future radiocarbon (14C) calibration so as to maximize the utility of 14C dating in archaeology and safeguard its reputation in the discipline.
Registry-based trials have emerged as a potentially cost-saving study methodology. Early estimates of cost savings, however, conflated the benefits associated with registry utilisation and those associated with other aspects of pragmatic trial designs, which might not all be as broadly applicable. In this study, we sought to build a practical tool that investigators could use across disciplines to estimate the ranges of potential cost differences associated with implementing registry-based trials versus standard clinical trials.
We built simulation Markov models to compare unique costs associated with data acquisition, cleaning, and linkage under a registry-based trial design versus a standard clinical trial. We conducted one-way, two-way, and probabilistic sensitivity analyses, varying study characteristics over broad ranges, to determine thresholds at which investigators might optimally select each trial design.
Registry-based trials were more cost effective than standard clinical trials 98.6% of the time. Data-related cost savings ranged from $4300 to $600,000 with variation in study characteristics. Cost differences were most reactive to the number of patients in a study, the number of data elements per patient available in a registry, and the speed with which research coordinators could manually abstract data. Registry incorporation resulted in cost savings when as few as 3768 independent data elements were available and when manual data abstraction took as little as 3.4 seconds per data field.
Registries offer important resources for investigators. When available, their broad incorporation may help the scientific community reduce the costs of clinical investigation. We offer here a practical tool for investigators to assess potential costs savings.
Heat stress is a global issue constraining pig productivity, and it is likely to intensify under future climate change. Technological advances in earth observation have made tools available that enable identification and mapping livestock species that are at risk of exposure to heat stress due to climate change. Here, we present a methodology to map the current and likely future heat stress risk in pigs using R software by combining the effects of temperature and relative humidity. We applied the method to growing-finishing pigs in Uganda. We mapped monthly heat stress risk and quantified the number of pigs exposed to heat stress using 18 global circulation models and projected impacts in the 2050s. Results show that more than 800 000 pigs in Uganda will be affected by heat stress in the future. The results can feed into evidence-based policy, planning and targeted resource allocation in the livestock sector.
The identity, richness, and abundance of true flies (Diptera) from the nests of three cavity-nesting raptors (Aves) were investigated in northern Nova Scotia, Canada. After fledging, flies were extracted from the nest material using Berlese funnels within an emergence chamber. Thirty-one species/morphospecies from 14 families were collected, including eight new records for Nova Scotia and two new records for eastern North America.
At present, analysis of diet and bladder cancer (BC) is mostly based on the intake of individual foods. The examination of food combinations provides a scope to deal with the complexity and unpredictability of the diet and aims to overcome the limitations of the study of nutrients and foods in isolation. This article aims to demonstrate the usability of supervised data mining methods to extract the food groups related to BC. In order to derive key food groups associated with BC risk, we applied the data mining technique C5.0 with 10-fold cross-validation in the BLadder cancer Epidemiology and Nutritional Determinants study, including data from eighteen case–control and one nested case–cohort study, compromising 8320 BC cases out of 31 551 participants. Dietary data, on the eleven main food groups of the Eurocode 2 Core classification codebook, and relevant non-diet data (i.e. sex, age and smoking status) were available. Primarily, five key food groups were extracted; in order of importance, beverages (non-milk); grains and grain products; vegetables and vegetable products; fats, oils and their products; meats and meat products were associated with BC risk. Since these food groups are corresponded with previously proposed BC-related dietary factors, data mining seems to be a promising technique in the field of nutritional epidemiology and deserves further examination.
Clinical diagnostics in sudden onset disasters have historically been limited. We set out to design, implement, and evaluate a mobile diagnostic laboratory accompanying a type 2 emergency medical team (EMT) field hospital.
Available diagnostic platforms were reviewed and selected against in field need. Platforms included HemoCue301/WBC DIFF, i-STAT, BIOFIRE FILMARRAY multiplex rt-PCR, Olympus BX53 microscopy, ABO/Rh grouping, and specific rapid diagnostic tests. This equipment was trialed in Katherine, Australia, and Dili, Timor-Leste.
During the initial deployment, an evaluation of FilmArray tests was successful using blood culture identification, gastrointestinal, and respiratory panels. HemoCue301 (n = 20) hemoglobin values were compared on Sysmex XN 550 (r = 0.94). HemoCue WBC DIFF had some variation, dependent on the cell, when compared with Sysmex XN 550 (r = 0.88-0.16). i-STAT showed nonsignificant differences against Vitros 250. Further evaluation of FilmArray in Dili, Timor-Leste, diagnosed 117 pathogens on 168 FilmArray pouches, including 25 separate organisms on blood culture and 4 separate cerebrospinal fluid pathogens.
This mobile laboratory represents a major advance in sudden onset disaster. Setup of the service was quick (< 24 hr) and transport to site rapid. Future deployment in fragmented health systems after sudden onset disasters with EMT2 will now allow broader diagnostic capability.
This guidance paper from the European Psychiatric Association (EPA) aims to provide evidence-based recommendations on early intervention in clinical high risk (CHR) states of psychosis, assessed according to the EPA guidance on early detection. The recommendations were derived from a meta-analysis of current empirical evidence on the efficacy of psychological and pharmacological interventions in CHR samples. Eligible studies had to investigate conversion rate and/or functioning as a treatment outcome in CHR patients defined by the ultra-high risk and/or basic symptom criteria. Besides analyses on treatment effects on conversion rate and functional outcome, age and type of intervention were examined as potential moderators. Based on data from 15 studies (n = 1394), early intervention generally produced significantly reduced conversion rates at 6- to 48-month follow-up compared to control conditions. However, early intervention failed to achieve significantly greater functional improvements because both early intervention and control conditions produced similar positive effects. With regard to the type of intervention, both psychological and pharmacological interventions produced significant effects on conversion rates, but not on functional outcome relative to the control conditions. Early intervention in youth samples was generally less effective than in predominantly adult samples. Seven evidence-based recommendations for early intervention in CHR samples could have been formulated, although more studies are needed to investigate the specificity of treatment effects and potential age effects in order to tailor interventions to the individual treatment needs and risk status.
The aim of this guidance paper of the European Psychiatric Association is to provide evidence-based recommendations on the early detection of a clinical high risk (CHR) for psychosis in patients with mental problems. To this aim, we conducted a meta-analysis of studies reporting on conversion rates to psychosis in non-overlapping samples meeting any at least any one of the main CHR criteria: ultra-high risk (UHR) and/or basic symptoms criteria. Further, effects of potential moderators (different UHR criteria definitions, single UHR criteria and age) on conversion rates were examined. Conversion rates in the identified 42 samples with altogether more than 4000 CHR patients who had mainly been identified by UHR criteria and/or the basic symptom criterion ‘cognitive disturbances’ (COGDIS) showed considerable heterogeneity. While UHR criteria and COGDIS were related to similar conversion rates until 2-year follow-up, conversion rates of COGDIS were significantly higher thereafter. Differences in onset and frequency requirements of symptomatic UHR criteria or in their different consideration of functional decline, substance use and co-morbidity did not seem to impact on conversion rates. The ‘genetic risk and functional decline’ UHR criterion was rarely met and only showed an insignificant pooled sample effect. However, age significantly affected UHR conversion rates with lower rates in children and adolescents. Although more research into potential sources of heterogeneity in conversion rates is needed to facilitate improvement of CHR criteria, six evidence-based recommendations for an early detection of psychosis were developed as a basis for the EPA guidance on early intervention in CHR states.
Terrestrial plant macrofossils from the sedimentary record of Lake Suigetsu, Japan, provide the only quasi-continuous direct atmospheric record of radiocarbon (14C) covering the last 50 ka cal BP (Bronk Ramsey et al. 2012). Since then, new high precision data have become available on U-Th dated speleothems from Hulu Cave China, covering the same time range (Cheng et al. 2018). In addition, an updated varve-based chronology has also been published for the 2006 core from Lake Suigetsu (SG06) based on extended microscopic analysis of the sediments and improved algorithms for interpolation (Schlolaut et al. 2018). Here we reanalyze the radiocarbon dataset from Suigetsu based on the new varve counting information and the constraints imposed by the speleothem data. This enables the new information on the calendar age scale of the Suigetsu dataset to be used in the construction of the consensus IntCal calibration curve. Comparison of the speleothem and plant macrofossil records provides insight into the mechanisms underlying the incorporation of carbon into different types of record and the relative strengths of different types of archive for calibration purposes.
Given the common view that pre-exercise nutrition/breakfast is important for performance, the present study investigated whether breakfast influences resistance exercise performance via a physiological or psychological effect. Twenty-two resistance-trained, breakfast-consuming men completed three experimental trials, consuming water-only (WAT), or semi-solid breakfasts containing 0 g/kg (PLA) or 1·5 g/kg (CHO) maltodextrin. PLA and CHO meals contained xanthan gum and low-energy flavouring (approximately 122 kJ), and subjects were told both ‘contained energy’. At 2 h post-meal, subjects completed four sets of back squat and bench press to failure at 90 % ten repetition maximum. Blood samples were taken pre-meal, 45 min and 105 min post-meal to measure serum/plasma glucose, insulin, ghrelin, glucagon-like peptide-1 and peptide tyrosine-tyrosine concentrations. Subjective hunger/fullness was also measured. Total back squat repetitions were greater in CHO (44 (sd 10) repetitions) and PLA (43 (sd 10) repetitions) than WAT (38 (sd 10) repetitions; P < 0·001). Total bench press repetitions were similar between trials (WAT 37 (sd 7) repetitions; CHO 39 (sd 7) repetitions; PLA 38 (sd 7) repetitions; P = 0·130). Performance was similar between CHO and PLA trials. Hunger was suppressed and fullness increased similarly in PLA and CHO, relative to WAT (P < 0·001). During CHO, plasma glucose was elevated at 45 min (P < 0·05), whilst serum insulin was elevated (P < 0·05) and plasma ghrelin suppressed at 45 and 105 min (P < 0·05). These results suggest that breakfast/pre-exercise nutrition enhances resistance exercise performance via a psychological effect, although a potential mediating role of hunger cannot be discounted.
Campylobacteriosis is the most common notifiable disease in New Zealand. While the risk of campylobacteriosis has been found to be strongly associated with the consumption of undercooked poultry, other risk factors include rainwater-sourced drinking water, contact with animals and consumption of raw dairy products. Despite this, there has been little investigation of raw milk as a risk factor for campylobacteriosis. Recent increases in demand for untreated or ‘raw’ milk have also raised concerns that this exposure may become a more important source of disease in the future. This study describes the cases of notified campylobacteriosis from a sentinel surveillance site. Previously collected data from notified cases of raw milk-associated campylobacteriosis were examined and compared with campylobacteriosis cases who did not report raw milk consumption. Raw milk campylobacteriosis cases differed from non-raw milk cases on comparison of age and occupation demographics, with raw milk cases more likely to be younger and categorised as children or students for occupation. Raw milk cases were more likely to be associated with outbreaks than non-raw milk cases. Study-suggested motivations for raw milk consumption (health reasons, natural product, produced on farm, inexpensive or to support locals) were not strongly supported by cases. More information about the raw milk consumption habits of New Zealanders would be helpful to better understand the risks of this disease, especially with respect to increased disease risk observed in younger people. Further discussion with raw milk consumers around their motivations may also be useful to find common ground between public health concerns and consumer preferences as efforts continue to manage this ongoing public health issue.
Electrochemical capacitors featuring a modified acetonitrile (AN) electrolyte and a binder-free, activated carbon fabric electrode material were assembled and tested at <−40 °C. The melting point of the electrolyte was depressed relative to the standard pure AN solvent through the use of a methyl formate cosolvent, to enable operation at temperatures lower than the rated limit of typical commercial cells (−40 °C). Based on earlier electrolyte formulation studies, a 1:1 ratio of methyl formate to AN (by volume) was selected, to maximize freezing point depression while maintaining a sufficient salt solubility. The salt spiro-(1,1′)-bipyrrolidinium tetrafluoroborate was used, based on its improved conductivity at low temperatures, relative to linear alkyl ammonium salts. The carbon fabric electrode supported a relatively high rate capability at temperatures as low as −65 °C with a modest increase in cell resistance at this reduced temperature. The capacitance was only weakly dependent on temperature, with a specific capacitance of ∼110 F/g.
Healthcare personnel (HCP) were recruited to provide serum samples, which were tested for antibodies against Ebola or Lassa virus to evaluate for asymptomatic seroconversion.
From 2014 to 2016, 4 patients with Ebola virus disease (EVD) and 1 patient with Lassa fever (LF) were treated in the Serious Communicable Diseases Unit (SCDU) at Emory University Hospital. Strict infection control and clinical biosafety practices were implemented to prevent nosocomial transmission of EVD or LF to HCP.
All personnel who entered the SCDU who were required to measure their temperatures and complete a symptom questionnaire twice daily were eligible.
No employee developed symptomatic EVD or LF. EVD and LF antibody studies were performed on sera samples from 42 HCP. The 6 participants who had received investigational vaccination with a chimpanzee adenovirus type 3 vectored Ebola glycoprotein vaccine had high antibody titers to Ebola glycoprotein, but none had a response to Ebola nucleoprotein or VP40, or a response to LF antigens.
Patients infected with filoviruses and arenaviruses can be managed successfully without causing occupation-related symptomatic or asymptomatic infections. Meticulous attention to infection control and clinical biosafety practices by highly motivated, trained staff is critical to the safe care of patients with an infection from a special pathogen.
Cardiac Fibromas are primary cardiac tumours more common in children than in adults. Surgical intervention is often not required except in the case of limited cardiac output or significant arrhythmia burden. We present a symptomatic 3-month-old infant who had successful surgical intervention for a giant right ventricle fibroma found on prenatal imaging.
Finely resolved geodetic data provide an opportunity to assess the extent and morphology of crevasses and their change over time. Crevasses have the potential to bias geodetic measurements of elevation and mass change unless they are properly accounted for. We developed a framework that automatically maps and extracts crevasse geometry and masks them where they interfere with surface mass-balance assessment. Our study examines airborne light detection and ranging digital elevation models (LiDAR DEMs) from Haig Glacier, which is experiencing a transient response in its crevassed upper regions as the glacier thins, using a self-organizing map algorithm. This method successfully extracts and characterizes ~1000 crevasses, with an overall accuracy of 94%. The resulting map provides insight into stress and flow conditions. The crevasse mask also enables refined geodetic estimates of summer mass balance. From differencing of September and April LiDAR DEMs, the raw LiDAR DEM gives a 9% overestimate in the magnitude of glacier thinning over the summer: −5.48 m compared with a mean elevation change of −5.02 m when crevasses are masked out. Without identification and removal of crevasses, the LiDAR-derived summer mass balance therefore has a negative bias relative to the glaciological surface mass balance.
Pigweed is difficult to manage in grain sorghum because of widespread herbicide resistance, a limited number of registered effective herbicides, and the synchronous emergence of pigweed with grain sorghum in Kansas. The combination of cultural and mechanical control tactics with an herbicide program are commonly recognized as best management strategies; however, limited information is available to adapt these strategies to dryland systems. Our objective for this research was to assess the influence of four components, including a winter wheat cover crop (CC), row-crop cultivation, three row widths, with and without a herbicide program, on pigweed control in a dryland system. Field trials were implemented during 2017 and 2018 at three locations for a total of 6 site-years. The herbicide program component resulted in excellent control (>97%) in all treatments at 3 and 8 weeks after planting (WAP). CC provided approximately 50% reductions in pigweed density and biomass for both timings in half of the site-years; however, mixed results were observed in the remaining site-years, ranging from no attributable difference to a 170% increase in weed density at 8 WAP in 1 site-year. Treatments including row-crop cultivation reduced pigweed biomass and density in most site-years 3 and 8 WAP. An herbicide program is required to achieve pigweed control and should be integrated with row-crop cultivation or narrow row widths to reduce the risk of herbicide resistance. Additional research is required to optimize the use of CC as an integrated pigweed management strategy in dryland grain sorghum.