To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This article emerged as the human species collectively have been experiencing the worst global pandemic in a century. With a long view of the ecological, economic, social, and political factors that promote the emergence and spread of infectious disease, archaeologists are well positioned to examine the antecedents of the present crisis. In this article, we bring together a variety of perspectives on the issues surrounding the emergence, spread, and effects of disease in both the Americas and Afro-Eurasian contexts. Recognizing that human populations most severely impacted by COVID-19 are typically descendants of marginalized groups, we investigate pre- and postcontact disease vectors among Indigenous and Black communities in North America, outlining the systemic impacts of diseases and the conditions that exacerbate their spread. We look at how material culture both reflects and changes as a result of social transformations brought about by disease, the insights that paleopathology provides about the ancient human condition, and the impacts of ancient globalization on the spread of disease worldwide. By understanding the differential effects of past epidemics on diverse communities and contributing to more equitable sociopolitical agendas, archaeology can play a key role in helping to pursue a more just future.
Although no pharmacological treatment has proved to be highly effective for reducing cocaine dependence, several medications have been tested over the last decade and have shown promising efficacy. Modafinil (Provigil), known as a treatment for day time sleepiness, and Topiramate (Topamax), an anti-epileptic medication also prescribed for migraine, have been shown to be effective in controlled clinical trials. We have recently started a major study utilizing Positron Emission Tomography (PET) brain imaging to monitor the progress of pharmacotherapy with modafinil or topiramate in cocaine-dependent and methadone-maintained cocaine-dependent patients. Patients will be assessed before treatment, and again after 4 weeks of pharmacotherapy. The aims of the project are to study effects of the two medications on cocaine dependence and craving, and on dopamine binding in the brain. At each assessment session, patients will undergo PET with [11C] raclopride to image the dopamine receptor DRD2. To trigger craving, patients will then be exposed to a videotape showing cocaine use; a questionnaire will be used to record their subjective responses, and a second PET scan will be performed with [18F] fluorodeoxyglucose (FDG) to image cerebral glucose metabolism during craving. This protocol was designed to enable us to study changes resulting from pharmacotherapy on dopamine binding in the brain, and on craving as reflected both in subjective measures and regional cerebral glucose metabolism. In addition, we will investigate the association between subjective measures of craving for cocaine and the level of dopamine DRD2 receptor occupancy in the brain before and after treatment. Notwithstanding the complexity of the clinical and therapeutic reality characterizing cocaine dependence, we hope to present preliminary evidence for the relative efficacy of these two promising medications in treatment for cocaine. dependence. This evidence could also elucidate the brain mechanisms underlying cocaine craving and dependence in cocaine-dependent patients.
Introduction: Oxygen is commonly administered to prehospital patients presenting with acute myocardial infarction (AMI). We conducted a systematic review to determine if oxygen administration, in AMI, impacts patient outcomes. Methods: We conducted a systematic search using MeSH terms and keywords in Medline, Embase, Cochrane Database of Systematic Reviews, Cochrane Central, clinicaltrials.gov and ISRCTN for relevant randomized controlled trials and observational studies comparing oxygen administration and no oxygen administration. The outcomes of interest were: mortality (≤30 days, in-hospital, and intermediate 2-11 months), infarct size, and major adverse cardiac events (MACE). Risk of Bias assessments were performed and GRADE methodology was employed to assess quality and overall confidence in the effect estimate. A meta-analysis was performed using RevMan 5 software. Results: Our search yielded 1192 citations of which 48 studies were reviewed as full texts and a total of 8 studies were included in the analysis. All evidence was considered low or very low quality. Five studies reported on mortality finding low quality evidence of no benefit or harm. Low quality evidence demonstrated no benefit or harm from supplemental oxygen administration. Similarly, no benefit or harm was found in MACE or infarct size (very low quality). Normoxia was defined as oxygen saturation measured via pulse oximetry at ≥90% in one recent study and ≥94% in another. Conclusion: We found low and very low quality evidence that the administration of supplemental oxygen to normoxic patients experiencing AMI, provides no clear harm nor benefit for mortality or MACE. The evidence on infarct size was inconsistent and warrants further prospective examination.
Introduction: Opioids are routinely administered for analgesia to prehospital patients experiencing chest discomfort from acute myocardial infarction (AMI). We conducted a systematic review to determine if opioid administration impacts patient outcomes. Methods: We conducted a systematic search using MeSH terms and keywords in Medline, Embase, Cochrane Database of Systematic Reviews, Cochrane Central and Clinicaltrials.gov for relevant randomized controlled trials and observational studies comparing opioid administration in AMI patients from 1990 to 2017. The outcomes of interest were: all-cause short-term mortality (≤30 days), major adverse cardiac events (MACE), platelet activity and aggregation, immediate adverse events, infarct size, and analgesia. Included studies were hand searched for additional citations. Risk of Bias assessments were performed and GRADE methodology was employed to assess quality and overall confidence in the effect estimate. Results: Our search yielded 3001 citations of which 19 studies were reviewed as full texts and a total of 9 studies were included in the analysis. The studies predominantly reported on morphine as the opioid. Five studies reported on mortality (≤30 days), seven on MACE, four on platelet activity and aggregation, two on immediate adverse events, two on infarct size and none on analgesic effect. We found low quality evidence suggesting no benefit or harm in terms of mortality or MACE. However, low quality evidence indicates that opioids increase infarct size. Low-quality evidence also shows reduced serum P2Y12 (eg: clopidogrel and ticagrelor) active metabolite levels and increased platelet reactivity in the first several hours post administration following an increase in vomiting. Conclusion: We find low and very low quality evidence that the administration of opioids in STEMI may be adversely related to vomiting and some surrogate outcomes including increased infarct size, reduced serum P2Y12 levels, and increased platelet activity. We found no clear benefit or harm on patient-oriented clinical outcomes including mortality.
Lowering protein level in diets for piglets urge to have knowledge on the piglet’s requirements for essential amino acids (AA) and their interactions. The present studies aimed to determine the interaction between the dietary level of valine (Val) and tryptophan (Trp) and the effect of AA imbalance at two levels of dietary Val on the growth performance of post-weaning piglets. In Experiment 1 (duration 4 weeks), the effects of supplementation of free l-Val (1.0 g/kg) and/or l-Trp (0.5 g/kg) in a low-CP diet (CP 17.7%), marginal in Trp and Val, was studied in a 2×2 factorial design and using an additional reference treatment (CP 19.5%). In Experiment 2 (duration 5 weeks), the influence of a stepwise increase in excess supply of isoleucine (Ile), histidine (His) and leucine (Leu), up to 10, 10% and 30% relative to their requirement values respectively, was evaluated at 60% or 70% standardized ileal digestible (SID) Val relative to SID lysine, using a 3×2 factorial design. In Experiment 1, over the whole experimental period, feed intake (FI) was affected by dietary Trp level (P<0.05) and feed conversion ratio (FCR) by both the level of Trp and Val in the diet (both P<0.05). Increasing Trp level increased FI and decreased FCR while increasing dietary Val level reduced FI and increased FCR. For BW gain (BWG), there was an interaction between dietary level of Trp and Val (P<0.05). Valine supplementation decreased BWG using a diet marginal in Trp, whereas it increased BWG when using a Trp sufficient diet. Piglets fed the low-CP diet with adequate levels of Val and Trp showed at least same performance compared to piglets fed the high CP reference diet. In Experiment 2, increasing dietary Val improved FI and BWG (P<0.001) and tended to improve FCR. Dietary AA excess for Ile, His and Leu reduced FI and BWG (P<0.05) and only affected FCR (P<0.01) in the 1st week of the study. Dietary level of Val and AA excess did not show interactive effects, except for FCR over the final 2 weeks of the study (P<0.05). In conclusion, an interaction exists between dietary supply of Val and Trp on the zootechnical performance of post-weaning piglets and dietary AA excess for Ile, Leu and His, reduces growth performance of piglets in low-protein diets, independent of the dietary level of Val.
In order to control and optimize chicken quality products, it is necessary to improve the description of the responses to dietary amino acid (AA) concentration in terms of carcass composition and meat quality, especially during the finishing period. The aim of this study was to investigate the effects of Lysine (Lys, i.e. a limiting AA used as reference in AA nutrition) and AA other than Lys (AA effect). In total, 12 experimental diets were formulated with four levels of digestible Lys content (7, 8.5, 10 and 11.5 g/kg) combined with either a low (AA−), adequate control (AAc) and high (AA+) amount of other essential AA (EAA) expressed as a proportion of Lys. They were distributed to male Ross PM3 from 3 to 5 weeks of age. No significant AA×Lys interaction was found for growth performance or carcass composition. Body weight and feed conversion ratio were significantly improved by addition of Lys but were impaired in broilers receiving the AA− diets, whereas breast meat yield and abdominal fat were only affected by Lys. No additional benefit was found when the relative amount of other EAA was increased. There was a significant AA×Lys interaction on most of the meat quality traits, including ultimate pH, color and drip loss, with a significant effect of both AA and Lys. For example, AA− combined with reduced Lys level favored the production of meat with high ultimate pH (>6.0), dark color and low drip loss whereas more acid, light and exudative meat (<5.85) was produced with AA+ combined with a low Lys level. In conclusion, growth performance, carcass composition and meat quality are affected by the levels of dietary Lys and AA in finishing broilers. In addition, interactive responses to Lys and AA are found on meat quality traits, leading to great variations in breast pHu, color and drip loss according AA balance or imbalance.
The Single Ventricle Reconstruction trial randomised neonates with hypoplastic left heart syndrome to a systemic-to-pulmonary-artery shunt strategy. Patients received care according to usual institutional practice. We analysed practice variation at the Stage II surgery to attempt to identify areas for decreased variation and process control improvement.
Prospectively collected data were available in the Single Ventricle Reconstruction public-use database. Practice variation across 14 centres was described for 397 patients who underwent Stage II surgery. Data are centre-level specific and reported as interquartile ranges across all centres, unless otherwise specified.
Preoperative Stage II median age and weight across centres were 5.4 months (interquartile range 4.9–5.7) and 5.7 kg (5.5–6.1), with 70% performed electively. Most patients had pre-Stage-II cardiac catheterisation (98.5–100%). Digoxin was used by 11/14 centres in 25% of patients (23–31%), and 81% had some oral feeds (68–84%). The majority of the centres (86%) performed a bidirectional Glenn versus hemi-Fontan. Median cardiopulmonary bypass time was 96 minutes (75–113). In aggregate, 26% of patients had deep hypothermic circulatory arrest >10 minutes. In 13/14 centres using deep hypothermic circulatory arrest, 12.5% of patients exceeded 10 minutes (8–32%). Seven centres extubated 5% of patients (2–40) in the operating room. Postoperatively, ICU length of stay was 4.8 days (4.0–5.3) and total length of stay was 7.5 days (6–10).
In the Single Ventricle Reconstruction Trial, practice varied widely among centres for nearly all perioperative factors surrounding Stage II. Further analysis may facilitate establishing best practices by identifying the impact of practice variation.
We performed a spatial-temporal analysis to assess household risk factors for Ebola virus disease (Ebola) in a remote, severely-affected village. We defined a household as a family's shared living space and a case-household as a household with at least one resident who became a suspect, probable, or confirmed Ebola case from 1 August 2014 to 10 October 2014. We used Geographic Information System (GIS) software to calculate inter-household distances, performed space-time cluster analyses, and developed Generalized Estimating Equations (GEE). Village X consisted of 64 households; 42% of households became case-households over the observation period. Two significant space-time clusters occurred among households in the village; temporal effects outweighed spatial effects. GEE demonstrated that the odds of becoming a case-household increased by 4·0% for each additional person per household (P < 0·02) and 2·6% per day (P < 0·07). An increasing number of persons per household, and to a lesser extent, the passage of time after onset of the outbreak were risk factors for household Ebola acquisition, emphasizing the importance of prompt public health interventions that prioritize the most populated households. Using GIS with GEE can reveal complex spatial-temporal risk factors, which can inform prioritization of response activities in future outbreaks.
Reducing the dietary CP content is an efficient way to limit nitrogen excretion in broilers but, as reported in the literature, it often reduces performance, probably because of an inadequate provision in amino acids (AA). The aim of this study was to investigate the effect of decreasing the CP content in the diet on animal performance, meat quality and nitrogen utilization in growing-finishing broilers using an optimized dietary AA profile based on the ideal protein concept. Two experiments (1 and 2) were performed using 1-day-old PM3 Ross male broilers (1520 and 912 for experiments 1 and 2, respectively) using the minimum AA:Lys ratios proposed by Mack et al. with modifications for Thr and Arg. The digestible Thr (dThr): dLys ratio was increased from 63% to 68% and the dArg:dLys ratio was decreased from 112% to 108%. In experiment 1, the reduction of dietary CP from 19% to 15% (five treatments) did not alter feed intake or BW, but the feed conversion ratio was increased for the 16% and 15% CP diets (+2.4% and +3.6%, respectively), while in experiment 2 (three treatments: 19%, 17.5% and 16% CP) there was no effect of dietary CP on performance. In both experiments, dietary CP content did not affect breast meat yield. However, abdominal fat content (expressed as a percentage of BW) was increased by the decrease in CP content (up to +0.5 and +0.2 percentage point, in experiments 1 and 2, respectively). In experiment 2, meat quality traits responded to dietary CP content with a higher ultimate pH and lower lightness and drip loss values for the low CP diets. Nitrogen retention efficiency increased when reducing CP content in both experiments (+3.5 points/CP percentage point). The main consequence of this higher efficiency was a decrease in nitrogen excretion (−2.5 g N/kg BW gain) and volatilization (expressed as a percentage of excretion: −5 points/CP percentage point). In conclusion, this study demonstrates that with an adapted AA profile, it is possible to reduce dietary CP content to at least 17% in growing-finishing male broilers, without altering animal performance and meat quality. Such a feeding strategy could therefore help improving the sustainability of broiler production as it is an efficient way to reduce environmental burden associated with nitrogen excretion.
A few studies have evaluated the impact of clinical trial results on practice in paediatric cardiology. The Infant Single Ventricle (ISV) Trial results published in 2010 did not support routine use of the angiotensin-converting enzyme inhibitor enalapril in infants with single-ventricle physiology. We sought to assess the influence of these findings on clinical practice.
A web-based survey was distributed via e-mail to over 2000 paediatric cardiologists, intensivists, cardiothoracic surgeons, and cardiac advance practice nurses during three distribution periods. The results were analysed using McNemar’s test for paired data and Fisher’s exact test.
The response rate was 31.5% (69% cardiologists and 65% with >10 years of experience). Among respondents familiar with trial results, 74% reported current practice consistent with trial findings versus 48% before trial publication (p<0.001); 19% used angiotensin-converting enzyme inhibitor in this population “almost always” versus 36% in the past (p<0.001), and 72% reported a change in management or improved confidence in treatment decisions involving this therapy based on the trial results. Respondents familiar with trial results (78%) were marginally more likely to practise consistent with the trial results than those unfamiliar (74 versus 67%, p=0.16). Among all respondents, 28% reported less frequent use of angiotensin-converting enzyme inhibitor over the last 3 years.
Within 5 years of publication, the majority of respondents was familiar with the Infant Single Ventricle Trial results and reported less frequent use of angiotensin-converting enzyme inhibitor in single-ventricle infants; however, 28% reported not adjusting their clinical decisions based on the trial’s findings.
Pertussis epidemics have displayed substantial spatial heterogeneity in countries with high socioeconomic conditions and high vaccine coverage. This study aims to investigate the relationship between pertussis risk and socio-environmental factors on the spatio-temporal variation underlying pertussis infection. We obtained daily case numbers of pertussis notifications from Queensland Health, Australia by postal area, for the period January 2006 to December 2012. A Bayesian spatio-temporal model was used to quantify the relationship between monthly pertussis incidence and socio-environmental factors. The socio-environmental factors included monthly mean minimum temperature (MIT), monthly mean vapour pressure (VAP), Queensland school calendar pattern (SCP), and socioeconomic index for area (SEIFA). An increase in pertussis incidence was observed from 2006 to 2010 and a slight decrease from 2011 to 2012. Spatial analyses showed pertussis incidence across Queensland postal area to be low and more spatially homogeneous during 2006–2008; incidence was higher and more spatially heterogeneous after 2009. The results also showed that the average decrease in monthly pertussis incidence was 3·1% [95% credible interval (CrI) 1·3–4·8] for each 1 °C increase in monthly MIT, while average increase in monthly pertussis incidences were 6·2% (95% CrI 0·4–12·4) and 2% (95% CrI 1–3) for SCP periods and for each 10-unit increase in SEIFA, respectively. This study demonstrated that pertussis transmission is significantly associated with MIT, SEIFA, and SCP. Mapping derived from this work highlights the potential for future investigation and areas for focusing future control strategies.
We examined functional outcomes and quality of life of whole brain radiotherapy (WBRT) with integrated fractionated stereotactic radiotherapy boost (FSRT) for brain metastases treatment. Methods Eighty seven people with 1-3 brain metastases were enrolled on this Phase II trial of WBRT (30Gy/10)+simultaneous FSRT, (60Gy/10). Results Mean (Min-Max) baseline KPS, Mini Mental Status Exam (MMSE) and FACT-BR quality of life were 83 (70-100), 28 (21-30) and 143 (98-153). Lower baseline MMSE (but not KPS or FACT-Br) was associated with worse survival after adjusting for age, number of metastases, primary and extra-cranial disease status. Crude rates of deterioration (>10 points decrease from baseline for KPS and FACT-Br, MMSE fall to<27) ranged from 26-38% for KPS, 32-59% for FACT-Br and 0-16%for MMSE depending on the time-point assessed with higher rates generally noted at earlier time points (<6months post-treatment). Using a linear mixed models analysis, significant declines from baseline were noted for KPS and FACT-Br (largest effects at 6 weeks to 3 months) with no significant change in MMSE. Conclusions The effects on function and quality of life of this integrated treatment of WBRT+simultaneous FSRT were similar to other published series combining WBRT+SRS.
Patients with psychosis display the so-called ‘Jumping to Conclusions’ bias (JTC) – a tendency for hasty decision-making in probabilistic reasoning tasks. So far, only a few studies have evaluated the JTC bias in ‘at-risk mental state’ (ARMS) patients, specifically in ARMS samples fulfilling ‘ultra-high risk’ (UHR) criteria, thus not allowing for comparisons between different ARMS subgroups.
In the framework of the PREVENT (secondary prevention of schizophrenia) study, a JTC task was applied to 188 patients either fulfilling UHR criteria or presenting with cognitive basic symptoms (BS). Similar data were available for 30 healthy control participants matched for age, gender, education and premorbid verbal intelligence. ARMS patients were identified by the Structured Interview for Prodromal Symptoms (SIPS) and the Schizophrenia Proneness Instrument – Adult Version (SPI-A).
The mean number of draws to decision (DTD) significantly differed between ARM -subgroups: UHR patients made significantly less draws to make a decision than ARMS patients with only cognitive BS. Furthermore, UHR patients tended to fulfil behavioural criteria for JTC more often than BS patients. In a secondary analysis, ARMS patients were much hastier in their decision-making than controls. In patients, DTD was moderately associated with positive and negative symptoms as well as disorganization and excitement.
Our data indicate an enhanced JTC bias in the UHR group compared to ARMS patients with only cognitive BS. This underscores the importance of reasoning deficits within cognitive theories of the developing psychosis. Interactions with the liability to psychotic transitions and therapeutic interventions should be unravelled in longitudinal studies.
We studied the removal of seeds of three species of large-seeded tree (Astrocaryum standleyanum, Attalea butyracea and Dipteryx oleifera) from three different heights within six study plots in a lowland forest in central Panama. Fresh fruits with intact seeds fitted with industrial sewing bobbins were placed within semi-permeable exclosures. Removed seeds were tracked to deposition sites, and seed fate was determined. Removals were likely perpetrated by two small rodents, the strictly terrestrial Proechimys semispinosus and the scansorial Sciurus granatensis, because they were the most abundant small rodents in the study site during the study period and were of sufficient size to remove large seeds. Rodent abundance and fruit availability were estimated by conducting censuses. Nine microhabitat variables were measured at each deposition site to determine if these two rodents were preferentially depositing seeds in sites with certain characteristics or were randomly depositing seeds. During the study, rodents handled 98 seeds, 85 of which were not predated upon and could potentially germinate. Removal rates were not influenced by rodent abundance or fruit availability. Seeds were most frequently moved <3 m and deposited with the fruit eaten and the seed intact. However, some seeds did experience relatively long-distance dispersal (>10 m). Rodents preferentially deposited seeds in locations with large logs (>10 cm diameter), dense herbaceous cover, and an intact canopy. The number of large logs was different from random locations. Despite not being able to determine long-term fate (greater than c. 1 y), we show that these small rodents are not primarily seed predators and may in fact be important mutualists by dispersing seeds relatively long distances to favourable germination sites.