To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Dispersal is a key ecological process affecting community dynamics and the maintenance of populations. There is increasing awareness of the need to understand individual dispersal potential to better inform population-level dispersal, allowing more accurate models of the spread of invasive and beneficial insects, aiding crop and pest management strategies. Here, fine-scale movements of Poecilus cupreus, an important agricultural carabid predator, were recorded using a locomotion compensator and key movement characteristics were quantified. Net displacement increased more rapidly than predicted by a simple correlated random walk model with near ballistic behaviour observed. Individuals displayed a latent ability to head on a constant bearing for protracted time periods, despite no clear evidence of a population level global orientation bias. Intermittent bouts of movement and non-movement were observed, with both the frequency and duration of bouts of movement varying at the inter- and intra-individual level. Variation in movement behaviour was observed at both the inter- and intra- individual level. Analysis suggests that individuals have the potential to rapidly disperse over a wider area than predicted by simple movement models parametrised at the population level. This highlights the importance of considering the role of individual variation when analysing movement and attempting to predict dispersal distances.
Compulsory admission procedures of patients with mental disorders vary between countries in Europe. The Ethics Committee of the European Psychiatric Association (EPA) launched a survey on involuntary admission procedures of patients with mental disorders in 40 countries to gather information from all National Psychiatric Associations that are members of the EPA to develop recommendations for improving involuntary admission processes and promote voluntary care.
The survey focused on legislation of involuntary admissions and key actors involved in the admission procedure as well as most common reasons for involuntary admissions.
We analyzed the survey categorical data in themes, which highlight that both medical and legal actors are involved in involuntary admission procedures.
We conclude that legal reasons for compulsory admission should be reworded in order to remove stigmatization of the patient, that raising awareness about involuntary admission procedures and patient rights with both patients and family advocacy groups is paramount, that communication about procedures should be widely available in lay-language for the general population, and that training sessions and guidance should be available for legal and medical practitioners. Finally, people working in the field need to be constantly aware about the ethical challenges surrounding compulsory admissions.
To evaluate total usual intakes and biomarkers of micronutrients, overall dietary quality and related health characteristics of US older adults who were overweight or obese compared with a healthy weight.
Two 24-h dietary recalls, nutritional biomarkers and objective and subjective health characteristic data were analysed from the National Health and Nutrition Examination Survey 2011–2014. We used the National Cancer Institute method to estimate distributions of total usual intakes from foods and dietary supplements for eleven micronutrients of potential concern and the Healthy Eating Index (HEI)-2015 score.
Older adults aged ≥60 years (n 2969) were categorised by sex and body weight status, using standard BMI categories. Underweight individuals (n 47) were excluded due to small sample size.
A greater percentage of obese older adults compared with their healthy-weight counterparts was at risk of inadequate Mg (both sexes), Ca, vitamin B6 and vitamin D (women only) intakes. The proportion of those with serum 25-hydroxyvitamin D < 40 nmol/l was higher in obese (12 %) than in healthy-weight older women (6 %). Mean overall HEI-2015 scores were 8·6 (men) and 7·1 (women) points lower in obese than in healthy-weight older adults. In addition, compared with healthy-weight counterparts, obese older adults were more likely to self-report fair/poor health, use ≥ 5 medications and have limitations in activities of daily living and cardio-metabolic risk factors; and obese older women were more likely to be food-insecure and have depression.
Our findings suggest that obesity may coexist with micronutrient inadequacy in older adults, especially among women.
Nanosized, well-dispersed titania particles were synthesized via a hydrothermal method using multiwalled carbon nanotubes (MWCNTs) as structural modifiers during the nucleation process to decrease aggregation. Synthesized TiO2/MWCNT composites containing different amounts of MWCNTs were characterized using N2 physisorption, XRD, spectroscopic techniques (Raman, UV-visible, and X-ray photoelectron), and electron microscopy to illuminate the morphology, crystal structure, and surface chemistry of the composites. Photocatalytic performance was evaluated by measuring the degradation of acetaldehyde in a batch reactor under UV illumination. Average rate constants decrease in the following order: TiO2/MWCNT-1% > TiO2 > TiO2/MWCNT-5%. Addition of MWCNTs beyond the optimum loading ratio of 1:100 (MWCNT:TiO2) diminishes the effectiveness of the photocatalyst and the synergistic effect between MWCNTs and TiO2. The primary mechanism for photocatalytic activity enhancement in TiO2/MWCNT-1% is thought to be due to increased porosity, hydroxyl enrichment on the surface, and high dispersion of TiO2 particles.
Introduction: Lacerations are common in children presenting to the emergency department (ED). They are often uncooperative when sutures are needed and may require procedural sedation. Few studies have evaluated intranasal (IN) ketamine for procedural sedation in children, with doses from 3 to 9 mg/kg used mostly for dental procedures. In a previous dose escalation trial, DosINK-1, 6 mg/kg was found to be the optimal IN ketamine dose for procedural sedation for sutures in children. In this trial, we aim to further evaluate the efficacy of this dose. Methods: We conducted a multicentre single-arm clinical trial. A convenience sample of 30 uncooperative children between 1 and 12 years (10 to 30 kg) with no cardiac or kidney disease, active respiratory infection, prior administration of opioid or sedative agents received 6 mg/kg of IN ketamine using an atomizer for their laceration repair with sutures in the ED. The primary outcome was defined as the proportion (95% CI) of patients who achieved an adequate procedural sedation evaluated with the PERC/PECARN consensus criteria. Results: Thirty patients were recruited from April 2018 to November 2019 in 2 pediatric ED. The median age was 3.2 (interquartile range(IQR), 1.9 to 4.7) years-old with laceration of more than 2 cm in 20 (67%) patients and in the face in 21 (70%) cases. Sedation was effective in 18 out of 30 children 60% (95%CI, 45 to 80), was suboptimal in 6 patients (20%) with a procedure completed with minimal difficulties, and unsuccessful in the remaining 6 (20%), all without serious adverse event. Similarly, 21/30 (70%) physicians were willing to reuse IN ketamine at the same doses and 25 parents (83%) would agree to the same sedation in the future. Median time to return to baseline status was 58 min (IQR, 33 to 73). One patient desaturated during the procedure and required transitory oxygen and repositioning. After the procedure, 1 (3%) patient had headache, 1 (3%) patient had nausea, and 2 (7%) patients vomited. Conclusion: A single dose of 6 mg/kg of IN Ketamine for laceration repair with sutures in uncooperative children is safe and facilitated the procedure in 60% (95%CI, 45 to 80) of patients, was suboptimal in 20% and unsuccessful in 20% of patients. As seen with IV ketamine, an available additional dose of IN ketamine for some children if needed could potentially increase proportion of successful sedation. However, the safety and efficacy of repeated doses needs to be addressed.
Introduction: Venipuncture is a frequent cause of pain and distress in the pediatric emergency department (ED). Distraction, which can improve patient experience, remains the most studied psychological intervention. Virtual reality (VR) is a method of immersive distraction that can contribute to the multi-modal management of procedural pain and distress. Methods: The main objectives of this study were to determine the feasibility and acceptability of Virtual Reality (VR) distraction for pain management associated with venipunctures and to examine its preliminary effects on pain and distress in the pediatric ED. Children 7-17 years requiring a venipuncture in the pediatric ED were recruited. Participants were randomized to either a control group (standard care) or intervention group (standard of care + VR). Principal clinical outcome was the mean level of procedural pain, measured by the verbal numerical rating scale (VNRS). Distress was also measured using the Child Fear Scale (CFS) and the Procedure Behavior Check List (PBCL) and memory of pain using the VNRS. Side effects were documented. Results: A total of 63 patients were recruited. Results showed feasibility and acceptability of VR in the PED and overall high satisfaction levels (79% recruitment rate of eligible families, 90% rate of VR game completion, and overall high mean satisfaction levels). There was a significantly higher level of satisfaction among healthcare providers in the intervention group, and 93% of those were willing to use this technology again for the same procedure. Regarding clinical outcomes, no significant difference was observed between groups on procedural pain. Distress evaluated by proxy (10/40 vs 13.2/40, p = 0.007) and memory of pain at 24 hours (2.4 vs 4.2, p = 0.027) were significantly lower in the VR group. Venipuncture was successful on first attempt in 23/31 patients (74%) in the VR group and 15/30 (50%) patients in the control group (p = 0.039). Five of the 31 patients (16%) in the VR group reported side effects Conclusion: The addition of VR to standard care is feasible and acceptable for pain and distress management during venipunctures in the pediatric ED. There was no difference in self-reported procedural pain between groups. Levels of procedural distress and memory of pain at 24 hours were lower in the VR group.
Introduction: Emergency department (ED) buprenorphine/naloxone inductions for opioid use disorder are an effective and safe way to initiate addictions care in the ED. Kelowna General Hospital's ED buprenorphine/naloxone (KEDSS) program was implemented in September 2018 in order to respond to a community need for accessible and evidence-based addictions care. The objective of our program evaluation study was to examine the implementation of the first five months of the KEDSS program through evaluating patient characteristics and service outcomes. Methods: The KEDSS treatment pathway consists of a standardized protocol (pre-printed order set) to facilitate buprenorphine/naloxone induction and stabilization in the acute care setting (ED and inpatient wards) at Kelowna General Hospital, a community academic hospital. All patients referred to the outpatient addictions clinic via the order set during September 2018-January 2019 (the first 5 months) were included in the study population. A retrospective descriptive chart review was completed. Outcome measures included population characteristics (sociodemographic information, clinical characteristics) and service outcomes (number of patients initiated, patient follow-up). Descriptive statistics and bivariate analyses using t-tests or Pearson's χ2 statistic, as appropriate, were conducted to compare the ED-initiated group with the inpatient-initiated group. Results: During the first five months of the KEDSS program, a total of 35 patients (26% female, mean age 36.6 years, 54% homeless) were started on the treatment pathway, 16 (46%) in the ED. Compared to the inpatient-initiated group, the ED-initiated group were less likely to have psychiatric comorbidities (ED 1.0 vs. inpatient 1.5, p = 0.002), require methadone or sustained-release oral morphine (ED 13% vs. inpatient 37%, p = 0.048), and have attended follow-up (ED 56% vs. inpatient 84%, p = 0.004). Conclusion: This study provides a preliminary look at a new opioid agonist therapy (OAT) treatment pathway (KEDSS) at Kelowna General Hospital, and provides insight into the population that is accessing the program. We found that the majority of patients who are started on buprenorphine/naloxone in the ED are seen in follow-up at the addictions clinic. Future work will examine ongoing follow-up and OAT adherence rates in the study population to quantify the program's impact on improving access to addictions treatment within this community hospital setting.
In the UK, blood investigations ordered by the Mental Health Trust are usually carried out by Acute hospitals. The results are not immediately accessible by the staff of Mental Health Trusts on the computer due to confidentiality and lack of shared software access between trusts. This has a significant impact on care management of psychiatric patients often resulting in delay in clinical decisions.
We encountered similar problem where the results of the tests ordered by the staff were not immediately accessible to them, as these investigations are carried out by the local acute hospital. To address this issue a project was chartered.
Of the project was to develop a protocol between the trusts so that the staff of Mental Health Trust could access the results of investigation on local computers as soon as they become available and to evaluate its impact on service.
A steering committee including Specialty Registrar, Pharmacist and Matron was constituted. The committee met regularly and evolved strategy with representatives of the Acute Trust. The main concern of the Acute Trust was patient confidentiality and software access. Following regular meetings and correspondence shared-protocol was developed.
It was agreed that the acute trust would install the software on all mental health trust computers. The staff would be trained to use the software and access results. To address the issue of confidentiality, flowchart of sponsorships of the shared-protocol was developed. Accordingly, all the medics would be sponsored for access by Medical Director, Nursing Staff by Matron and Pharmacists by Chief Pharmacist. This protocol ensured that all the staff trained are accounted and IT department could monitor any unauthorised access of data.
We have noticed a big improvement in the quality of clinical practice as a result. Unnecessary delays in clinical decisions have been avoided. We feel such a shared-protocol could be developed in other hospitals that are faced with similar access issues.
Selenium (Se) is an essential element for human health. However, our knowledge of the prevalence of Se deficiency is less than for other micronutrients of public health concern such as iodine, iron and zinc, especially in sub-Saharan Africa (SSA). Studies of food systems in SSA, in particular in Malawi, have revealed that human Se deficiency risks are widespread and influenced strongly by geography. Direct evidence of Se deficiency risks includes nationally representative data of Se concentrations in blood plasma and urine as population biomarkers of Se status. Long-range geospatial variation in Se deficiency risks has been linked to soil characteristics and their effects on the Se concentration of food crops. Selenium deficiency risks are also linked to socio-economic status including access to animal source foods. This review highlights the need for geospatially-resolved data on the movement of Se and other micronutrients in food systems which span agriculture–nutrition–health disciplinary domains (defined as a GeoNutrition approach). Given that similar drivers of deficiency risks for Se, and other micronutrients, are likely to occur in other countries in SSA and elsewhere, micronutrient surveillance programmes should be designed accordingly.
To examine public commitments for encouraging United States consumers to make healthy dietary purchases with their Supplemental Nutrition Assistance Program (SNAP) benefits among of prevalent SNAP-authorised retailers.
National SNAP-authorised retail landscape in addition to stores located in California and Virginia, two states targetted for a Partnership for a Healthier America pilot social marketing campaign.
SNAP-authorised retailers with the most store locations in selected settings.
A review of retailers’ publicly available business information was conducted (November 2016–February 2017). Webpages and grey literature sources were accessed to identify corporate social responsibility (CSR) reports and commitments describing strategies to encourage healthy consumer purchases aligned with the 2015–2020 Dietary Guidelines for Americans. Evidence was organised using a marketing-mix and choice-architecture (MMCA) framework to characterise strategies used among eight possible types (i.e. place, profile, portion, pricing, promotion, priming, prompting and proximity).
Of the SNAP-authorised retailers (n 38) reviewed, more than half (n 20; 52·6 %) provided no information in the public domain relevant to the research objective. Few retailers (n 8; 21·1 %) had relevant CSR information; grey literature sources (n 52 articles across seventeen retailers) were more commonly identified. SNAP-authorised retailers in majority committed to increasing the number of healthy products available for purchase (profile).
Substantial improvements are needed to enhance the capacity and commitments of SNAP-authorised retailers to use diverse strategies to promote healthy purchases among SNAP recipients. Future research could explore feasible approaches to improve dietary behaviours through sector changes via public–private partnerships, policy changes, or a combination of government regulatory and voluntary business actions.
An experiment was conducted to test the hypothesis that meat products have digestible indispensable amino acid scores (DIAAS) >100 and that various processing methods will increase standardised ileal digestibility (SID) of amino acids (AA) and DIAAS. Nine ileal-cannulated gilts were randomly allotted to a 9 × 8 Youden square design with nine diets and eight 7-d periods. Values for SID of AA and DIAAS for two reference patterns were calculated for salami, bologna, beef jerky, raw ground beef, cooked ground beef and ribeye roast heated to 56, 64 or 72°C. The SID of most AA was not different among salami, bologna, beef jerky and cooked ground beef, but was less (P < 0·05) than the values for raw ground beef. The SID of AA for 56°C ribeye roast was not different from the values for raw ground beef and 72°C ribeye roast, but greater (P < 0·05) than those for 64°C ribeye roast. For older children, adolescents and adults, the DIAAS for all proteins, except cooked ground beef, were >100 and bologna and 64°C ribeye roast had the greatest (P < 0·05) DIAAS. The limiting AA for this age group were sulphur AA (beef jerky), leucine (bologna, raw ground beef and cooked ground beef) and valine (salami and the three ribeye roasts). In conclusion, meat products generally provide high-quality protein with DIAAS >100 regardless of processing. However, overcooking meat may reduce AA digestibility and DIAAS.
The range of chronological methods described by the term “luminescence dating” provides a rich set of tools for dating many types of events relevant to archaeological research. These include assessing the depositional age of sediments (the time elapsed since those sediment were deposited by, for example, water, wind, or human activity), and estimating the time since pottery, casting cores, or stones were last fired/heated. Following an initial suggestion by Daniels et al. (1953), luminescence dating methods were introduced into the archaeological context by Aitken et al. (1964) with the thermoluminescence (TL) dating of pottery. Since then, considerable improvements in understanding the basic underlying physical mechanisms have been translated into significant methodological breakthroughs. Notable among these was the development of optically stimulated luminescence (OSL) methods (Huntley et al. 1985) and the improved confidence in dating sedimentary material that this brought. A more recent technologically driven advance was the dating of individual sand grains (so-called single-grain dating), allowing more in-depth assessments of dating reliability and widening the applicability of OSL dating (also referred to as “optical dating”).
Improved budgeting of heat loads arising from radiogenic heating in high level wastes (HLW) could allow enhanced usage of geological disposal facility space. Separation of high heat generating nuclides from HLW, such as Cs, would simplify management of heat loads. A potential host matrix for Cs-disposal is hollandite. The incorporation of Cs into the hollandite phase is aided by substitution of cations on the B-site of the structure; these ions may include Ni and Zn. Two series of hollandites, Ni-substituted and Zn-substituted, were synthesised via an alkoxide-nitrate route and consolidated by cold uniaxial pressing and sintering or by hot isostatic pressing. Characterisation of the resultant material by X-ray diffraction and scanning electron microscopy found that hollandite was formed for all levels of substitution. Materials produced via HIP were found to be denser indicating lower Cs loss. HIPed Ni hollandites were found to contain fewer secondary phases and it was concluded that they were the most suitable candidates
A growing body of research shows that women legislators outperform their male counterparts in the legislative arena, but scholars have yet to examine whether this pattern emerges in non-policy aspects of representation. We conducted an audit study of 6,000 U.S. state legislators to analyze whether women outperform or underperform men on constituency service in light of the extra effort they spend on policy. We find that women are more likely to respond to constituent requests than men, even after accounting for their heightened level of policy activity. Female legislators are the most responsive in conservative districts, where women may see the barriers to their election as especially high. We then demonstrate that our findings are not a function of staff responsiveness, legislator ideology, or responsiveness to female constituents or gender issues. The results provide additional evidence that women perform better than their male counterparts across a range of representational activities.
Prenatal exposure to persistent organic pollutants (POPs) has been associated with the development of metabolic syndrome-related diseases in offspring. According to epidemiological studies, father’s transmission of environmental effects in addition to mother’s can influence offspring health. Moreover, maternal prenatal dietary folic acid (FA) may beneficially impact offspring health. The objective is to investigate whether prenatal FA supplementation can overcome the deleterious effects of prenatal exposure to POPs on lipid homeostasis and inflammation in three generations of male rat descendants through the paternal lineage. Female Sprague-Dawley rats (F0) were exposed to a POPs mixture (or corn oil) +/− FA supplementation for 9 weeks before and during gestation. F1 and F2 males were mated with untreated females. Plasma and hepatic lipids were measured in F1, F2, and F3 males after 12-h fast. Gene expression of inflammatory cytokines was determined by qPCR in epididymal adipose tissue. In F1 males, prenatal POPs exposure increased plasma lipids at 14 weeks old and hepatic lipids at 28 weeks old and prenatal FA supplementation decreased plasma total cholesterol at 14 weeks old. Prenatal POPs exposure decreased plasma triglycerides at 14 weeks old in F2 males. No change was observed in inflammatory markers. Our results show an impact of the paternal lineage on lipid homeostasis in rats up to the F2 male generation. FA supplementation of the F0 diet, regardless of POPs exposure, lowered plasma cholesterol in F1 males but failed to attenuate the deleterious effects of prenatal POPs exposure on plasma and hepatic lipids in F1 males.
In the present study, we aimed to compare anthropometric indicators as predictors of mortality in a community-based setting.
We conducted a population-based longitudinal study nested in a cluster-randomized trial. We assessed weight, height and mid-upper arm circumference (MUAC) on children 12 months after the trial began and used the trial’s annual census and monitoring visits to assess mortality over 2 years.
Children aged 6–60 months during the study.
Of 1023 children included in the study at baseline, height-for-age Z-score, weight-for-age Z-score, weight-for-height Z-score and MUAC classified 777 (76·0 %), 630 (61·6 %), 131 (12·9 %) and eighty (7·8 %) children as moderately to severely malnourished, respectively. Over the 2-year study period, fifty-eight children (5·7 %) died. MUAC had the greatest AUC (0·68, 95 % CI 0·61, 0·75) and had the strongest association with mortality in this sample (hazard ratio = 2·21, 95 % CI 1·26, 3·89, P = 0·006).
MUAC appears to be a better predictor of mortality than other anthropometric indicators in this community-based, high-malnutrition setting in Niger.
Hurricane Maria caused catastrophic damage in Puerto Rico, increasing the risk for morbidity and mortality in the post-impact period. We aimed to establish a syndromic surveillance system to describe the number and type of visits at 2 emergency health-care settings in the same hospital system in Ponce, Puerto Rico.
We implemented a hurricane surveillance system by interviewing patients with a short questionnaire about the reason for visit at a hospital emergency department and associated urgent care clinic in the 6 mo after Hurricane Maria. We then evaluated the system by comparing findings with data from the electronic medical record (EMR) system for the same time period.
The hurricane surveillance system captured information from 5116 participants across the 2 sites, representing 17% of all visits captured in the EMR for the same period. Most visits were associated with acute illness/symptoms (79%), followed by injury (11%). The hurricane surveillance and EMR data were similar, proportionally, by sex, age, and visit category.
The hurricane surveillance system provided timely and representative data about the number and type of visits at 2 sites. This system, or an adapted version using available electronic data, should be considered in future disaster settings.
The aim of this study was to describe individuals seeking care for injury at a major emergency department (ED) in southern Puerto Rico in the months after Hurricane Maria on September 20, 2017.
After informed consent, we used a modified version of the Natural Disaster Morbidity Surveillance Form to determine why patients were visiting the ED during October 16, 2017–March 28, 2018. We analyzed visits where injury was reported as the primary reason for visit and whether it was hurricane-related.
Among 5 116 patients, 573 (11%) reported injury as the primary reason for a visit. Of these, 10% were hurricane-related visits. The most common types of injuries were abrasions, lacerations, and cuts (43% of all injury visits and 50% of hurricane-related visits). The most common mechanisms of injury were falls, slips, trips (268, 47%), and being hit by/or against an object (88, 15%). Most injury visits occurred during the first 3 months after the hurricane.
Surveillance after Hurricane Maria identified injury as the reason for a visit for about 1 in 10 patients visiting the ED, providing evidence on the patterns of injuries in the months following a hurricane. Public health and emergency providers can use this information to anticipate health care needs after a disaster.
Human alteration of the planet’s terrestrial landscapes for agriculture, habitation and commerce is reshaping wildlife communities. The threat of land cover change to wildlife is pronounced in regions with rapidly growing human populations. We investigated how species richness and species-specific occurrence of bats changed as a function of land cover and canopy (tree) cover across a rapidly changing region of Florida, USA. Contrary to our predictions, we found negligible effects of agriculture and urban development on the occurrence of all species. In contrast, we found that a remotely sensed metric of canopy cover on a broad scale (25 km2) was a good predictor of the occurrence of eight out of ten species. The occurrence of all smaller bats (vespertilionids) in our study increased with 0–50% increases in canopy cover, while larger bats showed different patterns. Occurrence of Brazilian free-tailed bats (Tadarida brasiliensis) decreased with increasing canopy cover, and Florida bonneted bats (Eumops floridanus) were not influenced by canopy cover. We conclude that remotely sensed measures of canopy cover can provide a more reliable predictor of bat species richness than land-cover types, and efforts to prevent the loss of bat diversity should consider maintaining canopy cover across mosaic landscapes with diverse land-cover types.