To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This article reports findings when using a molybdenum–tungsten (MoW) interlayer for diamond thin film deposition on steel substrates. The main focus was on the postdeposition stress within the diamond films and its impact on the coating's tribological properties. The effect of MoW interlayer thickness and the effect of chemical vapor deposition (CVD) process temperature have been investigated. Nanocrystalline diamond films were deposited on steel substrates with MoW interlayers (thickness of 1.1, 4.5, and 8.3 μm) at two different deposition temperatures (650 and 875 °C). It was found that when depositing good quality diamond films on steel substrates, increasing interlayer thickness and decreasing CVD process temperature have to be jointly considered to obtain the optimal result. The diamond-coated steel substrates with the 8.3 μm interlayer deposited at the lower CVD processing temperature exhibited the least residual stress combined with excellent mechanical properties.
Methods to stimulate appetite in the sick or elderly remains a challenge with few safe therapeutic options. Ghrelin is an orexigenic hormone, increasing appetite and subsequent food intake. It has received considerable attention as a therapeutic target to stimulate food intake in patients with anorexia. The identification of food-grade bioactives with proven orexigenic effects would mark significant progress in the treatment of disease-related malnutrition. This study therefore investigated the effects of two milk-derived ghrelinergic peptides on appetite and energy intake in healthy humans.
A single-blind, placebo-controlled, 3-arm (placebo, casein bioactive MF1145 and whey bioactive UL-2-141) cross-over trial was conducted in healthy male volunteers. Participants received 26 mg/kg of both the bioactives and placebo. The main outcome measures were energy & protein intake from a set breakfast and ad libitum lunch and subjective appetite sensations as assessed by visual analogue scale (VAS). Basal and postprandial levels of active ghrelin (AG) were measured. Dietary intakes were analysed using Nutritics software. Statistical analyses were performed in R.
Overall, 22 male participants (mean age 27 years) were included, average BMI was 24.6 kg/m2, (19.8 to 30.2 kg/m2). Mean energy and protein intakes at lunch when treated with placebo were 1343 kcal (95% CI: 1215–1471 kcal) and 74 g (95% CI: 66–81 g), respectively. Energy and protein intakes were not significantly different from placebo for either treatment (p = 0.918, p = 0.319 for UL-2-141 and p = 0.889, p = 0.959 for MF1145, respectively). Similarly, appetite, hunger and satiety responses on VAS were not significantly different from placebo for either treatment. AG peak post-lunch on placebo was 653 pg/ml (95% CI: 511–794 pg/ml). Treatment with UL-2-141 resulted in 139 pg/ml reduction in post-prandial AG compared to placebo and treatment with MF1145 resulted in 114 pg/ml reduction compared to placebo. This pattern was significant for both treatments (p = 0.021 and p = 0.045, respectively) however when controlling for fasting-AG, the pattern was no longer significant (p = 0.590 and p = 0.877 respectively). Pre-prandial AG peaks were not significantly different across treatments.
While these peptides have previously demonstrated ghrelinergic effects in rats, no effect on appetite or food intake in humans was identified by this study. This may be attributable to the small sample size or low dose. However, since healthy adults are often not in tune with their own physiological hunger, they may not respond strongly to simple physiological modulators and repeating the study in subjects with established anorexia may be prudent.
Over the past decade, the World Health Summit (WHS) has provided a global platform for policy-makers and decision-makers to interact with academics and practitioners on global health. Recently the WHS adopted health security into their agenda for transnational disease risks (eg, Ebola and antimicrobial resistance) that increasingly threaten multiple sectors. Global health engagement (GHE) focuses efforts across interdisciplinary and interorganizational lines to identify critical threats and provide rapid deployment of key resources at the right time for addressing health security risks. As a product of subject matter experts convening at the WHS, a special side-group has organically risen with leadership and coordination from the German Institute for Defense and Strategic Studies in support of GHE activities across governmental, academic, and industry partners. Through novel approaches and targeted methodology that maximize outcomes and streamline global health operational process, the Global Health Security Alliance (GloHSA) was born. This short conference report describes in more detail the GloHSA.
Blockchain is a distributed ledger technology for storing and transmitting information (value) that is secure, verifiable, and auditable. Two specific use-case opportunities exist, identity management and payment systems.
A secure and auditable solution for disaster refugee support.
Gap analysis, literature search, and synthesis using existing technologies.
Strategy foundation: A blockchain identity management system that utilizes the Hyperledger Fabric framework; identification on a large scale, in a distributed model that provides immutable record capabilities to prevent fraud, with the ability to incorporate biometrics and DNA; deploy applications that will provide supply-chain capabilities; cryptocurrency for recipients and other relief functions for refugees/disaster victims; components such as consensus, membership services, and Smart Contracts; cloud-based, with redundancies in multiple vendors and additional complex government cloud requirements/certifications, leveraging NIST 800–53 by utilizing a hybrid public permissions architecture.
There are an estimated 68 million refugees worldwide at any time. Valid identification is needed by most refugees to qualify for government or international donor relief. That identification is crucial in getting refugees and victims access to the aid supply chain. Blockchain stores data on a large number of computer nodes connected over the Internet. Each node contains an identical copy that is time-stamped and protected by a cryptographic technique called hashing, and control is decentralized. This blockchain strategy will revolutionize the way the government manages the $30 billion in foreign aid to refugees. It will build upon the identities established to deploy applications that will provide supply-chain capabilities, cryptocurrency for recipients, and other relief functions for refugees/disaster victims. Stakeholders beyond government will also benefit tremendously. The distributed nature of our application will provide visibility to NGOs, nonprofits, host nation stakeholders, and other relief organizations. A single system that provides information to everyone involved will almost instantaneously change the face of relief.
Use of Point-of-Care Ultrasound (US) has grown considerably in resource-limited and wilderness environments because of a combination of features, including portability, durability, and safety. However, the optimal method of powering US devices in such environments is not well established.
This project has the following aims:
1. Develop a solar power generation and storage system that maximizes power capacity and minimizes weight while being easily transportable by a single person.
2. Test the system in a real-world environment to evaluate actual performance relative to stated performance.
3. Determine the approximate US scan-time where solar systems would outperform pre-charged batteries with respect to weight.
We developed multiple solar collecting systems using a combination of polycrystalline, monocrystalline, and thin-film solar arrays paired with different powerbanks and tested them using a variety of US systems. From this, the duration of usage was calculated, which makes the solar power generation system a superior option to pre-charged batteries.
Lithium-ion energy storage was found to be superior to lead-acid batteries for multiple reasons, most prominently, weight. Several models of US systems were tested revealing that portable US systems consume between 30 to 50 watts. Tri-fold monocrystalline solar panels coupled with lithium-ion powerbanks provided the best combination of weight and transportability. Total weight of the combined solar array, powerbank, and US system is 10 kilograms and easily packs into a backpack carrier. It was found that systems using solar generating capacity become superior to pre-charged powerbanks in regard to weight at approximately 14 hours of scanning time.
While these results are not fully generalizable due to seasonal and geographic variability as well as the type of US system used, use of solar generating capacity to power US systems is optimal for extended durations of use in resource-limited environments.
Examination gloves have been previously noted as a possible barrier to hand hygiene. We performed a prospective quantitative and qualitative study to investigate. Glove usage was found to be a potential barrier to hand hygiene; this was driven by desire for personal safety and potentially learned during professional training.
We sought to assess the universal salt iodization (USI) strategy in Armenia by characterizing dietary iodine intake from naturally occurring iodine, salt-derived iodine in processed foods and salt-derived iodine in household-prepared foods.
Using a cross-sectional cluster survey model, we collected urine samples which were analysed for iodine and sodium concentrations (UIC and UNaC) and household salt samples which were analysed for iodine concentration (SI). SI and UNaC data were used as explanatory variables in multiple linear regression analyses with UIC as dependent variable, and the regression parameters were used to estimate the iodine intake sources attributable to native iodine and iodine from salt in processed foods and household salt.
Armenia is naturally iodine deficient; in 2004, the government mandated a USI strategy.
We recruited school-age children (SAC), pregnant women (PW) and non-pregnant women of reproductive age (WRA).
From thirteen sites covering all provinces, sufficient urine and table salt samples were obtained from 312 SAC, 311 PW and 332 WRA. Findings revealed significant differences between groups: contribution of native iodine ranged from 81% in PW to 46% in SAC, while household salt-derived iodine contributed from 19% in SAC to 1% in PW.
Differences between groups may reflect differences in diet. In all groups, household and processed food salt constituted a significant part of total iodine intake, highlighting the success and importance of USI in ensuring iodine sufficiency. There appears to be leeway to reduce salt intake without adversely affecting the iodine status of the population in Armenia.
Nitrogen (N) is a difficult nutrient to manage in organic farming systems, and yield reductions related to N deficiency have been reported in organic systems. Legume-based cover crops offer opportunities for biologically fixed N; however, improved quantification of N contribution is needed for cost-effective N management. A 2-yr experiment was conducted near Corvallis, OR, USA, in 2007 and 2008 to (1) evaluate biomass production and N accumulation from selected cover crop treatments, (2) compare the effects of fall-planted cover crops on broccoli [(Brassica oleraceae L. (Italica group)] yield, (3) estimate the quantity of feather meal-N replaced by cover crops. Cover crop treatments included common vetch (Vicia sativa L.), phacelia (Phacelia tanacetifolia Benth), oats (Avena sativa L.) and the mixtures phacelia plus vetch, oats plus vetch and a no-cover crop (fallow) treatment as the control. Using feather meal as an N source, four rates of N fertilizer (0, 100, 200 and 300 kg N ha−1) were randomized within each cover crop treatment in a randomized, split-plot design. Cover crop biomass and N accumulation differed between the 2 yr of the study. In 2007, total biomass accumulation ranged from 5000 to 10,000 kg ha−1, whereas in 2008, cover crop accumulation was 1500 to 5000 kg ha−1. Biomass of both phacelia and vetch (in mixtures or as sole crops) was reduced by 80% from 2007 to 2008, whereas oat biomass and weed biomass in the fallow plots was reduced by only 40% between the 2 yr. The accumulation of N was also reduced in 2008, with vetch (either as a sole crop or in mixtures) contributing less than a third of total N produced in 2007. In 2007, vetch and vetch-based cover crop mixtures increased broccoli yield compared with the fallow, providing 100–135 kg fertilizer equivalent N ha−1. But due to decreased cover crop biomass and N accumulation in 2008, vetch and vetch-based mixtures failed to increase broccoli yield, providing <20 kg N ha−1 fertilizer equivalence. In 2007, oats grown as a sole cover crop reduced broccoli yield when no supplemental N was applied. In 2008, both phacelia and oats reduced broccoli yield at all N levels, with estimated N fertilizer equivalence values of −80 to −95 kg N ha−1. Although legume and legume mixtures increased broccoli yield in only 1 yr of the experiment, addition of vetch to the mixtures reduced yield loss in both years compared with oats and phacelia grown as sole crops.
Seabird bycatch is widely regarded as the greatest threat globally to procellariiform seabirds. Although measures to reduce seabird–fishery interactions have been in existence for many years, uptake in fleets with high risk profiles remains variable. We recorded seabird bycatch and other interactions in the Namibian demersal longline fishery. Interaction rates were estimated for seasonal and spatial strata and scaled up to fishing effort data. Bycatch rates were 0.77 (95% CI 0.24–1.39) and 0.37 (95% CI 0.11–0.72) birds per 1,000 hooks in winter and summer, respectively. Scaling up to 2010, the most recent year for which complete data are available, suggests 20,567 (95% CI 6,328–37,935) birds were killed in this fishery that year. We compared bycatch rates to those from experimental fishing sets using mitigation measures (one or two bird-scaring lines and the replacement of standard concrete weights with 5 kg steel weights). All mitigation measures significantly reduced the bycatch rate. This study confirms the Namibian longline fishery has some of the highest known impacts on seabirds globally, but implementing simple measures could rapidly reduce those impacts. In November 2015 the Ministry of Fisheries and Marine Resources introduced regulations requiring the use of bird-scaring lines, line weighting and night setting in this fishery. A collaborative approach between NGOs, industry and government was important in achieving wide understanding and acceptance of the proposed mitigation measures in the lead up to the introduction of new fishery regulations.
Public agencies at all levels of government and other organizations that manage archaeological resources often face the problem of many undertakings that collectively impact large numbers of individually significant archaeological resources. Such situations arise when an agency is managing a large area, such as a national forest, land management district, park unit, wildlife refuge, or military installation. These situations also may arise in regard to large-scale development projects, such as energy developments, highways, reservoirs, transmission lines, and other major infrastructure projects that cover substantial areas. Over time, the accumulation of impacts from small-scale projects to individual archaeological resources may degrade landscape or regional-scale cultural phenomena. Typically, these impacts are mitigated at the site level without regard to how the impacts to individual resources affect the broader population of resources. Actions to mitigate impacts rarely are designed to do more than avoid resources or ensure some level of data recovery at single sites. Such mitigation activities are incapable of addressing research question at a landscape or regional scale.
We identified several factors affecting the use of quaternary ammonium-based (Quat) disinfectant in our facility. Microfiber wipers, cotton towels, and 1 of 2 types of disposable wipes soaked in a Quat disinfectant revealed significant binding of the disinfectant. Concentrations of Quat delivered by automated disinfectant dispensers varied widely.
Infect. Control Hosp. Epidemiol. 2016;37(3):340–342
This paper employs survey experiments to examine how contextualizing the claims made in negative political advertising affects perceptions of their fairness. This has implications for the components of fairness judgments, e.g., if “truth” is a component of fairness, being informed that a claim is untrue should undermine perceptions of its fairness, as well as for the efficacy of “fact-checking.” Our experiments on a random national telephone sample show some effects of being informed that a claim is untrue but few if it is characterized as taken out of context or as irrelevant. These findings imply that: (a) while evaluations of the truth of claims appear to be a component of fairness, considerations such as whether claims are the “whole story” or “relevant” to the decision at hand do not, and (b) contextualizing of the claims of ads in fact-checks has very little impact on perceptions of their fairness.
The public health burden of alcohol is unevenly distributed across the life course, with levels of use, abuse, and dependence increasing across adolescence and peaking in early adulthood. Here, we leverage this temporal patterning to search for common genetic variants predicting developmental trajectories of alcohol consumption. Comparable psychiatric evaluations measuring alcohol consumption were collected in three longitudinal community samples (N = 2,126, obs = 12,166). Consumption-repeated measurements spanning adolescence and early adulthood were analyzed using linear mixed models, estimating individual consumption trajectories, which were then tested for association with Illumina 660W-Quad genotype data (866,099 SNPs after imputation and QC). Association results were combined across samples using standard meta-analysis methods. Four meta-analysis associations satisfied our pre-determined genome-wide significance criterion (FDR < 0.1) and six others met our ‘suggestive’ criterion (FDR <0.2). Genome-wide significant associations were highly biological plausible, including associations within GABA transporter 1, SLC6A1 (solute carrier family 6, member 1), and exonic hits in LOC100129340 (mitofusin-1-like). Pathway analyses elaborated single marker results, indicating significant enriched associations to intuitive biological mechanisms, including neurotransmission, xenobiotic pharmacodynamics, and nuclear hormone receptors (NHR). These findings underscore the value of combining longitudinal behavioral data and genome-wide genotype information in order to study developmental patterns and improve statistical power in genomic studies.
Our understanding of the range of exhibition spaces for Victorian popular science, and the correspondingly diverse experiences they provided for audiences, has undergone significant modification over the last ten years. Whilst metropolitan establishments such as the Royal Polytechnic Institution, Egyptian Hall, Wyld's Great Globe and Royal Adelaide Gallery probably remain the best-known known examples of institutions providing a potent mixture of instruction, amusement and spectacle, a wealth of recent scholarship has demonstrated that there was also an increasing variety of alternative exhibition spaces that have been for too long in the shadow of these large and iconic London establishments. A trip to the Polytechnic or Egyptian Hall might well form part of the itinerary of a visitor to London; for most though, such trips were the exception rather than the rule. Two key research trends have driven this emergence of a more nuanced picture of Victorian popular science. The first is a move away from studies of the metropolis towards an exploration of the provision of lectures, demonstrations, classes and exhibitions in British provincial towns and cities. For inhabitants of towns and cities beyond London (and potentially for many Londoners as well), a lecture at the local mechanics' institute, a magic lantern show of natural history given as a Sunday school treat, the hullabaloo of a travelling menagerie arriving in the marketplace, or a freak show at the annual fair were more likely to have constituted their experiences of popular science.