To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Here, we develop and characterize high thermal conductivity/high thermal shock-resistant bulk Ce-doped Al2O3 and propose it as a new phosphor converting capping layer for high-powered/high-brightness solid-state white lighting (SSWL). The bulk, dense Ce:Al2O3 ceramics have a 0.5 at.% Ce:Al concentration (significantly higher than the equilibrium solubility limit) and were produced using a simultaneous solid-state reactive current activated pressure-assisted densification (CAPAD) approach. Ce:Al2O3 exhibits a broadband emission from 400 to 600 nm, which encompasses the entire blue and green portions of the visible spectrum when pumped with ultraviolet (UV) light that is now commercially available in UV light–emitting devices and laser diodes (LD). These broadband phosphors can be used in the commonly used scheme of mixing with other UV-converting capping layers that emit red light to produce white light. Alternatively, they can be used in a novel composite down-converter approach that ensures improved thermal–mechanical properties of the converting phosphor capping layer. In this configuration, Ce:Al2O3 is used with proven phosphor conversion materials such as Ce:YAG as an active encapsulant or as a capping layer to produce SSWL with an improved bandwidth in the blue portion of the visible spectrum. To study the effect of crystallinity on the Ce photoluminescent (PL) emission, we synthesize Ce:YAG ceramics using high-pressure CAPAD at moderate temperatures to obtain varying crystallinity (amorphous through fully crystalline). We investigate the PL characteristics of Ce:Al2O3 and Ce:YAG from 295 to 4 K, revealing unique crystal field effects from the matrix on the Ce dopants. The unique PL properties in conjunction with the superior thermal–mechanical properties of Ce:Al2O3 can be used in high-powered/high-brightness–integrated devices based on high-efficiency UV-LD that do not suffer efficiency droop at high drive currents to pump the solid-state capping phosphors.
This paper focuses on the problem of skin corrosion on the upper wing surfaces of rib-stiffened aircraft. For maritime and military transport aircraft this often results in multiple co-located repairs. The common approach to corrosion damage in operational aircraft is to blend out the corrosion and rivet a mechanical doubler over the region. In particular this paper describes the results of a combined numerical and experimental investigation into the ability of the additive metal technology, Supersonic Particle Deposition (SPD), to restore the load-carrying capacity of rib-stiffened wing planks with simulated skin corrosion. The experimental results reveal that unrepaired skin corrosion can result in failure by yielding. The experimental results also reveal that SPD repairs to skin corrosion can restore the stress field in the structure, and can ensure that the load-carrying capability of the repaired structure is above proof load.
In September 2015, Volkswagen's "clean diesel" technology was exposed as a sham. Not only were the company's vehicles discharging dangerously high levels of nitrogen oxide, but VW had intentionally rigged its emissions systems to cheat on environmental tests. In the wake of resignations and criminal investigations, the company's governance system came under justifiable attack. Were VW's famously worker-friendly governance policies to blame? This Chapter examines the root causes of the emissions scandal and concludes that VW's governance culture suffered from dictatorial leadership as well as a cozy relationship between management and labor leaders. This culture of complacency led to a lack of accountability at key levels, including executives, shareholders, and regulators. In addition, despite its worker-oriented governance structure, Volkswagen's internal management is still organized along traditional hierarchical lines. Empowered workers, participating at all levels of company governance, would provide a stronger internal culture of compliance, innovation, and sustainability.
Alcohol and cannabis remain the substances most widely used by adolescents. Better understanding of the dynamic relationship between trajectories of substance use in relation to neuropsychological functioning is needed. The aim of this study was to examine the different impacts of within- and between-person changes in alcohol and cannabis use on neuropsychological functioning over multiple time points.
Hierarchical linear modeling examined the effects of alcohol and cannabis use on neuropsychological functioning over the course of 14 years in a sample of 175 adolescents (aged 12–15 years at baseline).
Time-specific fluctuations in alcohol use (within-person effect) predicted worse performance across time on the Wechsler Abbreviated Scale of Intelligence Block Design subtest (B = −.05, SE = .02, p = .01). Greater mean levels of percent days of cannabis use across time (between-person effect) were associated with an increased contrast score between Delis–Kaplan Executive Function System Color Word Inhibition and Color Naming conditions (B = .52, SE = .14, p < .0001) and poorer performance over time on Block Design (B = −.08, SE = .04, p = .03). Neither alcohol and/nor cannabis use over time was associated with performance in the verbal memory and processing speed domains.
Greater cumulative cannabis use over adolescence may be linked to poorer inhibitory control and visuospatial functioning performance, whereas more proximal increases in alcohol consumption during adolescence may drive alcohol-related performance decrements in visuospatial functioning. Results from this prospective study add to the growing body of literature on the impact of alcohol and cannabis use on cognition from adolescent to young adulthood.
The United States has a multidimensional set of employment law protections. From minimum wage and health and safety standards to antidiscrimination and antiretaliation protections, the law provides specific standards and structures to shield workers from egregious employer behavior and remedy the harms inflicted. These mandatory protections dovetail with the organizational power that labor law is intended to confer. The National Labor Relations Act (NLRA) provides for worker representation and obligates employers to bargain with these representatives over terms and conditions of employment. Labor law specifically provides employees with representation and requires management to negotiate with those representatives. And labor law professors have marveled at the spare commands of the NLRA and the depth of the Board’s interpretive nuance, as refined over 80 years.
Both blood- and milk-based biomarkers have been analysed for decades in research settings, although often only in one herd, and without focus on the variation in the biomarkers that are specifically related to herd or diet. Biomarkers can be used to detect physiological imbalance and disease risk and may have a role in precision livestock farming (PLF). For use in PLF, it is important to quantify normal variation in specific biomarkers and the source of this variation. The objective of this study was to estimate the between- and within-herd variation in a number of blood metabolites (β-hydroxybutyrate (BHB), non-esterified fatty acids, glucose and serum IGF-1), milk metabolites (free glucose, glucose-6-phosphate, urea, isocitrate, BHB and uric acid), milk enzymes (lactate dehydrogenase and N-acetyl-β-D-glucosaminidase (NAGase)) and composite indicators for metabolic imbalances (Physiological Imbalance-index and energy balance), to help facilitate their adoption within PLF. Blood and milk were sampled from 234 Holstein dairy cows from 6 experimental herds, each in a different European country, and offered a total of 10 different diets. Blood was sampled on 2 occasions at approximately 14 days-in-milk (DIM) and 35 DIM. Milk samples were collected twice weekly (in total 2750 samples) from DIM 1 to 50. Multilevel random regression models were used to estimate the variance components and to calculate the intraclass correlations (ICCs). The ICCs for the milk metabolites, when adjusted for parity and DIM at sampling, demonstrated that between 12% (glucose-6-phosphate) and 46% (urea) of the variation in the metabolites’ levels could be associated with the herd-diet combination. Intraclass Correlations related to the herd-diet combination were generally higher for blood metabolites, from 17% (cholesterol) to approximately 46% (BHB and urea). The high ICCs for urea suggest that this biomarker can be used for monitoring on herd level. The low variance within cow for NAGase indicates that few samples would be needed to describe the status and potentially a general reference value could be used. The low ICC for most of the biomarkers and larger within cow variation emphasises that multiple samples would be needed - most likely on the individual cows - for making the biomarkers useful for monitoring. The majority of biomarkers were influenced by parity and DIM which indicate that these should be accounted for if the biomarker should be used for monitoring.
The terms “missed injury” and “delayed diagnosis” have undergone evolution in their academic meaning over the last several decades of trauma care. Missed injury is typically reserved for an unidentified injury for which the opportune moment for intervention has passed. A delayed diagnosis is the term given to injuries not identified on the primary or secondary survey of the initial trauma evaluation. There is obvious overlap in the ways these terms are employed throughout trauma care, and specific institutions may possess their own interpretations. Many emergency medicine texts list a missed injury as one that is discovered after the patient has left the Emergency Department (ED), whether discharged home or admitted. This version of the “missed injury definition” would include possible injuries which were suspected in the ED (not truly “missed”), though not officially found due to appropriate delays in imaging while more acute issues are being resolved in the operating room (OR) or Intensive Care Unit (ICU). The national trauma database of the American College of Surgeons defines missed injury as an “injury-related diagnosis discovered after initial workup is completed and admission diagnosis is determined.”1 Delayed diagnosis was proposed to describe diagnoses that were not found on primary and secondary survey. The tertiary survey was intended to identify many of these injuries,2 though some literature still defines injuries found during the tertiary survey as “delayed.”3,4 In any case, the use of a tertiary survey should be employed in all trauma evaluations, as it leads to a reduction in clinically significant initially unidentified injuries.5 Trauma surgery has also created leveling algorithms based on the mechanism of injury to help activate appropriate resources for trauma patients. Finally, multiple evidence-based decision tools (i.e. Ottawa knee rules, Canadian head computed tomography rules, etc.) exist to help delineate imaging decisions.
MD-PhD training programs train physician-scientists to pursue careers involving both clinical care and research, but decreasing numbers of physician-scientists stay engaged in clinical research. We sought to identify current clinical research training methods utilized by MD–PhD programs and to assess how effective they are in promoting self-efficacy for clinical research.
The US MD–PhD students were surveyed in April–May 2018. Students identified the clinical research training methods they participated in, and self-efficacy in clinical research was determined using a modified 12-item Clinical Research Appraisal Inventory.
Responses were received from 61 of 108 MD–PhD institutions. Responses were obtained from 647 MD–PhD students in all years of training. The primary methods of clinical research training included no clinical research training, and various combinations of didactics, mentored clinical research, and a clinical research practicum. Students with didactics plus mentored clinical research had similar self-efficacy as those with didactics plus clinical research practicum. Training activities that differentiated students who did and did not have the clinical research practicum experience and were associated with higher self-efficacy included exposure to Institutional Review Boards and participation in human subject recruitment.
A clinical research practicum was found to be an effective option for MD–PhD students conducting basic science research to gain experience in clinical research skills. Clinical research self-efficacy was correlated with the amount of clinical research training and specific clinical research tasks, which may inform curriculum development for a variety of clinical and translational research training programs, for example, MD–PhD, TL1, and KL2.
Spotted fever group rickettsiae (SFG) are a neglected group of bacteria, belonging to the genus Rickettsia, that represent a large number of new and emerging infectious diseases with a worldwide distribution. The diseases are zoonotic and are transmitted by arthropod vectors, mainly ticks, fleas and mites, to hosts such as wild animals. Domesticated animals and humans are accidental hosts. In Asia, local people in endemic areas as well as travellers to these regions are at high risk of infection. In this review we compare SFG molecular and serological diagnostic methods and discuss their limitations. While there is a large range of molecular diagnostics and serological assays, both approaches have limitations and a positive result is dependent on the timing of sample collection. There is an increasing need for less expensive and easy-to-use diagnostic tests. However, despite many tests being available, their lack of suitability for use in resource-limited regions is of concern, as many require technical expertise, expensive equipment and reagents. In addition, many existing diagnostic tests still require rigorous validation in the regions and populations where these tests may be used, in particular to establish coherent and worthwhile cut-offs. It is likely that the best strategy is to use a real-time quantitative polymerase chain reaction (qPCR) and immunofluorescence assay in tandem. If the specimen is collected early enough in the infection there will be no antibodies but there will be a greater chance of a PCR positive result. Conversely, when there are detectable antibodies it is less likely that there will be a positive PCR result. It is therefore extremely important that a complete medical history is provided especially the number of days of fever prior to sample collection. More effort is required to develop and validate SFG diagnostics and those of other rickettsial infections.
Sleep disturbances are prevalent in cancer patients, especially those with advanced disease. There are few published intervention studies that address sleep issues in advanced cancer patients during the course of treatment. This study assesses the impact of a multidisciplinary quality of life (QOL) intervention on subjective sleep difficulties in patients with advanced cancer.
This randomized trial investigated the comparative effects of a multidisciplinary QOL intervention (n = 54) vs. standard care (n = 63) on sleep quality in patients with advanced cancer receiving radiation therapy as a secondary endpoint. The intervention group attended six intervention sessions, while the standard care group received informational material only. Sleep quality was assessed using the Pittsburgh Sleep Quality Index (PSQI) and Epworth Sleepiness Scale (ESS), administered at baseline and weeks 4 (post-intervention), 27, and 52.
The intervention group had a statistically significant improvement in the PSQI total score and two components of sleep quality and daytime dysfunction than the control group at week 4. At week 27, although both groups showed improvements in sleep measures from baseline, there were no statistically significant differences between groups in any of the PSQI total and component scores, or ESS. At week 52, the intervention group used less sleep medication than control patients compared to baseline (p = 0.04) and had a lower ESS score (7.6 vs. 9.3, p = 0.03).
Significance of results
A multidisciplinary intervention to improve QOL can also improve sleep quality of advanced cancer patients undergoing radiation therapy. Those patients who completed the intervention also reported the use of less sleep medication.
Microcredit – joint-liability loans to the poorest of the poor – has been touted as a powerful approach for combatting global poverty, but sustainability varies dramatically across banks. Efforts to improve the sustainability of microcredit have assumed defaults are caused by free-riding. Here, we point out that the response of other group members to delinquent groupmates also plays an important role in defaults. Even in the absence of any free-rider problem, some people will be unable to make their payments due to bad luck. It is other group members’ unwillingness to pitch in extra – due to, among other things, not wanting to have less than other group members – that leads to default. To support this argument, we utilize the Ultimatum Game (UG), a standard paradigm from behavioral economics for measuring one's aversion to inequitable outcomes. First, we show that country-level variation in microloan default rates is strongly correlated (overall r = 0.81) with country-level UG rejection rates, but not free-riding measures. We then introduce a laboratory model ‘Microloan Game’ and present evidence that defaults arise from inequity-averse individuals refusing to make up the difference when others fail to pay their fair share. This perspective suggests a suite of new approaches for combatting defaults that leverage findings on reducing UG rejections.
Field studies were conducted in 2016 and 2017 at Clinton, NC, to quantify the effects of season-long interference of large crabgrass [Digitaria sanguinalis (L.) Scop.] and Palmer amaranth (Amaranthus palmeri S. Watson) on ‘AG6536’ soybean [Glycine max (L.) Merr.]. Weed density treatments consisted of 0, 1, 2, 4, and 8 plants m−2 for A. palmeri and 0, 1, 2, 4, and 16 plants m−2 for D. sanguinalis with (interspecific interference) and without (intraspecific interference) soybean to determine the impacts on weed biomass, soybean biomass, and seed yield. Biomass per square meter increased with increasing weed density for both weed species with and without soybean present. Biomass per square meter of D. sanguinalis was 617% and 37% greater when grown without soybean than with soybean, for 1 and 16 plants m−2 respectively. Biomass per square meter of A. palmeri was 272% and 115% greater when grown without soybean than with soybean for 1 and 8 plants m−2, respectively. Biomass per plant for D. sanguinalis and A. palmeri grown without soybean was greatest at the 1 plant m−2 density. Biomass per plant of D. sanguinalis plants across measured densities was 33% to 83% greater when grown without soybean compared with biomass per plant when soybean was present for 1 and 16 plants m−2, respectively. Similarly, biomass per plant for A. palmeri was 56% to 74% greater when grown without soybean for 1 and 8 plants m−2, respectively. Biomass per plant of either weed species was not affected by weed density when grown with soybean due to interspecific competition with soybean. Yield loss for soybean grown with A. palmeri ranged from 14% to 37% for densities of 1 to 8 plants m−2, respectively, with a maximum yield loss estimate of 49%. Similarly, predicted loss for soybean grown with D. sanguinalis was 0 % to 37% for densities of 1 to 16 m−2 with a maximum yield loss estimate of 50%. Soybean biomass was not affected by weed species or density. Results from these studies indicate that A. palmeri is more competitive than D. sanguinalis at lower densities, but that similar yield loss can occur when densities greater than 4 plants m−2 of either weed are present.
An improved understanding of diagnostic and treatment practices for patients with rare primary mitochondrial disorders can support benchmarking against guidelines and establish priorities for evaluative research. We aimed to describe physician care for patients with mitochondrial diseases in Canada, including variation in care.
We conducted a cross-sectional survey of Canadian physicians involved in the diagnosis and/or ongoing care of patients with mitochondrial diseases. We used snowball sampling to identify potentially eligible participants, who were contacted by mail up to five times and invited to complete a questionnaire by mail or internet. The questionnaire addressed: personal experience in providing care for mitochondrial disorders; diagnostic and treatment practices; challenges in accessing tests or treatments; and views regarding research priorities.
We received 58 survey responses (52% response rate). Most respondents (83%) reported spending 20% or less of their clinical practice time caring for patients with mitochondrial disorders. We identified important variation in diagnostic care, although assessments frequently reported as diagnostically helpful (e.g., brain magnetic resonance imaging, MRI/MR spectroscopy) were also recommended in published guidelines. Approximately half (49%) of participants would recommend “mitochondrial cocktails” for all or most patients, but we identified variation in responses regarding specific vitamins and cofactors. A majority of physicians recommended studies on the development of effective therapies as the top research priority.
While Canadian physicians’ views about diagnostic care and disease management are aligned with published recommendations, important variations in care reflect persistent areas of uncertainty and a need for empirical evidence to support and update standard protocols.