To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Viral pneumonia is an important cause of death and morbidity among infants worldwide. Transmission of non-influenza respiratory viruses in households can inform preventative interventions and has not been well-characterised in South Asia. From April 2011 to April 2012, household members of pregnant women enrolled in a randomised trial of influenza vaccine in rural Nepal were surveyed weekly for respiratory illness until 180 days after birth. Nasal swabs were tested by polymerase chain reaction for respiratory viruses in symptomatic individuals. A transmission event was defined as a secondary case of the same virus within 14 days of initial infection within a household. From 555 households, 825 initial viral illness episodes occurred, resulting in 79 transmission events. The overall incidence of transmission was 1.14 events per 100 person-weeks. Risk of transmission incidence was associated with an index case age 1–4 years (incidence rate ratio (IRR) 2.35; 95% confidence interval (CI) 1.40–3.96), coinfection as initial infection (IRR 1.94; 95% CI 1.05–3.61) and no electricity in household (IRR 2.70; 95% CI 1.41–5.00). Preventive interventions targeting preschool-age children in households in resource-limited settings may decrease the risk of transmission to vulnerable household members, such as young infants.
One goal of public health campaigns is to reduce health inequalities by encouraging responsible and prudent health choices among groups that exhibit higher rates of disease, especially among groups with low socio-economic status (SES).1 In Australia, examples include Queensland Health’s ‘Deadly Choices’ campaign, which encourages members of the indigenous community to adopt healthy practices,2 as well as the national ‘Quit Now’ campaign to reduce smoking.3 In the United States, there is the Centre for Disease Control’s ‘Verb’ campaign to encourage exercise and activity in at-risk and obese youth.4 At the heart of these campaigns is a conception of the person as an autonomous being who is ultimately responsible for the decisions that impact his or her health.
Moral reasoning and decision making help guide behavior and facilitate interpersonal relationships. Accounts of morality that position commonsense psychology as the foundation of moral development, (i.e., rationalist theories) have dominated research in morality in autism spectrum disorder (ASD). Given the well-documented differences in commonsense psychology among autistic individuals, researchers have investigated whether the development and execution of moral judgement and reasoning differs in this population compared with neurotypical individuals. In light of the diverse findings of investigations of moral development and reasoning in ASD, a summation and critical evaluation of the literature could help make sense of what is known about this important social-cognitive skill in ASD. To that end, we conducted a systematic review of the literature investigating moral decision making among autistic children and adults. Our search identified 29 studies. In this review, we synthesize the research in the area and provide suggestions for future research. Such research could include the application of an alternative theoretical framework to studying morality in autism spectrum disorder that does not assume a deficits-based perspective.
This article involved a broad search of applied sciences for milestone technologies we deem to be the most significant innovations applied by the North American pork industry, during the past 10 to 12 years. Several innovations shifted the trajectory of improvement or resolved significant production limitations. Each is being integrated into practice, with the exception being gene editing technology, which is undergoing the federal approval process. Advances in molecular genomics have been applied to gene editing for control of porcine reproductive and respiratory syndrome and to identify piglet genome contributions from each parent. Post-cervical artificial insemination technology is not novel, but this technology is now used extensively to accelerate the rate of genetic progress. A milestone was achieved with the discovery that dietary essential fatty acids, during lactation, were limiting reproduction. Their provision resulted in a dose-related response for pregnancy, pregnancy maintenance and litter size, especially in maturing sows and ultimately resolved seasonal infertility. The benefit of segregated early weaning (12 to 14 days of age) was realized for specific pathogen removal for genetic nucleus and multiplication. Application was premature for commercial practice, as piglet mortality and morbidity increased. Early weaning impairs intestinal barrier and mucosal innate immune development, which coincides with diminished resilience to pathogens and viability later in life. Two important milestones were achieved to improve precision nutrition for growing pigs. The first involved the updated publication of the National Research Council nutrient requirements for pigs, a collaboration between scientists from America and Canada. Precision nutrition advanced further when ingredient description, for metabolically available amino acids and net energy (by source plant), became a private sector nutrition product. The past decade also led to fortuitous discoveries of health-improving components in ingredients (xylanase, soybeans). Finally, two technologies converged to facilitate timely detection of multiple pathogens in a population: oral fluids sampling and polymerase chain reaction (PCR) for pathogen analysis. Most critical diseases in North America are now routinely monitored by oral fluid sampling and prepared for analysis using PCR methods.
We consider a theoretical model for the flow of Newtonian fluid through a long flexible-walled channel which is formed from four compliant and rigid compartments arranged alternately in series. We drive the flow using a fixed upstream flux and derive a spatially one-dimensional model using a flow profile assumption. The compliant compartments of the channel are assumed subject to a large external pressure, so the system admits a highly collapsed steady state. Using both a global (linear) stability eigensolver and fully nonlinear simulations, we show that these highly collapsed steady states admit a primary global oscillatory instability similar to observations in a single channel. We also show that in some regions of the parameter space the system admits a secondary mode of instability which can interact with the primary mode and lead to significant changes in the structure of the neutral stability curves. Finally, we apply the predictions of this model to the flow of blood through the central retinal vein and examine the conditions required for the onset of self-excited oscillation. We show that the neutral stability curve of the primary mode of instability discussed above agrees well with canine experimental measurements of the onset of retinal venous pulsation, although there is a large discrepancy in the oscillation frequency.
A theoretically based relationship for the Darcy–Weisbach friction factor
for rough-bed open-channel flows is derived and discussed. The derivation procedure is based on the double averaging (in time and space) of the Navier–Stokes equation followed by repeated integration across the flow. The obtained relationship explicitly shows that the friction factor can be split into at least five additive components, due to: (i) viscous stress; (ii) turbulent stress; (iii) dispersive stress (which in turn can be subdivided into two parts, due to bed roughness and secondary currents); (iv) flow unsteadiness and non-uniformity; and (v) spatial heterogeneity of fluid stresses in a bed-parallel plane. These constitutive components account for the roughness geometry effect and highlight the significance of the turbulent and dispersive stresses in the near-bed region where their values are largest. To explore the potential of the proposed relationship, an extensive data set has been assembled by employing specially designed large-eddy simulations and laboratory experiments for a wide range of Reynolds numbers. Flows over self-affine rough boundaries, which are representative of natural and man-made surfaces, are considered. The data analysis focuses on the effects of roughness geometry (i.e. spectral slope in the bed elevation spectra), relative submergence of roughness elements and flow and roughness Reynolds numbers, all of which are found to be substantial. It is revealed that at sufficiently high Reynolds numbers the roughness-induced and secondary-currents-induced dispersive stresses may play significant roles in generating bed friction, complementing the dominant turbulent stress contribution.
Introduction: Emergency Department (ED) health care professionals are responsible for providing team-based care to critically ill patients. Given this complex responsibility, simulation training is paramount. In situ simulation (ISS) has many cited benefits as a training strategy that targets on-duty staff and occurs in the actual patient environment. Several evidence-based frameworks identify staff buy-in as essential for successful ISS implementation, however, the attitudes of interdisciplinary front-line ED staff in this regard are unknown. The purpose of this study is to identify contextual trends in interdisciplinary opinions on routine ISS in the ED. Methods: Qualitative and quantitative review, exploring the self-reported attitudes of interdisciplinary ED staff: before, during and after the implementation of a routine ISS pilot program (5 sessions in 5 months) at the Charles V Keating Emergency and Trauma Center in Halifax from Feb-Nov, 2018. Results: 149 surveys were received. Baseline support for ISS was high; 83% of respondents believed that the advantages of ISS outweigh the challenges and 47% favoured simulation in the ED, relative the sim bay (26%) and 28% were indifferent. The attitudes of direct participants in ISS were very positive, with 88% believing that the benefits outweighed the challenges after participation and 91% believing that they personally benefited from participating. A department wide post-ISS pilot survey suggested a slight decrease in support. Support for ISS dropped from 83% to 67%, a statistically insignificant reduction (p = 0.098) but a sizeable change that warrants further investigation. Most notably respondents reported increased support for simulation training in a simulation bay relative to ISS in the ED. Respondents still regarded simulation highly overall. Interestingly, when the results were stratified by position, staff physicians were the least positive. Conclusion: Pre-pilot or baseline opinions of ISS were very positive, and participants all responded positively to the simulations. This study generates valuable insight into the perceptions of interdisciplinary ED staff regarding the implementation and perceived impact of routine ISS. This evidence can be used to inform future programming, though further investigation is warranted into why opinions post-intervention may have changed at the department level.
Research indicates that people suffering from obsessive compulsive disorder (OCD) possess several cognitive biases, including a tendency to over-estimate threat and avoid risk. Studies have suggested that people with OCD not only over-estimate the severity of negative events, but also under-estimate their ability to cope with such occurrences. What is less clear is if they also miscalculate the extent to which they will be emotionally impacted by a given experience.
The aim of the current study was twofold. First, we examined if people with OCD are especially poor at predicting their emotional responses to future events (i.e. affective forecasting). Second, we analysed the relationship between affective forecasting accuracy and risk assessment across a broad domain of behaviours.
Forty-one OCD, 42 non-anxious, and 40 socially anxious subjects completed an affective forecasting task and a self-report measure of risk-taking.
Findings revealed that affective forecasting accuracy did not differ among the groups. In addition, there was little evidence that affective forecasting errors are related to how people assess risk in a variety of situations.
The results of our study suggest that affective forecasting is unlikely to contribute to the phenomenology of OCD or social anxiety disorder. However, that people over-estimate the hedonic impact of negative events might have interesting implications for the treatment of OCD and other disorders treated with exposure therapy.
OBJECTIVES/SPECIFIC AIMS: The primary objective of this study was to evaluate the performance of a bone fracture targeted systemically administrable bone anabolic as a potential therapeutic for bone fracture repair. Currently all bone fracture repair therapeutic require local administration during surgery. However, the population that need the most assistance in repair bone fractures are not eligible for surgery. So, it was our goal to design an inject-able therapeutic to assist in bone fracture repair to reduce the invasiveness. The injectable nature of it allows for repair administration of the bone anabolic and for therapeutic effect throughout the entire bone fracture healing process. Targeting it to the bone fracture site reduces the toxicity and increases the efficacy. METHODS/STUDY POPULATION: METHODS To achieve the above objective, a bone mineral-(hydroxyapatite-) targeting oligopeptide was conjugated to the non-signaling end of an engineered parathyroid hormone related protein fragment 1-46 with substitutions at Glu22,25, Leu23,28,31, Aib29, Lys26,30 (ePTHrP). The negatively charged oligopeptide has been shown to target raw hydroxyapatite with remarkable specificity, while the attached PTHrP has been demonstrated to induce sustained and accelerated bone growth under control of endogenous morphogenic regulatory factors. The conjugate’s specificity arises from the fact that raw hydroxyapatite is only exposed whenever a bone is fractured, surgically cut, grafted, or induced to undergo accelerated remodeling. The hydroxyapatite-targeted conjugate can therefore be administered systemically (i.e. without invasive surgery or localized injection) and still accumulate on the exposed hydroxyapatite at the fracture site where it accelerates the healing process Murine in vivo experiments were conducted on female Swiss Webster mice (10 per group). Femoral fractures were induced with a 3-point bending device and stabilized. Mice were dosed with 3 nmol/kg/d of targeted-ePTHrP, non-conjugated (free) ePTHrP, or saline. Following a 4-week study, fracture callus densities were measured using microCT. Canine in vivo experiments were conducted on 1-year-old male beagles. Beagles underwent a 10 mm bilateral ulnar ostectomy. Two dogs in the treatment group and Three dogs in the control group were dosed daily with either targeted-ePTHrP 0.5nmol/kg/d or saline respectively. Dogs were x-rayed weekly for the first 6 weeks and then every other week thereafter. One tailed ANOVA followed by Dunnett’s post-hoc test was used to establish significance. All animal experiments were conducted as described in approved IACUC protocols. P<0.05 was considered significant. RESULTS/ANTICIPATED RESULTS: RESULTS SECTION: In the murine studies we observed a marked increase in fracture callus size and a 2-fold increase in bone deposition was observed in the targeted-ePTHrP group over the saline group (P<0.01). A significant doubling in bone density was also observed. Targeted-ePTHrP group fractured femurs were able to achieve their pre-fracture strength as early as 3 weeks compared to 9 weeks in the saline mice representing a 66% reduction in healing time. In the canine studies, we observe a significantly higher closure of the ostectomy gap than saline controls (P<0.05). In addition, no significant differences in weight are observed in the treatment vs. saline controls. No significant difference between the control group and treatment groups was found in a histological investigation of the organs. DISCUSSION/SIGNIFICANCE OF IMPACT: DISCUSSION: Although attempts have been made in developing a systemically administered fracture therapeutic for fracture repair, i.e. teriparatide, to date, no such anabolics have been approved for this use. In these studies there is evidence that anabolic activity was occurring at the fracture site, but at a level that did not meet FDA required end-points.2 It is plausible that if sufficient drug were to be delivered to a fracture site then improved fracture repair would be possible. In previous studies, we demonstrated fracture specific accumulation bone anabolics can be achieved by modifying the drug with acidic oligopeptides.3 Here, by modifying a safe, clinically proven, parathyroid hormone receptor agonist with an acidic oligopeptide we observe improved bone deposition and strength in mice. Furthermore, when administered to canine critical sized defect ostectomies, a more relevant and difficult model, we observe improved ostectomy closure. CLINICAL RELEVANCE:: The ability to accelerate bone fracture repair is a fundamental need that has not been addressed by conventional methods. By targeting bone anabolic agents to bone fractures, we can deliver sufficient concentrations of anabolic agent to the fracture site to accelerate healing, thus avoiding surgery and any ectopic bone growth associated with locally-applied bone anabolic agents.
To evaluate the long-term safety and tolerability of deutetrabenazine in patients with tardive dyskinesia (TD) at 2years.
In the 12-week ARM-TD and AIM-TD studies, deutetrabenazine showed clinically significant improvements in Abnormal Involuntary Movement Scale scores compared with placebo, and there were low rates of overall adverse events (AEs) and discontinuations associated with deutetrabenazine.
Patients who completed ARM-TD or AIM-TD were included in this open-label, single-arm extension study, in which all patients restarted/started deutetrabenazine 12mg/day, titrating up to a maximum total daily dose of 48mg/day based on dyskinesia control and tolerability. The study comprised a 6-week titration period and a long-term maintenance phase. Safety measures included incidence of AEs, serious AEs (SAEs), and AEs leading to withdrawal, dose reduction, or dose suspension. Exposure-adjusted incidence rates (EAIRs; incidence/patient-years) were used to compare AE frequencies for long-term treatment with those for short-term treatment (ARM-TD and AIM-TD). This analysis reports results up to 2 years (Week106).
343 patients were enrolled (111 patients received placebo in the parent study and 232 received deutetrabenazine). There were 331.4 patient-years of exposure in this analysis. Through Week 106, EAIRs of AEs were comparable to or lower than those observed with short-term deutetrabenazine and placebo, including AEs of interest (akathisia/restlessness [long-term EAIR: 0.02; short-term EAIR range: 0–0.25], anxiety [0.09; 0.13–0.21], depression [0.09; 0.04–0.13], diarrhea [0.06; 0.06–0.34], parkinsonism [0.01; 0–0.08], somnolence/sedation [0.09; 0.06–0.81], and suicidality [0.02; 0–0.13]). The frequency of SAEs (EAIR 0.15) was similar to those observed with short-term placebo (0.33) and deutetrabenazine (range 0.06–0.33) treatment. AEs leading to withdrawal (0.08), dose reduction (0.17), and dose suspension (0.06) were uncommon.
These results confirm the safety outcomes seen in the ARM-TD and AIM-TD parent studies, demonstrating that deutetrabenazine is well tolerated for long-term use in TD patients.
Presented at: American Academy of Neurology Annual Meeting; April 21–27, 2018, Los Angeles, California,USA
Funding Acknowledgements: Funding: This study was supported by Teva Pharmaceuticals, Petach Tikva, Israel
To evaluate long-term efficacy of deutetrabenazine in patients with tardive dyskinesia (TD) by examining response rates from baseline in Abnormal Involuntary Movement Scale (AIMS) scores. Preliminary results of the responder analysis are reported in this analysis.
In the 12-week ARM-TD and AIM-TD studies, the odds of response to deutetrabenazine treatment were higher than the odds of response to placebo at all response levels, and there were low rates of overall adverse events and discontinuations associated with deutetrabenazine.
Patients with TD who completed ARM-TD or AIM-TD were included in this open-label, single-arm extension study, in which all patients restarted/started deutetrabenazine 12mg/day, titrating up to a maximum total daily dose of 48mg/day based on dyskinesia control and tolerability. The study comprised a 6-week titration and a long-term maintenance phase. The cumulative proportion of AIMS responders from baseline was assessed. Response was defined as a percent improvement from baseline for each patient from 10% to 90% in 10% increments. AlMS score was assessed by local site ratings for this analysis.
343 patients enrolled in the extension study (111 patients received placebo in the parent study and 232 patients received deutetrabenazine). At Week 54 (n=145; total daily dose [mean±standard error]: 38.1±0.9mg), 63% of patients receiving deutetrabenazine achieved ≥30% response, 48% of patients achieved ≥50% response, and 26% achieved ≥70% response. At Week 80 (n=66; total daily dose: 38.6±1.1mg), 76% of patients achieved ≥30% response, 59% of patients achieved ≥50% response, and 36% achieved ≥70% response. Treatment was generally well tolerated.
Patients who received long-term treatment with deutetrabenazine achieved response rates higher than those observed in positive short-term studies, indicating clinically meaningful long-term treatment benefit.
Presented at: American Academy of Neurology Annual Meeting; April 21–27, 2018, Los Angeles, California, USA.
Funding Acknowledgements: This study was supported by Teva Pharmaceuticals, Petach Tikva, Israel.
The aim of this study was to examine the metabolic response to feed deprivation up to 48 h in low and high yielding lamb genotypes. It was hypothesised that Terminal sired lambs would have decreased plasma glucose and increased plasma non-esterified fatty acids (NEFA) and β-hydroxybutyrate (BHOB) concentrations in response to feed deprivation compared to Merino sired lambs. In addition, it was hypothesised that the metabolic changes due to feed deprivation would also be greater in progeny of sires with breeding values for greater growth, muscling and leanness. Eighty nine lambs (45 ewes, 44 wethers) from Merino dams with Merino or Terminal sires with a range in Australian Sheep Breeding Values (ASBVs) for post-weaning weight (PWT), post-weaning eye muscle depth and post-weaning fat depth (PFAT) were used in this experiment. Blood samples were collected via jugular cannulas every 6 h from time 0 to 48 h of feed deprivation for the determination of plasma glucose, NEFA, BHOB and lactate concentration. From 12 to 48 h of feed deprivation plasma glucose concentration decreased (P < 0.05) by 25% from 4.04 ± 0.032 mmol/l to 3.04 ± 0.032 mmol/l. From 6 h NEFA concentration increased (P < 0.05) from 0.15 ± 0.021 mmol/l by almost 10-fold to 1.34 ± 0.021 mmol/l at 48 h of feed deprivation. Feed deprivation also influenced BHOB concentrations and from 12 to 48 h it increased (P < 0.05) from 0.15 ± 0.010 mmol/l to 0.52 ± 0.010 mmol/l. Merino sired lambs had a 8% greater reduction in glucose and 29% and 10% higher NEFA and BHOB response, respectively, compared to Terminal sired lambs (P < 0.05). In Merino sired lambs, increasing PWT was also associated with an increase in glucose and decline in NEFA and BHOB concentration (P < 0.05). In Terminal sired lambs, increasing PFAT was associated with an increase in glucose and decline in NEFA concentration (P < 0.05). Contrary to the hypothesis, Merino sired lambs showed the greatest metabolic response to fasting especially in regards to fat metabolism.
Under current Australian industry pre-slaughter guidelines, lambs may be off feed for up to 48 h before slaughter. The purpose of this study was to examine what proportion of circulating metabolites at slaughter are due to stress and feed deprivation and if this response differs between Merino and Terminal genotypes. In addition the effect of feed deprivation on carcass weight and meat quality was examined. Jugular blood samples were collected from 88 Merino and Terminal sired lambs at rest and at slaughter following 24, 36 and 48 h of feed deprivation and plasma analysed for glucose, lactate, non-esterified fatty acids (NEFA) and β-hydroxybutyrate (BHOB). From the same carcasses hot carcass weight (HCWT) were measured as well as a suite of meat quality traits measured such as M. longissimus lumborum (loin) and M. semitendinosus pH at 24 h postmortem. Loin samples were also analysed for intramuscular fat content and Warner–Bratzer Shear Force. Merino sired lambs had a higher NEFA response compared to Terminal sired lambs at slaughter after 24, 36 and 48 h of feed deprivation, with NEFA levels up to 35% higher than previously reported in the same animals at rest in animal house conditions, whereas BHOB response to feed deprivation was not affected by sire type (P>0.05) and similar to previously reported at rest. In addition to the metabolic effects, increasing feed deprivation from 36 h was associated with a 3% reduction in HCWT and dressing percentage as well as causing increased ultimate pH in the M. semitendinosus in Merino sired lambs. Findings from this study demonstrate that Merino and Terminal sired lambs differ in their metabolic response to feed deprivation under commercial slaughter conditions. In addition, commercial feed deprivation appears to have a negative effect on ultimate pH and carcass weight and warrants further investigation.
The choice of animal-based traits to identify and deal with production diseases is often a challenge for pig farmers, researchers and other related professionals. This systematic review focused on production diseases, that is, the diseases that arise from management practices, affecting the digestive, locomotory and respiratory system of pigs. The aim was to classify all traits that have been measured and conduct a meta-analysis to quantify the impact of diseases on these traits so that these can be used as indicators for intervention. Data were extracted from 67 peer-reviewed publications selected from 2339 records. Traits were classified as productive (performance and carcass composition), behavioural, biochemical and molecular traits. A meta-analysis based on mixed models was performed on traits assessed more than five times across studies, using the package metafor of the R software. A total of 524 unique traits were recorded 1 to 31 times in a variety of sample material including blood, muscle, articular cartilage, bone or at the level of whole animal. No behavioural traits were recorded from the included experiments. Only 14 traits were measured on more than five occasions across studies. Traits within the biochemical, molecular and productive trait groups were reported most frequently in the published literature and were most affected by production diseases; among these were some cytokines (interleukin (IL) 1-β, IL6, IL8 and tumour necrosis factor-α), acute phase proteins (haptoglobin) and daily weight gain. Quantification of the influence of factors relating to animal characteristics or husbandry practices was not possible, due to the low frequency of reporting throughout the literature. To conclude, this study has permitted a holistic assessment of traits measured in the published literature to study production diseases occurring in various stages of the production cycle of pigs. It shows the lack of consensus and common measurements of traits to characterise production diseases within the scientific literature. Specific traits, most of them relating to performance characteristics or immunological response of pigs, are proposed for further study as potential tools for the prognosis and study of production diseases.
Many studies have identified changes in the brain associated with obsessive–compulsive disorder (OCD), but few have examined the relationship between genetic determinants of OCD and brain variation.
We present the first genome-wide investigation of overlapping genetic risk for OCD and genetic influences on subcortical brain structures.
Using single nucleotide polymorphism effect concordance analysis, we measured genetic overlap between the first genome-wide association study (GWAS) of OCD (1465 participants with OCD, 5557 controls) and recent GWASs of eight subcortical brain volumes (13 171 participants).
We found evidence of significant positive concordance between OCD risk variants and variants associated with greater nucleus accumbens and putamen volumes. When conditioning OCD risk variants on brain volume, variants influencing putamen, amygdala and thalamus volumes were associated with risk for OCD.
These results are consistent with current OCD neurocircuitry models. Further evidence will clarify the relationship between putamen volume and OCD risk, and the roles of the detected variants in this disorder.
Declaration of interest
The authors have declared that no competing interests exist.
Background: Graduating residents require general palliative care skills. In Canada, there is no standardized palliative care curriculum for specialty trained residents. The objective of this research is to develop an evidence-based palliative care curriculum designed to provide neurology residents with the general palliative care skills required for providing patient care along the continuum of life. Methods: A needs assessment was performed in Neurology at Western University using qualitative analysis techniques. Residents completed the following:. A curricular outline was developed based on the Kolb learning style inventory (LSI), a knowledge pre-test, the Palliative Medicine Comfort and Confidence Survey and a review of the literature. Two iterations of the curriculum have been developed. Results: Residents identified a need for additional training in supportive and palliative care skills. Based on the Kolb LSI, 9/16 (56.3%) of neurology residents are “accommodators”. General principles identified for inclusion included: symptom management, communication, psychosocial aspects of care, care coordination and access, and myths and pitfalls in palliative care. Conclusions: This project is designed to identify the current palliative educational needs for Neurology residents. The results suggest that specialty trained residents are receptive to embedding training in the principles of palliative care within their training programs.
British Anglo-Catholic and high church Anglicans promoted a new set of foreign missionary initiatives in the Pacific and South and East Africa in the 1860s. Theorizing new indigenizing models for mission inspired by Tractarian medievalism, the initiatives envisioned a different and better engagement with ‘native’ cultures. Despite setbacks, the continued use of Anglican sisters in Hawai‘i and brothers in Melanesia, Africa and India created a potent new imaginative space for missionary endeavour, but one problematized by the uneven reach of empire: from contested, as in the Pacific, to normal and pervasive, as in India. Of particular relevance was the Sandwich Islands mission, invited by the Hawaiian crown, where Bishop T. N. Staley arrived in 1862, followed by Anglican missionary sisters in 1864. Immensely controversial in Britain and America, where among evangelicals in particular suspicion of ‘popish’ religious practice ran high, Anglo-Catholic methods and religious communities mobilized discussion, denunciation and reaction. Particularly in the contested imperial space of an independent indigenous monarchy, Anglo-Catholics criticized what they styled the cruel austerities of evangelical American ‘puritanism’ and the ambitions of American imperialists; in the process they catalyzed a reconceptualized imperial reformism with important implications for the shape of the late Victorian British empire.