To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Language and cognitive impairments are common consequences of stroke. These difficulties persist with 60% of stroke survivors continuing to experience memory problems, 50% attention deficits and 61% communication problems long after the onset of the stroke-related impairments. Such deficits are ‘invisible’ – evident only through patient report, behavioural observation or formal assessment. The impacts of such deficits are considerable and can include prolonged hospital stays, poorer functional recovery and reduced quality of life. Effective and timely rehabilitation of language (auditory comprehension, expressive language, reading and writing) and cognitive abilities (memory, attention, spatial awareness, perception and executive function) are crucial to optimise recovery after stroke. In this chapter we review the current evidence base, relevant clinical guidelines relating to language and cognitive impairments and consider the implications for stroke rehabilitation practice and future research. Speech and language therapy offers benefit to people with aphasia after stroke; intensive intervention, if tolerated, likely augments the benefits. Interventions for deficits in all non-language cognitive domains exist, but need refining and evaluating more thoroughly with a wider range of methodologies.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
External urinary collection devices (EUCDs) may reduce indwelling catheter usage and catheter-associated urinary tract infections (CAUTIs). In this retrospective quasi-experimental study, we demonstrated that EUCD implementation in women was associated with significantly decreased indwelling catheter usage and a trend (P = .10) toward decreased CAUTI per 1,000 patient days.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
To scale-out an experiential teaching kitchen in Parks and Recreation centres’ after-school programming in a large urban setting among predominantly low-income, minority children.
We evaluated the implementation of a skills-based, experiential teaching kitchen to gauge programme success. Effectiveness outcomes included pre–post measures of child-reported cooking self-efficacy, attitudes towards cooking, fruit and vegetable preference, intention to eat fruits and vegetables and willingness to try new fruits and vegetables. Process outcomes included attendance (i.e., intervention dose delivered), cost, fidelity and adaptations to the intervention.
After-school programming in Parks and Recreation Community centres in Nashville, TN.
Predominantly low-income minority children aged 6–14 years.
Of the twenty-five city community centres, twenty-one successfully implemented the programme, and nineteen of twenty-five implemented seven or more of the eight planned sessions. Among children with pre–post data (n 369), mean age was 8·8 (sd 1·9) years, and 53·7 % were female. All five effectiveness measures significantly improved (P < 0·001). Attendance at sessions ranged from 36·3 % of children not attending any sessions to 36·6 % of children attending at least four sessions. Across all centres, fidelity was 97·5 %. The average food cost per serving was $1·37.
This type of nutritional education and skills building experiential teaching kitchen can be successfully implemented in a community setting with high fidelity, effectiveness and organisational alignment, while also expanding reach to low-income, underserved children.
In the absence of pyuria, positive urine cultures are unlikely to represent infection. Conditional urine reflex culture policies have the potential to limit unnecessary urine culturing. We evaluated the impact of this diagnostic stewardship intervention.
We conducted a retrospective, quasi-experimental (nonrandomized) study, with interrupted time series, from August 2013 to January 2018 to examine rates of urine cultures before versus after the policy intervention. We compared 3 intervention sites to 3 control sites in an aggregated series using segmented negative binomial regression.
The study included 6 acute-care hospitals within the Veterans’ Health Administration across the United States.
Adult patients with at least 1 urinalysis ordered during acute-care admission, excluding pregnant patients or those undergoing urological procedures, were included.
At the intervention sites, urine cultures were performed if a preceding urinalysis met prespecified criteria. No such restrictions occurred at the control sites. The primary outcome was the rate of urine cultures performed per 1,000 patient days. The safety outcome was the rate of gram-negative bloodstream infection per 1,000 patient days.
The study included 224,573 urine cultures from 50,901 admissions in 24,759 unique patients. Among the intervention sites, the overall average number of urine cultures performed did not significantly decrease relative to the preintervention period (5.9% decrease; P = 0.8) but did decrease by 21% relative to control sites (P < .01). We detected no significant difference in the rates of gram-negative bloodstream infection among intervention or control sites (P = .49).
Conditional urine reflex culture policies were associated with a decrease in urine culturing without a change in the incidence of gram-negative bloodstream infection.
Hadrosaurid dinosaurs, the dominant large-bodied terrestrial herbivores in most Laurasian Late Cretaceous ecosystems, have an exceptional fossil record consisting of many species known from partial ontogenetic series, making them an ideal clade with which to conduct life-history studies. Previous research considered the Dinosaur Park Formation (DPF) of Alberta as an attritional, or time-averaged, sample and interpreted size–frequency distribution of long bones collected from the DPF with three size classes to suggest that hadrosaurids from the DPF attained near-asymptotic body size in under 3 years. This conflicted with previously published osteohistological estimates of 6+ years for penecontemporaneous hadrosaurids from the Two Medicine Formation (TMF) of Montana, suggesting either extreme variation in hadrosaurid growth rates or that size–frequency distributions and/or osteohistology and growth modeling inaccurately estimate ontogenetic age.
We tested the validity of the previously proposed size–age relationship of hadrosaurids from the DPF by significantly increasing sample size and combining data from size–frequency distributions and osteohistology across multiple long-bone elements. The newly constructed size–frequency distributions typically reveal four relatively distinct size–frequency peaks that, when integrated with the osteohistological data, aligned with growth marks. The yearling size class was heavily underrepresented in the size–frequency distribution. If not due to preservation, this suggests that either juvenile (<2 years of age) hadrosaurids from the DPF had increased survivorship following an initially high nestling mortality rate or that yearlings were segregated from adults. A growth-curve analysis revealed asymptotic body size was attained in approximately 7 years, which is consistent with hadrosaurids from the TMF. The data suggest size–frequency distributions of attritional samples underestimate age and overestimate growth rates, but when paired with osteohistology can provide unique life-history insights.
Over the past 15 years, there has been substantial growth in web-based psychological interventions. We summarize evidence regarding the efficacy of web-based self-directed psychological interventions on depressive, anxiety and distress symptoms in people living with a chronic health condition.
We searched Medline, PsycINFO, CINAHL, EMBASE databases and Cochrane Database from 1990 to 1 May 2019. English language papers of randomized controlled trials (usual care or waitlist control) of web-based psychological interventions with a primary or secondary aim to reduce anxiety, depression or distress in adults with a chronic health condition were eligible. Results were assessed using narrative synthases and random-effects meta-analyses.
In total 70 eligible studies across 17 health conditions [most commonly: cancer (k = 20), chronic pain (k = 9), arthritis (k = 6) and multiple sclerosis (k = 5), diabetes (k = 4), fibromyalgia (k = 4)] were identified. Interventions were based on CBT principles in 46 (66%) studies and 42 (60%) included a facilitator. When combining all chronic health conditions, web-based interventions were more efficacious than control conditions in reducing symptoms of depression g = 0.30 (95% CI 0.22–0.39), anxiety g = 0.19 (95% CI 0.12–0.27), and distress g = 0.36 (95% CI 0.23–0.49).
Evidence regarding effectiveness for specific chronic health conditions was inconsistent. While self-guided online psychological interventions may help to reduce symptoms of anxiety, depression and distress in people with chronic health conditions in general, it is unclear if these interventions are effective for specific health conditions. More high-quality evidence is needed before definite conclusions can be made.
Recent declines of wild pollinators and infections in honey, bumble and other bee species have raised concerns about pathogen spillover from managed honey and bumble bees to other pollinators. Parasites of honey and bumble bees include trypanosomatids and microsporidia that often exhibit low host specificity, suggesting potential for spillover to co-occurring bees via shared floral resources. However, experimental tests of trypanosomatid and microsporidial cross-infectivity outside of managed honey and bumble bees are scarce. To characterize potential cross-infectivity of honey and bumble bee-associated parasites, we inoculated three trypanosomatids and one microsporidian into five potential hosts – including four managed species – from the apid, halictid and megachilid bee families. We found evidence of cross-infection by the trypanosomatids Crithidia bombi and C. mellificae, with evidence for replication in 3/5 and 3/4 host species, respectively. These include the first reports of experimental C. bombi infection in Megachile rotundata and Osmia lignaria, and C. mellificae infection in O. lignaria and Halictus ligatus. Although inability to control amounts inoculated in O. lignaria and H. ligatus hindered estimates of parasite replication, our findings suggest a broad host range in these trypanosomatids, and underscore the need to quantify disease-mediated threats of managed social bees to sympatric pollinators.
We investigate the turbulence statistics in a multiphase plume made of heavy particles (particle Reynolds number at terminal velocity is 450). Using refractive-index-matched stereoscopic particle image velocimetry, we measure the locations of particles whose buoyancy drives the formation of a multiphase plume, together with the local velocity of the induced flow in the ambient salt–water. Measurements of the mean axial flow in the plume centreplane follow Gaussian profiles and that of the mean radial flow is consistent with integral plume theory. The turbulence characteristics resemble those measured in a bubble plume, including strong anisotropy in the normal Reynolds stresses. However, we observe structural differences between the two multiphase plumes. First, the skewness of the probability density function of the axial velocity fluctuations is not that which would be predicted by simply reversing the direction of a bubble plume. Second, in contrast to a bubble plume, the particle plume has a non-negligible fluid-shear production term in the turbulent kinetic energy (TKE) budget. Third, the radial decay of all measured terms in the TKE budget is slower than those in a bubble plume. Despite these dissimilarities, a bigger picture emerges that applies to both flows. The TKE production by particles (or bubbles) roughly balances the viscous dissipation, except near the plume centreline. The one-dimensional power spectra of the velocity fluctuations show a
power law that puts both the particle and bubble plume in a category different from single-phase shear-flow turbulence.
OBJECTIVES/GOALS: Primary graft dysfunction (PGD) is acute lung injury in the first three days after lung transplant. Patients that experience PGD have increased mortality and an increased risk of chronic lung allograft dysfunction. The pathogenesis is thought to be an ischemia-reperfusion injury but is incompletely understood and there are no specific therapies. We investigated the role of the microbiome in PGD and associations with inflammation and markers of aspiration. METHODS/STUDY POPULATION: We collected airway lavage samples from lung transplant donors before procurement and recipients after reperfusion. We extracted DNA, amplified the bacterial 16S rRNA gene, and sequenced on the Illumina MiSeq platform. QIIME2 and Deblur were used for bioinformatic analysis. R packages were used for downstream analysis and visualizations. The host response was quantified using the Milipore 41-plex Luminex and an ELISA for pepsin. Clinical data was collected by the Penn Lung Transplant Outcomes Group. PGD was assessed by degree of hypoxemia and chest X-ray findings in the 72 hours after transplant. RESULTS/ANTICIPATED RESULTS: There was no significant difference in alpha diversity (Shannon index, p = 0.51), biomass (via comparison of 16S amplicon PicoGreen, p = 0.6), or beta diversity (Weighted UniFrac, p = 0.472, PERMANOVA) between subjects with PGD grade 3 (n = 36) and those that did not (n = 96). On taxonomic analysis, we found an enrichment of Prevotella in donor and recipient lungs that went on to develop PGD (p = 0.05). To follow up this finding we measured immune response and pepsin concentrations in recipient lungs. We found elevated levels in 35/41 cytokines measured in subjects that developed PGD as well as an elevation in pepsin and a correlation between pepsin concentration and Prevotella relative abundance (Figure 1). Additionally, Prevotella relative abundance had statistically significant positive correlations with multiple cytokines such as IL-6 (Pearson’s = 0.26, p = 0.009) and eotaxin (Pearson’s = 0.24, p = 0.016). DISCUSSION/SIGNIFICANCE OF IMPACT: There is an enrichment of oral anerobes in lung allografts that eventually develop PGD. This is associated with elevated levels of pepsin and markers of inflammation. These lines of evidence suggest aspiration contributes to priming the allograft for PGD.
Introduction: Prehospital field trauma triage (FTT) standards were reviewed and revised in 2014 based on the recommendations of the Centers for Disease Control and Prevention. The FTT standard allows a hospital bypass and direct transport, within 30 min, to a lead trauma hospital (LTH). Our objectives were to assess the impact of the newly introduced prehospital FTT standard and to describe the emergency department (ED) management and outcomes of patients that had bypassed closer hospitals. Methods: We conducted a 12-month multi-centred health record review of paramedic and ED records following the implementation of the 4 step FTT standard (step 1: vital signs and level of consciousness (physiologic), step 2: anatomical injury, step 3: mechanism and step 4: special considerations) in nine paramedic services across Eastern Ontario. We included adult trauma patients transported as urgent that met FTT standard, regardless of transport time. We developed and piloted a data collection tool and obtained consensus on all definitions. The primary outcome was the rate of appropriate triage to a LTH which was defined as: ISS ≥12, admitted to intensive care unit (ICU), non-orthopedic surgery, or death. We have reported descriptive statistics. Results: 570 patients were included: mean age 48.8, male 68.9%, falls 29.6%, motor vehicle collisions 20.2%, stab wounds 10.5%, transported to a LTH 76.5% (n = 436). 72.2% (n = 315) of patients transported to a LTH had bypassed a closer hospital and 126/306 (41.2%) of those were determined to be an appropriate triage to LTH (9 patients had missing outcomes). ED management included: CT head/cervical spine 69.9%, ultrasound 53.6%, xray 51.6%, intubation 15.0%, sedation 11.1%, tranexamic acid 9.8%, blood transfusion 8.2%, fracture reduction 6.9%, tube thoracostomy 5.9%. Outcomes included: ISS ≥ 12 32.7%, admitted to ICU 15.0%, non-orthopedic surgery 11.1%, death 8.8%. Others included: admission to hospital 57.5%, mean LOS 12.8 days, orthopedic surgery 16.3% and discharged from ED 37.3%. Conclusion: Despite a high number of admissions, the majority of trauma patients bypassed to a LTH were considered over-triaged, with a low number of ED procedures and non-orthopedic surgeries. Continued work is needed to appropriately identify patients requiring transport to a LTH.
At present, analysis of diet and bladder cancer (BC) is mostly based on the intake of individual foods. The examination of food combinations provides a scope to deal with the complexity and unpredictability of the diet and aims to overcome the limitations of the study of nutrients and foods in isolation. This article aims to demonstrate the usability of supervised data mining methods to extract the food groups related to BC. In order to derive key food groups associated with BC risk, we applied the data mining technique C5.0 with 10-fold cross-validation in the BLadder cancer Epidemiology and Nutritional Determinants study, including data from eighteen case–control and one nested case–cohort study, compromising 8320 BC cases out of 31 551 participants. Dietary data, on the eleven main food groups of the Eurocode 2 Core classification codebook, and relevant non-diet data (i.e. sex, age and smoking status) were available. Primarily, five key food groups were extracted; in order of importance, beverages (non-milk); grains and grain products; vegetables and vegetable products; fats, oils and their products; meats and meat products were associated with BC risk. Since these food groups are corresponded with previously proposed BC-related dietary factors, data mining seems to be a promising technique in the field of nutritional epidemiology and deserves further examination.
Dans le cadre du développement de la classification internationale des maladies (CIM-11), les groupes de travail ont développé des propositions avec pour objectif d’améliorer l’utilité clinique de la classification. Ces propositions sont testées via la plateforme internet « Réseau Mondial de Pratique Clinique (RMPC) » permettant de conduire à des études cliniques électroniques dans les langues officielles de l’OMS, dont le français. Cette étude s’intéresse aux catégories diagnostiques des troubles de l’alimentation et des conduites alimentaires (TCA). Des nouveaux diagnostics ont été proposés tels que le trouble d’hyperphagie et le trouble d’évitement et de restriction de l’apport alimentaire.
– évaluer l’impact des changements spécifiques des TCA entre la CIM-10 et la CIM-11 auprès des membres francophones du RMPC ;
– évaluer la validité, l’utilité clinique des nouvelles propositions et l’accord inter-juges des participants.
Étude mixte, internationale, conduite par internet auprès des membres francophones du RMPC.
Membres du RMPC maîtrisant le français (environ 1000 professionnels) et exerçant une activité clinique.
La population cible recevra un email d’invitation. Les participants seront amenés à lire deux vignettes puis à poser des diagnostics et à répondre à des questions complémentaires, en se basant sur la CIM-10 ou la CIM-11 qu’ils auront reçu de façon aléatoire.
Les vignettes représenteront des cas cliniques réels et reflèteront les changements spécifiques entre la CIM-10 et la CIM-11. Elles seront ainsi présentées par pair (8 pairs possibles).
– interparticipants portant sur l’utilisation du système diagnostique (10 ou 11) et l’attribution du diagnostic en fonction des changements spécifiques ;
– intra-participant sur l’évaluation des pairs de vignettes.
Cette étude doit permettre d’évaluer les nouvelles propositions CIM en français, en tenant compte des spécificités culturelles et linguistiques de la francophonie.
In addition to the positive and negative symptoms, schizophrenia is associated with a variety of cognitive impairments, and in particular with episodic memory deficits. Functional neuroimaging studies have begun exploring the potential neural correlates of memory deficits but there are few reports of structural brain abnormalities underlying memory impairment in schizophrenia. We investigated the potential association between morphological brain abnormalities as revealed by cortical thickness measures and episodic memory performance on a face recognition task. Differences in regional cortical thickness between 27 patients with a DSM-IV diagnosis of schizophrenia and 28 control matched subjects were investigated using MRI T1 images and computer image analysis (CIVET pipeline; Lerch and Evans, 2005). Cortical thickness was estimated as the shortest distance between the pial surface of the cerebral cortex and the white-matter/gray-matter interface surface at numerous points (40 962 vertices) across the cortical mantle. Consistent with previous studies, a group comparison revealed thinner cortex in the patient group relative to controls in the right prefrontal cortex and parahippocampal gyrus. Interestingly, a significant positive correlation between memory performance and cortical thickness of the anterior cingulate, bilaterally as well as the right parahippocampal gyrus was noted in the schizophrenia group. That is, the thinner the cortex in those regions, the more impaired the patients were in terms of memory performance as compared to healthy participants.
L’exercice de la médecine est un compromis permanent entre la vie et la mort, entre puissance médicale et risque d’échec. Un exercice d’autant plus complexe qu’il est soumis aux contraintes d’une organisation institutionnelle mouvante et d’une charge de travail croissante. Par essence, les psychiatres sont exposés à une charge émotionnelle intense dans leurs échanges avec des patients souffrants et traumatisés, d’autant qu’il leur est recommandé de faire preuve d’empathie. Ainsi les médecins présentent un risque important de burn out, avec 49 % d’épuisement émotionnel chez des psychiatres italiens par exemple. Les comorbidités du burn out restent la dépression, le suicide, les addictions. Le risque suicidaire est plus élevé chez les médecins (les hommes médecins sont 1,4 fois plus à risque de commettre un suicide que les hommes non-médecins) et seulement 1/5 déclarent qu’ils iraient chercher de l’aide s’ils souffraient d’une maladie mentale. Etre thérapeute auprès de victimes de traumatismes peut entraîner une souffrance psychologique cumulée se manifestant sous forme de certains symptômes post-traumatiques révélant un traumatisme vicariant ou secondaire. L’usure de compassion, terme parfois utilisé comme synonyme, est pourtant quant à elle conceptualisée comme la somme de deux entités : le trauma vicariant et le burn out. La vulnérabilité à ces modifications cognitives est d’autant plus grande chez les soignants qu’ils présentent une exposition personnelle à des évènements traumatisants importante. L’élaboration d’échelles d’évaluation validées permet de mener des études sur ces différentes dimensions (« usure de compassion », traumatisme vicariant, burn out,…) parfois comprises comme conséquences néfastes de stratégies de coping dépassées. En France, le développement de la prise en soin des victimes de psychotraumatisme, doit conduire à étudier l’impact de celui-ci sur les personnels soignants.
The Time to Change (TTC) anti-stigma campaign, launched in January 2009 in England, intends to make fundamental improvements across England in: public knowledge, attitudes and discriminatory behaviour in relation to people with mental illness. To be effective and valid the campaign must reach a wide range of diverse audiences. This study explores attitudes of people from ethnic minority communities in relation to mental health.
The study investigates:
1) General attitudes and perceptions about mental illness in ethnic minority communities
2) How we might increase awareness about mental wellbeing and decrease stigma in ethnic minority communities.
Ten focus groups with members of ethnic minority groups were conducted. Five groups consisted of service users and five were composed of non-service users. Two groups comprised participants from an Indian origin, two Somali origin, two Afro-Caribbean origin and the other groups were mixed.
We will present findings regarding the ways in which traditional perceptions of mental health and personal experiences of ethnic minority service users affect their perceptions of sources of support such as family, friends, medical staff and religion and how this feedback could inform ant-stigma interventions.
The study suggests that in order to maximise the impact of anti-stigma campaigns, attention should be given to sources of discrimination and traditional perceptions of mental illness which are emphasised by ethnic minority groups. When planning anti-stigma campaigns it is important to incorporate experiences and perceptions from a wide range of audiences.
A range of decision-makers, including policy-makers, NGOs and local communities, have a stake in developing conservation interventions that are to be implemented on the ground. In order to ensure that decision-making is evidence-informed, the science community needs to engage these communities of policy and practice effectively. This chapter brings together work which explores how scientists can work effectively with decision-makers, using global case studies from South America, Australia, New Zealand and elsewhere to identify what works. It identifies 10 key tips for successful engagement : (1) know who you need to talk to, (2) engage early, (3) make it easy to engage, (4) include multiple knowledges, perspectives and worldviews, (5) think hard about power, (6) build trust, (7) good facilitation is key, (8) learn new engagement skills, (9) make use of existing spaces of collaboration, and (10) don't give up. While executing these tips will not guarantee successful engagement in every case, it will improve the chances for mutually beneficial relationships and hence better conservation outcomes.