To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Triage at mass gatherings in Australia is commonly performed by staff members with first aid training. There have been no evaluations of the performance of first aid staff with respect to diagnostic accuracy or identification of presentations requiring ambulance transport to hospital.
It was hypothesized that triage decisions by first aid staff would be considered correct in at least 61% of presentations.
A retrospective audit of 1,048 presentations to a single supplier of event health care services in Australia was conducted. The presentations were assessed based on the first measured set of physiological parameters, and the primary triage decision was classified as “expected” if the primary and secondary triage classifications were the same or “not expected” if they differed. The performance of the two triage systems was compared using area under the receiver operating characteristic curve (AUROC) analysis.
The expected decision was made by first aid staff in 674 (71%) of presentations. Under-triage occurred in 131 (14%) presentations and over-triage in 142 (15%) presentations. The primary triage strategy had an AUROC of 0.7644, while the secondary triage strategy had an AUROC of 0.6280, which was significantly different (P = .0199).
The results support the continued use of first aid trained staff members in triage roles at Australian mass gatherings. Triage tools should be simple, and the addition of physiological variables to improve the sensitivity of triage tools is not recommended because such an approach does not improve the discriminatory capacity of the tools.
To examine mediation by (i) diet quality and (ii) diet quantity in the associations of mindful eating domains with 3-year change in depressive symptoms.
Depressive symptoms were measured with the Center for Epidemiologic Studies Depression scale at baseline and 3-year follow-up. Four mindful eating domains (Focused Eating; Eating in response to Hunger and Satiety Cues; Eating with Awareness; Eating without Distraction) were measured with the Mindful Eating Behavior Scale. Food intake was measured with a 238-item FFQ. Diet quality was defined as the Mediterranean Diet Score (MDS). Diet quantity was defined as total energy intake (kcal/d; 1 kcal = 4·184 kJ). Mediation analyses with percentile-corrected bootstrap confidence intervals were conducted to calculate indirect effects.
Longitudinal Aging Study Amsterdam.
Adults aged 55 years or above (n 929).
Diet quality (MDS) did not mediate associations of any of the four mindful eating domains with change in depressive symptoms. In contrast, total energy intake did mediate the associations with change in depressive symptoms for the mindful eating domains Eating with Awareness (indirect effects fully adjusted models: B = −0·014, 95 % CI −0·037, −0·002) and Eating without Distraction (B = −0·013, 95 % CI −0·033, −0·001), but not for the other two domains. Post hoc multiple mediation analyses showed similar results.
Higher scores on two mindful eating domains were associated with a decrease in depressive symptoms through lower total energy intake. Diet quantity, but not diet quality, could be a possible underlying mechanism in the associations between mindful eating and change in depressive symptoms.
When tuberculosis (TB) and depression co-occur, there is greater risk for comorbidities, disability, suffering, and health-related costs. Depression is also associated with poor treatment adherence in patients with TB. The major aim of this study was to assess the symptoms of depression and associated factors among TB patients currently receiving directly observed treatment short-course (DOTS) treatment.
A cross-sectional study was conducted among TB patients currently undergoing treatment in 27 DOTS centers in three districts of Kathmandu Valley. The study included 250 TB patients within 2 months of treatment initiation, aged 18 years and above. The previously validated Nepali Patient Health Questionnaire (PHQ-9) was used to screen for depression and semi-structured interviews were conducted to collect socio-demographic information and other factors related to TB and/or depression. Data analysis was conducted using IBM SPSS Statistics version 20.
The study found the mean PHQ Score to be 2.84 (s.d. 4.92, range 0–25). Among the respondents, 10% (n = 25) had PHQ-9 scores ⩾10, suggestive of probable depression. Multivariate linear regression indicated that depressive symptoms were significantly associated with being separated/widowed/divorced (p = 0.000) and having lower education (0.003). In addition, smoking (p = 0.02), alcohol use (p = 0.001), and experience of side effects from TB medications (p = 0.001) were risk factors for higher PHQ-9 scores.
Our findings suggest that patients on TB treatment have higher risk of depression and efforts should be made by the National Tuberculosis Program to address this issue.
Navigating the research domain at an academic medical center can be challenging, even for seasoned investigators. To address this, Duke University launched two initiatives: (1) a research navigation “hotline” to provide brief assistance with a variety of research questions; and (2) researcher onboarding and consultation, a one-to-one tailored offering to ensure that researchers are equipped to navigate research resources and processes effectively. The services are provided by the myRESEARCHnavigators (MRN) team, funded by Duke’s CTSA. The diverse scientific backgrounds of the six team members align well with those of the research community, allowing for a good match between the researcher and MRN team member. The MRN team answers approximately 30 questions per month, and has provided consultations to almost 400 researchers. Both services receive high satisfaction ratings (4 or 5 stars [out of 5 stars] given to 90% of hotline answers, and 99% of researcher onboarding/consultation sessions). As of July 2019, the School of Medicine has determined that the consultations are critical to their mission and have made them a requirement for new research faculty. The team will continue marketing both services to encourage adoption.
Review findings on the role of dietary patterns in preventing depression are inconsistent, possibly due to variation in assessment of dietary exposure and depression. We studied the association between dietary patterns and depressive symptoms in six population-based cohorts and meta-analysed the findings using a standardised approach that defined dietary exposure, depression assessment and covariates.
Included were cross-sectional data from 23 026 participants in six cohorts: InCHIANTI (Italy), LASA, NESDA, HELIUS (the Netherlands), ALSWH (Australia) and Whitehall II (UK). Analysis of incidence was based on three cohorts with repeated measures of depressive symptoms at 5–6 years of follow-up in 10 721 participants: Whitehall II, InCHIANTI, ALSWH. Three a priori dietary patterns, Mediterranean diet score (MDS), Alternative Healthy Eating Index (AHEI-2010), and the Dietary Approaches to Stop Hypertension (DASH) diet were investigated in relation to depressive symptoms. Analyses at the cohort-level adjusted for a fixed set of confounders, meta-analysis used a random-effects model.
Cross-sectional and prospective analyses showed statistically significant inverse associations of the three dietary patterns with depressive symptoms (continuous and dichotomous). In cross-sectional analysis, the association of diet with depressive symptoms using a cut-off yielded an adjusted OR of 0.87 (95% confidence interval 0.84–0.91) for MDS, 0.93 (0.88–0.98) for AHEI-2010, and 0.94 (0.87–1.01) for DASH. Similar associations were observed prospectively: 0.88 (0.80–0.96) for MDS; 0.95 (0.84–1.06) for AHEI-2010; 0.90 (0.84–0.97) for DASH.
Population-scale observational evidence indicates that adults following a healthy dietary pattern have fewer depressive symptoms and lower risk of developing depressive symptoms.
Spatial models are increasingly being used to target the most suitable areas for biodiversity conservation. This study investigates how the spatial tool Marxan with Zones (MARZONE) can be used to support the design of cost-effective biodiversity conservation policy. New in this study is the spatial analysis of the costs and effectiveness of different agro-environmental measures (AEMs) for habitat and biodiversity conservation in the Montado ecosystem in Portugal. A distinction is made between the financial costs paid to participating landowners and farmers for adopting AEMs and the broader economic opportunity costs of the corresponding land-use changes. Habitat and species conservation targets are furthermore defined interactively with the local government agency responsible for the management of protected areas, while the costs of agro-forestry activities and alternative land uses are estimated in direct consultation with local landowners. MARZONE identifies the spatial distribution of priority areas for conservation and the associated costs, some of which overlap with existing protected areas. These results provide useful insights into the trade-offs between nature conservation and the opportunity costs of protecting ecologically vulnerable areas, helping to improve current and future conservation policy design.
While numerous studies have recently shown that variation in input quantity predicts children’s rate of acquisition across a range of language skills, comparatively little is known about the impact of variation in input quality on (bilingual) children’s language development. This study investigated the relation between specific quality-oriented properties of bilingual children’s input and measures of children’s language development across a number of skills while at the same time taking family constellation into account. Participants were bilingual preschoolers (n = 50) acquiring Dutch alongside another language. Preschoolers’ receptive and productive vocabulary and morphosyntax in Dutch were assessed. Parental questionnaires were used to derive estimates of input quality. Family constellation was first operationalized as presence of a native-speaker parent and subsequently in terms of patterns of parental language use. Results showed that proportion of native input and having a native-speaker parent were never significant predictors of children’s language skills, whereas the degree of non-nativeness in the input, family constellation in terms of parental language use, and language richness were. This study shows that what matters is not how much exposure bilingual children have to native rather than non-native speakers, but how proficient any non-native speakers are.
Families with neurodevelopmental disorders engage in varied types of therapies to address behavioural, communication and cognitive challenges. Research suggests that consistent therapy adherence predicts positive therapy outcomes. The present study examined therapy adherence in 55 parent-child dyads where all children had been diagnosed with ASD, ADHD, and/or ID. Parents completed questionnaires assessing demographics, therapy type, adherence to child treatment, parental stress, and challenging child behaviour. The researchers proposed a new scale, the Child Therapy Adherence Scale (CTAS), which initial testing supported as a reliable measure of therapy adherence. Significant relationships were found between parental stress, annual household income and therapy adherence, with parental stress being a notably strong predictor of therapy adherence. No significant relationships were observed between child challenging behaviour, single parent status and therapy adherence. These findings have implications for practitioners, in that parent levels of stress and demographic influences may impact capacity to adhere to recommended home practice and interventions for children with neurodevelopmental disorders.
After five positive randomized controlled trials showed benefit of mechanical thrombectomy in the management of acute ischemic stroke with emergent large-vessel occlusion, a multi-society meeting was organized during the 17th Congress of the World Federation of Interventional and Therapeutic Neuroradiology in October 2017 in Budapest, Hungary. This multi-society meeting was dedicated to establish standards of practice in acute ischemic stroke intervention aiming for a consensus on the minimum requirements for centers providing such treatment. In an ideal situation, all patients would be treated at a center offering a full spectrum of neuroendovascular care (a level 1 center). However, for geographical reasons, some patients are unable to reach such a center in a reasonable period of time. With this in mind, the group paid special attention to define recommendations on the prerequisites of organizing stroke centers providing medical thrombectomy for acute ischemic stroke, but not for other neurovascular diseases (level 2 centers). Finally, some centers will have a stroke unit and offer intravenous thrombolysis, but not any endovascular stroke therapy (level 3 centers). Together, these level 1, 2, and 3 centers form a complete stroke system of care. The multi-society group provides recommendations and a framework for the development of medical thrombectomy services worldwide.
OBJECTIVES/SPECIFIC AIMS: 1.Assess changes in clinical research workforce landscape at Duke 2.Optimize and evaluate efficacy of a tier advancement process for clinical research career progression at Duke 3.Implement CRP engagement as a change management mechanism for workforce innovation METHODS/STUDY POPULATION: We evaluated 857 clinical research positions at Duke to understand changes in the workforce (demographics, numbers in each position and tier) since implementing the tier advancement process in 2016. To understand the efficacy of the tier assessment process, data from a subset of this population (n=84) who underwent the advancement process was examined for success rate. Individual employees and their managers were surveyed to understand their perception of the advancement process and identify areas for improvement. We also describe implementation of multiple mechanisms of community engagement to manage expectations around the tier advancement process and to provide opportunities for employees to self-manage their career planning, including portfolio planning and leadership opportunities. RESULTS/ANTICIPATED RESULTS: Whereas the clinical research workforce has grown by 5.5% since tiering began (2016, n=810; 2018 n=857). Nearly 13% of that growth has been in managerial or senior positions (2016, n=111; 2018 n=127). Distributions across job classifications changed only slightly, representing realignment of positions with study-level responsibilities over department-level responsibilities. Notably, clinical research nurses (CRNC & CRNC Sr.) was the only category including tiered and non-tiered positions to drop overall numbers between 2016 (n = 136) and 2018 (n=115), representing a shift in the workforce from research nurses to research coordinators. General demographics (gender, age) remained largely the same. A total of 359 positions have been hired during this time frame, nearly half of which were entry-level positions (175/359): 47 of these positions represent expansion of the workforce. Of 359 new hires since 2016, 271 currently still work in one of the research roles. Of the 84 employees who underwent the tier advancement process, 84% (43/51) succeeded in tier 2 advancement, 70% (14/20) succeeded in tier 3 advancement (CRC, CRNC, and regulatory coordinators), and 77% (10/13) of research program leaders (RPLs) succeeded in achieving tier two, which is the highest tier for this group. Fifty-five employees (65%) and 32 managers responded to a voluntary feedback survey. Overall confidence in the process improved in both groups from 2016 to 2018, most notably with managers. Both groups indicated a 10 hour reduction (employees = 35hr, 2016 and 25hr 2018; managers 25hr, 2016 and 15hr 2018) in time required to complete the tier advancement process. DISCUSSION/SIGNIFICANCE OF IMPACT: The use of objectively-assessed competencies is an important step in the development of a workforce. By 1) maintaining alignment with industry standards for competencies, 2) upholding high standards, and 3) offering a consistent approach to career growth, Duke is working to develop and maintain a workforce that supports high quality research. Since the implementation of standardized job classifications and competency-based tier advancement, the positions have undergone rigorous competency-based needs evaluation. This leads to better matched jobs to study needs as well as increased standardization across the clinical research workforce. We believe that the subtle workforce shifts represent alignment of our positions with the business needs of our clinical research enterprise. Additionally, approximately 15% of our clinical research workforce has taken advantage of the opportunity to advance their own careers. We have made significant improvements in the following tier advancement processes: standardization of assessments, scoring guides, and modes; changes from LMS to a REDCap delivery of the knowledge assessments; streamlined the utilization of electronic documentation; and additional guidance for employees and managers regarding portfolio inclusions. These improvements have increased satisfaction and acceptance with the advancement process and were made possible through strategic communication across the workforce. Regular town hall meetings and focus group feedback sessions have included the clinical research community in discussions of WE-R initiatives and provided a much-needed feedback loop for process improvement and change management. Moreover, inclusion of WE-R discussion in our Research Professional Network events has provided opportunities to discuss career advancement strategies as well as volunteer opportunities to grow and demonstrate leadership competencies.
For a long time, sexual violence in conflict has been seen and largely prosecuted as crimes committed by men against women. Only recently, the international community is becoming aware of the brutality of sexual violence perpetrated against men in conflict situations, and that the number of these atrocities is much higher than has always previously been assumed. Similarly, like men, women and girls are capable – and indeed have done so – of committing atrocious forms of conflict-related sexual violence against men, boys, women and girls alike.
According to Rosalind Petchesky, ‘acts of sexual violence against men, women … are named and experienced differently, which is what it means to say they are gendered’. Building on this insight, this chapter deals with the two oft en marginalised categories of conflict situations with regards to sexual violence, namely male victims and female perpetrators. It first explores the lack of awareness and knowledge surrounding both male victims and female perpetrators of sexual violence and the underlying reasons for such violence. It then discusses how gender norms regarding sexual violence impact on all aspects of the international justice process. In the case of male victims of sexual violence, attention is drawn to the tendency to erase the sexual nature of the crime by using a more general label, such as torture. It will be argued that this state of affairs derives from gender norms, which make male victimisation, and especially male sexual victimisation, inconsistent with ideals of masculinity. In the case of female perpetrators of conflict-related sexual violence, it will be argued that the existence of female perpetrators of sexual violence in conflict is at odds with ideals of femininity. Attention will, inter alia, be drawn to the tendency of these women to deny their ability – as women – to commit sexual violence. The chapter ends with some concluding remarks on the ways forward in understanding and prosecuting male sexual victimisation and sexual violence perpetrated by women.
This study investigated bidirectional associations between intake of food groups and depressive symptoms in 1058 Italian participants (aged 20–102 years) of the Invecchiare in Chianti study. Dietary intake, assessed with a validated FFQ, and depressive symptoms, measured with the Center for Epidemiologic Studies Depression scale (CES-D), were assessed at baseline and after 3, 6 and 9 years. Associations of repeated measurements of intakes of thirteen food groups with 3-year changes in depressive symptoms, and vice versa, were analysed using linear mixed models and logistic generalised estimating equations. Fish intake was inversely (quartile (Q)4 v. Q1, B=–0·97, 95 % CI –1·74, –0·21) and sweet food intake positively (Q4 v. Q1, B=1·03, 95 % CI 0·25, 1·81) associated with subsequent CES-D score. In the other direction, higher CES-D scores were associated with decreases in intakes of vegetables (ratio: 0·995, 95 % CI 0·990, 0·999) and red and processed meat (B=–0·006, 95 % CI –0·010, –0·001), an increase in dairy product intake (ratio: 1·008, 95 % CI 1·004, 1·013), and increasing odds of eating savoury snacks (OR: 1·012, 95 % CI 1·000, 1·024). Fruit, nuts and legumes, potatoes, wholegrain bread, olive oil, sugar-sweetened beverages, and coffee and tea were not significantly associated in either direction. Our study confirmed bidirectional associations between food group intakes and depressive symptoms. Fish and sweet food intakes were associated with 3-year improvement and deterioration in depressive symptoms, respectively. Depressive symptoms were associated with 3-year changes in vegetable, meat, dairy product and savoury snack intakes. Trials are necessary to examine the causal associations between food groups and depression.
Pregnant and lactating women and breastfed infants are at risk of vitamin D deficiency. The supplemental vitamin D dose that optimises maternal vitamin D status and breast milk antirachitic activity (ARA) is unclear. Healthy pregnant women were randomised to 10 (n 10), 35 (n 11), 60 (n 11) and 85 (n 11) µg vitamin D3/d from 20 gestational weeks (GW) to 4 weeks postpartum (PP). The participants also received increasing dosages of fish oil supplements and a multivitamin. Treatment allocation was not blinded. Parent vitamin D and 25-hydroxyvitamin D (25(OH)D) were measured in maternal plasma at 20 GW, 36 GW and 4 weeks PP, and in milk at 4 weeks PP. Median 25(OH)D and parent vitamin D at 20 GW were 85 (range 25–131) nmol/l and ‘not detectable (nd)’ (range nd–40) nmol/l. Both increased, seemingly dose dependent, from 20 to 36 GW and decreased from 36 GW to 4 weeks PP. In all, 35 µg vitamin D/d was needed to increase 25(OH)D to adequacy (80–249 nmol/l) in >97·5 % of participants at 36 GW, while >85 µg/d was needed to reach this criterion at 4 weeks PP. The 25(OH)D increments from 20 to 36 GW and from 20 GW to 4 weeks PP diminished with supplemental dose and related inversely to 25(OH)D at 20 GW. Milk ARA related to vitamin D3 dose, but the infant adequate intake of 513 IU/l was not reached. Vitamin D3 dosages of 35 and >85 µg/d were needed to reach adequate maternal vitamin D status at 36 GW and 4 weeks PP, respectively.
This study investigated whether cross-linguistic differences affect semantic prediction. We assessed this by looking at two languages, Dutch and Turkish, that differ in word order and thus vary in how words come together to create sentence meaning. In an eye-tracking task, Dutch and Turkish four-year-olds (N = 40), five-year-olds (N = 58), and adults (N = 40) were presented with a visual display containing two familiar objects (e.g., a cake and a tree). Participants heard semantically constraining (e.g., “The boy eats the big cake”) or neutral sentences (e.g., “The boy sees the big cake”) in their native language. The Dutch data revealed a prediction effect for children and adults; however, it was larger for the adults. The Turkish data revealed no prediction effect for the children but only for the adults. These findings reveal that experience with word order structures and/or automatization of language processing routines may lead to timecourse differences in semantic prediction.
Zn deficiency and diarrhoea are prevalent and may coexist in children living in low-resource settings. Recently, a novel approach for delivering Zn via microbiologically treated, Zn-fortified water was shown to be effective in improving Zn status in West African schoolchildren. We assessed the effectiveness of Zn-fortified, microbiologically purified water delivered as a household intervention on Zn intake, status and morbidity in children aged 2–6 years from rural western Kenya.
Randomised controlled trial. Intervention included households assigned to water treatment device with (ZFW) or without (FW) Zn delivery capability
Rural households in Kisumu, western Kenya.
Children aged 2–6 years.
The ZFW group had higher dietary Zn intake compared with the FW group. ZFW contributed 36 and 31 % of daily requirements for absorbable Zn in children aged 2–3 and 4–6 years, respectively, in the ZFW group. Consumption of Zn-fortified water resulted in lower prevalence of reported illness (risk ratio; 95 % CI) in the ZFW group compared with the FW group: for cold with runny nose (0·91; 0·83, 0·99; P=0·034) and abdominal pain (0·70; 0·56, 0·89; P=0·003) in the intention-to-treat analysis and for diarrhoea (0·72; 0·53, 0·96; P=0·025) in the per-protocol analysis. We did not detect an effect of treatment on plasma Zn concentration.
Daily consumption of Zn-fortified, microbiologically treated water results in increased intake of absorbable dietary Zn and may help in preventing childhood infections in pre-school children in rural Africa.
Innovation platforms are fast becoming part of the mantra of agricultural research for development projects and programmes. Their basic tenet is that stakeholders depend on one another to achieve agricultural development outcomes, and hence need a space where they can learn, negotiate and coordinate to overcome challenges and capture opportunities through a facilitated innovation process. Although much has been written on how to implement and facilitate innovation platforms efficiently, few studies support ex-ante appraisal of when and for what purpose innovation platforms provide an appropriate mechanism for achieving development outcomes, and what kinds of human and financial resource investments and enabling environments are required. Without these insights, innovation platforms run the risk of being promoted as a panacea for all problems in the agricultural sector. This study makes clear that not all constraints will require innovation platforms and, if there is a simpler and cheaper alternative, that should be considered first. Based on the review of critical design principles and plausible outcomes of innovation platforms, this study provides a decision support tool for research, development and funding agencies that can enhance more critical thinking about the purposes and conditions under which innovation platforms can contribute to achieving agricultural development outcomes.
OBJECTIVES/SPECIFIC AIMS: Describe (1) the components of the research navigation service and consultation/onboarding program, (2) use and adoption of the services, and (3) the overall satisfaction from the research community. METHODS/STUDY POPULATION: Duke offers 2 programs to support researchers: Research Navigation and Researcher Onboarding. The services aim to connect researchers to resources, offices, funding opportunities, and other collaborators. The general Research Navigation Service is an on-demand “hotline,” where navigators answer questions from researchers across the institution, helping them understand processes, best practices, and how to locate resources or potential collaborators. Navigators can be reached via the myRESEARCHhome portal, email, or by phone. The researcher onboarding program is a free 1:1 consultative service, focused on the researcher’s individual portfolio, stage of career, and immediate plans in the research arena. The goal is to equip researchers “from the start” to be successful. Researchers are identified via the new faculty hire list, or by referral. Both services are provided by the myRESEARCHnavigators team, who are trained in a variety of research areas, from basic to clinical to social sciences, and are familiar with Duke. RESULTS/ANTICIPATED RESULTS: Use of both services has increased substantially over the year. Of the almost 200 faculty members hired into the School of Medicine in 2017, ~75% have taken part in the onboarding program, and 91% have rated the service as 5-stars. The content of the sessions will be described. The Research Navigation service has fielded hundreds of calls since its inception, with topics including Equipment and Facilities (55 requests), Study start up (44 requests), Innovation and Technology (15 requests), and Regulation and Policy (25 requests). Categorization of requests, users of the services, and other information about the programs will be described. DISCUSSION/SIGNIFICANCE OF IMPACT: The navigation and onboarding services are proving to be a successful way to increase efficiency and understanding of processes and resources across the institution. Feedback from the users, coupled with high referral rates to the programs, suggests that the program is helping researchers feel better equipped with regard to their research planning, conduct, and analysis.
OBJECTIVES/SPECIFIC AIMS: Describe the framework for tier advancement of research professionals. Describe the various forms of assessments of competencies. How competencies are used to provide transparency into professional development opportunities. Discuss the results of the first tier advancement opportunity for research staff. METHODS/STUDY POPULATION: These processes were developed at Duke, an academic medical center with over 2000 active clinical research protocols and 300 new clinical trials per year. Roughly 500 employees are categorized into tiered classifications, allowing them opportunities for advancement through competency testing. Approximately 10% opted for tier testing, and their results will be shared. RESULTS/ANTICIPATED RESULTS: Competency assessments were developed for all 42 of Duke’s research professional competencies, some using 2 modalities of testing. Almost 12% of the research professionals classified in tiered positions opted to attempt the tier advancement process. Of those, 37 completed, and the vast majority reached their desired tier. Results by competency will be provided. DISCUSSION/SIGNIFICANCE OF IMPACT: The use of objectively assessed competencies is an important step in the development of a workforce. By (1) maintaining alignment with industry standards for competencies, (2) holding staff to a high bar, and (3) offering a consistent approach to career growth, Duke is working to develop and maintain a workforce that supports high quality research.