To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study examines the association between attention-deficit/hyperactivity disorder (ADHD) and overweight/obesity in a large-scale longitudinal study of children, while controlling for a range of psychosocial factors.
Data were obtained from Growing Up in Ireland, a nationally representative and longitudinal study of approximately 6500 children who were assessed at 9 and 13 years of age. Body mass index (BMI) was determined using measured height and weight, ADHD status was determined by parent reports of professional diagnoses and ADHD symptoms were measured using the Strengths and Difficulties Questionnaire (SDQ).
The associations between ADHD status, ADHD symptoms (SDQ) and BMI category at age 9 and 13 years were evaluated using logistic regression. Adjustments were made for child factors (sex, developmental coordination disorder, emotional symptoms, conduct problems, birth weight and exercise) and parental factors (socio-economic status, parental BMI, parental depression, and maternal smoking and alcohol use during pregnancy). Logistic regression indicated that ADHD status was not associated with BMI category at 9 or at 13 years of age, but children with ADHD at 9 years were significantly more likely to be overweight/obese at 13 years than those without ADHD. However, when other child and parental factors were adjusted for, ADHD status was no longer significantly associated with weight status. Female sex, low levels of exercise, overweight/obese parents and prenatal smoking during pregnancy consistently increased the odds of childhood overweight/obesity.
While ADHD and overweight/obesity co-occur in general populations, this relationship is largely explained by a variety of psychosocial factors.
In Leviathan, Hobbes uses his new theory of authorization to explain the nature of corporate persons. While On the Citizen lacks the theory of authorization, it includes several accounts of corporate persons. In On the Citizen, Hobbes suggests that a group forms a corporate person when its members accept obligations to support a sovereign, when the members are all compelled to act in concert, or when the members of the group adopt voting rules for making decisions. Hobbes also uses his analysis of the commonwealth as a corporate person to argue for the sovereign’s immunity in On the Citizen much as he does in Leviathan. Generally speaking, the Leviathan account of corporate persons is superior to the ones in On the Citizen. However, Hobbes needs the voting rules account from On the Citizen in order to explain how democratic and aristocratic assemblies can serve as sovereigns. Since he tries to replace the voting rules account with the authorization account in Leviathan, this raises a problem for him that he does not appreciate.
Horseweed is one of Kentucky’s most common and problematic weeds in no-till soybean production systems. Emergence in the fall and spring necessitates control at these times because horseweed is best managed when small. Control is typically achieved through herbicides or cover crops (CCs); integrating these practices can lead to more sustainable weed management. Two years of field experiments were conducted over 2016 to 2017 and 2017 to 2018 in Versailles, KY, to examine the use of fall herbicide (FH; namely, saflufenacil or none), spring herbicide (SH; namely, 2,4-D; dicamba; or none), and CC (namely, cereal rye or none) for horseweed management prior to soybean. Treatments were examined with a fully factorial design to assess potential interactions. The CC biomass in 2016 to 2017 was higher relative to 2017 to 2018 and both herbicide programs reduced winter weed biomass in that year. The CC reduced horseweed density while growing and after termination in 1 yr. The FH reduced horseweed density through mid-spring. The FH also killed winter weeds that may have suppressed horseweed emergence; higher horseweed density resulted by soybean planting unless the CC was present to suppress the additional spring emergence. If either FH or CC was used, SH typically did not result in additional horseweed control. The SH killed emerged plants but did not provide residual control of a late horseweed flush in 2017 to 2018. These results suggest CCs can help manage spring flushes of horseweed emergence when nonresidual herbicide products are used, though this effect was short-lived when less CC biomass was present.
Antimicrobial stewardship improves patient care and reduces antimicrobial resistance, inappropriate use, and adverse outcomes. Despite high-profile mandates for antimicrobial stewardship programs across the healthcare continuum, descriptive data, and recommendations for dedicated resources, including appropriate physician, pharmacist, data analytics, and administrative staffing support, are not robust. This review summarizes the current literature on antimicrobial stewardship staffing and calls for the development of minimum staffing recommendations.
Use of the herbicide atrazine (ATR) is banned in the European Union; yet, it is still widely used in the USA and Australia. ATR is known to alter testosterone and oestrogen production and thus reproductive characteristics in numerous species. In this proof of concept study, we examined the effect of ATR exposure, at a supra-environmental dose (5 mg/kg bw/day), beginning on E9.5 in utero, prior to sexual differentiation of the reproductive tissues, until 26 weeks of age, on the development of the mouse penis. Notably, this is the first study to specifically investigate whether ATR can affect penis characteristics. We show that ATR exposure, beginning in utero, causes a shortening (demasculinisation) of penis structures and increases the incidence of hypospadias in mice. These data indicate the need for further studies of ATR on human reproductive development and fertility, especially considering its continued and widespread use.
Anecdotal evidence suggests the use of bolus tube feeding is increasing in the long-term home enteral tube feed (HETF) patients. A cross-sectional survey to assess the prevalence of bolus tube feeding and to characterise these patients was undertaken. Dietitians from ten centres across the UK collected data on all adult HETF patients on the dietetic caseload receiving bolus tube feeding (n 604, 60 % male, age 58 years). Demographic data, reasons for tube and bolus feeding, tube and equipment types, feeding method and patients’ complete tube feeding regimens were recorded. Over a third of patients receiving HETF used bolus feeding (37 %). Patients were long-term tube fed (4·1 years tube feeding, 3·5 years bolus tube feeding), living at home (71 %) and sedentary (70 %). The majority were head and neck cancer patients (22 %) who were significantly more active (79 %) and lived at home (97 %), while those with cerebral palsy (12 %) were typically younger (age 31 years) but sedentary (94 %). Most patients used bolus feeding as their sole feeding method (46 %), because it was quick and easy to use, as a top-up to oral diet or to mimic mealtimes. Importantly, oral nutritional supplements (ONS) were used for bolus feeding in 85 % of patients, with 51 % of these being compact-style ONS (2·4 kcal (10·0 kJ)/ml, 125 ml). This survey shows that bolus tube feeding is common among UK HETF patients, is used by a wide variety of patient groups and can be adapted to meet the needs of a variety of patients, clinical conditions, nutritional requirements and lifestyles.
Oxidative stress is implicated in the aetiology of schizophrenia, and the antioxidant defence system (AODS) may be protective in this illness. We examined the major antioxidant glutathione (GSH) in prefrontal brain and its correlates with clinical and demographic variables in schizophrenia.
GSH levels were measured in the dorsolateral prefrontal region of 28 patients with chronic schizophrenia using a magnetic resonance spectroscopy sequence specifically adapted for GSH. We examined correlations of GSH levels with age, age at onset of illness, duration of illness, and clinical symptoms.
We found a negative correlation between GSH levels and age at onset (r = −0.46, p = 0.015), and a trend-level positive relationship between GSH and duration of illness (r = 0.34, p = 0.076).
Our findings are consistent with a possible compensatory upregulation of the AODS with longer duration of illness and suggest that the AODS may play a role in schizophrenia.
The first case of evolved protoporphyrinogen oxidase (PPO)-inhibitor resistance was observed in 2001 in common waterhemp [Amaranthus tuberculatus (Moq.) Sauer var. rudis (Sauer) Costea and Tardif]. This resistance in A. tuberculatus is most commonly conferred by deletion of the amino acid glycine at the 210th position (ΔGly-210) of the PPO enzyme (PPO2) encoded by PPX2. In a field in Kentucky in 2015, inadequate control of Amaranthus plants was observed following application of a PPO inhibitor. Morphological observations indicated that survivors included both A. tuberculatus and Palmer amaranth (Amaranthus palmeri S. Watson). Research was conducted to confirm species identities and resistance and then to determine whether resistance evolved independently in the two species or via hybridization. Results from a quantitative PCR assay based on the ribosomal internal transcribed spacer confirmed that both A. tuberculatus and A. palmeri coexisted in the field. The mutation conferring ΔGly-210 in PPO2 was identified in both species; phylogenetic analysis of a region of PPX2, however, indicated that the mutation evolved independently in the two species. Genotyping of greenhouse-grown plants that survived lactofen indicated that all A. tuberculatus survivors, but only a third of A. palmeri survivors, contained the ΔGly-210 mutation. Consequently, A. palmeri plants were evaluated for the presence of an arginine to glycine or methionine substitution at position 128 of PPO2 (Arg-128-Gly and Arg-128-Met). The Arg-128-Gly substitution was found to account for resistance that was not accounted for by the ΔGly-210 mutation in plants from the A. palmeri population. Results from this study provide a modern-day example of both parallel and convergent evolution occurring within a single field.
Populations of Critically Endangered White-rumped Gyps bengalensis and Slender-billed G. tenuirostris Vultures in Nepal declined rapidly during the 2000s, almost certainly because of the effects of the use in livestock of the non-steroidal anti-inflammatory drug diclofenac, which is nephrotoxic to Gyps vultures. In 2006, veterinary use of diclofenac was banned in Nepal and this was followed by the gradual implementation, over most of the geographical range of the two vulture species in Nepal, of a Vulture Safe Zone (VSZ) programme to advocate vulture conservation, raise awareness about diclofenac, provide vultures with NSAID-free food and encourage the veterinary use in livestock of a vulture-safe alternative NSAID (meloxicam). We report the results of long-term monitoring of vulture populations in Nepal before and after this programme was implemented, by means of road transects. Piecewise regression analysis of the count data indicated that a rapid decline of the White-rumped Vulture population from 2002 up to about 2013 gave way to a partial recovery between about 2013 and 2018. More limited data for the Slender-billed Vulture indicated that a rapid decline also gave way to partial recovery from about 2012 onwards. The rates at which populations were increasing in the 2010s exceeded the upper end of the range of increase rates expected in a closed population under optimal conditions. The possibility that immigration from India is contributing to the changes cannot be excluded. We present evidence from open and undercover pharmacy surveys that the VSZ programme had apparently become effective in reducing the availability of diclofenac in a large part of the range of these species in Nepal by about 2011. Hence, community-based advocacy and awareness-raising actions, and possibly also provisioning of safe food, may have made an important contribution to vulture conservation by augmenting the effects of changes in the regulation of toxic veterinary drugs.
Advancements in computer technology have enabled three-dimensional (3D) reconstruction, data-stitching, and manipulation of 3D data obtained on X-ray imaging systems such as micro-computed tomography (μ-CT). Likewise, intuitive evaluation of these 3D datasets can be enhanced by recent advances in virtual reality (VR) hardware and software. Additionally, the generation, viewing, and manipulation of 3D X-ray diffraction datasets, such as pole figures employed for texture analysis, can also benefit from these advanced visualization techniques. We present newly-developed protocols for porting 3D data (as TIFF-stacks) into a Unity gaming software platform so that data may be toured, manipulated, and evaluated within a more-intuitive VR environment through the use of game-like controls and 3D headsets. We demonstrate this capability by rendering μ-CT data of a polymer dogbone test bar at various stages of in situ mechanical strain. An additional experiment is presented showing 3D XRD data collected on an aluminum test block with vias. These 3D XRD data for texture analysis (χ, ϕ, 2θ dimensions) enables the viewer to visually inspect 3D pole figures and detect the presence or absence of in-plane residual macrostrain. These two examples serve to illustrate the benefits of this new methodology for multidimensional analysis.
Introduction: Oxygen is commonly administered to prehospital patients presenting with acute myocardial infarction (AMI). We conducted a systematic review to determine if oxygen administration, in AMI, impacts patient outcomes. Methods: We conducted a systematic search using MeSH terms and keywords in Medline, Embase, Cochrane Database of Systematic Reviews, Cochrane Central, clinicaltrials.gov and ISRCTN for relevant randomized controlled trials and observational studies comparing oxygen administration and no oxygen administration. The outcomes of interest were: mortality (≤30 days, in-hospital, and intermediate 2-11 months), infarct size, and major adverse cardiac events (MACE). Risk of Bias assessments were performed and GRADE methodology was employed to assess quality and overall confidence in the effect estimate. A meta-analysis was performed using RevMan 5 software. Results: Our search yielded 1192 citations of which 48 studies were reviewed as full texts and a total of 8 studies were included in the analysis. All evidence was considered low or very low quality. Five studies reported on mortality finding low quality evidence of no benefit or harm. Low quality evidence demonstrated no benefit or harm from supplemental oxygen administration. Similarly, no benefit or harm was found in MACE or infarct size (very low quality). Normoxia was defined as oxygen saturation measured via pulse oximetry at ≥90% in one recent study and ≥94% in another. Conclusion: We found low and very low quality evidence that the administration of supplemental oxygen to normoxic patients experiencing AMI, provides no clear harm nor benefit for mortality or MACE. The evidence on infarct size was inconsistent and warrants further prospective examination.
Introduction: Opioids are routinely administered for analgesia to prehospital patients experiencing chest discomfort from acute myocardial infarction (AMI). We conducted a systematic review to determine if opioid administration impacts patient outcomes. Methods: We conducted a systematic search using MeSH terms and keywords in Medline, Embase, Cochrane Database of Systematic Reviews, Cochrane Central and Clinicaltrials.gov for relevant randomized controlled trials and observational studies comparing opioid administration in AMI patients from 1990 to 2017. The outcomes of interest were: all-cause short-term mortality (≤30 days), major adverse cardiac events (MACE), platelet activity and aggregation, immediate adverse events, infarct size, and analgesia. Included studies were hand searched for additional citations. Risk of Bias assessments were performed and GRADE methodology was employed to assess quality and overall confidence in the effect estimate. Results: Our search yielded 3001 citations of which 19 studies were reviewed as full texts and a total of 9 studies were included in the analysis. The studies predominantly reported on morphine as the opioid. Five studies reported on mortality (≤30 days), seven on MACE, four on platelet activity and aggregation, two on immediate adverse events, two on infarct size and none on analgesic effect. We found low quality evidence suggesting no benefit or harm in terms of mortality or MACE. However, low quality evidence indicates that opioids increase infarct size. Low-quality evidence also shows reduced serum P2Y12 (eg: clopidogrel and ticagrelor) active metabolite levels and increased platelet reactivity in the first several hours post administration following an increase in vomiting. Conclusion: We find low and very low quality evidence that the administration of opioids in STEMI may be adversely related to vomiting and some surrogate outcomes including increased infarct size, reduced serum P2Y12 levels, and increased platelet activity. We found no clear benefit or harm on patient-oriented clinical outcomes including mortality.
Introduction: Early and accurate diagnosis of critical conditions is essential in emergency medical services (EMS). Serum lactate testing may be used to identify patients with worse prognosis, including sepsis. Recently, the use of a point-of-care lactate (POCL) test has been evaluated in guiding treatment in patients with sepsis. Operating as part of the Prehospital Evidence Based Practice (PEP) Program, the authors sought to identify and describe the body of evidence for POCL use in EMS and the emergency department (ED) for patients with sepsis. Methods: Following PEP methodology, in May 2018, PubMed was searched in a systematic manner. Title and abstract screening were conducted by the program coordinator. These studies were collected, appraised and added to the existing body of literature contained within the PEP database. Evidence appraisal was conducted by two reviewers who assigned both a level of evidence (LOE) on a novel three tier scale and a direction of evidence (supportive, neutral or opposing; based on primary outcome). Data on setting and study design were also extracted. Results: Eight studies were included in our analysis. Three of these studies were conducted in the ED setting; each investigating the POCL test's ability to predict severe sepsis, ICU admission or death. All three studies found supportive results for POCL. A systematic review on the use of POCL in the ED determined that this test can also improve time to treatment. Five of the total 8 studies were conducted prehospitally. Two of these studies were supportive of POCL use in the prehospital setting; in terms of feasibility and the ability to predict sepsis. Both of these study sites used this early information as part of initiating a “sepsis alert” pathway. The other three prehospital studies provide neutral support for POCL. One study demonstrated moderate ability of POCL to predict severe illness. Two studies found poor agreement between prehospital POCL and serum lactate values. Conclusion: Limited low and moderate quality evidence suggest POCL may be feasible and helpful in predicting sepsis in the prehospital setting. However, there is sparse and inconsistent support for specific important outcomes, including accuracy.
Introduction: The Prehospital Evidence-Based Practice (PEP) program is an online, freely accessible, continuously updated Emergency Medical Services (EMS) evidence repository. This summary describes the research evidence for the identification and management of adult patients suffering from sepsis syndrome or septic shock. Methods: PubMed was searched in a systematic manner. One author reviewed titles and abstracts for relevance and two authors appraised each study selected for inclusion. Primary outcomes were extracted. Studies were scored by trained appraisers on a three-point Level of Evidence (LOE) scale (based on study design and quality) and a three-point Direction of Evidence (DOE) scale (supportive, neutral, or opposing findings based on the studies’ primary outcome for each intervention). LOE and DOE of each intervention were plotted on an evidence matrix (DOE x LOE). Results: Eighty-eight studies were included for 15 interventions listed in PEP. The interventions with the most evidence were related to identification tools (ID) (n = 26, 30%) and early goal directed therapy (EGDT) (n = 21, 24%). ID tools included Systematic Inflammatory Response Syndrome (SIRS), quick Sequential Organ Failure Assessment (qSOFA) and other unique measures. The most common primary outcomes were related to diagnosis (n = 30, 34%), mortality (n = 40, 45%) and treatment goals (e.g. time to antibiotic) (n = 14, 16%). The evidence rank for the supported interventions were: supportive-high quality (n = 1, 7%) for crystalloid infusion, supportive-moderate quality (n = 7, 47%) for identification tools, prenotification, point of care lactate, titrated oxygen, temperature monitoring, and supportive-low quality (n = 1, 7%) for vasopressors. The benefit of prehospital antibiotics and EGDT remain inconclusive with a neutral DOE. There is moderate level evidence opposing use of high flow oxygen. Conclusion: EMS sepsis interventions are informed primarily by moderate quality supportive evidence. Several standard treatments are well supported by moderate to high quality evidence, as are identification tools. However, some standard in-hospital therapies are not supported by evidence in the prehospital setting, such as antibiotics, and EGDT. Based on primary outcomes, no identification tool appears superior. This evidence analysis can guide selection of appropriate prehospital therapies.
Introduction: Long-term immobility has detrimental effects for critically ill patients admitted to the intensive care unit (ICU) including ICU-acquired weakness. Early mobilization of patients admitted to ICU has been demonstrated to be a safe, feasible and effective strategy to improve patient outcomes. The optimal mobilization of trauma ICU patients has not been extensively studied. Our objective was to determine the impact of an early mobilization protocol on outcomes among trauma patients admitted to the ICU. Methods: We analyzed all adult trauma patients ( > 18 years old) admitted to ICU over a 2-year period prior to and following implementation of an early mobilization protocol, allowing for a 1-year transition period. Data were collected from the Nova Scotia Trauma Registry. We compared patient characteristics and outcomes (mortality, length of stay [LOS], ventilator days) between the pre- and post-implementation groups. Associations between early mobilization and clinical outcomes were estimated using binary and linear regression models. Results: Overall, there were 526 patients included in the analysis (292 pre-implementation, 234 post-implementation). The study population ranged in age from 18 to 92 years (mean age 49.0 ± 20.4 years) and 74.3% of all patients were male. The pre- and post-implementation groups were similar in age, sex, and injury severity. In-hospital mortality was reduced in the post-implementation group (25.3% vs. 17.5%; p = 0.031). In addition, there was a reduction in ICU mortality in the post-implementation group (21.6% vs. 12.8%; p = 0.009). We did not observe any difference in overall hospital LOS, ICU LOS, or ventilator days between the two groups. Compared to the pre-implementation period, trauma patients admitted to the ICU following protocol implementation were less likely to die in-hospital (OR = 0.52, 95% CI 0.30-0.91; p = 0.021) or in the ICU (OR = 0.40, 95% CI 0.21- 0.76, p = 0.005). Results were similar following a sensitivity analysis limited to patients with blunt or penetrating injuries. There was no difference between the pre- and post-implementation groups with respect to in-hospital LOS, ICU LOS, or the number of ventilator days. Conclusion: We found that trauma patients admitted to ICU during the post-implementation period had decreased odds of in-hospital mortality and ICU mortality. Ours is the first study to demonstrate a significant reduction in trauma mortality following implementation of an ICU mobility protocol.
Introduction: Previous systematic reviews suggest early mobilization in the intensive care unit (ICU) population is feasible, safe, and may improve outcomes. Only one review investigated mobilization specifically in trauma ICU patients and failed to identify any relevant articles. The objective of the present systematic review was to conduct an up-to-date search of the literature to assess the effect of early mobilization in adult trauma ICU patients on mortality, length of stay (LOS) and duration of mechanical ventilation. Methods: We performed a systematic search of four electronic databases (Ovid MEDLINE, Embase, CINAHL, Cochrane Library) and the grey literature. To be included, studies must have compared early mobilization to delayed or no mobilization among trauma patients admitted to the ICU. Meta-analysis was performed to determine the effect of early mobilization on mortality, hospital LOS, ICU LOS, and duration of mechanical ventilation. Results: The search yielded 2,975 records from the 4 databases and 7 records from grey literature and bibliographic searches; of these, 9 articles met all eligibility criteria and were included in the analysis. There were 7 studies performed in the United States, 1 study from China and 1 study from Norway. Study populations included neurotrauma (3 studies), blunt abdominal trauma (2 studies), mixed injury types (2 studies) and burns (1 study). Cohorts ranged in size from 15 to 1,132 patients (median, 63) and varied in inclusion criteria. Most studies used some form of stepwise progressive mobility protocol. Two studies used simple ambulation as the mobilization measure, and 1 study employed upright sitting as their only intervention. Time to commencement of the intervention was variable across studies, and only 2 studies specified the timing of mobilization initiation. We did not detect a difference in mortality with early mobilization, although the pooled risk ratio (RR) was reduced (RR 0.90, 95% CI 0.74 to 1.09). Hospital LOS and ICU LOS were decreased with early mobilization, though this difference did not reach significance. Duration of mechanical ventilation was significantly shorter in the early mobilization group (mean difference −1.18. 95% CI −2.17 to −0.19). Conclusion: Our review identified few studies that examined mobilization of critically ill trauma patients in the ICU. On meta-analysis, early mobilization was found to reduce duration of mechanical ventilation, but the effects on mortality and LOS were not significant.
Influenza and respiratory syncytial virus (RSV) are common causes of respiratory tract infections and place a burden on health services each winter. Systems to describe the timing and intensity of such activity will improve the public health response and deployment of interventions to these pressures. Here we develop early warning and activity intensity thresholds for monitoring influenza and RSV using two novel data sources: general practitioner out-of-hours consultations (GP OOH) and telehealth calls (NHS 111). Moving Epidemic Method (MEM) thresholds were developed for winter 2017–2018. The NHS 111 cold/flu threshold was breached several weeks in advance of other systems. The NHS 111 RSV epidemic threshold was breached in week 41, in advance of RSV laboratory reporting. Combining the use of MEM thresholds with daily monitoring of NHS 111 and GP OOH syndromic surveillance systems provides the potential to alert to threshold breaches in real-time. An advantage of using thresholds across different health systems is the ability to capture a range of healthcare-seeking behaviour, which may reflect differences in disease severity. This study also provides a quantifiable measure of seasonal RSV activity, which contributes to our understanding of RSV activity in advance of the potential introduction of new RSV vaccines.
OBJECTIVES/SPECIFIC AIMS: o To review the community’s recommendations on how to rebuild trust in the Flint community. o To review effective community engagement strategies utilized with the Flint Special Projects for project conceptualization, participant recruitment, data analysis, project oversight, and dissemination. METHODS/STUDY POPULATION: The study population includes nearly two hundred residents representing seniors, youth and diverse ethnicities recruited to participate in eleven focus group meetings. The population also represents the general public who attended informational meetings in Flint, Michigan to learn about the crisis and allow residents to voice their opinions and concerns during the onset of the crisis. The project is a mixed methods community based participatory research effort that utilized community decision making in all phases of the effort such as pre-conception, implementation, dissemination and advocacy to encourage the community’s recommendations are adopted at policy and institutional responsiveness levels. It includes three community engaged research efforts: (project 1) A qualitative analysis of community sentiment provided during 17 recorded legislative, media and community events, and (projects 2-3) two mixed methods efforts utilizing purposive sampling of stakeholders whose voice may not have been heard. RESULTS/ANTICIPATED RESULTS: The project presents a qualitative analysis of the community’s voice during the onset of the man-made disaster when the community first became aware of the emergency manager’s plans to switch the water source. It also reflects current perspectives of community voice since the projects are scheduled to end late February 2019. Findings from a trust measure administered to nearly two hundred residents will be presented, along with a qualitative analysis of focus group findings among segments of the population (seniors, youth, and diverse ethnicities) who may have been left out of narratives on the water crisis. Finally, the project will compare empowerment and resiliency approaches being utilized in Flint, Michigan to recover from the disaster with other approaches grounded in literature and theory. DISCUSSION/SIGNIFICANCE OF IMPACT: Communities of color often experience social determinants of health which negatively impact their health, well-being and human rights. Some Flint citizens are experiencing negative health consequences (i.e., rashes, brain and behavioral sequelle, fertility, etc.) as a result of the disaster, and are uncertain of health outcomes in the future. This is the first project to rigorously document and analyze levels of trust and mistrust in the city of Flint since the water disaster occurred. The qualitative research will guide future clinical research that will benefit this traumatized community experiencing high levels of mistrust (i.e., government, elected officials, etc.). The community engaged methodology involved residents and study participants in all phases of the project including project oversight, validating and analyzing data, and dissemination. This methodology will contribute to existing literature and theory on community based participatory research, community engaged research, team science and citizen science. The approaches empowered a call to action among residents, for example, seniors who attended two senior focus group sessions shared “they are hopeful and have a purpose,” resulting in the creation of a council (with officers) at their housing complex to advocate for the well-being of seniors during the recovery process. Recruitment methodologies were extremely successful due to resident level trust in community leaders and community partner organizations. Finally, the project’s examination of approaches encouraging empowerment and resiliency will provide lessons learned for other communities challenged with crisis.
Mismatch negativity (MMN) is an event-related potential (ERP) component reflecting auditory predictive coding. Repeated standard tones evoke increasing positivity (‘repetition positivity’; RP), reflecting strengthening of the standard's memory trace and the prediction it will recur. Likewise, deviant tones preceded by more standard repetitions evoke greater negativity (‘deviant negativity’; DN), reflecting stronger prediction error signaling. These memory trace effects are also evident in MMN difference wave. Here, we assess group differences and test-retest reliability of these indices in schizophrenia patients (SZ) and healthy controls (HC).
Electroencephalography was recorded twice, 2 weeks apart, from 43 SZ and 30 HC, during a roving standard paradigm. We examined ERPs to the third, eighth, and 33rd standards (RP), immediately subsequent deviants (DN), and the corresponding MMN. Memory trace effects were assessed by comparing amplitudes associated with the three standard repetition trains.
Compared with controls, SZ showed reduced MMNs and DNs, but normal RPs. Both groups showed memory trace effects for RP, MMN, and DN, with a trend for attenuated DNs in SZ. Intraclass correlations obtained via this paradigm indicated good-to-moderate reliabilities for overall MMN, DN and RP, but moderate to poor reliabilities for components associated with short, intermediate, and long standard trains, and poor reliability of their memory trace effects.
MMN deficits in SZ reflected attenuated prediction error signaling (DN), with relatively intact predictive code formation (RP) and memory trace effects. This roving standard MMN paradigm requires additional development/validation to obtain suitable levels of reliability for use in clinical trials.