To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
OBJECTIVES/GOALS: Our overall goal is to identify the processes used by the human visual system to encode visual stimuli into perceptual representations. In this project, our objective is (i) to collect a dataset of human neural activity in response to 1000 naturalistic color images and (ii) to determine how image parameters drive different parts of the human brain. METHODS/STUDY POPULATION: We recorded iEEG data in 4 human subjects who had been implanted for epilepsy monitoring. Each subject was presented 10 sets of 100 naturalistic stimuli, taken from the Natural Scenes Dataset (Allen et al., 2021), on a screen for 1 second each with 1 second rest intervals between stimuli. The subjects were instructed to fixate on a red dot at the center of the screen and were prompted to recall whether they had seen 3 additional test stimuli at the end of each set to encourage attentiveness. We calculated significant neural responses at each electrode by comparing evoked potentials and high frequency power changes during each stimulus vs. rest. Electrodes with significant responses were then mapped to anatomic locations in each subjects brain and then collectively to a standard brain. RESULTS/ANTICIPATED RESULTS: The natural image set elicited significant evoked potentials and high frequency responses at electrodes in each subject. Response latencies, from 80 to 300 ms after stimulus onset, portrayed the evolution of visual processing along the visual pathways, through key sites such as the early visual cortex, ventral temporal cortex, intraparietal sulcus, and frontal eye field. These responses differed significantly from those elicited by simple patterns, which drove early visual cortex but less so in later regions. DISCUSSION/SIGNIFICANCE: These data show that the human brain responds differently to more complex images. Determining the human brains response to naturalistic images is essential for encoding models that describe the processing in the human visual system. These models may further future efforts for electrical neurostimulation therapies such as for restoring vision.
ABSTRACT IMPACT: This study characterizes interactions between human limbic circuitry and ventral temporal cortex using single pulse electrical stimulation, which may inform emerging stimulation therapies for epilepsy. OBJECTIVES/GOALS: The goal of electrical brain stimulation treatment is to modulate brain network function. However, stimulation inputs to different brain sites alter the network in a variety of ways. This study examines that variability by characterizing responses in a target region while stimulating multiple other brain sites. METHODS/STUDY POPULATION: We measured voltages in intracranial EEG in 6 patients who had electrodes implanted for epilepsy monitoring. We stimulated pairs of electrodes at multiple sites in the brain with a single pulse every 5 to 7 s and measured the resulting corticocortical evoked potential (CCEP) responses in the ventral temporal cortex (VTC). Using a novel clustering method, we uncovered sets of distinct canonical response shapes from the 20 to 500 ms post-stimulation period. This allowed us to group stimulation sites that evoked similar responses. We then related each group to high frequency, broadband, changes in spectral power as a reflection of local neuronal activity. RESULTS/ANTICIPATED RESULTS: We found that the VTC receives strong inputs specifically from the amygdala and hippocampus, both in terms of amplitude and broadband spectral power change. However, inputs from the hippocampus produced a different canonical shape than those from the amygdala. We also observed that VTC responses to inputs from the insula clustered in shape with those from the amygdala. These clustering patterns were consistent across subjects, although the actual shapes of the clusters showed variability. We further observed that some shapes were more associated with increases in overall neuronal activity than others, as reflected by broadband spectral power change. DISCUSSION/SIGNIFICANCE OF FINDINGS: Stimulation of connected sites may drive excitability at the target region in ways that are described by sets of full-time-course responses. By capturing their shapes, we can begin to decipher canonical input types at the circuit level. This approach might identify how stimulation inputs can be tailored to therapy while mitigating adverse effects.
In the midwestern United States, biotypes of giant ragweed resistant to multiple herbicide biochemical sites of action have been identified. Weeds with resistance to multiple herbicides reduce the utility of existing herbicides and necessitate the development of alternative weed control strategies. In two experiments in southeastern Minnesota, we determined the effect of six 3 yr crop-rotation systems containing corn, soybean, wheat, and alfalfa on giant ragweed seedbank depletion and emergence patterns. The six crop-rotation systems included continuous corn, soybean–corn–corn, corn–soybean–corn, soybean–wheat–corn, soybean–alfalfa–corn, and alfalfa–alfalfa–corn. The crop-rotation system had no effect on the amount of seedbank depletion when a zero-weed threshold was maintained, with an average of 96% of the giant ragweed seedbank being depleted within 2 yr. Seedbank depletion occurred primarily through seedling emergence in all crop-rotation systems. However, seedling emergence tended to account for more of the seedbank depletion in rotations containing only corn or soybean compared with rotations with wheat or alfalfa. Giant ragweed emerged early across all treatments, with on average 90% emergence occurring by June 4. Duration of emergence was slightly longer in established alfalfa compared with other cropping systems. These results indicate that corn and soybean rotations are more conducive to giant ragweed emergence than rotations including wheat and alfalfa, and that adopting a zero-weed threshold is a viable approach to depleting the weed seedbank in all crop-rotation systems.
Boarding admitted patients decreases emergency department (ED) capacity to accommodate daily patient surge. Boarding in regional hospitals may decrease the ability to meet community needs during a public health emergency. This study examined differences in regional patient boarding times across the United States and in regions at risk for public health emergencies.
A retrospective cross-sectional analysis was performed by using 2012 ED visit data from the American Hospital Association (AHA) database and 2012 hospital ED boarding data from the Centers for Medicare and Medicaid Services Hospital Compare database. Hospitals were grouped into hospital referral regions (HRRs). The primary outcome was mean ED boarding time per HRR. Spatial hot spot analysis examined boarding time spatial clustering.
A total of 3317 of 4671 (71%) hospitals were included in the study cohort. A total of 45 high-boarding-time HRRs clustered along the East/West coasts and 67 low-boarding-time HRRs clustered in the Midwest/Northern Plains regions. A total of 86% of HRRs at risk for a terrorist event had high boarding times and 36% of HRRs with frequent natural disasters had high boarding times.
Urban, coastal areas have the longest boarding times and are clustered with other high-boarding-time HRRs. Longer boarding times suggest a heightened level of vulnerability and a need to enhance surge capacity because these regions have difficulty meeting daily emergency care demands and are at increased risk for disasters. (Disaster Med Public Health Preparedness. 2016;10:576–582)
As herbicide-resistant weed populations become increasingly problematic in crop production, alternative strategies of weed control are necessary. Giant ragweed, one of the most competitive agricultural weeds in row crops, has evolved resistance to multiple herbicide biochemical sites of action within the plant, necessitating the development of new and integrated methods of weed control. This study assessed the quantity and duration of seed retention of giant ragweed grown in soybean fields and adjacent field margins. Seed retention of giant ragweed was monitored weekly during the 2012 to 2014 harvest seasons using seed collection traps. Giant ragweed plants produced an average of 1,818 seeds per plant, with 66% being potentially viable. Giant ragweed on average began shattering hard (potentially viable) and soft (nonviable) seeds September 12 and continued through October at an average rate of 0.75 and 0.44% of total seeds per day during September and October, respectively. Giant ragweed seeds remained on the plants well into the Minnesota soybean harvest season, with an average of 80% of the total seeds being retained on October 11, when Minnesota soybean harvest was approximately 75% completed in the years of the study. These results suggest that there is a sufficient amount of time to remove escaped giant ragweed from production fields and field margins before the seeds shatter by managing weed seed dispersal before or at crop harvest. Controlling weed seed dispersal has potential to manage herbicide-resistant giant ragweed by limiting replenishment of the weed seed bank.
Influenza A (H1N1) pdm09 became the predominant circulating strain in the United States during the 2013–2014 influenza season. Little is known about the epidemiology of severe influenza during this season.
A retrospective cohort study of severely ill patients with influenza infection in intensive care units in 33 US hospitals from September 1, 2013, through April 1, 2014, was conducted to determine risk factors for mortality present on intensive care unit admission and to describe patient characteristics, spectrum of disease, management, and outcomes.
A total of 444 adults and 63 children were admitted to an intensive care unit in a study hospital; 93 adults (20.9%) and 4 children (6.3%) died. By logistic regression analysis, the following factors were significantly associated with mortality among adult patients: older age (>65 years, odds ratio, 3.1 [95% CI, 1.4–6.9], P=.006 and 50–64 years, 2.5 [1.3–4.9], P=.007; reference age 18–49 years), male sex (1.9 [1.1–3.3], P=.031), history of malignant tumor with chemotherapy administered within the prior 6 months (12.1 [3.9–37.0], P<.001), and a higher Sequential Organ Failure Assessment score (for each increase by 1 in score, 1.3 [1.2–1.4], P<.001).
Risk factors for death among US patients with severe influenza during the 2013–2014 season, when influenza A (H1N1) pdm09 was the predominant circulating strain type, shifted in the first postpandemic season in which it predominated toward those of a more typical epidemic influenza season.
Infect. Control Hosp. Epidemiol. 2015;36(11):1251–1260
Although rates of anxiety tend to decrease across late life, rates of anxiety increase among a subset of older adults, those with mild cognitive impairment (MCI) or dementia. Our understanding of anxiety in dementia is limited, in part, by a lack of anxiety measures designed for use with this population. This study sought to address limitations of the literature by developing a new measure of anxiety for cognitively impaired individuals, the anxiety in cognitive impairment and dementia (ACID) Scales, which includes both proxy (ACID-PR) and self-report (ACID-SR) versions.
The ACID-SR and ACID-PR were administered to 45 residents, aged 60 years and older, of three long-term care (LTC) facilities, and 38 professional caregivers at these facilities. Other measures of anxiety, and measures of depression, functional ability, cognition, and general physical and mental health were also administered.
Initial evaluation of its psychometric properties revealed adequate to good internal consistency for the ACID-PR and ACID-SR. Evidence for convergent validity of measures obtained with the ACID-SR and ACID-PR was demonstrated by moderate-to-strong associations with measures of worry, depressive symptoms, and general mental health. Discriminant validity of measures obtained with the ACID-SR and ACID-PR was demonstrated by weak correlations with measures of cognition, functional ability, and general physical well-being.
The preliminary results suggest that the ACID-SR and ACID-PR can obtain reliable and valid measures of anxiety among individuals with cognitive impairment. Given the subjective nature of anxiety, it may be prudent to collect self-report of anxiety symptoms even among those with moderate cognitive impairment.
Cannabis use is high amongst young people who have recently had their first episode of psychosis, and is associated with worse outcomes. To date, interventions to reduce cannabis consumption have been largely ineffective, and it has been suggested that longer treatment periods are required.
In a pragmatic single-blind randomized controlled trial 110 participants were randomly allocated to one of three conditions: a brief motivational interviewing and cognitive behavioural therapy (MI-CBT) intervention (up to 12 sessions over 4.5 months) with standard care from an early intervention service; a long MI-CBT intervention (up to 24 sessions over 9 months) with standard care; or standard care alone. The primary outcome was change in cannabis use as measured by Timeline Followback.
Neither the extended nor the brief interventions conferred benefit over standard care in terms of reductions in frequency or amount of cannabis use. Also the interventions did not result in improvements in the assessed clinical outcomes, including symptoms, functioning, hospital admissions or relapse.
Integrated MI and CBT for people with cannabis use and recent-onset psychosis does not reduce cannabis use or improve clinical outcomes. These findings are consistent with those in the published literature, and additionally demonstrate that offering a more extended intervention does not confer any advantage. Many participants were not at an action stage for change and for those not ready to reduce or quit cannabis, targeting associated problems rather than the cannabis use per se may be the best current strategy for mental health services to adopt.
The Whillans Ice Stream Subglacial Access Research Drilling (WISSARD) project will test the overarching hypothesis that an active hydrological system exists beneath a West Antarctic ice stream that exerts a major control on ice dynamics, and the metabolic and phylogenetic diversity of the microbial community in subglacial water and sediment. WISSARD will explore Subglacial Lake Whillans (SLW, unofficial name) and its outflow toward the grounding line where it is thought to enter the Ross Ice Shelf seawater cavity. Introducing microbial contamination to the subglacial environment during drilling operations could compromise environmental stewardship and the science objectives of the project, consequently we developed a set of tools and procedures to directly address these issues. WISSARD hot water drilling efforts will include a custom water treatment system designed to remove micron and sub-micron sized particles (biotic and abiotic), irradiate the drilling water with germicidal ultraviolet (UV) radiation, and pasteurize the water to reduce the viability of persisting microbial contamination. Our clean access protocols also include methods to reduce microbial contamination on the surfaces of cables/hoses and down-borehole equipment using germicidal UV exposure and chemical disinfection. This paper presents experimental data showing that our protocols will meet expectations established by international agreement between participating Antarctic nations.
We describe a population of compact objects in the centre of the Fornax Cluster which were discovered as part of our 2dF Fornax Spectroscopic Survey. These objects have spectra typical of old stellar systems, but are unresolved on photographic sky survey plates. They have absolute magnitudes −13 < MB < −11, so they are 10 times more luminous than any Galactic globular clusters, but fainter than any known compact dwarf galaxies. These objects are all within 30 arcminutes of the central galaxy of the cluster, NGC 1399, but are distributed over larger radii than the globular cluster system of that galaxy. We suggest that these objects are either super-massive star clusters (intra-cluster globular clusters or tidally stripped nuclei of dwarf galaxies) or a new type of low-luminosity, compact elliptical dwarf (‘M32-type”) galaxy. The best way to test these hypotheses will be to obtain high-resolution imaging and high-dispersion spectroscopy to determine their structures and mass-to-light ratios. This will allow us to compare them to known compact objects and establish whether they represent a new class of hitherto unknown stellar system.
Mass casualty triage is the process of prioritizing multiple victims when resources are not sufficient to treat everyone immediately. No national guideline for mass casualty triage exists in the United States. The lack of a national guideline has resulted in variability in triage processes, tags, and nomenclature. This variability has the potential to inject confusion and miscommunication into the disaster incident, particularly when multiple jurisdictions are involved. The Model Uniform Core Criteria for Mass Casualty Triage were developed to be a national guideline for mass casualty triage to ensure interoperability and standardization when responding to a mass casualty incident. The Core Criteria consist of 4 categories: general considerations, global sorting, lifesaving interventions, and individual assessment of triage category. The criteria within each of these categories were developed by a workgroup of experts representing national stakeholder organizations who used the best available science and, when necessary, consensus opinion. This article describes how the Model Uniform Core Criteria for Mass Casualty Triage were developed.
(Disaster Med Public Health Preparedness. 2011;5:129-137)
Objectives: In late June 2006, Ethiopia's Oromiya Region was affected by an outbreak of acute watery diarrhea, subsequently confirmed to be caused by Vibrio cholerae O1, a pathogen not known to be endemic to this area. Despite initial control efforts, the outbreak quickly spread to neighboring zones and regions. The Oromiya Health Bureau required public health assistance to investigate the outbreak, determine potential causes, and assess the adequacy of the response, particularly given the concern that the number of cases being reported by health care personnel might represent only a fraction of what actually existed in the community.
Methods: A physician-epidemiologist–led team assessed the Guji, Bale, and East Shewa zones from September 15 to October 9, 2006. By using a purposive sample, we surveyed health bureau staff and cholera treatment center (CTC) staff and community members, assessed CTC sites, and interviewed key personnel of the various organizations responding to the outbreak.
Results: The cholera cases mapped along the Ganale River. The individual attack rates were low (ranging from ~ 0.03% to ~ 4.12%), as was the overall attack rate for all 3 zones (almost 0.50%). The individual CTC case fatality rates ranged from 0% to 6.4%, and the overall case fatality rate was 1.11%. There was a trend toward men being disproportionately affected. This outbreak resulted primarily from poor sanitation and insufficient access to clean water. In Oromiya, the outbreak was addressed by a prompt and effective response, which included village chairmen at the community level. The use of community-based workers was successful and likely contributed significantly to control of the outbreak.
Conclusion: Future epidemics will undoubtedly occur unless basic water and sanitation deficiencies are properly addressed. This outbreak prompts the need for increased local public health capacity to apply prevention strategies and establish ongoing surveillance. Signatories to the World Health Organization International Health Regulations must report outbreaks of nonendemic diseases.
(Disaster Med Public Health Preparedness. 2010;4:312-317)
Poor oral health influences the dietary quality of older individuals. The objective of the present study was to relate the number of teeth to adherence to the 2005 Dietary Guidelines for Americans among an ethnically diverse sample of older adults.
A block cluster design was used to obtain a sample of older adults. Data were weighted to census data for ethnicity and gender. Dietary intakes were assessed using an FFQ and converted into Healthy Eating Index-2005 (HEI-2005) scores.
Two counties in North Carolina, USA, with large African-American and American Indian populations.
Community-dwelling older adults (N 635).
Three hundred and twenty-six participants had severe tooth loss (0–10 teeth remaining), compared with 305 participants with 11+ teeth. After controlling for socio-economic factors, those with 0–10 teeth had lower total HEI-2005 scores and consumed less Total Fruit, Meat and Beans, and Oils, and more energy from Solid Fat, Alcohol and Added Sugar, compared with those with 11+ teeth. Less than 1 % of those with 0–10 teeth and 4 % of those with 11+ teeth met overall HEI-2005 recommendations. Those with 0–10 teeth were less likely to eat recommended amounts of Total Vegetables, Dark Green and Orange Vegetables, and energy from Solid Fat, Alcohol and Added Sugar.
Older adults with severe tooth loss are less likely than those with moderate to low tooth loss to meet current dietary recommendations. Nutrition interventions for older adults should take oral health status into consideration and include strategies that specifically address this as a barrier to healthful eating.
The role of viruses and M. pneumoniae in episodes of acute respiratory illness in childhood has been studied in a London general practice. The total isolation rate was 31·7%, but the rate varied from 32·6% in upper respiratory infections to 64·0% in pneumonia. The clinical features associated with infection were influenced not only by the type of agent but also by age and other host factors in the infected children. Rhinoviruses were more commonly isolated than any other agent and were frequently associated with wheezy bronchitis.
The potential for outbreaks of epidemic disease among displaced residents was a significant public health concern in the aftermath of Hurricane Katrina. In response, the Mississippi Department of Health (MDH) and the American Red Cross (ARC) implemented a novel infectious disease surveillance system, in the form of a telephone “hotline”, to detect and rapidly respond to health threats in shelters.
All ARC-managed shelters in Mississippi were included in the surveillance system. A symptom-based, case reporting method was developed and distributed to shelter staff, who were linked with MDH and ARC professionals by a toll-free telephone service. Hotline staff investigated potential infectious disease outbreaks, provided assistance to shelter staff regarding optimal patient care, and helped facilitate the evaluation of ill evacuees by local medical personnel.
Forty-three shelters sheltering 3,520 evacuees participated in the program. Seventeen shelters made 29 calls notifying the hotline of the following cases: (1) fever (6 cases); (2) respiratory infections (37 cases); (3) bloody diarrhea (2 cases); (4) watery diarrhea (15 cases); and (5) other, including rashes (33 cases). Thirty-four of these patients were referred to a local physician or hospital for further diagnosis and disease management. Three cases of chickenpox were identified. No significant infectious disease outbreaks occurred and no deaths were reported.
The surveillance system used direct verbal communication between shelter staff and hotline managers to enable more rapid reporting, mapping, investigation, and intervention, far beyond the capabilities of a more passive or paper-based system. It also allowed for immediate feedback and education for staff unfamiliar with the diseases and reporting process. Replication of this program should be considered during future disasters when health surveillance of a large, disseminated shelter population is necessary.
The Magnetism in Massive Stars (MiMeS) Project is a consensus collaboration among the foremost international researchers of the physics of hot, massive stars, with the basic aim of understanding the origin, evolution and impact of magnetic fields in these objects. The cornerstone of the project is the MiMeS Large Program at the Canada-France-Hawaii Telescope, which represents a dedication of 640 hours of telescope time from 2008-2012. The MiMeS Large Program will exploit the unique capabilities of the ESPaDOnS spectropolarimeter to obtain critical missing information about the poorly-studied magnetic properties of these important stars, to confront current models and to guide theory.
Trilayer concentric metallic-piezoelectric-metallic microtubes are fabricated by infiltrating porous Si templates with sol precursors. LaNiO3 (LNO) is used as the inner and outer electrode material and PbZrTiO3 (PZT) is the middle piezoelectric layer. Structure of the microtubes is characterized in details using scanning and transmission electron microscopy which are equipped with energy dispersive X-ray spectroscopy for elemental mapping. The hysteresis of a trilayered thin film structure of LNO-PZT-LNO is shown. This trilayered tubes might find applications in inkjet printing.
Strontium Titanate (STO) substrates were studied by electron paramagnetic resonance (EPR) spectroscopy to assess possible changes incurred by deposition of multiferroic thin films. To this effect, STO was vacuum annealed at pressures of 10−5 Torr for one hour at temperatures in the range of 200 – 500 °C. EPR spectra, measured before and after each anneal, revealed changes in the amount of three different defects, Cr3+, Fe3+ and an iron-oxygen vacancy complex, Fe3+Vo. The latter was used to monitor the diffusion of oxygen. EPR analysis showed that Fe3+Vo increases from its as-grown value, suggesting that a charged oxygen species is mobile in the substrate under film deposition conditions. Coupled with a subsequent O2 anneal showing minimal change in the Fe3+Vo signal, the data indicate a loss of oxygen from the sample during vacuum annealing. As charged oxygen vacancies may affect the substrate as well as the substrate/ thin film interface, these results are important for understanding the behavior of multiferroic devices built on STO substrates.