To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The peoples of southern Mesoamerica, including the Classic period Maya, are often claimed to exhibit a distinct type of spatial organization relative to contemporary urban systems. Here, we use the settlement scaling framework and properties of settlements recorded in systematic, full-coverage surveys to examine ways in which southern Mesoamerican settlement systems were both similar to and different from contemporary systems. We find that the population-area relationship in these settlements differs greatly from that reported for other agrarian settlement systems, but that more typical patterns emerge when one considers a site epicenter as the relevant social interaction area, and the population administered from a given center as the relevant interacting population. Our results imply that southern Mesoamerican populations mixed socially at a slower temporal rhythm than is typical of contemporary systems. Residential locations reflected the need to balance energetic and transport costs of farming with lower-frequency costs of commuting to central places. Nevertheless, increasing returns in activities such as civic construction were still realized through lower-frequency social mixing. These findings suggest that the primary difference between low-density urbanism and contemporary urban systems lies in the spatial and temporal rhythms of social mixing.
There is mounting evidence for the potential for the natural dietary antioxidant and anti-inflammatory amino acid l-Ergothioneine (ERGO) to prevent or mitigate chronic diseases of aging. This has led to the suggestion that it could be considered a ‘longevity vitamin.’ ERGO is produced in nature only by certain fungi and a few other microbes. Mushrooms are, by far, the leading dietary source of ERGO, but it is found in small amounts throughout the food chain, most likely due to soil-borne fungi passing it on to plants. Because some common agricultural practices can disrupt beneficial fungus–plant root relationships, ERGO levels in foods grown under those conditions could be compromised. Thus, research is needed to further analyse the role agricultural practices play in the availability of ERGO in the human diet and its potential to improve our long-term health.
Background: Including infection preventionists (IPs) in hospital design, construction, and renovation projects is important. According to the Joint Commission, “Infection control oversights during building design or renovations commonly result in regulatory problems, millions lost and even patient deaths.” We evaluated the number of active major construction projects at our 800-bed hospital with 6.0 IP FTEs and the IP time required for oversight. Methods: We reviewed construction records from October 2018 through October 2019. We classified projects as active if any construction occurred during the study period. We describe the types of projects: inpatient, outpatient, non–patient care, and the potential impact to patient health through infection control risk assessments (ICRA). ICRAs were classified as class I (non–patient-care area and minimal construction activity), class II (patients are not likely to be in the area and work is small scale), class III (patient care area and work requires demolition that generates dust), and class IV (any area requiring environmental precautions). We calculated the time spent visiting construction sites and in design meetings. Results: During October 2018–October 2019, there were 51 active construction projects with an average of 15 active sites per week. These sites included a wide range of projects from a new bone marrow transplant unit, labor and delivery expansion and renovation, space conversion to an inpatient unit to a project for multiple air handler replacements. All 51 projects were classified as class III or class IV. We visited, on average, 4 construction sites each week for 30 minutes per site, leaving 11 sites unobserved due to time constraints. We spent an average of 120 minutes weekly, but 450 minutes would have been required to observe all 15 sites. Yearly, the required hours to observe these active construction sites once weekly would be 390 hours. In addition to the observational hours, 124 hours were spent in design meetings alone, not considering the preparation time and follow-up required for these meetings. Conclusions: In a large academic medical center, IPs had time available to visit only a quarter of active projects on an ongoing basis. Increasing dedicated IP time in construction projects is essential to mitigating infection control risks in large hospitals.
Background: Measles is a highly contagious virus that reemerged in 2019 with the highest number of reported cases in the United States since 1992. Beginning in March 2019, The Johns Hopkins Hospital (JHH) responded to an influx of patients with concern for measles as a result of outbreaks in Maryland and the surrounding states. We report the JHH Department of Infection Control and Hospital Epidemiology (HEIC) response to this measles outbreak using a multidisciplinary measles incident command system (ICS). Methods: The JHH HEIC and the Johns Hopkins Office of Emergency Management established the HEIC Clinical Incident Command Center and coordinated a multipronged response to the measles outbreak with partners from occupational health services, microbiology, the adult and pediatric emergency departments, marketing and communication and local and state public health departments. The multidisciplinary structure rapidly developed, approved, and disseminated tools to improve the ability of frontline providers to quickly identify, isolate, and determine testing needs for patients suspected to have measles infection and reduce the risk of secondary transmission. The tools included a triage algorithm, visitor signage, staff and patient vaccination guidance and clinics, and standard operating procedures for measles evaluation and testing. The triage algorithms were developed for phone or in-person and assessed measles exposure history, immune status, and symptoms, and provided guidance regarding isolation and the need for testing. The algorithms were distributed to frontline providers in clinics and emergency rooms across the Johns Hopkins Health System. The incident command team also distributed resources to community providers to reduce patient influx to JHH and staged an outdoor measles evaluation and testing site in the event of a case influx that would exceed emergency department resources. Results: From March 2019 through June 2019, 37 patients presented with symptoms or concern for measles. Using the ICS tools and algorithms, JHH rapidly identified, isolated, and tested 11 patients with high suspicion for measles, 4 of whom were confirmed positive. Of the other 26 patients not tested, none developed measles infection. Exposures were minimized, and there were no secondary measles transmissions among patients. Conclusions: Using the ICS and development of tools and resources to prevent measles transmission, including a patient triage algorithm, the JHH team successfully identified, isolated, and evaluated patients with high suspicion for measles while minimizing exposures and secondary transmission. These strategies may be useful to other institutions and locales in the event of an emerging or reemerging infectious disease outbreak.
Disclosures: Aaron Milstone reports consulting for Becton Dickinson.
Background: During a 2017–2019 intervention in Chicago-area vSNFs to control carbapenem-resistant Enterobacteriaceae, healthcare worker adherence to hand hygiene and personal protective equipment was stubbornly inadequate (hand hygiene adherence, ~16% and 56% on entry and exit), despite educational and monitoring efforts. Little is known about vSNF staff understanding of multidrug-resistant organism (MDRO) transmission. We conducted a qualitative analysis of staff members at a vSNF that included assessment of staff perceptions of personal MDRO acquisition risk and associated personal hygiene routines transitioning from work to home. Methods: Between September 2018 and November 2018, a PhD-candidate medical anthropologist conducted semistructured interviews with management (N = 5), nursing staff (N = 6), and certified nursing assistants (N = 6) at a vSNF in the Chicago region (Illinois) who had already received 1 year of MDRO staff education and hand hygiene adherence monitoring. More than 11 hours of semistructured interviews were collected and transcribed. Data collection and analysis included identifying how staff members related to their own risk of MDRO acquisition/infection and what personal hygiene routines they followed. Transcriptions of the data were analyzed using thematic coding aided by MAXQDA qualitative analysis software. Results: Staff members at all levels were able to describe their perceptions related to the risk of acquiring an MDRO and personal hygiene in great detail. The risk of acquiring an MDRO was perceived as a constant threat by staff members, who described germs as bad and everywhere (Table 1). The perceived threat of MDRO acquisition was connected to individual personal hygiene routines (eg, changing shoes before leaving work), which were considered important by staff members (Table 2). Nursing staff and certified nursing assistants noted that personal hygiene was a critical factor keeping their residents, themselves, and their families free from MDROs. Conclusions: In the context of a quality improvement campaign, vSNF healthcare workers are aware of the transmissibility of microscopic MDROs and are highly motivated in preventing transmission of MDROs to themselves. Such perceptions may explain actions such as why workers may be differentially adherent with infection control interventions (eg, more likely to perform hand hygiene leaving a room rather than going into a room, or less likely to change gowns in between residents in multibed rooms if they believe they are already personally protected with a gown). Our findings suggest that interventions to improve staff adherence to infection control measures may need to address other factors related to adherence besides knowledge deficit (eg, understaffing) and may need to acknowledge self-protection as a driving motivator for staff adherence.
Background: Carbapenem-resistant Enterobacteriaceae (CRE) are endemic in the Chicago region. We assessed the regional impact of a CRE control intervention targeting high-prevalence facilities; that is, long-term acute-care hospitals (LTACHs) and ventilator-capable skilled nursing facilities (vSNFs). Methods: In July 2017, an academic–public health partnership launched a regional CRE prevention bundle: (1) identifying patient CRE status by querying Illinois’ XDRO registry and periodic point-prevalence surveys reported to public health, (2) cohorting or private rooms with contact precautions for CRE patients, (3) combining hand hygiene adherence, monitoring with general infection control education, and guidance by project coordinators and public health, and (4) daily chlorhexidine gluconate (CHG) bathing. Informed by epidemiology and modeling, we targeted LTACHs and vSNFs in a 13-mile radius from the coordinating center. Illinois mandates CRE reporting to the XDRO registry, which can also be manually queried or generate automated alerts to facilitate interfacility communication. The regional intervention promoted increased automation of alerts to hospitals. The prespecified primary outcome was incident clinical CRE culture reported to the XDRO registry in Cook County by month, analyzed by segmented regression modeling. A secondary outcome was colonization prevalence measured by serial point-prevalence surveys for carbapenemase-producing organism colonization in LTACHs and vSNFs. Results: All eligible LTACHs (n = 6) and vSNFs (n = 9) participated in the intervention. One vSNF declined CHG bathing. vSNFs that implemented CHG bathing typically bathed residents 2–3 times per week instead of daily. Overall, there were significant gaps in infection control practices, especially in vSNFs. Also, 75 Illinois hospitals adopted automated alerts (56 during the intervention period). Mean CRE incidence in Cook County decreased from 59.0 cases per month during baseline to 40.6 cases per month during intervention (P < .001). In a segmented regression model, there was an average reduction of 10.56 cases per month during the 24-month intervention period (P = .02) (Fig. 1), and an estimated 253 incident CRE cases were averted. Mean CRE incidence also decreased among the stratum of vSNF/LTACH intervention facilities (P = .03). However, evidence of ongoing CRE transmission, particularly in vSNFs, persisted, and CRE colonization prevalence remained high at intervention facilities (Table 1). Conclusions: A resource-intensive public health regional CRE intervention was implemented that included enhanced interfacility communication and targeted infection prevention. There was a significant decline in incident CRE clinical cases in Cook County, despite high persistent CRE colonization prevalence in intervention facilities. vSNFs, where understaffing or underresourcing were common and lengths of stay range from months to years, had a major prevalence challenge, underscoring the need for aggressive infection control improvements in these facilities.
Funding: The Centers for Disease Control and Prevention (SHEPheRD Contract No. 200-2011-42037)
Disclosures: M.Y.L. has received research support in the form of contributed product from OpGen and Sage Products (now part of Stryker Corporation), and has received an investigator-initiated grant from CareFusion Foundation (now part of BD).
Hand hygiene (HH) is critical to prevent hospital-acquired infections. Running a successful HH program requires valid and accurate HH data to monitor the status and progress of HH improvement efforts. HH data are frequently subject to variable forms of bias, for which considerations must be made to enhance the validity of HH data. Objective: We assessed the extent to which observers may be prone to report more favorable HH rates when observing healthcare workers from the same professional group versus members of other job categories. Methods: We analyzed HH data from 48,543 electronically collected observations conducted by frontline healthcare workers in a 793-bed acute-care hospital from January 1, 2019, through July 31, 2019. All auditors received training on HH observations and proper use of the data collection application. Compliance data were sorted into peer versus nonpeer observations by profession. We compared HH compliance rates for members of each professional group when monitoring peers versus nonpeers. We further stratified results by ancillary professions (central transport, unit associates, food services, pharmacy, phlebotomy, rehabilitation services, and respiratory therapy) versus nonancillary professions (doctors, nurses, physician assistants, patient care assistants). Results: Of 12,488 ancillary observations, 7,184 (57.5%) were peer observations and 36,055 were nonancillary observations, of which 15,942 (44.2%) were peer observations. The percentage of peer-to-peer observations versus nonpeer observations varied by profession, ranging from 96% of central transport workers and 91% of environmental services observations to 21% of patient care assistants and 34% of physician’s assistants. Average compliance rates for peer versus nonpeer observations in ancillary groups were 98% (95% CI, 98.7%–99.2%) versus 83% (95% CI, 82.5%–84.5%). Average compliance rates nonancillary groups were 92% (95% CI, 92.0%–92.8%) for peers versus 88% (95% CI, 87.8%–88.7%) for nonpeers (Table 1). Conclusions: We documented a propensity for some categories of healthcare workers to record discrepant rates of HH compliance when observing members of the same peer group versus others. This effect was more pronounced amongst ancillary versus nonancillary services. This study adds to the literature of potential sources of bias in HH monitoring programs. Operational changes in HH program data collection may be warranted to try to mitigate these biases such as increasing the frequency of validation exercises conducted by nonaffiliated observers, weighting peer versus nonpeer observations differently, or switching to automated electronic monitoring systems.
Background: Previous work suggests an intermingling of community and hospital transmission networks driving the MRSA epidemic, but how those with CO-HA infections fit into the network remains unclear. We integrated epidemiologic data and whole-genome sequencing (WGS) from existing MRSA clinical isolates to determine whether there were distinguishable features of CO-HA MRSA infections that could guide interventions. Methods: We examined 955 existing clinical MRSA isolates from 2011 to 2013 from patients at Cook County Health, the major public healthcare network in Chicago, Illinois. We performed electronic and manual chart review to ascertain community (eg, illicit drug use, incarceration history) and healthcare exposures and comorbidities. WGS was performed on all sequences, and sequences were typed with multilocus sequence typing (MLST). We assessed the distribution of epidemiological factors and sequence type (ST) across onset type. Results: Infections were more frequent in males (70%); 61% of individuals with infection were African American and 21% were Hispanic. Overall, wound infections were the most common (81%) followed by blood (7%) and respiratory (6%). 82% of infections were ST8 (most USA300), 8% were ST5 (USA100) and 10% were other STs (Fig. 1a). Using standard epidemiologic definitions, we identified 523 CO, 295 CO-HA, and 137 HO infections. USA300 infections were common across CO, CO-HA, and HO categories, whereas USA100 was more frequently observed among CO-HA and HO. Current illicit drug use and history of incarceration—factors typically associated with CO-MRSA—were observed among both CO-HA and HO infections. 38% of CO-HA and 36% of HO had a history of MRSA infection or nasal colonization in the prior 6 months. As expected, 73% of CO-HA had a history of recent hospitalization, but this was also true for 44% of HO cases; points for intervention for both groups, especially CO-HA patients, include outpatient, inpatient, and ER care. Diabetes was common across categories, and HIV was more commonly observed among CO-HA cases (Fig. 1b). Conclusions: We characterized the genomic and epidemiologic features of CO-HA MRSA infections relative to CO and HO. By MLST and epidemiological analysis, CO-HA infections share similarities to both CO and HO. Although USA300 infections were the most common strain type, our findings highlight the need for WGS to discern relationships between individuals to understand the intermixing of healthcare and community networks for CO-HA infections. Higher resolution genomic analysis may help guide whether interventions need to be at hospital discharge or in the community to have the most impact on decreasing CO-HA MRSA infections.
Funding: Funding: from CDC Broad Agency Announcement: Genomic Epidemiology of Community-Onset Invasive USA300 MRSA Infections; Contract ID: 75D30118C02923
Background: During 2017–2019 in the Chicago region, several ventilator-capable skilled nursing facilities (vSNFs) participated in a quality improvement project to control the spread of highly prevalent carbapenem-resistant Enterobacteriaceae (CRE). With guidance from regional project coordinators and public health departments that involved education, assistance with implementation, and adherence monitoring, the facilities implemented a CRE prevention bundle that included a hand hygiene campaign that promoted alcohol-based hand rub, contact precautions (personal protective equipment with glove/gown) for care of CRE-colonized residents, and 2% chlorhexidine gluconate (CHG) wipes for routine resident bathing. We conducted a qualitative study to better understand the ways that vSNF employees engage with the implementation of such infection control measures. Methods: A PhD-candidate medical anthropologist conducted semistructured interviews with management (N = 5), nursing staff (N = 6), and certified nursing assistants (N = 6) at a vSNF in the Chicago region (Illinois) between September 2018 and November 2018. More than 11 hours of semistructured interviews were collected and transcribed. Data collection and analysis focused on identifying healthcare worker experiences during an infection control intervention. Transcriptions of the data were analyzed using thematic coding aided by MAXQDA qualitative analysis software. Results: Healthcare workers described the facility using language associated with a family environment (Table 1). Furthermore, healthcare workers demonstrated motivation to implement infection control policies (Table 2). However, healthcare workers expressed cultural and structural challenges encountered during implementation, such as their belief that some infection control measures discouraged maintenance of a home-like environment, lack of time, and understaffing. Some healthcare workers perceived that alcohol-based hand rub was ineffective over time and left unpleasant textures on the skin. Additionally, some workers did not trust the available gown and gloves used to prevent transmission. Lastly, healthcare workers typically did not prefer 2% CHG wipes over soap and water, citing residual resident postbathing smell as one indicator of CHG ineffectiveness. Conclusions: In a vSNF we found both considerable support and challenges implementing a CRE prevention bundle from the healthcare worker perspective. Healthcare workers were dedicated to recreating a home-like environment for their residents, which sometimes felt at odds with infection control interventions. Residual misconceptions (eg, alcohol-based hand rub is not effective) and negative worker perceptions (eg, permeability of contact precaution gowns and/or residue from alcohol-based hand rub) suggest that ongoing education and participation by healthcare workers in evaluating infection control products for interventions is critical.
Background: Delayed or in vitro inactive empiric antibiotic therapy may be detrimental to survival in patients with bloodstream infections (BSIs). Understanding the landscape of delayed or discordant empiric antibiotic therapy (DDEAT) across different patient, pathogen, and hospital types, as well as by their baseline resistance milieu, may enable providers, antimicrobial stewardship programs, and policy makers to optimize empiric prescribing. Methods: Inpatients with clinically suspected serious infection (based on sampling of blood cultures and receiving systemic antibiotic therapy on the same or next day) found to have BSI were identified in the Cerner Healthfacts EHR database. Patients were considered to have received DDEAT when, on culture sampling day, they received either no antibiotic(s) or none that displayed in vitro activity against the pathogenic bloodstream isolate. Antibiotic-resistant phenotypes were defined by in vitro resistance to taxon-specific prototype antibiotics (eg, methicillin/oxacillin resistance in S. aureus) and were used to estimate baseline resistance prevalence encountered by the hospital. The probability of DDEAT was examined by bacterial taxon, by time of BSI onset, and by presence versus absence of antibiotic-resistance phenotypes, sepsis or septic shock, hospital type, and baseline resistance. Results: Of 26,036 assessable patients with a BSI at 131 US hospitals between 2005 and 2014, 14,658 (56%) had sepsis, 3,623 (14%) had septic shock, 5,084 (20%) had antibiotic-resistant phenotypes, and 8,593 (33%) received DDEAT. Also, 4,428 (52%) recipients of DDEAT received no antibiotics on culture sampling day, whereas the remaining 4,165 (48%) received in vitro discordant therapy. DDEAT occurred most often in S. maltophilia (87%) and E. faecium (80%) BSIs; however, 75% of DDEAT cases and 76% of deaths among recipients of DDEAT collectively occurred among patients with S. aureus and Enterobacteriales BSIs. For every 8 bacteremic patients presenting with septic shock, 1 patient did not receive any antibiotics on culture day (Fig. 1A). Patients with BSIs of hospital (vs community) onset were twice as likely to receive no antibiotics on culture day, whereas those with bloodstream pathogens displaying antibiotic-resistant (vs susceptible) phenotypes were 3 times as likely to receive in vitro discordant therapy (Fig. 1B). The median proportion of DDEAT ranged between 25% (14, 37%) in eight <300-bed teaching hospitals in the lowest baseline resistance quartile and 40% (31, 50%) at five ≥300-bed teaching hospitals in the third baseline resistance quartile (Fig. 2). Conclusions: Delayed or in vitro discordant empiric antibiotic therapy is common among patients with BSI in US hospitals regardless of hospital size, teaching status, or local resistance patterns. Prompt empiric antibiotic therapy in septic shock and hospital-onset BSI needs more support. Reliable detection of S. aureus and Enterobacteriales bloodstream pathogens and their resistance patterns earlier with rapid point-of-care diagnostics may mitigate the population-level impact of DDEAT in BSI.
Funding: This study was funded in part by the National Institutes of Health Clinical Center, National Institutes of Allergy and Infectious Diseases, National Cancer Institute (NCI contract no. HHSN261200800001E) and the Agency for Healthcare Research and Quality.
Background: Long-term acute-care hospitals (LTACHs) are disproportionately burdened by multidrug-resistant organisms (MDROs) like KPC-Kp. Although cohorting KPC-Kp+ patients into rooms with other carriers can be an outbreak-control strategy and may protect negative patients from colonization, it is unclear whether cohorted patients are at unintended increased risk of cross colonization with additional KPC-Kp strains. Methods: Cohorting KPC-Kp+ patients at admission into rooms with other positive patients was part of a bundled intervention that reduced transmission in a high-prevalence LTACH. Rectal surveillance culturing for KPC-Kp was performed at the start of the study, upon admission, and biweekly thereafter, capturing 94% of patients. We evaluated whole-genome sequencing (WGS) evidence of acquisition of distinct KPC-Kp strains in a convenience sample of patients positive for KPC-Kp at study start or admission to identify plausible secondary KPC-Kp acquisitions. Results: WGS multilocus sequence type (MLST) strain variability was observed among the 452 isolates from the 254 patients colonized by KPC-Kp (Fig. 1). Among the 32 patients who were positive at the beginning of the study or admission and had a secondary isolate collected at a later date (median, 89 days apart, range, 2–310 days), 17 (53%) had secondary isolates differing by MLST from their admission isolate. Although 60% of the KPC-Kp in the study was ST258, there was substantial genomic variation within ST258 isolates from the same patient (range, 0–102 genetic variants), suggesting multiple acquisitions of distinct ST258 isolates. Among the 17 patients who imported ST258 and had ST258 isolated again later, 11 (65%) carried secondary isolates genetically closer to isolates from other importing patients than to their own ST258 (Fig. 2). Examination of spatiotemporal exposures among patients with evidence of multiple acquisitions revealed that 11 (65%) patients with multiple MLSTs shared a room with a patient who was colonized with an isolate matching the secondary MLST, and 6 (35%) patients who carried multiple distinct ST258 isolates shared a room with a patient who imported these closely related isolates prior to secondary acquisition. Conclusions: Half of patients who imported KPC-Kp and had multiple isolates available had genomically supported secondary acquisitions linked to roommates who carried the acquired strains. Although cohorting is intended to protect negative patients from acquiring MDROs, this practice may promote multiple strain acquisitions by colonized patients in the cohort, potentially prolonging the period of MDRO carriage and increasing time at risk of infection. Our findings add to the debate about single-patient rooms, which may be preferred to cohorts to minimize potential harms by reducing MDRO transmission.
Many studies document cognitive decline following specific types of acute illness hospitalizations (AIH) such as surgery, critical care, or those complicated by delirium. However, cognitive decline may be a complication following all types of AIH. This systematic review will summarize longitudinal observational studies documenting cognitive changes following AIH in the majority admitted population and conduct meta-analysis (MA) to assess the quantitative effect of AIH on post-hospitalization cognitive decline (PHCD).
We followed Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines. Selection criteria were defined to identify studies of older age adults exposed to AIH with cognitive measures. 6566 titles were screened. 46 reports were reviewed qualitatively, of which seven contributed data to the MA. Risk of bias was assessed using the Newcastle–Ottawa Scale.
The qualitative review suggested increased cognitive decline following AIH, but several reports were particularly vulnerable to bias. Domain-specific outcomes following AIH included declines in memory and processing speed. Increasing age and the severity of illness were the most consistent risk factors for PHCD. PHCD was supported by MA of seven eligible studies with 41,453 participants (Cohen’s d = −0.25, 95% CI [−0.02, −0.49] I2 35%).
There is preliminary evidence that AIH exposure accelerates or triggers cognitive decline in the elderly patient. PHCD reported in specific contexts could be subsets of a larger phenomenon and caused by overlapping mechanisms. Future research must clarify the trajectory, clinical significance, and etiology of PHCD: a priority in the face of an aging population with increasing rates of both cognitive impairment and hospitalization.
Historically, common law countries took a restrictive approach to transactions involving trademarks. This restrictive approach was said to flow from the reasons for granting protection for trademarks in the first place. If a trademark communicates information to consumers as to the origin and quality of a particular trader’s goods or services, it was thought that any dealing with a trademark, such as an assignment or the grant of a licence to a third party, would disrupt the source and quality guarantee functions of the mark and potentially cause confusion among consumers. In other words, the very reasons that a trademark receives legal protection were thought to justify constraining an owner’s ability to deal with the mark (in contrast with other personal property, such as an unencumbered chattel or a patent). Initially, these sorts of concerns were highly influential, and the law either proscribed or imposed strict limitations on the exploitation of trademarks. However, over the course of the last century there was a gradual liberalisation of these rules. Consequently, in most common law countries, we have now reached a position where the law recognises registered trademarks to be personal property, which can be exploited with fewer restrictions than in the past. This liberalisation has to a large extent reflected changes in business practices, as brands have come to be recognised as valuable commodities in their own right and as trademark licensing, merchandising and franchising have become large and lucrative industries. Notwithstanding this, the tension between the idea of the mark as “property” and the mark as a badge of origin remains. This tension is reflected in the fact that the law retains restrictions on trademark transactions in cases where marks have been or might be used in such a way as to deceive consumers. Working out when a badge of origin can be transferred to an unrelated third party whilst not falling into the category of a “deceptive transaction” remains difficult.
Tobacco smoking remains one of the leading causes of preventable illness and death and is heritable with complex underpinnings. Converging evidence suggests a contribution of the polygenic risk for smoking to the use of tobacco and other substances. Yet, the underlying brain mechanisms between the genetic risk and tobacco smoking remain poorly understood.
Genomic, neuroimaging, and self-report data were acquired from a large cohort of adolescents from the IMAGEN study (a European multicenter study). Polygenic risk scores (PGRS) for smoking were calculated based on a genome-wide association study meta-analysis conducted by the Tobacco and Genetics Consortium. We examined the interrelationships among the genetic risk for smoking initiation, brain structure, and the number of occasions of tobacco use.
A higher smoking PGRS was significantly associated with both an increased number of occasions of tobacco use and smaller cortical volume of the right orbitofrontal cortex (OFC). Furthermore, reduced cortical volume within this cluster correlated with greater tobacco use. A subsequent path analysis suggested that the cortical volume within this cluster partially mediated the association between the genetic risk for smoking and the number of occasions of tobacco use.
Our data provide the first evidence for the involvement of the OFC in the relationship between smoking PGRS and tobacco use. Future studies of the molecular mechanisms underlying tobacco smoking should consider the mediation effect of the related neural structure.
Cyclonic storms (often called hurricanes, typhoons, or cyclones) often cause population declines in vulnerable bird species, and the intensity of these storms appears to be increasing due to climate change. Prior studies have reported short-term impacts of hurricanes on avifauna, but few have examined long-term impacts. Over two decades (1993–2018), we periodically surveyed a subspecies of West Indian Woodpecker Melanerpes superciliaris nyeanus on San Salvador, a small island in The Bahamas, to determine its distribution on the island, habitat use, and effects of hurricanes on abundance and population size. We conducted passive and playback surveys, supplemented with mist-netting. Woodpeckers were found only in the northern part of San Salvador, despite extensive surveys throughout other accessible areas of the island. Birds occupied areas with taller coppice adjacent to sabal palm Sabal palmetto groves, which were used for nesting. After hurricanes with >160 kph winds passed over San Salvador, woodpecker densities declined to 35–40% of pre-hurricane densities, but generally recovered back to pre-hurricane densities within 2–3 years. Based on an estimated density of woodpeckers within a ~1,400 ha occupied area, we calculated a population size of approximately 240 individuals (CI = 68-408). However, the population declined to far lower numbers immediately following hurricanes. Under IUCN Red List criteria, M. s. nyeanus classifies as ‘Critically Endangered’, and could be especially sensitive to future hurricanes if they occur at a high enough frequency or intensity to prevent the population from rebounding. Given the small size, isolation, and vulnerability of this population, we recommend preservation of the core habitat, continued monitoring, and further research. Our study shows that small, threatened bird populations can be resilient to the effects of hurricanes, but increased intensity of hurricanes, in combination with other threats, may limit this resilience in the future.
Studies suggest that around 25% of the European population receive treatment for a chronic condition. As the population ages, the prevalence of chronic diseases increases, with an average of two per person in their mid-60s and three for those surviving to their mid-70s (Barnett et al., 2012). People with chronic diseases now form a sizeable proportion of all hospital admissions both elective and emergency. Once admitted to hospital, people with multiple complex conditions may require a long length of stay and place a significant demand on acute hospital services.
Whether unintentional or by design, built, social, and perceived environments influence the human experience. Behavior is not solely the product of a rational motivated actor, operating independently from his or her environment; rather, it is also a function of edifices, neighborhoods, and public spaces, as well as the inhabitants, community norms, and the social capital they generate. Likewise, addictive behaviors have as much to do with the environmental contexts surrounding individuals as with their unique biological factors, specific brain mechanisms, and psychogenic causes. Any attempt to address addiction at either individual or population levels would benefit from careful consideration of the social and contextual influences on cognitions, opportunities, motivations, and behaviors. Interventions informed by this understanding are more likely to be efficacious than those solely targeted toward individual biology, motivations, or attitudes. In this chapter, we discuss the relationship between physical and social environments (PSE), health, and the behavior of humans. We then focus on the influential role of the PSE on the consumption of alcohol, tobacco, and other substances; food, eating behaviors, and addictions contributing to the current obesity epidemic; and a selection of other behavioral addictions. The chapter closes by discussing methodological considerations and implications for professional practice.