We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The literature on nineteenth-century Newcastle city region is a narrative of industrial progress premised upon technological prowess. But there is another story to be told about the transformation of a relatively small northern town into a conurbation with the attributes of a modern city. This second process of ‘rounding out’ the city with social, cultural and political institutions to accompany the economic prowess is relatively under-reported. In this study, we follow 1,621 individuals and compare their record of being mentioned in the literature to their participation in 343 local institutions. The focus is directed towards those who are much more visible in the literature compared to institutional membership – ‘narrative heroes’ – and those with the reverse pattern, much more to be found in institutions than in the literature – civic builders. The two sets of individuals are discussed and reasons for their contrasting positions are suggested.
Multiple herbicide-resistant populations of horseweed [Conyza canadensis (L.) Cronquist] continue to spread rapidly throughout Ontario, notably in areas where no-till soybean [Glycine max (L.) Merr.] is grown. The occurrence of multiple herbicide resistance within these populations suggests that the future role of herbicide tank mixtures as a means of control will be limited. An integrated weed management strategy utilizing complementary selection pressures is needed to reduce the selection intensity of relying solely on herbicides for control. Field studies were conducted in 2018 and 2019 to test the hypothesis: if fall-seeded cereal rye (Secale cereale L.) can reduce C. canadensis seedling density and suppress seedling growth, then the interaction(s) of complementary selection pressures of tillage, cereal rye, and herbicides would improve the level of C. canadensis control. Laboratory studies were conducted to determine whether the allelopathic compound 2-benzoxazolinone (BOA) affected the root development of C. canadensis seedlings. The interactions observed among multiple selection pressures of tillage, cereal rye, and herbicides were inconsistent between the 2 yr of study. A monoculture of cereal rye seeded in the fall, however, did reduce seedling height and biomass of C. canadensis consistently, but not density. This reduction in seedling height and biomass was likely caused by the allelopathic compound BOA, which reduced seedling root development. Control of C. canadensis seedlings in the spring required the higher registered rates of dicamba or saflufenacil. The addition of shallow fall tillage and the presence of cereal rye did not improve the variability in control observed notably with 2,4-D or the lower rates of saflufenacil or dicamba. With the implementation of complementary weed management strategies, environmental variables in any given year will likely have a direct influence on whether these interactions are additive or synergistic.
Year-round monitoring of Erebus volcano (Ross Island) has proved challenging due to the difficulties of maintaining continuous power for scientific instruments, especially through the Antarctic winter. We sought a potential solution involving the harvesting of thermal energy dissipated close to the summit crater of the volcano in a zone of diffuse hot gas emissions. We designed, constructed and tested a power generator based on the Seebeck effect, converting thermal energy to electrical power, which could, in principle, be used to run monitoring devices year round. We report here on the design of the generator and the results of an 11 day trial deployment on Erebus volcano in December 2014. The generator produced a mean output power of 270 mW, although we identified some technical issues that had impaired its efficiency. Nevertheless, this is already sufficient power for some monitoring equipment and, with design improvements, such a generator could provide a viable solution to powering a larger suite of instrumentation.
Viruses are more common than bacteria in patients hospitalized with community-acquired pneumonia. Little is known, however, about the frequency of respiratory viral testing and its associations with antimicrobial utilization.
Design:
Retrospective cohort study.
Setting:
The study included 179 US hospitals.
Patients:
Adults admitted with pneumonia between July 2010 and June 2015.
Methods:
We assessed the frequency of respiratory virus testing and compared antimicrobial utilization, mortality, length of stay, and costs between tested versus untested patients, and between virus-positive versus virus-negative patients.
Results:
Among 166,273 patients with pneumonia on admission, 40,787 patients (24.5%) were tested for respiratory viruses, 94.8% were tested for influenza, and 20.7% were tested for other viruses. Viral assays were positive in 5,133 of 40,787 tested patients (12.6%), typically for influenza and rhinovirus. Tested patients were younger and had fewer comorbidities than untested patients, but patients with positive viral assays were older and had more comorbidities than those with negative assays. Blood cultures were positive for bacterial pathogens in 2.7% of patients with positive viral assays versus 5.3% of patients with negative viral tests (P < .001). Antibacterial courses were shorter for virus-positive versus -negative patients overall (mean 5.5 vs 6.4 days; P < .001) but varied by bacterial testing: 8.1 versus 8.0 days (P = .60) if bacterial tests were positive; 5.3 versus 6.1 days (P < .001) if bacterial tests were negative; and 3.3 versus 5.2 days (P < .001) if bacterial tests were not obtained (interaction P < .001).
Conclusions:
A minority of patients hospitalized with pneumonia were tested for respiratory viruses; only a fraction of potential viral pathogens were assayed; and patients with positive viral tests often received long antibacterial courses.
In Australia, the gap between Indigenous and non-Indigenous mental health and well-being is a major human rights issue, and escalating suicide rates represent a national emergency. This chapter describes the Australian human rights context and developments within the discipline and profession of psychology to address these inequities, with the reconciliation action plan developed by the Australian Psychological Society (APS) as one commitment to change. The focus on respectful relationships, cultural safety, and promoting self-determination is part of the background leading to the APS apology to Aboriginal and Torres Strait Islander people. The apology highlighted the importance of a commitment by all psychologists to reconciliation and to modifying their attitudes and work practices to ensure a culturally appropriate, responsive, and safe workforce. The Australian Indigenous Psychology Education Project (AIPEP) represents a focus on the education and employment of the psychology workforce and illustrates collaboration with key stakeholders in psychology education to provide frameworks and guidelines for embedding cultural awareness, responsiveness, and competence throughout all psychology education.
Observations of teleseismic earthquakes using broadband seismometers on the Ross Ice Shelf (RIS) must contend with environmental and structural processes that do not exist for land-sited seismometers. Important considerations are: (1) a broadband, multi-mode ambient wavefield excited by ocean gravity wave interactions with the ice shelf; (2) body wave reverberations produced by seismic impedance contrasts at the ice/water and water/seafloor interfaces and (3) decoupling of the solid Earth horizontal wavefield by the sub-shelf water column. We analyze seasonal and geographic variations in signal-to-noise ratios for teleseismic P-wave (0.5–2.0 s), S-wave (10–15 s) and surface wave (13–25 s) arrivals relative to the RIS noise field. We use ice and water layer reverberations generated by teleseismic P-waves to accurately estimate the sub-station thicknesses of these layers. We present observations consistent with the theoretically predicted transition of the water column from compressible to incompressible mechanics, relevant for vertically incident solid Earth waves with periods longer than 3 s. Finally, we observe symmetric-mode Lamb waves generated by teleseismic S-waves incident on the grounding zones. Despite their complexity, we conclude that teleseismic coda can be utilized for passive imaging of sub-shelf Earth structure, although longer deployments relative to conventional land-sited seismometers will be necessary to acquire adequate data.
Fifty to ninety percent of individuals with Major Neurocognitive Disorder (MNCD) have Neuropsychiatric Symptoms (NPS)1. Agitation and aggression are amongst the most persistent and treatment-refractory symptom clusters. Patients with these NPS are associated with increased risk of institutionalization, psychotropic medication use, caregiver burden, and mortality2.
Safe and effective treatments for NPS are lacking. Consensus guidelines emphasize the initial use of non-pharmacologic approaches though supportive evidence is limited3.
Extensive research has established the safety and efficacy of ECT in elderly patients with depression and other psychiatric conditions6. Clinical experience suggests that ECT is a valuable treatment option in MNCD-related treatment refractory NPS cases7-10. However, data supporting the efficacy and safety of this practice is scant.
Materials and Method:
Patients admitted to the geriatric psychiatry inpatient units who meet the inclusion criteria, were recruited from 2 Vancouver sites and 3 unit at Ontario Shores. These patients had an anesthesia consultation to evaluate their safety of going through ECT. Consent was obtained from their substitute decision makers. All patients enrolled are already on psychotropic medications.
Background: Contaminated surfaces within patient rooms and on shared equipment is a major driver of healthcare-acquired infections (HAIs). The emergence of Candida auris in the New York City metropolitan area, a multidrug-resistant fungus with extended environmental viability, has made a standardized assessment of cleaning protocols even more urgent for our multihospital academic health system. We therefore sought to create an environmental surveillance protocol to detect C. auris and to assess patient room contamination after discharge cleaning by different chemicals and methods, including touch-free application using an electrostatic sprayer. Surfaces disinfected using touch-free methods may not appear disinfected when assessed by fluorescent tracer dye or ATP bioluminescent assay. Methods: We focused on surfaces within the patient zone which are touched by the patient or healthcare personnel prior to contact with the patient. Our protocol sampled the over-bed table, call button, oxygen meter, privacy curtain, and bed frame using nylon-flocked swabs dipped in nonbacteriostatic sterile saline. We swabbed a 36-cm2 surface area on each sample location shortly after the room was disinfected, immediately inoculated the swab on a blood agar 5% TSA plate, and then incubated the plate for 24 hours at 36°C. The contamination with common environmental bacteria was calculated as CFU per plate over swabbed surface area and a cutoff of 2.5 CFU/cm2 was used to determine whether a surface passed inspection. Limited data exist on acceptable microbial limits for healthcare settings, but the aforementioned cutoff has been used in food preparation. Results: Over a year-long period, terminal cleaning had an overall fail rate of 6.5% for 413 surfaces swabbed. We used the protocol to compare the normal application of either peracetic acid/hydrogen peroxide or bleach using microfiber cloths to a new method using sodium dichloroisocyanurate (NaDCC) applied with microfiber cloths and electrostatic sprayers. The normal protocol had a fail rate of 9%, and NaDCC had a failure rate of 2.5%. The oxygen meter had the highest normal method failure rate (18.2%), whereas the curtain had the highest NaDCC method failure rate (11%). In addition, we swabbed 7 rooms previously occupied by C. auris–colonized patients for C. auris contamination of environmental surfaces, including the mobile medical equipment of the 4 patient care units that contained these rooms. We did not find any C. auris, and we continue data collection. Conclusions: A systematic environmental surveillance system is critical for healthcare systems to assess touch-free disinfection and identify MDRO contamination of surfaces.
Background: Approximately two-thirds of children aged <5 years receive out-of-home child care. Childcare attendees have an increased risk of infections compared to children not in childcare settings, possibly due to their close contact in a shared environment. As multidrug-resistant organisms (MDROs) increasingly move from healthcare-associated to community settings, childcare can provide a venue for further transmission of these pathogens. Our objective was to evaluate the bioburden of pathogens present on fomites in childcare centers and how surface contamination changes over time. Methods: The study was conducted in the single-room play area of an Ypsilanti, Michigan, childcare center caring for children aged 3–5 years. Polyester swabs were used to collect surface samples from 16 locations in the room, including (1) laminate, wood and plastic tabletops and furniture; (2) a stainless steel sink and adjacent plastic trash bin; and (3) wood, metal and plastic toys. A water sample was also collected at a 17th site. Samples were collected twice weekly for 5 of 6 weeks, followed by 1 additional collection (September–October 2019). Tryptic soy agar was used for standard plate counts and selective media were used to identify methicillin-resistant Staphylococcus aureus (MRSA), Vvancomycin-resistant Enterococcus (VRE), and extended-spectrum β-lactamase (ESBL)–producing Enterobacteriaceae. Single-plex RT-PCR was used to detect norovirus and adenovirus. Results: Among 175 samples collected on 11 days, MRSA and ESBL-producing Enterobacteriaceae were detected from 10.3% (18 of 175) and 8.0% (14 of 175), respectively, of environmental specimens. No specimens were positive for VRE or norovirus. Adenovirus was detected in 20 of 175 specimens (11.4%). Median bioburden by site ranged from 85 CFU/mL to 2,510 CFU/mL. The highest median bioburden was observed at the sink (2,510 CFU/mL), followed by the plastic building block table (1,620 CFU/mL), the small wood blocks (1,565 CFU/mL) and water from a water play area and an adjacent tabletop (1,260 and 1,100 CFU/mL respectively). The highest single day bioburden was 273,000 CFU/mL at the sink. Conclusion: The presence of MDROs on childcare center fomites raised concern for exposure to these pathogens among vulnerable populations. More study is needed to determine the degree to which these contaminated fomites drive transmission between children. We found the highest bioburdens on sites where children played or washed with water, identifying potential targets for more frequent cleaning.
Funding: None
Disclosures: Emily T. Martin reports a consulting from Pfizer.
Cognitive behavior therapy (CBT) is effective for most patients with a social anxiety disorder (SAD) but a substantial proportion fails to remit. Experimental and clinical research suggests that enhancing CBT using imagery-based techniques could improve outcomes. It was hypothesized that imagery-enhanced CBT (IE-CBT) would be superior to verbally-based CBT (VB-CBT) on pre-registered outcomes.
Methods
A randomized controlled trial of IE-CBT v. VB-CBT for social anxiety was completed in a community mental health clinic setting. Participants were randomized to IE (n = 53) or VB (n = 54) CBT, with 1-month (primary end point) and 6-month follow-up assessments. Participants completed 12, 2-hour, weekly sessions of IE-CBT or VB-CBT plus 1-month follow-up.
Results
Intention to treat analyses showed very large within-treatment effect sizes on the social interaction anxiety at all time points (ds = 2.09–2.62), with no between-treatment differences on this outcome or clinician-rated severity [1-month OR = 1.45 (0.45, 4.62), p = 0.53; 6-month OR = 1.31 (0.42, 4.08), p = 0.65], SAD remission (1-month: IE = 61.04%, VB = 55.09%, p = 0.59); 6-month: IE = 58.73%, VB = 61.89%, p = 0.77), or secondary outcomes. Three adverse events were noted (substance abuse, n = 1 in IE-CBT; temporary increase in suicide risk, n = 1 in each condition, with one being withdrawn at 1-month follow-up).
Conclusions
Group IE-CBT and VB-CBT were safe and there were no significant differences in outcomes. Both treatments were associated with very large within-group effect sizes and the majority of patients remitted following treatment.
Radiocarbon (14C) ages cannot provide absolutely dated chronologies for archaeological or paleoenvironmental studies directly but must be converted to calendar age equivalents using a calibration curve compensating for fluctuations in atmospheric 14C concentration. Although calibration curves are constructed from independently dated archives, they invariably require revision as new data become available and our understanding of the Earth system improves. In this volume the international 14C calibration curves for both the Northern and Southern Hemispheres, as well as for the ocean surface layer, have been updated to include a wealth of new data and extended to 55,000 cal BP. Based on tree rings, IntCal20 now extends as a fully atmospheric record to ca. 13,900 cal BP. For the older part of the timescale, IntCal20 comprises statistically integrated evidence from floating tree-ring chronologies, lacustrine and marine sediments, speleothems, and corals. We utilized improved evaluation of the timescales and location variable 14C offsets from the atmosphere (reservoir age, dead carbon fraction) for each dataset. New statistical methods have refined the structure of the calibration curves while maintaining a robust treatment of uncertainties in the 14C ages, the calendar ages and other corrections. The inclusion of modeled marine reservoir ages derived from a three-dimensional ocean circulation model has allowed us to apply more appropriate reservoir corrections to the marine 14C data rather than the previous use of constant regional offsets from the atmosphere. Here we provide an overview of the new and revised datasets and the associated methods used for the construction of the IntCal20 curve and explore potential regional offsets for tree-ring data. We discuss the main differences with respect to the previous calibration curve, IntCal13, and some of the implications for archaeology and geosciences ranging from the recent past to the time of the extinction of the Neanderthals.
We undertook a strengths, weaknesses, opportunities, and threats (SWOT) analysis of Northern Hemisphere tree-ring datasets included in IntCal20 in order to evaluate their strategic fit with the demands of archaeological users. Case studies on wiggle-matching single tree rings from timbers in historic buildings and Bayesian modeling of series of results on archaeological samples from Neolithic long barrows in central-southern England exemplify the archaeological implications that arise when using IntCal20. The SWOT analysis provides an opportunity to think strategically about future radiocarbon (14C) calibration so as to maximize the utility of 14C dating in archaeology and safeguard its reputation in the discipline.
We analyse the distribution of vowel laxness and stress alternations in Slovenian nouns (for example in the nominative and genitive forms of the masculine noun [ˈjɛzik ~ jeˈzika] ‘tongue’), showing that stress shifts away from mid lax vowels in initial syllables. A stress shift of this sort is predicted by positional faithfulness (Beckman 1997). We show that this prediction is correct, contra McCarthy (2007, 2010) and Jesney (2011). The productivity of the pattern is confirmed in a large-scale nonce-word task. Stress shift in Slovenian is a result of the markedness of mid lax vowels and, perhaps counterintuitively, faithfulness to laxness in initial stressed position.
We evaluated the safety and feasibility of high-intensity interval training via a novel telemedicine ergometer (MedBIKE™) in children with Fontan physiology.
Methods:
The MedBIKE™ is a custom telemedicine ergometer, incorporating a video game platform and live feed of patient video/audio, electrocardiography, pulse oximetry, and power output, for remote medical supervision and modulation of work. There were three study phases: (I) exercise workload comparison between the MedBIKE™ and a standard cardiopulmonary exercise ergometer in 10 healthy adults. (II) In-hospital safety, feasibility, and user experience (via questionnaire) assessment of a MedBIKE™ high-intensity interval training protocol in children with Fontan physiology. (III) Eight-week home-based high-intensity interval trial programme in two participants with Fontan physiology.
Results:
There was good agreement in oxygen consumption during graded exercise at matched work rates between the cardiopulmonary exercise ergometer and MedBIKE™ (1.1 ± 0.5 L/minute versus 1.1 ± 0.5 L/minute, p = 0.44). Ten youth with Fontan physiology (11.5 ± 1.8 years old) completed a MedBIKE™ high-intensity interval training session with no adverse events. The participants found the MedBIKE™ to be enjoyable and easy to navigate. In two participants, the 8-week home-based protocol was tolerated well with completion of 23/24 (96%) and 24/24 (100%) of sessions, respectively, and no adverse events across the 47 sessions in total.
Conclusion:
The MedBIKE™ resulted in similar physiological responses as compared to a cardiopulmonary exercise test ergometer and the high-intensity interval training protocol was safe, feasible, and enjoyable in youth with Fontan physiology. A randomised-controlled trial of a home-based high-intensity interval training exercise intervention using the MedBIKE™ will next be undertaken.
Chemical weapons attacks during the recent conflict in Syria and Iraq highlight the need to better understand the changing epidemiology of chemical weapons use, especially among non-state actors. Public health professionals and policy-makers require this data to prioritize funding, training, chemical weapons preparedness, disaster response, and recovery. The purpose of this investigation is to provide descriptive data that can be used by policy-makers and public safety officials to better prepare for these potential attacks.
Methods:
A five-decade descriptive retrospective review of The Global Terrorism Database, maintained by the National Consortium for the Study of Terrorism and Responses to Terrorism, was conducted to understand trends in chemical agents, targets, and routes of exposure. We reviewed and analyzed data specific to these documented chemical attacks between 1970 and 2017.
Results:
383 terror attacks involved chemical weapons over the study period. A specific agent was named in 154 incidents, while 124 incidents could be classified into traditional chemical weapons categories (eg, vesicant, choking agents). A route of exposure was identified in 242 attacks, with the most common routes of exposure being dermal-mucosal and inhalational. Caustic agents were used in the highest portion of attacks (25%) where the route of exposure was known. Explosive devices were used in 21% of attacks to deliver these chemical agents. Of particular note, private citizens and educational facilities were targeted in 25% and 12% of attacks, respectively. The average number of attacks increased from 6 per year between 1970 and 2011 to 24.9 per year between 2011 and 2017 (coinciding with the start of the Syria conflict). The most commonly utilized chemicals were chlorine (26.0%), tear gas (20.8%), and cyanide (15.6%). Blood agent incidents declined from 32.6% before the September 11, 2001 attacks to 13.6% after 2001, while nerve agent attacks fell from 9.3% to 1.2%. In contrast, choking (namely chlorine) and vesicant (mustard) agent use increased from 7% to 48.1% and from 2.3% to 6.2% of attacks, respectively.
Conclusions:
Chemical weapon use in global terrorism remains an increasingly common occurrence that requires better characterization. The average number of chemical terrorist attacks per year is increasing, with a large proportion resulting from the conflicts in Iraq and Syria. Choking (chlorine) and vesicant (mustard) agents have become the predominant chemical terror agent since 2001, with a decreased incidence of blood (cyanogenic) and nerve (sarin) agents. Future preparedness initiatives should focus on vulnerable targets such as private citizens and educational institutions. Improving blast injury response is essential, along with prioritizing disaster training focused on choking agents, vesicants, and caustics.
Systematic monitoring of exanthema is largely absent from public health surveillance despite emerging diseases and threats of bioterrorism. Michigan Child Care Related Infections Surveillance Program (MCRISP) is the first online program in child care centers to report pediatric exanthema.
Methods:
MCRISP aggregated daily counts of children sick, absent, or reported ill by parents. We extracted all MCRISP exanthema cases from October 1, 2014 through June 30, 2019. Cases were assessed with descriptive statistics and counts were used to construct epidemic curves.
Results:
360 exanthema cases were reported from 12,233 illnesses over 4.5 seasons. Children ages 13-35 months had the highest rash occurrence (45%, n = 162), followed by 36-59 months (41.7%, n = 150), 0-12 months (12.5%, n = 45), and kindergarten (0.8%, n = 3). Centers reported rashes of hand-foot-mouth disease (50%, n = 180), nonspecific rash without fever (15.3%, n = 55), hives (8.1%, n = 29), fever with nonspecific rash (6.9%, n = 25), roseola (3.3%, n = 12), scabies (2.5%, n = 9), scarlet fever (2.5%, n = 9), impetigo (2.2%, n = 8), abscess (1.95, n = 7), viral exanthema without fever (1.7%, n = 6), varicella (1.7%, n = 6), pinworms (0.8%, n = 3), molluscum (0.6%, n = 2), cellulitis (0.6%, n = 2), ringworm (0.6%, n = 2), and shingles (0.2%, n = 1).
Conclusion:
Child care surveillance networks have the potential to act as sentinel public health tools for surveillance of pediatric exanthema outbreaks.
There are vexing puzzles about one of the most comprehensive, far-reaching, most deeply penetrating and punitive of TLOs: anti-money laundering (AML). Despite its seemingly successful institutionalization, the AML TLO exhibits many deficiencies and imposes extensive costs on the private and public sectors, and harms upon the public. Given these failings, what explains its persistence? Could it also be the case that the pervasiveness and penetration of the AML TLO indicates it may constitute a particular species of “disciplinary” TLOs? Drawing on an intensive study at a moment when the TLO’s governing norms and methodologies of implementation were undergoing revision and expansion, as well as on observation and participation in AML/CFT activities over three decades, the chapter brings rich empirical evidence to address these questions: first, by briefly sketching the thirty-year development and workings of the AML TLO; second, by considering its benefits, costs, deficiencies and harms; third, by appraising explanations for its persistence, including the fact that it (1) works in some degree, (2) harms are felt most by weak domestic actors, (3) costs are largely hidden from the public, (4) the TLO has surface plausibility, (5) it is difficult to critique a TLO that purports to control terrorism, and (5) it is sustained by geopolitics; and, fourth, by arguing that the AML TLO may be distinctive insofar as it is a disciplinary TLO. Those singular properties may in fact be shared substantially by other TLOs directed at crime. The site of criminal justice thereby encourages a more differentiated understanding of TLOs in 21st century settings.
Hydrogen lithography has been used to template phosphine-based surface chemistry to fabricate atomic-scale devices, a process we abbreviate as atomic precision advanced manufacturing (APAM). Here, we use mid-infrared variable angle spectroscopic ellipsometry (IR-VASE) to characterize single-nanometer thickness phosphorus dopant layers (δ-layers) in silicon made using APAM compatible processes. A large Drude response is directly attributable to the δ-layer and can be used for nondestructive monitoring of the condition of the APAM layer when integrating additional processing steps. The carrier density and mobility extracted from our room temperature IR-VASE measurements are consistent with cryogenic magneto-transport measurements, showing that APAM δ-layers function at room temperature. Finally, the permittivity extracted from these measurements shows that the doping in the APAM δ-layers is so large that their low-frequency in-plane response is reminiscent of a silicide. However, there is no indication of a plasma resonance, likely due to reduced dimensionality and/or low scattering lifetime.
It is not clear to what extent associations between schizophrenia, cannabis use and cigarette use are due to a shared genetic etiology. We, therefore, examined whether schizophrenia genetic risk associates with longitudinal patterns of cigarette and cannabis use in adolescence and mediating pathways for any association to inform potential reduction strategies.
Methods
Associations between schizophrenia polygenic scores and longitudinal latent classes of cigarette and cannabis use from ages 14 to 19 years were investigated in up to 3925 individuals in the Avon Longitudinal Study of Parents and Children. Mediation models were estimated to assess the potential mediating effects of a range of cognitive, emotional, and behavioral phenotypes.
Results
The schizophrenia polygenic score, based on single nucleotide polymorphisms meeting a training-set p threshold of 0.05, was associated with late-onset cannabis use (OR = 1.23; 95% CI = 1.08,1.41), but not with cigarette or early-onset cannabis use classes. This association was not mediated through lower IQ, victimization, emotional difficulties, antisocial behavior, impulsivity, or poorer social relationships during childhood. Sensitivity analyses adjusting for genetic liability to cannabis or cigarette use, using polygenic scores excluding the CHRNA5-A3-B4 gene cluster, or basing scores on a 0.5 training-set p threshold, provided results consistent with our main analyses.
Conclusions
Our study provides evidence that genetic risk for schizophrenia is associated with patterns of cannabis use during adolescence. Investigation of pathways other than the cognitive, emotional, and behavioral phenotypes examined here is required to identify modifiable targets to reduce the public health burden of cannabis use in the population.