We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Barrett’s oesophagus (BE) is the precursor of oesophageal adenocarcinoma, which has become the most common type of oesophageal cancer in many Western populations. Existing evidence on diet and risk of BE predominantly comes from case–control studies, which are subject to recall bias in measurement of diet. We aimed to investigate the potential effect of diet, including macronutrients, carotenoids, food groups, specific food items, beverages and dietary scores, on risk of BE in over 20 000 participants of the Melbourne Collaborative Cohort Study. Diet at baseline (1990–1994) was measured using a food frequency questionnaire. The outcome was BE diagnosed between baseline and follow-up (2007–2010). Logistic regression models were used to estimate OR and 95 % CI for diet in relation to risk of BE. Intakes of leafy vegetables and fruit were inversely associated with risk of BE (highest v. lowest quartile: OR = 0·59; CI: 0·38, 0·94; P-trend = 0·02 and OR = 0·58; CI: 0·37, 0·93; P-trend = 0·02 respectively), as were dietary fibre and carotenoids. Stronger associations were observed for food than the nutrients found in them. Positive associations were observed for discretionary food (OR = 1·54; CI: 0·97, 2·44; P-trend = 0·04) and total fat intake (OR per 10 g/d = 1·11; CI: 1·00, 1·23), the association for fat was less robust in sensitivity analyses. No association was observed for meat, protein, dairy products or diet scores. Diet is a potential modifiable risk factor for BE. Public health and clinical guidelines that incorporate dietary recommendations could contribute to reduction in risk of BE and, thereby, oesophageal adenocarcinoma.
To evaluate the effectiveness of ultraviolet-C (UV-C) disinfection as an adjunct to standard chlorine-based disinfectant terminal room cleaning in reducing transmission of hospital-acquired multidrug-resistant organisms (MDROs) from a prior room occupant.
Design:
A retrospective cohort study was conducted to compare rates of MDRO transmission by UV-C status from January 1, 2016, through December 31, 2018.
Setting:
Acute-care, single-patient hospital rooms at 6 hospitals within an academic healthcare system in Pennsylvania.
Methods:
Transmission of hospital-acquired MDRO infection was assessed in patients subsequently assigned to a single-patient room of a source occupant with carriage of 1 or more MDROs on or during admission. Acquisition of 5 pathogens was compared between exposed patients in rooms with standard-of-care chlorine-based disinfectant terminal cleaning with or without adjunct UV-C disinfection. Logistic regression analysis was used to estimate the adjusted risk of pathogen transfer with adjunctive use of UV-C disinfection.
Results:
In total, 33,771 exposed patient admissions were evaluated; the source occupants carried 46,688 unique pathogens. Prior to the 33,771 patient admissions, 5,802 rooms (17.2%) were treated with adjunct UV-C disinfection. After adjustment for covariates, exposed patients in rooms treated with adjunct UV-C were at comparable risk of transfer of any pathogen (odds ratio, 1.06; 95% CI, 0.84–1.32; P = .64).
Conclusion:
Our analysis does not support the use of UV-C in addition to post-discharge cleaning with chlorine-based disinfectant to lower the risk of prior room occupant pathogen transfer.
To define conditions in which contact precautions can be safely discontinued for methicillin-resistant Staphylococcus aureus (MRSA) and vancomycin-resistant Enterococcus (VRE).
Design:
Interrupted time series.
Setting:
15 acute-care hospitals.
Participants:
Inpatients.
Intervention:
Contact precautions for endemic MRSA and VRE were discontinued in 12 intervention hospitals and continued at 3 nonintervention hospitals. Rates of MRSA and VRE healthcare-associated infections (HAIs) were collected for 12 months before and after. Trends in HAI rates were analyzed using Poisson regression. To predict conditions when contact precautions may be safely discontinued, selected baseline hospital characteristics and infection prevention practices were correlated with HAI rate changes, stratified by hospital.
Results:
Aggregated HAI rates from intervention hospitals before and after discontinuation of contact precautions were 0.14 and 0.15 MRSA HAI per 1,000 patient days (P = .74), 0.05 and 0.05 VRE HAI per 1,000 patient days (P = .96), and 0.04 and 0.04 MRSA laboratory-identified (LabID) events per 100 admissions (P = .57). No statistically significant rate changes occurred between intervention and non-intervention hospitals. All successful hospitals had low baseline MRSA and VRE HAI rates and high hand hygiene adherence. We observed no correlations between rate changes after discontinuation and the assessed hospital characteristics and infection prevention factors, but the rate improved with higher proportion of semiprivate rooms (P = .04).
Conclusions:
Discontinuing contact precautions for MRSA/VRE did not result in increased HAI rates, suggesting that contact precautions can be safely removed from diverse hospitals, including community hospitals and those with lower proportions of private rooms. Good hand hygiene and low baseline HAI rates may be conditions permissive of safe removal of contact precautions.
To examine associations between diet and risk of developing gastro-oesophageal reflux disease (GERD).
Design:
Prospective cohort with a median follow-up of 15·8 years. Baseline diet was measured using a FFQ. GERD was defined as self-reported current or history of daily heartburn or acid regurgitation beginning at least 2 years after baseline. Sex-specific logistic regressions were performed to estimate OR for GERD associated with diet quality scores and intakes of nutrients, food groups and individual foods and beverages. The effect of substituting saturated fat for monounsaturated or polyunsaturated fat on GERD risk was examined.
Setting:
Melbourne, Australia.
Participants:
A cohort of 20 926 participants (62 % women) aged 40–59 years at recruitment between 1990 and 1994.
Results:
For men, total fat intake was associated with increased risk of GERD (OR 1·05 per 5 g/d; 95 % CI 1·01, 1·09; P = 0·016), whereas total carbohydrate (OR 0·89 per 30 g/d; 95 % CI 0·82, 0·98; P = 0·010) and starch intakes (OR 0·84 per 30 g/d; 95 % CI 0·75, 0·94; P = 0·005) were associated with reduced risk. Nutrients were not associated with risk for women. For both sexes, substituting saturated fat for polyunsaturated or monounsaturated fat did not change risk. For both sexes, fish, chicken, cruciferous vegetables and carbonated beverages were associated with increased risk, whereas total fruit and citrus were associated with reduced risk. No association was observed with diet quality scores.
Conclusions:
Diet is a possible risk factor for GERD, but food considered as triggers of GERD symptoms might not necessarily contribute to disease development. Potential differential associations for men and women warrant further investigation.
In Chapter 19, the authors deal with the practices of good teachers of listening. They describe an interventionist study aimed at developing teachers’ understanding of listening pedagogy and practice and present the implications of the study for models of teacher development.
We examined if and when English-learning 17-month-olds would accommodate Japanese forms as labels for novel objects. In Experiment 1, infants (n = 22) who were habituated to Japanese word–object pairs looked longer at switched test pairs than familiar test pairs, suggesting that they had mapped Japanese word forms to objects. In Experiments 2 (n = 44) and 3 (n = 22), infants were presented with a spoken passage prior to habituation to assess whether experience with a different language would shift their perception of Japanese word forms. Here, infants did not demonstrate learning of Japanese word–object pairs. These findings offer insight into the flexibility of the developing perceptual system. That is, when there is no evidence to the contrary, 17-month-olds will accommodate forms that vary from their typical input but will efficiently constrain their perception when cued to the fact that they are not listening to their native language.
This paper considers research and practice relating to listening in instructed classroom settings, limiting itself to what might be called unidirectional listening (Macaro, Graham & Vanderplank 2007) – in other words, where learners listen to a recording, a TV or radio clip or lecture, but where there is no communication back to the speaker(s). A review of the literature relating to such listening reveals a tendency for papers to highlight two features in their introductory lines: first, the acknowledged importance of listening as a skill underpinning second language (L2) acquisition more broadly, and second, the relative paucity of research into listening compared with the skills of speaking, reading or writing. In the last ten years or so, however, there has been a growth in the number of studies conducted in the field, as evidenced in Vandergrift's review in 2007 and Vanderplank's more recent overview (2013). Consequently, my view is that it is possible to identify from that research certain key principles in relation to listening within instructed settings, particularly regarding listening strategies.
We investigated 16- and 20-month-olds' flexibility in mapping phonotactically illegal words to objects. Using an associative word-learning task, infants were presented with a training phase that either highlighted or did not highlight the referential status of a novel label. Infants were then habituated to two novel objects, each paired with a phonotactically illegal Czech word. When referential cues were provided, 16-, but not 20-month-olds, formed word–object mappings. In the absence of referential cues, infants of both ages failed to map the novel words. These findings illustrate the complex interplay between infants' developing sound system and their word learning abilities.
Research suggests that the way in which cognitive therapy is delivered is an important factor in determining outcomes. We test the hypotheses in which the development of a shared problem list, use of case formulation, homework tasks and active intervention strategies will act as process variables.
Method
Presence of these components during therapy is taken from therapist notes. The direct and indirect effect of the intervention is estimated by an instrumental variable analysis.
Results
A significant decrease in the symptom score for case formulation (coefficient =–23, 95% CI –44 to –1.7, P = 0.036) and homework (coefficient =–0.26, 95% CI –0.51 to –0.001, P = 0.049) is found. Improvement with the inclusion of active change strategies is of borderline significance (coefficient =–0.23, 95% CI –0.47 to 0.005, P = 0.056).
Conclusions
There is a greater treatment effect if formulation and homework are involved in therapy. However, high correlation between components means that these may be indicators of overall treatment fidelity.
The difficulties in conducting palliative care research have been widely acknowledged. In order to generate the evidence needed to underpin palliative care provision, collaborative research is considered essential. Prior to formalizing the development of a research network for the state of Victoria, Australia, a preliminary study was undertaken to ascertain interest and recommendations for the design of such a collaboration.
Method:
Three data-collection strategies were used: a cross-sectional questionnaire, interviews, and workshops. The questionnaire was completed by multidisciplinary palliative care specialists from across the state (n = 61); interviews were conducted with senior clinicians and academics (n = 21) followed by two stakeholder workshops (n = 29). The questionnaire was constructed specifically for this study, measuring involvement of and perceptions of palliative care research.
Results:
Both the interview and the questionnaire data demonstrated strong support for a palliative care research network and aided in establishing a research agenda. The stakeholder workshops assisted with strategies for the formation of the Palliative Care Research Network Victoria (PCRNV) and guided the development of the mission and strategic plan.
Significance of results:
The research and efforts to date to establish the PCRNV are encouraging and provide optimism for the evolution of palliative care research in Australia. The international implications are highlighted.
Background: Substantial epidemiological research has shown that psychotic experiences are more common in densely populated areas. Many patients with persecutory delusions find it difficult to enter busy social urban settings. The stress and anxiety caused by being outside lead many patients to remain in-doors. We therefore developed a brief CBT intervention, based upon a formulation of the way urban environments cause stress and anxiety, to help patients with paranoid thoughts to feel less distressed when outside in busy streets. Aims: The aim was to pilot the new intervention for feasibility and acceptability and gather preliminary outcome data. Method: Fifteen patients with persecutory delusions in the context of a schizophrenia diagnosis took part. All patients first went outside to test their reactions, received the intervention, and then went outside again. Results: The intervention was considered useful by the patients. There was evidence that going outside after the intervention led to less paranoid responses than the initial exposure, but this was only statistically significant for levels of distress. Conclusions: Initial evidence was obtained that a brief CBT module specifically focused on helping patients with paranoia go outside is feasible, acceptable, and may have clinical benefits. However, it could not be determined from this small feasibility study that any observed improvements were due to the CBT intervention. Challenges in this area and future work required are outlined.
Background: Research suggests that core schemas are important in both the development and maintenance of psychosis. Aims: The aim of the study was to investigate and compare core schemas in four groups along the continuum of psychosis and examine the relationships between schemas and positive psychotic symptomatology. Method: A measure of core schemas was distributed to 20 individuals experiencing first-episode psychosis (FEP), 113 individuals with “at risk mental states” (ARMS), 28 participants forming a help-seeking clinical group (HSC), and 30 non-help-seeking individuals who endorse some psychotic-like experiences (NH). Results: The clinical groups scored significantly higher than the NH group for negative beliefs about self and about others. No significant effects of group on positive beliefs about others were found. For positive beliefs about the self, the NH group scored significantly higher than the clinical groups. Furthermore, negative beliefs about self and others were related to positive psychotic symptomatology and to distress related to those experiences. Conclusions: Negative evaluations of the self and others appear to be characteristic of the appraisals of people seeking help for psychosis and psychosis-like experiences. The results support the literature that suggests that self-esteem should be a target for intervention. Future research would benefit from including comparison groups of people experiencing chronic psychosis and people who do not have any psychotic-like experiences.
Delusions are a key symptom of psychosis and they are frequently distressing and disabling. Existing treatments, both pharmacological and psychological, are only partially effective. It is important to develop new treatment approaches based on theoretically derived and empirically tested processes. Delusions are associated with a reasoning bias: the jumping to conclusions (JTC) bias involves gathering limited information to reach decisions. It is proposed that this bias influences appraisals of psychotic experiences leading to the formation and persistence of delusions. Existing treatments do not influence JTC. A new intensive treatment approach – ‘reasoning training’ – is described. It aims to encourage participants to gather information, consider alternative explanations for events and review the evidence before reaching a decision. Preliminary data suggest that it is possible to change the JTC bias and that this improves belief flexibility and may reduce delusional conviction. The concepts and methods of this new approach have implications for clinical practice.
Background: Dementia research often requires the participation of people with dementia. Obtaining informed consent is problematic when potential participants lack the capacity to provide it. We investigated comfort with proxy consent to research involving older adults deemed incapable of this decision, and examined if comfort varies with the type of proxy and the study's risk-benefit profile.
Methods: We surveyed random samples of five relevant groups (older adults, informal caregivers, physicians, researchers in aging, and Research Ethics Board members) from four Canadian provinces. Respondents were presented with scenarios involving four types of proxies (non-assigned, designated in a healthcare advance directive with or without instructions specific to research participation, and court-appointed). Given a series of risk-benefit profiles, respondents indicated whether they were comfortable with proxy consent to research for each scenario.
Results: Two percent of the respondents felt proxy consent should never be allowed. In all groups, comfort depended far more on the risk-benefit profile associated with the research scenario than with type of proxy. For research involving little or no risk and potential personal benefits, over 90% of the respondents felt comfortable with substitute consent by a designated or court-appointed proxy while 80% were at ease with a non-assigned proxy. For studies involving serious risks with potentially greater personal benefits, older adults and informal caregivers were less comfortable with proxy consent.
Conclusions: A large majority of Canadians are comfortable with proxy consent for low-risk research. Further work is needed to establish what kinds of research are considered to be low risk.
The Kepler satellite provides a unique opportunity to study the detailed optical photometric variability of late-type stars with unprecedentedly long (several year) continuous monitoring and sensitivity to very small-scale variations. We are studying a sample of over two hundred cool (mid-A - late-K spectral type) stars using Kepler long-cadence (30 minute sampling) observations. These stars show a remarkable range of photometric variability, but in this paper we concentrate on rotational modulation due to starspots and flaring. Modulation at the 0.1% level is readily discernable. We highlight the rapid timescales of starspot evolution seen on solar-like stars with rotational periods between 2 and 7 days.
Using the British ‘Index to Theses’, we found forty-seven Ph.D.s relating to second and foreign language learning and/or teaching defended in English universities in 2006. Objective criteria led us to fourteen theses which had investigated both teaching and learning. Over half of these adopted a process–product research design with the aim of finding causal relationships between teaching and learning. Six theses focused on individual differences (motivation, strategies, attitudes), with three adopting an ‘effectiveness-of-intervention’ approach and three following more descriptive, exploratory designs. The designs of the ‘effectiveness-of-intervention’ studies varied greatly, ranging from naturalistic evaluations to highly controlled randomised control experiments. They covered a range of pedagogical concerns, including the use of computers, error correction, language portfolios, learner strategies and communicative-style activities. In addition to our own comments on the quality of the studies and reports, we present considerable methodological detail to enable the reader to evaluate the validity of the findings and claims made in each study. We argue that Ph.D. theses need to demonstrate fully that the implications drawn from the study are supported by the data collection and analyses described, which was not always the case in the theses reviewed. Finally, we make suggestions for future areas of investigation by postgraduate researchers.
Ischemic cardiovascular disease is the leading cause of death in Canada. In ST elevation myocardial infarction (STEMI), time to reperfusion is a key determinant in reducing morbidity and mortality with percutaneous coronary intervention (PCI) being the preferred reperfusion strategy. Where PCI is available, delays to definitive care include times to electrocardiogram (ECG) diagnosis and cardiovascular laboratory access. In 2004, the Cardiac Care Network of Ontario recommended implementation of an emergency department (ED) protocol to reduce reperfusion time by transporting patients with STEMI directly to the nearest catheterization laboratory. The model was implemented in Frontenac County in April 2005. The objective of this study was to assess the effectiveness of a protocol for rapid access to PCI in reducing door-to-balloon times in STEMI.
Methods:
Two 1-year periods before and after implementation of a rapid access to PCI protocol (ending March 2005 and June 2006, respectively) were studied. Administrative databases were used to identify all subjects with STEMI who were transported by regional emergency medical services (EMS) and received emergent PCI. The primary outcome measure was time from ED arrival to first balloon inflation (door-to-balloon time). Times are presented as medians and interquartile ranges (IQRs). Statistical comparisons were made using the Mann–Whitney U test and presented graphically with Kaplan–Meier curves.
Results:
Patients transported under the rapid access protocol (n = 39) were compared with historical controls (n = 42). Median door-to-balloon time was reduced from 87 minutes (IQR 67–108) preprotocol to 62 minutes (IQR 40–80) postprotocol (p < 0.001).
Conclusion:
In our region, implementation of an EMS protocol for rapid access to PCI significantly reduced time to reperfusion for patients with STEMI.