To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Nosocomial outbreaks leading to healthcare worker (HCW) infection and death have been increasingly reported during the coronavirus disease 2019 (COVID-19) pandemic.
We implemented a strategy to reduce nosocomial acquisition.
We summarized our experience in implementing a multipronged infection control strategy in the first 300 days (December 31, 2019, to October 25, 2020) of the COVID-19 pandemic under the governance of Hospital Authority in Hong Kong.
Of 5,296 COVID-19 patients, 4,808 (90.8%) were diagnosed in the first pandemic wave (142 cases), second wave (896 cases), and third wave (3,770 cases) in Hong Kong. With the exception of 1 patient who died before admission, all COVID-19 patients were admitted to the public healthcare system for a total of 78,834 COVID-19 patient days. The median length of stay was 13 days (range, 1–128). Of 81,955 HCWs, 38 HCWs (0.05%; 2 doctors and 11 nurses and 25 nonprofessional staff) acquired COVID-19. With the exception of 5 of 38 HCWs (13.2%) infected by HCW-to-HCW transmission in the nonclinical settings, no HCW had documented transmission from COVID-19 patients in the hospitals. The incidence of COVID-19 among HCWs was significantly lower than that of our general population (0.46 per 1,000 HCWs vs 0.71 per 1,000 population; P = .008). The incidence of COVID-19 among professional staff was significantly lower than that of nonprofessional staff (0.30 vs 0.66 per 1,000 full-time equivalent; P = .022).
A hospital-based approach spared our healthcare service from being overloaded. With our multipronged infection control strategy, no nosocomial COVID-19 in was identified among HCWs in the first 300 days of the COVID-19 pandemic in Hong Kong.
Extensive environmental contamination by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) has been reported in hospitals during the coronavirus disease 2019 (COVID-19) pandemic. We report our experience with the practice of directly observed environmental disinfection (DOED) in a community isolation facility (CIF) and a community treatment facility (CTF) in Hong Kong.
The CIF, with 250 single-room bungalows in a holiday camp, opened on July 24, 2020, to receive step-down patients from hospitals. The CTF, with 500 beds in open cubicles inside a convention hall, was activated on August 1, 2020, to admit newly diagnosed COVID-19 patients from the community. Healthcare workers (HCWs) and cleaning staff received infection control training to reinforce donning and doffing of personal protective equipment and to understand the practice of DOED, in which the cleaning staff observed patient and staff activities and then performed environmental disinfection immediately thereafter. Supervisors also observed cleaning staff to ensure the quality of work. In the CTF, air and environmental samples were collected on days 7, 14, 21, and 28 for SARS-CoV-2 detection by RT-PCR. Patient compliance with mask wearing was also recorded.
Of 291 HCWs and 54 cleaning staff who managed 243 patients in the CIF and 674 patients in the CTF from July 24 to August 29, 2020, no one acquired COVID-19. All 24 air samples and 520 environmental samples collected in the patient area of the CTF were negative for SARS-CoV-2. Patient compliance with mask wearing was 100%.
With appropriate infection control measures, zero environmental contamination and nosocomial transmission of SARS-CoV-2 to HCWs and cleaning staff was achieved.
This study examined how language knowledge and item properties (i.e., semantic relatedness and position) influenced Chinese missing logographeme effects. Eighty-four Chinese readers and 53 English readers were asked to search for the Chinese logographeme 口 while reading a Chinese prose passage. The target 口 appeared in five different positions (i.e., left, right, top, bottom, or inside), varying its degree of semantic relatedness to its embedded characters. The generalized linear mixed-effect model revealed a significant interaction between semantic relatedness and position in Chinese, but not in English, readers when visual complexity and frequency were controlled. For Chinese readers, a higher omission rate occurred when 口 appeared in the top and inside positions and exhibited low semantic relatedness with its embedded characters, whereas 口 was omitted more when it was positioned on the right and exhibited high semantic relatedness to its embedded characters. English readers exhibited a different omission pattern: 口 was omitted more when it appeared in the left or right position irrespective of semantic relatedness. In addition, 口 was omitted more in the inside, rather than the bottom, position. These findings suggest that the omission rate of the logographeme is determined by item properties at the sublexical level and the reader’s language knowledge.
Cortical spreading depolarization (CSD) is recognized as a cause of transient neurological symptoms (TNS) in various clinical entities. Although scientific literature has been flourishing in the field of CSD, it remains an underrecognized pathophysiology in clinical practice. The literature evoking CSD in relation to subdural hematoma (SDH) is particularly scarce. Patients with SDH frequently suffer from TNS, most being attributed to seizures despite an atypical semiology, evolution, and therapeutic response. Recent literature has suggested that a significant proportion of those patients’ TNS represent the clinical manifestations of underlying CSD. Recently, the term Non-Epileptical Stereoytpical Intermittent Symptoms (NESIS) has been proposed to describe a subgroup of patients presenting with TNS in the context of SDH. Indirect evidence and recent research suggest that the pathophysiology of NESIS could represent the clinical manifestation of CSD. This review should provide a concise yet thorough review of the current state of literature behind the pathophysiology of CSD with a particular focus on recent research and knowledge regarding the presence of CSD in the context of subdural hematoma. Although many questions remain in the evolution of knowledge in this field would likely have significant diagnostic, therapeutic, and prognostic implications.
Taʿzia khani, also known as shabih khani, is a Shi˓ite form of devotional theatre. The performances, ongoing in the present day, are fundamentally rituals of lament, and centre around commemoration of the martyrdom of the Third Imam of Shi˓i Islam and grandson of the Prophet Mohammad, Hosayn ibn ˓Ali ibn Abi Taleb, born c.AH 6 (AD 627). The taʿzia repertoire includes a largely anonymous cycle of plays, the central corpus of which portray the siege and slaughter of Hosayn and seventy-two of his companions on the plain of Karbala in the Islamic month of Moharram in the year AH 61 (AD 680). As a genre of drama, taʿzia has undergone its maximum development in the Iranian context, although the same events are commemorated amongst Shi˓i communities internationally through a wide variety of performance forms.
The aims of this article are twofold. First, to introduce this tradition in its Persian context, hoping to encourage a comparative analysis discourse with current scholars of European varieties of early theatre (religious and otherwise). Secondly, I aim to contribute to the existing understanding of the process of the emergence of this devotional theatre form. Indeed, large annual Moharram commemorations for Hosayn have been documented on the Iranian plateau from the tenth century AH (sixteenth century AD), when Shi˓ism became the official religion of state under the Safavid dynasty, AH 907–1135 (AD 1501–1722). During this period public lamentations took a variety of forms, yet in urban contexts they appear to have been largely split between processional rituals and rousing public recitations of the Karbala narrative and life of Hosayn, a practice known as rawza khani. Despite some debate, the prevailing scholarly theory has been that the first taʿzia plays were generated around the mid-eighteenth century AD through the fusing of the visual elements of the ambulatory lament rituals for Hosayn with the narratives recounted in the rawza khani recitals. In light of new evidence suggesting that the tradition began somewhat earlier, I propose an alternative theory of how it emerged.
As suggested by the existing theory, the processional rituals of the late eleventh century AH (seventeenth century AD) and early twelfth century AH (eighteenth century AD) became increasingly theatrical in nature. Likewise, the influence of the literature used in the Safavid-era rawza khani recitals (for which see below) on the content of the main episodes of the taʿzia cycle is clear.
Exposure therapy is consistently indicated as the first-line treatment for anxiety-related disorders. Unfortunately, therapists often deliver exposure therapy in an overly cautious, less effective manner, characterized by using their own ‘therapist safety behaviours’. Cognitive behavioural models postulate that beliefs about therapist safety behaviours are related to their use; however, little is known about the beliefs therapists hold regarding therapist safety behaviour use. The present study aimed to identify the beliefs exposure therapists have regarding the necessity of therapist safety behaviours and to examine the relationship between this construct and therapist safety behaviour use. Australian psychologists (n = 98) completed an online survey that included existing measures of therapist safety behaviour use, therapist negative beliefs about exposure therapy, likelihood to exclude anxious clients from exposure therapy, and use of intensifying exposure techniques. Participants also completed the Exposure Implementation Beliefs Scale (EIBS), a measure created for the present study which assesses beliefs regarding the necessity of therapist safety behaviours. Beliefs about the necessity of therapist safety behaviours – particularly in protecting the client – significantly predicted therapist safety behaviour use. Findings suggest that exposure therapy training media should aim to decrease therapist safety behaviour use by addressing beliefs about the necessity of therapist safety behaviours, especially in protecting the client.
Key learning aims
(1) To understand what therapist safety behaviours are in the context of exposure therapy.
(2) To identify common beliefs about therapist safety behaviours.
(3) To understand how beliefs about therapist safety behaviours relate to therapist safety behaviour use.
(4) To consider how exposure therapy delivery may be improved by modifying beliefs about therapist safety behaviours.
(5) To explore how beliefs about therapist safety behaviours may be modified to reduce therapist safety behaviour use.
Automated surveillance of healthcare-associated infections reduces workload and improves standardization, but it has not yet been adopted widely. In this study, we assessed the performance and feasibility of an easy implementable framework to develop algorithms for semiautomated surveillance of deep incisional and organ-space surgical site infections (SSIs) after orthopedic, cardiac, and colon surgeries.
Retrospective cohort study in multiple countries.
European hospitals were recruited and selected based on the availability of manual SSI surveillance data from 2012 onward (reference standard) and on the ability to extract relevant data from electronic health records. A questionnaire on local manual surveillance and clinical practices was administered to participating hospitals, and the information collected was used to pre-emptively design semiautomated surveillance algorithms standardized for multiple hospitals and for center-specific application. Algorithm sensitivity, positive predictive value, and reduction of manual charts requiring review were calculated. Reasons for misclassification were explored using discrepancy analyses.
The study included 3 hospitals, in the Netherlands, France, and Spain. Classification algorithms were developed to indicate procedures with a high probability of SSI. Components concerned microbiology, prolonged length of stay or readmission, and reinterventions. Antibiotics and radiology ordering were optional. In total, 4,770 orthopedic procedures, 5,047 cardiac procedures, and 3,906 colon procedures were analyzed. Across hospitals, standardized algorithm sensitivity ranged between 82% and 100% for orthopedic surgery, between 67% and 100% for cardiac surgery, and between 84% and 100% for colon surgery, with 72%–98% workload reduction. Center-specific algorithms had lower sensitivity.
Using this framework, algorithms for semiautomated surveillance of SSI can be successfully developed. The high performance of standardized algorithms holds promise for large-scale standardization.
To evaluate the effectiveness and tolerability of brivaracetam (BRV) in a refractory epilepsy population in an outpatient clinical setting.
Retrospective medical information system review and self-report questionnaire for all patients treated with BRV until the end of 2017.
Thirty-eight patients were included, 73.7% female and mean age 36.2. The mean number of antiepileptic drugs (AEDs) for previous use was 8.9, and for current use was 2.5. Mean seizure frequency in the last 3 months was 12 per month. At 3, 6, 12, and 15 months, the 50% responder rates were 36.1%, 32%, 41.2%, and 45.5%, respectively. Patients took BRV for a median duration of 8.25 months, ranging from 7 days to 60 months. Retention rate was 75.0%, 72.0%, 59.2%, and 47.9% at 3, 6, 12, and 15 months, respectively. Overall, the main reasons for discontinuation were adverse events (AEs) (52.3%), lack of efficacy (35.3%), or both (11.8%). The rate of total AEs was 60.5% according to medical records and 85.7% according to questionnaire, including mostly tiredness, psychiatric, and memory complaints. Psychiatric side effects occurred in 31.6% according to medical records and 47.4% according to questionnaire results, which is higher than previously reported and persisted throughout the study period.
BRV appears to be a useful and safe add-on treatment, even in a very refractory group of patients. In this real-life clinical setting, psychiatric AEs were found at a higher rate than previously published.
This chapter examines the theoretical and empirical foundations of the roles of morphological and semantic skills in developmental dyslexia. Morphemes are the minimal units of meaning by which we create new words in any given language (e.g., “magic”+“ian” = “magician”). Semantics is the study of meaning, broadly speaking. In this chapter, we review data on children’s access to meaning at the word and sentence level in tasks, primarily in the oral modality. This review is important because of two common assumptions. The first is of the dominant role of phonological skills in dyslexia, an assumption that has limited the scope of empirical exploration into other potentially implicated factors. The second is that people with dyslexia have a strength in morphology and semantics, a speculation with surprisingly little empirical foundation. We first review the theoretical background for these speculations. We then present the available research evidence, focusing specifically on children with dyslexia, for alphabetic, morphosyllabic, and abjad writing systems.
We examined whether morphological awareness made a significant contribution to word-level reading across Grades 1 to 4. We test these relations specifically in a task measuring awareness of past-tense forms. A total of 375 children from Grades 1 to 4 completed tasks assessing past-tense morphological awareness along with real word and pseudoword reading. Children also completed control measures assessing phonological awareness, phonological short-term memory, sentence-level language skills, and nonverbal cognitive ability. After these controls, past-tense morphological awareness was a significant predictor of real word reading in Grades 1 and 2, but not in Grades 3 and 4. Further, following on all controls, past-tense morphological awareness was a consistent predictor of pseudoword reading across Grades 1 to 4. Morphological awareness, at least as measured with past-tense verbs, appears to have a role in word reading across the early to middle elementary school grades; for young readers, there are relations to reading of both known and novel words, and for older readers, relations are significant specifically in reading novel words. These findings are discussed within the context of theories of word reading development.
We examined morphological awareness and reading achievement in university students in two ways. First, students with and without a self-reported history of reading difficulties were compared on word reading and text reading achievement, and on the reading-related skills of morphological awareness, orthographic processing, and phonological processing. Second, the unique contribution of morphological awareness to reading achievement was examined for a larger sample of first-year university students. Students with a self-reported history of reading difficulties (n = 54) showed moderate to large gaps in each area of reading achievement, and timed reading comprehension appeared more severely impaired than word-reading efficiency. These students had a deficit in morphological awareness that persisted even when (a) phonological awareness and orthographic processing skills, or (b) word-reading accuracy were statistically controlled. In the larger first-year sample (N = 211), morphological awareness contributed to variance in word reading beyond that accounted for by phonological awareness and orthographic processing. Furthermore, of the reading-related skills, only morphological awareness made a unique contribution to reading comprehension beyond variance accounted for by word reading. Taken together, these results demonstrate that morphological awareness makes unique contributions to university students’ reading achievement and is an additional difficulty for students with a self-reported history of reading difficulties.
Background: Safety behaviours are ubiquitous across anxiety disorders and are associated with the aetiology, maintenance and exacerbation of anxiety. Cognitive behavioural models posit that beliefs about safety behaviours directly influence their use. Therefore, beliefs about safety behaviours may be an important component in decreasing safety behaviour use. Unfortunately, little empirical research has evaluated this theorized relationship.
Aims: The present study aimed to examine the predictive relationship between beliefs about safety behaviours and safety behaviour use while controlling for anxiety severity.
Method: Adults with clinically elevated levels of social anxiety (n = 145) and anxiety sensitivity (n = 109) completed an online survey that included established measures of safety behaviour use, quality of life, and anxiety severity. Participants also completed the Safety Behaviour Scale (SBS), a measure created for the current study which includes a transdiagnostic checklist of safety behaviours, as well as questions related to safety behaviour use and beliefs about safety behaviours.
Results: Within both the social anxiety and anxiety sensitivity groups, positive beliefs about safety behaviours predicted greater safety behaviour use, even when controlling for anxiety severity. Certain beliefs were particularly relevant in predicting safety behaviour use within each of the clinical analogue groups.
Conclusions: Findings suggest that efforts to decrease safety behaviour use during anxiety treatment may benefit from identifying and modifying positive beliefs about safety behaviours.
We agree with Brette's assessment that the coding metaphor has become more problematic than helpful for theories of brain and cognitive functioning. In an effort to aid in constructing an alternative, we argue that joining the insights from the dynamical systems approach with the semiotic framework of C. S. Peirce can provide a fruitful perspective.
Spelling is a key, and telling, component of children’s literacy development. An important aspect of spelling development lies in children’s sensitivity to morphological root constancy. This is the sensitivity to the fact that the spelling of roots typically remains constant across related words (e.g., sing in singing and singer). The present investigation examined the extent to which children with dyslexia and younger typically developing children are sensitive to this feature of the orthography. We did so with a spelling-level matched design (e.g., Bourassa & Treiman, 2008) and by further contrasting results with those for a sample of children of the same chronological age as the dyslexic group. Analyses revealed that the dyslexic children and their spelling-ability matched peers used the root constancy principle to a similar degree. However, neither group used this principle to its maximum extent; maximal use of root constancy did emerge for age matched peers. Overall, the findings support the idea that sensitivity to root constancy in children with dyslexia is characterized by delayed rather than atypical development.
Theories of reading development generally agree that, in addition to phonological decoding, some kind of orthographic processing skill underlies the ability to learn to read words. However, there is a lack of clarity as to which aspect(s) of orthographic processing are key in reading development. We test here whether this is orthographic knowledge and/or orthographic learning. Whereas orthographic knowledge has been argued to reflect a child’s existing store of orthographic representations, orthographic learning is concerned with the ability to form these representations. In a longitudinal study of second- and third-grade students, we evaluate the relations between these two aspects of orthographic processing and word-reading outcomes. The results of our analyses show that variance captured by orthographic knowledge overlaps with that of word reading, to the point that they form a single latent word-reading factor. In contrast, orthographic learning is distinctive from this factor. Further, structural equation modeling demonstrates that early orthographic learning was related to gains in word reading skills. We discuss the implications of these findings for theories of word-reading development.
“Traditional” foodways are represented as an important part of cultural heritage in Europe. Two legal instruments aim to play a role in safeguarding them—namely, the Traditional Specialties Guaranteed (TSG) scheme and the 2003 UNESCO Convention for the Safeguarding of the Intangible Cultural Heritage. These instruments are sometimes used in parallel—for example, in the TSG registration for “Pizza Napoletana” and the nomination of “the art of Neapolitan ‘pizzaiuolo’” to one of the lists of the Convention. While recognizing the important role of state actors in this process, this article proposes going beyond a simple “misappropriation” thesis to look at the possible economic effects of registration and inscription.
Background: While exposure therapy effectively reduces anxiety associated with specific phobias, not all individuals respond to treatment and some will experience a return of fear after treatment ceases. Aims: This study aimed to test the potential benefit of increasing the intensity of exposure therapy by adding an extra step that challenged uncontrollability (Step 15: allowing a spider to walk freely over one's body) to the standard fear hierarchy. Method: Fifty-one participants who had a severe fear of spiders completed two 60-min exposure sessions 1 week apart in a context that was either the same or different from the baseline and follow-up assessment context. Participants were categorized into groups based on the last hierarchy step they completed during treatment (Step 14 or fewer, or Step 15). Results: Those who completed Step 15 had greater reductions in fear and beliefs about the probability of harm from baseline to post-treatment than those who completed fewer steps. Although completing Step 15 did not prevent fear from returning after a context change, it allowed people to maintain their ability to tolerate their fear, which earlier steps did not. Despite some fear returning after a context change, individuals who completed Step 15 tended to report greater reductions in fear from baseline to the follow-up assessment than participants who completed 14 or fewer steps. Conclusions: Overall, these results suggest that more intensive exposure that directly challenges harm beliefs may lead to greater changes in fear and fear beliefs than less intensive exposure.
Background: Transient neurological symptoms in patients with subdural hematoma (SDH) are often attributed to secondary epilepsy despite a negative workup. We believe a significant proportion of these patients could rather suffer from cortical spreading depolarization (CSD). Methods: We performed a retrospective case-control study of patients with transient neurological symptoms post-SDH evacuation between 1996 and 2017. The clinical features of patients with negative EEG were compared to those with positive EEG (ictal or interictal abnormalities) and a clinical scoring system was created. Results: 59 patients were included, 20 (34%) with a positive EEG. Speech-related symptoms (OR 4.8, p=0.018) and prolonged episodes (OR 23.1, p=0.001) were associated with a negative EEG. Clonic movements (OR 0.014, p<0.0005), impaired awareness (OR 0.013, p<0.0005), positive symptoms (OR 0.05, p<0.0001), complete response to standard antiepileptic drugs (OR 0.06, p=0.007) and mortality (OR 0.021, p=0.003) were associated with a positive EEG. We built a clinical score based on these features, which showed a 90% sensitivity and 100% specificity. Conclusions: We believe that the differences observed between both groups were driven by the presence of CSD rather than seizure in the case group. Our proposed scoring system can help predict EEG results and may be useful to identify CSD in future trials.
The aim of the study was to assess whether a simple substitution of carbohydrate in the conventionally recommended diet with protein and fat would result in a clinically meaningful reduction in postprandial hyperglycaemia in subjects with type 2 diabetes mellitus (T2DM). In all, sixteen subjects with T2DM treated with metformin only, fourteen male, with a median age of 65 (43–70) years, HbA1c of 6·5 % (47 mmol/l) (5·5–8·3 % (37–67 mmol/l)) and a BMI of 30 (sd 4·4) kg/m2 participated in the randomised, cross-over study. A carbohydrate-reduced high-protein (CRHP) diet was compared with an iso-energetic conventional diabetes (CD) diet. Macronutrient contents of the CRHP/CD diets consisted of 31/54 % energy from carbohydrate, 29/16 % energy from protein and 40/30 % energy from fat, respectively. Each diet was consumed on 2 consecutive days in a randomised order. Postprandial glycaemia, pancreatic and gut hormones, as well as satiety, were evaluated at breakfast and lunch. Compared with the CD diet, the CRHP diet reduced postprandial AUC of glucose by 14 %, insulin by 22 % and glucose-dependent insulinotropic polypeptide by 17 % (all P<0·001), respectively. Correspondingly, glucagon AUC increased by 33 % (P<0·001), cholecystokinin by 24 % (P=0·004) and satiety scores by 7 % (P=0·035), respectively. A moderate reduction in carbohydrate with an increase in fat and protein in the diet, compared with an energy-matched CD diet, greatly reduced postprandial glucose excursions and resulted in increased satiety in patients with well-controlled T2DM.
Across Africa the majority of giraffe species and subspecies are in decline, whereas the South African giraffe Giraffa camelopardalis giraffa remains numerous and widespread throughout southern Africa. By 2013 the number of giraffes in South Africa's Kruger National Park had increased by c. 150% compared to 1979 estimates. An even greater increase occurred on many of the estimated 12,000 privately owned game ranches, indicating that private ownership can help to conserve this subspecies. The estimated total population size in South Africa is 21,053–26,919. The challenge now is to implement monitoring and surveillance of G. camelopardalis giraffa as a conservation priority and to introduce sustainable practices among private owners to increase numbers and genetic variation within in-country subspecies.