To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Differentiation between post-operative inflammation and bacterial infection remains an important issue in infants following congenital heart surgery. We primarily assessed kinetics and predictive value of C-reactive protein for bacterial infection in the early (days 0–4) and late (days 5–28) period after cardiopulmonary bypass surgery. Secondary objectives were frequency, type, and timing of post-operative infection related to the risk adjustment for congenital heart surgery score.
This 3-year single-centre retrospective cohort study in a paediatric cardiac ICU analysed 191 infants accounting for 235 episodes of CPBP surgery. Primary outcome was kinetics of CRP in the first 28 days after CPBP surgery in infected and non-infected patients.
We observed 22 infectious episodes in the early and 34 in the late post-operative period. CRP kinetics in the early post-operative period did not accurately differentiate between infected and non-infected patients. In the late post-operative period, infected infants displayed significantly higher CRP values with a median of 7.91 (1.64–22.02) and 6.92 mg/dl (1.92–19.65) on days 2 and 3 compared to 4.02 (1.99–15.9) and 3.72 mg/dl (1.08–9.72) in the non-infection group. Combining CRP on days 2 and 3 after suspicion of infection revealed a cut-off of 9.47 mg/L with an acceptable predictive accuracy of 76%.
In neonates and infants, CRP kinetics is not useful to predict infection in the first 72 hours after CPBP surgery due to the inflammatory response. However, in the late post-operative period, CRP is a valuable adjunctive diagnostic test in conjunction with clinical presentation and microbiological diagnostics.
We present a δ13Ccarb chemostratigraphy for the Late Ordovician Hirnantian Stage based on 208 whole-rock samples from six outcrops in the Oslo–Asker district, southern Norway. Our data include the Norwegian type section for the Hirnantian Stage and Ordovician–Silurian boundary at Hovedøya Island. The most complete record of the Hirnantian Isotope Carbon Excursion (HICE) is identified in a coastal exposure at Konglungø locality where the preserved part of the anomaly spans a c. 24 m thick, mixed carbonate–siliciclastic succession belonging to the upper Husbergøya, Langåra and Langøyene formations and where δ13Ccarb peak values reach c. +6 ‰. Almost the entire HICE occurs above beds containing the Hirnantia Fauna, suggesting a latest Hirnantian age for the peak of the excursion. The temporal development of the HICE in southern Norway is associated with substantial shallowing of depositional environments. Sedimentary facies and erosional unconformities suggest four inferably fourth-order glacio-eustatically controlled sea-level lowstands with successively increased exposure and erosion to the succession. The youngest erosional unconformity is related to the development of incised valleys and resulted in cut-out of at least the falling limb of the HICE throughout most of the Oslo–Asker district. The fill of the valleys contains the falling limb of the HICE, and the postglacial transgression therefore can be assigned to the latest part of the Hirnantian Age. We address the recent findings of the chitinozoan Belonechitina gamachiana in the study area and its relationship to the first occurrence of Hirnantia Fauna in the studied sections, challenging identification of the base of the Hirnantian Stage.
The first demonstration of laser action in ruby was made in 1960 by T. H. Maiman of Hughes Research Laboratories, USA. Many laboratories worldwide began the search for lasers using different materials, operating at different wavelengths. In the UK, academia, industry and the central laboratories took up the challenge from the earliest days to develop these systems for a broad range of applications. This historical review looks at the contribution the UK has made to the advancement of the technology, the development of systems and components and their exploitation over the last 60 years.
The Shahrizor Prehistory Project has targeted prehistoric levels of the Late Ubaid and Late Chalcolithic 4 (LC4; Late Middle Uruk) periods at Gurga Chiya (Shahrizor, Kurdistan region of northern Iraq), along with the Halaf period at the adjacent site of Tepe Marani. Excavations at the latter have produced new dietary and environmental data for the sixth millennium B.C. in the region, while at Gurga Chiya part of a burned Late Ubaid tripartite house was excavated. This has yielded a promising archaeobotanical assemblage and established a benchmark ceramic assemblage for the Shahrizor Plain, which is closely comparable to material known from Tell Madhhur in the Hamrin valley. The related series of radiocarbon dates gives significant new insights into the divergent timing of the Late Ubaid and early LC in northern and southern Mesopotamia. In the following occupation horizon, a ceramic assemblage closely aligned to the southern Middle Uruk indicates convergence of material culture with central and southern Iraq as early as the LC4 period. Combined with data for the appearance of Early Uruk elements at sites in the adjacent Qara Dagh region, this hints at long-term co-development of material culture during the fourth millennium B.C. in southeastern Iraqi Kurdistan and central and southern Iraq, potentially questioning the model of expansion or colonialism from the south.
Domestic dogs can function as either paratenic or definitive hosts for the zoonotic raccoon roundworm Baylisascaris procyonis. However, factors leading to development of patent infections in dogs are under-studied. Here we compared infection dynamics of B. procyonis in dogs vs the natural raccoon host. Dogs and raccoons were inoculated 5000 or 500 B. procyonis eggs (n = 3 per dose) or were fed B. procyonis-infected laboratory mice (n = 3 per dose; mice inoculated with 1000 or 250 eggs). Fecal samples were analysed via flotation and a commercial coproantigen ELISA designed for detection of Toxocara spp. Two of 12 dogs (both received low dose larvae) developed patent infections; all 12 raccoons became infected with 10 developing patent infections. Compared with dogs, prepatent periods were shorter in raccoons and maximum egg outputs were much greater. Baylisascaris procyonis coproantigens were detectable via ELISA in all raccoons and the patently infected dogs. Finally, dogs spontaneously lost infections while all patently infected raccoons shed eggs until conclusion of the study. Our results demonstrate that dogs are clearly suboptimal hosts showing limited parasite establishment and fecundity vs raccoons. Despite the low competence, patently infected dogs still pose a risk for human exposure, emphasizing the importance of control measures.
The challenges related to product structures, which go hand in hand with megatrends such as individualization, can be met with the modularity of product structures. With the help of various modularization methods, modular product structures are created with regard to different goals. There are many references to the effects of modular product structures on life phases and economic targets in the literature. These effects were collected in previous research in a generic impact model. Since there is a lot of information about the effects, such models become very comprehensive and thereby difficult to handle. For this reason, the impact model is consistently generated using SysML. The adaptation to company scenarios is possible through the use of simulations with which, for example, company-related and product-related boundary conditions can be controlled by means of a User Interface.
Antimicrobial stewardship programs (ASPs) are effective in developed countries. In this study, we assessed the effectiveness of an infectious disease (ID) physician–driven post-prescription review and feedback as an ASP strategy in India, a low middle-income country (LMIC).
Design and setting:
This prospective cohort study was carried out for 18 months in 2 intensive care units of a tertiary-care hospital, consisting of 3 phases: baseline, intervention, and follow up. Each phase spanned 6 months.
Patients aged ≥15 years receiving 48 hours of study antibiotics were recruited for the study.
During the intervention phase, an ID physician reviewed the included cases and gave alternate recommendations if the antibiotic use was inappropriate. Acceptance of the recommendations was measured after 48 hours. The primary outcome of the study was days of therapy (DOT) per 1,000 study patient days (PD).
Overall, 401 patients were recruited in the baseline phase, 381 patients were recruited in the intervention phase, and 379 patients were recruited in the follow-up phase. Antimicrobial use decreased from 831.5 during the baseline phase to 717 DOT per 1,000 PD in the intervention phase (P < .0001). The effect was sustained in the follow-up phase (713.6 DOT per 1,000 PD). De-escalation according to culture susceptibility improved significantly in the intervention phase versus the baseline phase (42.7% vs 23.6%; P < .0001). Overall, 73.3% of antibiotic prescriptions were inappropriate. Recommendations by the ID team were accepted in 60.7% of the cases.
The ID physician–driven implementation of an ASP was successful in reducing antibiotic utilization in an acute-care setting in India.
“Theory of Mind” (ToM) is the capacity to deduce other persons’ cognitive and emotional states. Studies investigating affective ToM in healthy older adults and in persons with Alzheimer’s disease have reported contradictory results, although evidence indicates that advanced age (Ruffman, Henry, Livingstone, & Phillips, 2008) and Alzheimer’s disease (Elferink, van Tilborg, & Kessels, 2015) do not affect the ability to identify or infer different emotions to the same extent. To evaluate affective ToM abilities in these populations, we asked 63 individuals (17 with Alzheimer’s disease) to infer the emotional states of characters presented without facial details in emotional situations. We observed similar results in healthy younger and older adults, but poorer performance in persons with Alzheimer’s disease for disgust, sadness, and surprise, but not for anger, fear, and joy. Results suggest that persons with Alzheimer’s disease have difficulties in inferring several emotional states from contextual information without facial cues.
Hospitals are meant to be places for respite and healing; however, technological advances and reliance on monitoring alarms has led to the environment becoming increasingly noisy. The coronary care unit (CCU), like the emergency department, provides care to ill patients while being vulnerable to noise pollution. The World Health Organization (WHO; Geneva, Switzerland) recommends that for optimum rest and healing, sound levels should average approximately 30 decibels (dB) with maximum readings less than 40 dB.
The purpose of this study was to measure and analyze sound levels in three different locations in the CCU, and to review alarm reports in relation to sound levels.
Over a one-month period, sound recorders (Extech SDL600; Extech Instruments; Nashua, New Hampshire USA) were placed in three separate locations in the CCU at the West Roxbury Veterans’ Administration (VA) Hospital (Roxbury, Massachusetts USA). Sound samples were recorded once per second, stored in Comma Separated Values format for Excel (Microsoft Corporation; Redmond, Washington USA), and then exported to Microsoft Excel. Averages were determined, plotted per hour, and alarm histories were reviewed to determine alarm noise effect on total noise for each location, as well as common alarm occurrences.
Patient Room 1 consistently had the lowest average recordings, though all averages were >40 dB, despite decreases between 10:00 pm and 7:00 am. During daytime hours, recordings maintained levels >50 dB. Overnight noise remained above recommended levels 55.25% of the period in Patient Room 1 and 99.61% of the same time period in Patient Room 7. The nurses’ station remained the loudest location of all three. Alarms per hour ranged from 20-26 during the day. Alarms per day averaged: Patient Room 1-57.17, Patient Room 7-122.03, and the nurses’ station - 562.26. Oxygen saturation alarms accounted for 33.59% of activity, and heart-related (including ST segment and pacemaker) accounted for 49.24% of alarms.
The CCU cares for ill patients requiring constant monitoring. Despite advances in technology, measured noise levels for the hospital studied exceeded WHO standards of 40 dB and peaks of 45 dB, even during night hours when patients require rest. Further work is required to reduce noise levels and examine effects on patient satisfaction, clinical outcomes, and length of stay.
RyanKM, GagnonM, HannaT, MelloB, FofanaM, CiottoneG, MolloyM. Noise Pollution: Do We Need a Solution? An Analysis of Noise in a Cardiac Care Unit. Prehosp Disaster Med. 2016;31(4):432–435.
Teachers want their lessons to be enjoyable, immersive, productive and full of learning. In this regard, digital games have everything they want. Successful digital games maintain players’ attention, require them to solve problems, acquire new knowledge and learn new skills. Moreover, despite the considerable amount of learning, emotional investment (including frustration) and often monotonous labour (for example, working back through levels each time you ‘die’), players will not only persist but also call it ‘fun’. It is not surprising then that the idea of incorporating digital games into the classroom has taken hold of teachers for decades. More recently, educators have realised that they can also learn from the success of digital games and use game principles to ‘gamify’ learning activities. However, digital games are not a ‘magic bullet’ for education. Giving students a digital game does not ensure that they will be learning in the classroom; the teacher will still have to resolve behaviour management and motivation issues. This chapter aims to explain how digital games and gamification can be used in education, while also pointing out some related concerns.
Digital games and gamification
It is important not to confuse digital games and gamification. They are not synonyms. Gamification is the use of game design (mechanics and dynamics) in what is typically considered non-game environments such as the classroom. Some of the elements we might use when applying gamification to curriculum activities are: levels, badges, points, competition and status. There are many more but, importantly, gamification is more than simply changing the age old ‘gold star’ reward in a classroom to a ‘badge’ or changing the name of lesson to ‘level 1’. Time and ultimately iterative design need to be invested into the mechanics (for example, levels) and the dynamics (for example, when those levels are unlocked). In addition, deeper considerations of game play need to be imbued into the instructional design including notions of ‘permission to fail’ – in games students ‘die’ all the time. Students need to be able to have choices and strategies for success – in games they have immediate feedback on their success or failure and can hypothesise on how to succeed the next time. Another consideration is how to encourage curiosity, imagination and a state of flow (a state of full immersion in a feeling of energised focus).
The aim of the present study was to examine the associations between the maternal intake of fatty acids during pregnancy and the risk of preclinical and clinical type 1 diabetes in the offspring. The study included 4887 children with human leucocyte antigen (HLA)-conferred type 1 diabetes susceptibility born during the years 1997–2004 from the Finnish Type 1 Diabetes Prediction and Prevention Study. Maternal diet was assessed with a validated FFQ. The offspring were observed at 3- to 12-month intervals for the appearance of type 1 diabetes-associated autoantibodies and development of clinical type 1 diabetes (average follow-up period: 4·6 years (range 0·5–11·5 years)). Altogether, 240 children developed preclinical type 1 diabetes and 112 children developed clinical type 1 diabetes. Piecewise linear log-hazard survival model and Cox proportional-hazards regression were used for statistical analyses. The maternal intake of palmitic acid (hazard ratio (HR) 0·82, 95 % CI 0·67, 0·99) and high consumption of cheese during pregnancy (highest quarter v. intermediate half HR 0·52, 95 % CI 0·31, 0·87) were associated with a decreased risk of clinical type 1 diabetes. The consumption of sour milk products (HR 1·14, 95 % CI 1·02, 1·28), intake of protein from sour milk (HR 1·15, 95 % CI 1·02, 1·29) and intake of fat from fresh milk (HR 1·43, 95 % CI 1·04, 1·96) were associated with an increased risk of preclinical type 1 diabetes, and the intake of low-fat margarines (HR 0·67, 95 % CI 0·49, 0·92) was associated with a decreased risk. No conclusive associations between maternal fatty acid intake or food consumption during pregnancy and the development of type 1 diabetes in the offspring were detected.
To examine the use of vitamin D supplements during infancy among the participants in an international infant feeding trial.
Information about vitamin D supplementation was collected through a validated FFQ at the age of 2 weeks and monthly between the ages of 1 month and 6 months.
Infants (n 2159) with a biological family member affected by type 1 diabetes and with increased human leucocyte antigen-conferred susceptibility to type 1 diabetes from twelve European countries, the USA, Canada and Australia.
Daily use of vitamin D supplements was common during the first 6 months of life in Northern and Central Europe (>80 % of the infants), with somewhat lower rates observed in Southern Europe (>60 %). In Canada, vitamin D supplementation was more common among exclusively breast-fed than other infants (e.g. 71 % v. 44 % at 6 months of age). Less than 2 % of infants in the USA and Australia received any vitamin D supplementation. Higher gestational age, older maternal age and longer maternal education were study-wide associated with greater use of vitamin D supplements.
Most of the infants received vitamin D supplements during the first 6 months of life in the European countries, whereas in Canada only half and in the USA and Australia very few were given supplementation.
The current classification of epileptic seizures, epilepsies, and epilepsy syndromes is considered first. The presence of progressive neurological signs is a cause for concern and suggests a degenerative disorder. Investigations may include biochemical investigation, EEG, video telemetry, cranial imaging, and DNA diagnostics. Affected males with fragile-X have an increased frequency of epilepsy. Estimates of its prevalence vary from 28% to 45%. Seizures may be generalized tonic-clonic, partial with or without secondary generalization, or of multiple types. Advances in human molecular genetic techniques have allowed positional cloning strategies to be applied to identification of the defective genes and their protein products. A number of studies have been performed on the incidence of epilepsy in the offspring of epileptic parents, and provide an empiric risk of 1. 7%-7. 3%, with a median of 4. 2% for all types of seizures, including febrile convulsions and single seizures.