To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The design community can contribute significantly to the success of the United Nations Sustainable Development Goals in Africa. Currently, alignment of the design research community on sustainable development goals in Africa is not well understood. In this paper, we review relevant literature and identify trends in research topics studied and in patterns of collaboration between researchers. We find differences in topic representation and collaboration trends between African-based and non-African based researchers. Understanding these differences better will be important for future research.
Introduction: Medicine demands a sacrifice of physicians’ personal life, but culture has slowly changed towards valuing a balanced work life. Parental leave is linked to better physical and mental health, but policies and culture surrounding parental leave are largely unstudied in the Canadian Emergency Medicine landscape. Anecdotally, experiences vary widely. This study was designed to determine what proportion of Canadian Emergency Departments have formal parental leave policies (maternity, paternity, and other ex. adoption) and what proportion of Canadian EM physicians are satisfied with their department's parental leave policies. Methods: Two surveys were generated; one to assess attitudes and experiences of emergency physicians, and a second survey for department chiefs assessed the policies and their features. These were approved by the UBC REB and distributed through the CAEP Research Committee. Primary outcomes were physician satisfaction with their department's parental leave policy (4-5/5 Likert Scale), and departments with a formal parental leave policy (Y/N). Results: 38% (8/21) of department chiefs reported having a formal policy for maternity leave, 29% (6/21) for paternity leave, and 24% (5/21) other. The survey of Emergency Physicians revealed similar rates at 48% (90/187) maternity, 40% (70/184) paternity, 29% (53/181) other. Among physicians who were aware of them, 69% (62/90) were somewhat or very satisfied with the maternity leave policies, 58% (51/88) with paternity leave policies, and 48% (39/81) with other parental leave. Less than 10% were somewhat or very dissatisfied with any of these. Several department chiefs commented that they had never refused anyone parental leave, but have no formal policy. However, 87% (147/187) of physicians reported a formal maternity leave policy was somewhat or very important to them; similarly 80% (134/187) paternity leave. Less than 15% felt each was somewhat or extremely unimportant. Conclusion: Presence and type of parental leave policy varies across the country. Most physicians were satisfied with the support they had available, but the vast majority felt that a formal maternity and paternity leave policy itself was important. This study would suggest that, without actually changing practice, the introduction of a formal parental leave policy is of value. Our research group will use this data to collaborate on a template parental leave policy to be made available for this purpose.
Introduction: Overcrowding in the Emergency Department (ED) results in delays in care, and increased patient morbidity and mortality. Innovative departmental approaches have the potential to make patient flow through the ED more efficient and reduce overcrowding by improving patient throughput. The Calgary zone ED recently piloted a new physician role, the Emergency Physician Lead (EPL), a senior physician working closely with the charge nurse and consulting services to provide physician leadership, and to troubleshoot flow issues and safety breeches such as EMS offload delays and long emergency inpatient (EIP) stays. The objective of this study was to evaluate the efficacy of the EPL by determining its effect on key metrics of patient flow, and by identifying which specific EPL interventions were most effective at improving patient throughput. Methods: A retrospective cohort design was used to compare Foothills Medical Centre (FMC) ED patients seen by the EPL from March-June 2019 (n = 1343 patients) with a control group from the same period in 2018 (n = 5530). An EMR search was used to collect patient data and generate descriptive statistics, which were compared between groups by Mann-Whitney U-test. Patient handover notes left by the EPL were also collected and analyzed by two independent assessors to develop a list of actions taken by the EPL. Each patient was then coded based on the actions in the handover note, and means for each coded group were compared to control to find correlations between action and changes in key flow metrics. Results: Patients whose care involved the EPL had a 40% shorter average ED length of stay (ELOS) compared to control (515 vs 865 min, p < 0.001). The EPL was especially effective for patients with ELOS above the 90th percentile, with a 58% relative reduction. EPL patients also had lower average times from first contact with the department to first order being placed (79 vs 143 min, p < 0.001), and spent less time as EIPs after being admitted (390 vs 515 mins, p < 0.001). EPL actions aimed at early ordering of investigations or early management showed the largest relative reductions in ELOS, followed by actions related to resolving issues with consulting services (56% and 48% respectively, p < 0.001). Conclusion: The EPL role appears to be associated with improvements in several key metrics of patient flow. Specific EPL actions were correlated with marked decreases in length of stay. The EPL may be an effective strategy to improve patient throughput and combat ED overcrowding.
Evidence from previous small trials has suggested the effectiveness of early social communication interventions for autism.
The Preschool Autism Communication Trial (PACT) investigated the efficacy of such an intervention in the largest psychosocial autism trial to date.
To provide a stringent test of a pre-school communication intervention for autism.
152 children with core autism aged 2 years - 4 years 11 months in a 3 site 2 arm single (assessor) blinded randomised controlled trial of the parent-mediated communication-focused intervention added to treatment as usual (TAU) against TAU alone. Primary outcome; severity of autism symptoms (modified social communication algorithm from Autism Diagnostic Observation Schedule-Generic, ADOS-G). Secondary outcomes; blinded measures of parent-child interaction, child language, and adaptation in school.
At 13 month endpoint the treatment resulted in strong improvement in parental synchronous response to child (adjusted between-group effect size 1.22 (95% CI 0.85, 1.59) and child initiations with parent (ES 0.41 (0.08, 0.74) but small effect on autism symptomatology (ADOS-G, ES -0.24 (95% CI -0.59, 0.11) ns). Parents (not blind to allocation) reported strong treatment effects on child language and social adaptation but effects on blinded research assessed language and school adaptation were small.
Addition of the PACT intervention showed clear benefit in improving parent-child dyadic social communication but no substantive benefit over TAU in modifying objectively rated autism symptoms. This attenuation on generalisation from ‘proximal’ intervention effects to wider symptom change in other contexts remains a significant challenge for autism treatment and measurement methodology.
22q11.2 deletion syndrome (22q11.2DS) and Williams syndrome (WS) are common neurogenetic microdeletion syndromes. The aim of the present study was to compare the neuropsychiatric and neurocognitive phenotypes of 22q11.2DS and WS.
Forty-five individuals with 22q11.2DS, 24 with WS, 22 with idiopathic developmental disability (DD) and 22 typically developing (TD) controls were compared for the rates of psychiatric disorders as well as cognitive executive and visuospatial functions.
We found that while anxiety, mood and disruptive disorders had an equally high prevalence among individuals with 22q11.2DS, WS and DDs, the 22q11.2DS group had the highest rates of psychotic disorders and the WS group had the highest rates of specific phobia. We also found that the WS group demonstrated more severe impairments in both executive and visuospatial functions than the other groups. WS and 22q11.2DS subjects had worse Performance-IQ than Verbal-IQ, a feature typical of non-verbal learning disorders.
These findings offer a wide perspective on unique versus common phenotypes in 22q11.2DS and WS.
Alice Dunbar Nelson’s early short stories about New Orleans’s downriver, working-class neighborhoods focus, in particular, on the way the men in this environment can suffer forms of alienation sufficiently extreme to constitute social death. In two of these stories, a literal death comes to highlight the ways the main characters are already, from the standpoint of social relations, dead, and as such highlight the problems faced by the working poor in the distinctive environment of the final years of the nineteenth century in New Orleans. These stories, however, gave Dunbar Nelson a means of escaping this world, as, soon after they were published, she left for New York and became a prominent figure in the Harlem Renaissance.
Acute confusional state (ACS) is very common in the intensive care unit (ICU) setting. Most often, it is one of the main reasons a neurology consult is requested in the surgical or medical ICU. Acute confusional states are often used interchangeably when describing metabolic encephalopathy, delirium, ICU psychosis, or septic encephalopathy. Encephalopathy is defined as a subacute global brain dysfunction that is gradual in onset with very broad clinical symptoms, whereas delirium is often described as an acute process. The list of potential causes (Table 32.1) of ACS could be summarized using “Vitamin E” as a mnemonic. In this chapter, we will only focus on management of delirium and toxic metabolic encephalopathy.
Antibody-associated disorders of the central nervous system (CNS) are divided into two broad categories: classic paraneoplastic disorders and autoimmune disorders (i.e. autoimmune encephalitis) . Autoimmune encephalitis is associated with antibodies that bind to cell surface determinants of membrane-associated proteins on neuronal cells (neuronal surface antibody syndrome –NSAb), whereas paraneoplastic syndromes are associated with intracellular neuronal antigens. It can be challenging at times to differentiate between the two syndromes. Patients with NSAb usually present with an acute or subacute symptom onset, with short duration to nadir, and a very good response to immunotherapy . Table 18.1 summarizes some of the characteristics of each. In this chapter, we will focus on the diagnosis and management of autoimmune encephalitis (AE).
Systems engineering and design thinking have been widely seen as distinctly different processes, systems engineering being more data-driven and analytical, and design thinking being more human- centred and creative. We use the term ‘design thinking’ to encompass the plurality of human-centered design processes that seek to unpack the core values behind design decisions. With the increased awareness that both systems engineering and design thinking need each other, the effects of a possibly persisting distinction on engineers’ attitudes toward these two processes are not well understood. In this paper, we describe the development and validation of a scale for measuring individual attitudes about systems engineering and design thinking. Thematic analysis of engineering and design literature is used to derive a Likert scale reflecting these attitudes. We use exploratory and confirmatory factor analysis to test and confirm this two-factor thematic representation, resulting in a 9-item Systems Engineering and Design Thinking Scale measure of attitudes.
Advancements in computer technology have enabled three-dimensional (3D) reconstruction, data-stitching, and manipulation of 3D data obtained on X-ray imaging systems such as micro-computed tomography (μ-CT). Likewise, intuitive evaluation of these 3D datasets can be enhanced by recent advances in virtual reality (VR) hardware and software. Additionally, the generation, viewing, and manipulation of 3D X-ray diffraction datasets, such as pole figures employed for texture analysis, can also benefit from these advanced visualization techniques. We present newly-developed protocols for porting 3D data (as TIFF-stacks) into a Unity gaming software platform so that data may be toured, manipulated, and evaluated within a more-intuitive VR environment through the use of game-like controls and 3D headsets. We demonstrate this capability by rendering μ-CT data of a polymer dogbone test bar at various stages of in situ mechanical strain. An additional experiment is presented showing 3D XRD data collected on an aluminum test block with vias. These 3D XRD data for texture analysis (χ, ϕ, 2θ dimensions) enables the viewer to visually inspect 3D pole figures and detect the presence or absence of in-plane residual macrostrain. These two examples serve to illustrate the benefits of this new methodology for multidimensional analysis.
The South Caucasus occupies the divide between ancient Mesopotamia and prehistoric Europe, and was thus crucial in the development of Old World societies. Chronologies for the region, however, have lacked the definition achieved in surrounding areas. Concentrating on the Tsaghkahovit Plain of north-western Armenia, Project ArAGATS's multi-site radiocarbon dataset has now produced Bayesian modelling, which provides tight chronometric support for tracing the transmission of technology, population movement and social developments that shaped the Eurasian Bronze and Iron Ages.
To describe the process by which the 12 community-based primary health care (CBPHC) research teams worked together and fostered cross-jurisdictional collaboration, including collection of common indicators with the goal of using the same measures and data sources.
A pan-Canadian mechanism for common measurement of the impact of primary care innovations across Canada is lacking. The Canadian Institutes for Health Research and its partners funded 12 teams to conduct research and collaborate on development of a set of commonly collected indicators.
A working group representing the 12 teams was established. They undertook an iterative process to consider existing primary care indicators identified from the literature and by stakeholders. Indicators were agreed upon with the intention of addressing three objectives across the 12 teams: (1) describing the impact of improving access to CBPHC; (2) examining the impact of alternative models of chronic disease prevention and management in CBPHC; and (3) describing the structures and context that influence the implementation, delivery, cost, and potential for scale-up of CBPHC innovations.
Nineteen common indicators within the core dimensions of primary care were identified: access, comprehensiveness, coordination, effectiveness, and equity. We also agreed to collect data on health care costs and utilization within each team. Data sources include surveys, health administrative data, interviews, focus groups, and case studies. Collaboration across these teams sets the foundation for a unique opportunity for new knowledge generation, over and above any knowledge developed by any one team. Keys to success are each team’s willingness to engage and commitment to working across teams, funding to support this collaboration, and distributed leadership across the working group. Reaching consensus on collection of common indicators is challenging but achievable.
There is an ongoing debate about whether human rights standards have changed over the last 30 years. The evidence for or against this shift relies upon indicators created by human coders reading the texts of human rights reports. To help resolve this debate, we suggest translating the question of changing standards into a supervised learning problem. From this perspective, the application of consistent standards over time implies a time-constant mapping from the textual features in reports to the human coded scores. Alternatively, if the meaning of abuses have evolved over time, then the same textual features will be labeled with different numerical scores at distinct times. Of course, while the mapping from natural language to numerical human rights score is a highly complicated function, we show that these two distinct data generation processes imply divergent overall patterns of accuracy when we train a wide variety of algorithms on older versus newer sets of observations to learn how to automatically label texts with scores. Our results are consistent with the expectation that standards of human rights have changed over time.
Legionnaires’ disease (LD) incidence in the USA has quadrupled since 2000. Health departments must detect LD outbreaks quickly to identify and remediate sources. We tested the performance of a system to prospectively detect simulated LD outbreaks in Allegheny County, Pennsylvania, USA. We generated three simulated LD outbreaks based on published outbreaks. After verifying no significant clusters existed in surveillance data during 2014–2016, we embedded simulated outbreak-associated cases into 2016, assigning simulated residences and report dates. We mimicked daily analyses in 2016 using the prospective space-time permutation scan statistic to detect clusters of ⩽30 and ⩽180 days using 365-day and 730-day baseline periods, respectively. We used recurrence interval (RI) thresholds of ⩾20, ⩾100 and ⩾365 days to define significant signals. We calculated sensitivity, specificity and positive and negative predictive values for daily analyses, separately for each embedded outbreak. Two large, simulated cooling tower-associated outbreaks were detected. As the RI threshold was increased, sensitivity and negative predictive value decreased, while positive predictive value and specificity increased. A small, simulated potable water-associated outbreak was not detected. Use of a RI threshold of ⩾100 days minimised time-to-detection while maximizing positive predictive value. Health departments should consider using this system to detect community-acquired LD outbreaks.
A cluster of Salmonella Paratyphi B variant L(+) tartrate(+) infections with indistinguishable pulsed-field gel electrophoresis patterns was detected in October 2015. Interviews initially identified nut butters, kale, kombucha, chia seeds and nutrition bars as common exposures. Epidemiologic, environmental and traceback investigations were conducted. Thirteen ill people infected with the outbreak strain were identified in 10 states with illness onset during 18 July–22 November 2015. Eight of 10 (80%) ill people reported eating Brand A raw sprouted nut butters. Brand A conducted a voluntary recall. Raw sprouted nut butters are a novel outbreak vehicle, though contaminated raw nuts, nut butters and sprouted seeds have all caused outbreaks previously. Firms producing raw sprouted products, including nut butters, should consider a kill step to reduce the risk of contamination. People at greater risk for foodborne illness may wish to consider avoiding raw products containing raw sprouted ingredients.
In 2016, imported Zika virus (ZIKV) infections and the presence of a potentially competent mosquito vector (Aedes albopictus) implied that ZIKV transmission in New York City (NYC) was possible. The NYC Department of Health and Mental Hygiene developed contingency plans for a urosurvey to rule out ongoing local transmission as quickly as possible if a locally acquired case of confirmed ZIKV infection was suspected. We identified tools to (1) rapidly estimate the population living in any given 150-m radius (i.e. within the typical flight distance of an Aedes mosquito) and (2) calculate the sample size needed to test and rule out the further local transmission. As we expected near-zero ZIKV prevalence, methods relying on the normal approximation to the binomial distribution were inappropriate. Instead, we assumed a hypergeometric distribution, 10 missed cases at maximum, a urine assay sensitivity of 92.6% and 100% specificity. Three suspected example risk areas were evaluated with estimated population sizes of 479–4,453, corresponding to a minimum of 133–1244 urine samples. This planning exercise improved our capacity for ruling out local transmission of an emerging infection in a dense, urban environment where all residents in a suspected risk area cannot be feasibly sampled.
Lichens are one of the common dominant biota in biological soil crusts (biocrusts), a community that is one of the largest in extent in the world. Here we present a summary of the main features of the lifestyle of soil crust lichens, emphasizing their habitat, ecophysiology and versatility. The soil crust is exposed to full light, often to high temperatures and has an additional water source, the soil beneath the lichens. However, despite the open nature of the habitat the lichens are active under shady and cooler conditions and avoid climate extremes of high temperature and light. In temperate and alpine habitats they can also be active for long periods, several months in some cases. They show a mixture of physiological constancy (e.g. similar activity periods and net photosynthetic rates) but also adaptations to the habitat (e.g. the response of net photosynthesis to thallus water content can differ for the same lichen species in Europe and the USA and some species show extensive rhizomorph development). Despite recent increased research, aspects of soil crust ecology, for example under snow, remain little understood.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.