We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Recent research has shown the potential of speleothem δ13C to record a range of environmental processes. Here, we report on 230Th-dated stalagmite δ13C records for southwest Sulawesi, Indonesia, over the last 40,000 yr to investigate the relationship between tropical vegetation productivity and atmospheric methane concentrations. We demonstrate that the Sulawesi stalagmite δ13C record is driven by changes in vegetation productivity and soil respiration and explore the link between soil respiration and tropical methane emissions using HadCM3 and the Sheffield Dynamic Global Vegetation Model. The model indicates that changes in soil respiration are primarily driven by changes in temperature and CO2, in line with our interpretation of stalagmite δ13C. In turn, modelled methane emissions are driven by soil respiration, providing a mechanism that links methane to stalagmite δ13C. This relationship is particularly strong during the last glaciation, indicating a key role for the tropics in controlling atmospheric methane when emissions from high-latitude boreal wetlands were suppressed. With further investigation, the link between δ13C in stalagmites and tropical methane could provide a low-latitude proxy complementary to polar ice core records to improve our understanding of the glacial–interglacial methane budget.
Pooling of samples in detecting the presence of virus is an effective and efficient strategy in screening carriers in a large population with low infection rate, leading to reduction in cost and time. There are a number of pooling test methods, some being simple and others being complicated. In such pooling tests, the most important parameter to decide is the pool or group size, which can be optimised mathematically. Two pooling methods are relatively simple. The minimum numbers required in these two tests for a population with known infection rate are discussed and compared. Results are useful for identifying asymptomatic carriers in a short time and in implementing health codes systems.
Phytase has long been used to decrease the inorganic phosphorus (Pi) input in poultry diet. The current study was conducted to investigate the effects of Pi supplementation on laying performance, egg quality and phosphate–calcium metabolism in Hy-Line Brown laying hens fed phytase. Layers (n = 504, 29 weeks old) were randomly assigned to seven treatments with six replicates of 12 birds. The corn–soybean meal-based diet contained 0.12% non-phytate phosphorus (nPP), 3.8% calcium, 2415 IU/kg vitamin D3 and 2000 FTU/kg phytase. Inorganic phosphorus (in the form of mono-dicalcium phosphate) was added into the basal diet to construct seven experimental diets; the final dietary nPP levels were 0.12%, 0.17%, 0.22%, 0.27%, 0.32%, 0.37% and 0.42%. The feeding trial lasted 12 weeks (hens from 29 to 40 weeks of age). Laying performance (housed laying rate, egg weight, egg mass, daily feed intake and feed conversion ratio) was weekly calculated. Egg quality (egg shape index, shell strength, shell thickness, albumen height, yolk colour and Haugh units), serum parameters (calcium, phosphorus, parathyroid hormone, calcitonin and 1,25-dihydroxyvitamin D), tibia quality (breaking strength, and calcium, phosphorus and ash contents), intestinal gene expression (type IIb sodium-dependent phosphate cotransporter, NaPi-IIb) and phosphorus excretion were determined at the end of the trial. No differences were observed on laying performance, egg quality, serum parameters and tibia quality. Hens fed 0.17% nPP had increased (P < 0.01) duodenum NaPi-IIb expression compared to all other treatments. Phosphorus excretion linearly increased with an increase in dietary nPP (phosphorus excretion = 1.7916 × nPP + 0.2157; R2 = 0.9609, P = 0.001). In conclusion, corn–soybean meal-based diets containing 0.12% nPP, 3.8% calcium, 2415 IU/kg vitamin D3 and 2000 FTU/kg phytase would meet the requirements for egg production in Hy-Line Brown laying hens (29 to 40 weeks of age).
Introduction: Several recent observational studies have presented concerning data regarding the safety of cardioversion (CV) for acute atrial fibrillation and flutter (AAFF). We conducted this systematic review to determine whether it is safe to cardiovert AAFF patients without prescribing oral anticoagulation (OAC) post-CV for those who are CHADS-65 negative. Methods: We conducted a librarian assisted search of MEDLINE, Embase, and Cochrane from inception through November 23, 2019. We included observational studies and randomized trials reporting thromboembolic (TE) events (i.e. stroke, transient ischemic attack, or systemic thromboembolism) within 30 days following CV in patients with AAFF, where onset of symptoms was <48 hours. Two reviewers independently screened studies and extracted data. The main outcome was risk of TE events within 30 days post-CV, stratified by OAC use. Risk of bias was assessed with the Quality in Prognostic Studies (QUIPS) tool. The primary analysis was based on prospective studies and the secondary analysis was based on retrospective studies. We performed meta-analyses for TE events where 2 or more studies were available, by applying the DerSimonian-Laird random-effects model. We implemented analyses stratified by study design using Open MetaAnalyst and generated the forest plots. Results: Our search yielded 969 titles; 74 were selected for full-text review and 20 studies were included in the review. The primary meta-analysis of 6 prospective studies, including two randomized trials, found a TE event rate of 0.15% (2 TE events/1,314 CVs). Within this prospective group, lack of OAC use was associated with a decreased risk of TE events (RR = 2.15 where RR >1 indicates increased risk of TE events with OAC compared to no OAC; 95% CI 0.50 to 9.31; I2 = 0%). Five of the 6 prospective studies had a low or moderate risk of bias in all QUIPS domains. Secondary meta-analysis of 6 retrospective studies revealed a TE event rate of 0.53% (56 TE events/10,521 CVs). This subgroup showed a trend favouring OAC use with decreased risk of TE events (RR = 0.34 where RR <1 suggests decreased risk of TE events with OAC; 95% CI 0.17 to 0.72; I2 = 0%). Conclusion: In the primary analysis of prospective studies, we found a low TE event rate following CV of AAFF, irrespective of OAC use. This contradicts previous analyses of retrospective studies. Our study supports the longstanding practice of not necessarily prescribing OAC post-CV in the ED for AAFF patients who are CHADS-65 negative.
Background: Increasing Emergency Department (ED) stretcher occupancy with admitted patients at our tertiary care hospital has contributed to long Physician Initial Assessment (PIA) times. As of Oct 2019, median PIA was 2.3 hours and 90th percentile PIA was 5.3 hours, with a consequent 71/74 PIA ranking compared to all Ontario EDs. Ambulatory zone (AZ) models are more commonly used in community EDs compared to tertiary level EDs. An interdisciplinary team trialled an AZ model for five days in our ED to improve PIA times. Aim Statement: We sought to decrease the median PIA for patients in our ED during the AZ trial period as compared to days with similar occupancy and volume. Measures & Design: The AZ was reserved for patients who could walk from a chair to stretcher. In this zone, ED rooms with stretchers were for patient assessment only; when waiting for results or receiving treatment, patients were moved into chairs. We removed nursing assignment ratios to increase patient flow. Our outcome measure was the median PIA for all patients in our ED. Our balancing measure was the 90th percentile PIA, which could increase if we negatively impacted patients who require stretchers. The median and 90th percentile PIA during the AZ trial were compared to similar occupancy and volume days without the AZ. Additional measures included ED Length of Stay (LOS) for non-admitted patients, and patients who leave without being seen (LWBS). Clinicians and patients provided qualitative feedback through surveys. Evaluation/Results: The median PIA during the AZ trial was 1.5 hours, compared to 2.1 hours during control days. Our balancing measure, the 90th percentile PIA was 3.7 hours, compared to 5.0 during control days. A run chart revealed both median and 90th percentile PIA during the trial were at their lowest points over the past 18 months. The number of LWBS patients decreased during the trial; EDLOS did not change. The majority of patients, nurses, and physicians felt the trial could be implemented permanently. Discussion/Impact: Although our highly specialized tertiary care hospital faces unique challenges and high occupancy pressures, a community-hospital style AZ model was successful in improving PIA. Shorter PIA times can improve other quality metrics, such as timeliness of analgesia and antibiotics. We are working to optimize the model based on feedback before we cycle another trial. Our findings suggest that other tertiary care EDs should consider similar AZ models.
To describe the infection control preparedness measures undertaken for coronavirus disease (COVID-19) due to SARS-CoV-2 (previously known as 2019 novel coronavirus) in the first 42 days after announcement of a cluster of pneumonia in China, on December 31, 2019 (day 1) in Hong Kong.
Methods:
A bundled approach of active and enhanced laboratory surveillance, early airborne infection isolation, rapid molecular diagnostic testing, and contact tracing for healthcare workers (HCWs) with unprotected exposure in the hospitals was implemented. Epidemiological characteristics of confirmed cases, environmental samples, and air samples were collected and analyzed.
Results:
From day 1 to day 42, 42 of 1,275 patients (3.3%) fulfilling active (n = 29) and enhanced laboratory surveillance (n = 13) were confirmed to have the SARS-CoV-2 infection. The number of locally acquired case significantly increased from 1 of 13 confirmed cases (7.7%, day 22 to day 32) to 27 of 29 confirmed cases (93.1%, day 33 to day 42; P < .001). Among them, 28 patients (66.6%) came from 8 family clusters. Of 413 HCWs caring for these confirmed cases, 11 (2.7%) had unprotected exposure requiring quarantine for 14 days. None of these was infected, and nosocomial transmission of SARS-CoV-2 was not observed. Environmental surveillance was performed in the room of a patient with viral load of 3.3 × 106 copies/mL (pooled nasopharyngeal and throat swabs) and 5.9 × 106 copies/mL (saliva), respectively. SARS-CoV-2 was identified in 1 of 13 environmental samples (7.7%) but not in 8 air samples collected at a distance of 10 cm from the patient’s chin with or without wearing a surgical mask.
Conclusion:
Appropriate hospital infection control measures was able to prevent nosocomial transmission of SARS-CoV-2.
Introduction: Elevated intracranial pressure (ICP) is a devastating complication of brain injury, such as traumatic brain injury, subarachnoid hemorrhage, intracerebral hemorrhage, ischemic stroke, and other conditions. Delay to diagnosis and treatment are associated with increased morbidity and mortality. For Emergency Department (ED) physicians, invasive ICP measurement is typically not available. We sought to summarize and compare the accuracy of physical examination, imaging, and ultrasonography of the optic nerve sheath diameter (ONSD) for diagnosis of elevated ICP. Methods: We searched Medline, EMBASE and 4 other databases from inception through August 2018. We included only English studies (randomized controlled trials, cohort and case-control studies). Gold standard was ICP≥20 mmHg on invasive ICP monitoring. Two reviewers independently screened studies and extracted data. We assessed risk of bias using Quality Assessment of Diagnostic Accuracy Studies 2 criteria. Hierarchical Summary Receiver Operating Characteristic model generated summary diagnostic accuracy estimates. Results: We included 37 studies (n = 4,768, kappa = 0.96). Of exam signs, pooled sensitivity and specificity for increased ICP were: mydriasis (28.2% [95% CI: 16.0-44.8], 85.9.0% [95% CI: 74.9-92.5]), motor posturing (54.3% [95% CI: 36.6-71.0], 63.6% [95% CI: 46.5-77.8]) and Glasgow Coma Scale (GCS) ≤8 (75.8% [95% CI: 62.4-85.5], 39.9% [95% CI: 26.9-54.5]). Computed tomography findings: compression of basal cisterns had 85.9% [95% CI: 58.0-96.4] sensitivity and 61.0% [95% CI: 29.1-85.6] specificity; any midline shift had 80.9% [95% CI: 64.3-90.9] sensitivity and 42.7% [95% CI: 24.0-63.7] specificity; midline shift≥1cm had 20.7% [95% CI: 13.0-31.3] sensitivity and 89.2% {95% CI: 77.5-95.2] specificity. Finally, pooled area under the ROC curve describing accuracy for ONSD sonography for ICP was 0.94 (95% CI: 0.91-0.96). Conclusion: The absence of any one physical exam feature (e.g. mydriasis, posturing, or decreased GCS) is not sufficient to rule-out elevated ICP. Significant midline shift is highly suggestive of elevated ICP, but absence of shift does not rule it out. ONSD sonography may be useful in diagnosing elevated ICP. High suspicion of elevated ICP may necessitate treatment and transfer to a centre capable of invasive ICP monitoring.
The response of soil microbial communities to soil quality changes is a sensitive indicator of soil ecosystem health. The current work investigated soil microbial communities under different fertilization treatments in a 31-year experiment using the phospholipid fatty acid (PLFA) profile method. The experiment consisted of five fertilization treatments: without fertilizer input (CK), chemical fertilizer alone (MF), rice (Oryza sativa L.) straw residue and chemical fertilizer (RF), low manure rate and chemical fertilizer (LOM), and high manure rate and chemical fertilizer (HOM). Soil samples were collected from the plough layer and results indicated that the content of PLFAs were increased in all fertilization treatments compared with the control. The iC15:0 fatty acids increased significantly in MF treatment but decreased in RF, LOM and HOM, while aC15:0 fatty acids increased in these three treatments. Principal component (PC) analysis was conducted to determine factors defining soil microbial community structure using the 21 PLFAs detected in all treatments: the first and second PCs explained 89.8% of the total variance. All unsaturated and cyclopropyl PLFAs except C12:0 and C15:0 were highly weighted on the first PC. The first and second PC also explained 87.1% of the total variance among all fertilization treatments. There was no difference in the first and second PC between RF and HOM treatments. The results indicated that long-term combined application of straw residue or organic manure with chemical fertilizer practices improved soil microbial community structure more than the mineral fertilizer treatment in double-cropped paddy fields in Southern China.
Quantitative gas chromatographic analyses supplemented by X-ray diffraction studies of the adsorption of ethanol and acetone (as model polar organic compounds) on homoionic montmorillonite revealed marked variation in the number of molecules associated with each exchange cation. The results show increasing association in the order K+ <Na+<Ba2+<Ca2+. K+ and Na+ associate with two and three molecules, respectively, of either ethanol or acetone, and the resulting complexes expand to form a monolayer (∼13 Å). Ba2+ and Ca2+ form both monolayer complexes as well as double layer complexes. In the single layer complexes Ba2+ associates with either four molecules of ethanol or four molecules of acetone, Ca2+ associates with five molecules of ethanol or four molecules of acetone. In the double-layer complexes the observed cation-molecule ratios are 1 : 8 for both Ba2+-ethanol and Ba2+-acetone, 1 : 10 for Ca2+-ethanol, and 1 : 8 for Ca2+-acetone.
The striking dependence of ethanol and acetone adsorption on the nature of the exchangeable cation suggests that cation-dipole interactions play an important role in the adsorption process. Structural models of the organic complexes are presented.
Recent studies indicate that early postnatal period is a critical window for gut microbiota manipulation to optimise the immunity and body growth. This study investigated the effects of maternal faecal microbiota orally administered to neonatal piglets after birth on growth performance, selected microbial populations, intestinal permeability and the development of intestinal mucosal immune system. In total, 12 litters of crossbred newborn piglets were selected in this study. Litter size was standardised to 10 piglets. On day 1, 10 piglets in each litter were randomly allotted to the faecal microbiota transplantation (FMT) and control groups. Piglets in the FMT group were orally administrated with 2ml faecal suspension of their nursing sow per day from the age of 1 to 3 days; piglets in the control group were treated with the same dose of a placebo (0.1M potassium phosphate buffer containing 10% glycerol (vol/vol)) inoculant. The experiment lasted 21 days. On days 7, 14 and 21, plasma and faecal samples were collected for the analysis of growth-related hormones and cytokines in plasma and lipocalin-2, secretory immunoglobulin A (sIgA), selected microbiota and short-chain fatty acids (SCFAs) in faeces. Faecal microbiota transplantation increased the average daily gain of piglets during week 3 and the whole experiment period. Compared with the control group, the FMT group had increased concentrations of plasma growth hormone and IGF-1 on days 14 and 21. Faecal microbiota transplantation also reduced the incidence of diarrhoea during weeks 1 and 3 and plasma concentrations of zonulin, endotoxin and diamine oxidase activities in piglets on days 7 and 14. The populations of Lactobacillus spp. and Faecalibacterium prausnitzii and the concentrations of faecal and plasma acetate, butyrate and total SCFAs in FMT group were higher than those in the control group on day 21. Moreover, the FMT piglets have higher concentrations of plasma transforming growth factor-β and immunoglobulin G, and faecal sIgA than the control piglets on day 21. These findings indicate that early intervention with maternal faecal microbiota improves growth performance, decreases intestinal permeability, stimulates sIgA secretion, and modulates gut microbiota composition and metabolism in suckling piglets.
Introduction: Necrotizing soft tissue infection (NSTI), a potentially life-threatening diagnosis, is often not immediately recognized by clinicians. Delays in diagnosis are associated with increased morbidity and mortality. We sought to summarize and compare the accuracy of physical exam, imaging, and Laboratory Risk Indicator of Necrotizing Fasciitis (LRINEC) Score used to confirm suspected NSTI in adult patients with skin and soft tissue infections. Methods: We searched Medline, Embase and 4 other databases from inception through November 2017. We included only English studies (randomized controlled trials, cohort and case-control studies) that reported the diagnostic accuracy of testing or LRINEC Score. Outcome was NSTI confirmed by surgery or histopathology. Two reviewers independently screened studies and extracted data. We assessed risk of bias using the Quality Assessment of Diagnostic Accuracy Studies 2 criteria. Diagnostic accuracy summary estimates were obtained from the Hierarchical Summary Receiver Operating Characteristic model. Results: We included 21 studies (n=6,044) in the meta-analysis. Of physical exam signs, pooled sensitivity and specificity for fever (49.4% [95% CI: 41.4-57.5], 78.0% [95% CI: 52.2-92.0]), hemorrhagic bullae (30.8% [95% CI: 16.2-50.6], 94.2% [95% CI: 82.9-98.2]) and hypotension (20.8% [95% CI: 7.7-45.2], 97.9% [95% CI: 89.1-99.6]) were generated. Computed tomography (CT) had 88.5% [95% CI: 55.5-97.9] sensitivity and 93.3% [95% CI: 80.8-97.9] specificity, while plain radiography had 48.9% [95% CI: 24.9-73.4] sensitivity and 94.0% [95% CI: 63.8-99.3] specificity. Finally, LRINEC 6 (traditional threshold) had 67.5% [95% CI: 48.3-82.3] sensitivity and 86.7% [95% CI: 77.6-92.5] specificity, while a LRINEC 8 had 94.9% [95% CI: 89.4-97.6] specificity but 40.8% [95% CI: 28.6-54.2] sensitivity. Conclusion: The absence of any one physical exam feature (e.g. fever or hypotension) is not sufficient to rule-out NSTI. CT is superior to plain radiography. The LRINEC Score had poor sensitivity, suggesting that a low score is not sufficient to rule-out NSTI. For patients with suspected NSTI, further evaluation is warranted. While no single test is sensitive, patients with high-risk features should receive early surgical consultation for definitive diagnosis and management.
Chilling injury is an important natural stress that can threaten cotton production, especially at the sowing and seedling stages in early spring. It is therefore important for cotton production to improve chilling tolerance at these stages. The current work examines the potential for glycine betaine (GB) treatment of seeds to increase the chilling tolerance of cotton at the seedling stage. Germination under cold stress was increased significantly by GB treatment. Under low temperature, the leaves of seedlings from treated seeds exhibited a higher net photosynthetic rate (PN), higher antioxidant enzyme activity including superoxide dismutase, ascorbate peroxidase and catalase, lower hydrogen peroxide (H2O2) content and less damage to the cell membrane. Enzyme activity was correlated negatively with H2O2 content and degree of damage to the cell membrane but correlated positively with GB content. The experimental results suggested that although GB was only used to treat cotton seed, the beneficial effect caused by the preliminary treatment of GB could play a significant role during germination that persisted to at least the four-leaf seedling stage. Therefore, it is crucial that this method is employed in agricultural production to improve chilling resistance in the seedling stage by soaking the seeds in GB.
Multidrug-resistant organisms (MDROs) are increasingly reported in residential care homes for the elderly (RCHEs). We assessed whether implementation of directly observed hand hygiene (DOHH) by hand hygiene ambassadors can reduce environmental contamination with MDROs.
METHODS
From July to August 2017, a cluster-randomized controlled study was conducted at 10 RCHEs (5 intervention versus 5 nonintervention controls), where DOHH was performed at two-hourly intervals during daytime, before meals and medication rounds by a one trained nurse in each intervention RCHE. Environmental contamination by MRDOs, such as methicillin-resistant Staphylococcus aureus (MRSA), carbapenem-resistant Acinetobacter species (CRA), and extended-spectrum β-lactamse (ESBL)–producing Enterobacteriaceae, was evaluated using specimens collected from communal areas at baseline, then twice weekly. The volume of alcohol-based hand rub (ABHR) consumed per resident per week was measured.
RESULTS
The overall environmental contamination of communal areas was culture-positive for MRSA in 33 of 100 specimens (33%), CRA in 26 of 100 specimens (26%), and ESBL-producing Enterobacteriaceae in 3 of 100 specimens (3%) in intervention and nonintervention RCHEs at baseline. Serial monitoring of environmental specimens revealed a significant reduction in MRSA (79 of 600 [13.2%] vs 197 of 600 [32.8%]; P<.001) and CRA (56 of 600 [9.3%] vs 94 of 600 [15.7%]; P=.001) contamination in the intervention arm compared with the nonintervention arm during the study period. The volume of ABHR consumed per resident per week was 3 times higher in the intervention arm compared with the baseline (59.3±12.9 mL vs 19.7±12.6 mL; P<.001) and was significantly higher than the nonintervention arm (59.3±12.9 mL vs 23.3±17.2 mL; P=.006).
CONCLUSIONS
The direct observation of hand hygiene of residents could reduce environmental contamination by MDROs in RCHEs.
Bacillary dysentery continues to be a major health issue in developing countries and ambient temperature is a possible environmental determinant. However, evidence about the risk of bacillary dysentery attributable to ambient temperature under climate change scenarios is scarce. We examined the attributable fraction (AF) of temperature-related bacillary dysentery in urban and rural Hefei, China during 2006–2012 and projected its shifting pattern under climate change scenarios using a distributed lag non-linear model. The risk of bacillary dysentery increased with the temperature rise above a threshold (18·4 °C), and the temperature effects appeared to be acute. The proportion of bacillary dysentery attributable to hot temperatures was 18·74% (95 empirical confidence interval (eCI): 8·36–27·44%). Apparent difference of AF was observed between urban and rural areas, with AF varying from 26·87% (95% eCI 16·21–36·68%) in urban area to −1·90% (95 eCI −25·03 to 16·05%) in rural area. Under the climate change scenarios alone (1–4 °C rise), the AF from extreme hot temperatures (>31·2 °C) would rise greatly accompanied by the relatively stable AF from moderate hot temperatures (18·4–31·2 °C). If climate change proceeds, urban area may be more likely to suffer from rapidly increasing burden of disease from extreme hot temperatures in the absence of effective mitigation and adaptation strategies.
To study the association between gastrointestinal colonization of carbapenemase-producing Enterobacteriaceae (CPE) and proton pump inhibitors (PPIs).
METHODS
We analyzed 31,526 patients with prospective collection of fecal specimens for CPE screening: upon admission (targeted screening) and during hospitalization (opportunistic screening, safety net screening, and extensive contact tracing), in our healthcare network with 3,200 beds from July 1, 2011, through December 31, 2015. Specimens were collected at least once weekly during hospitalization for CPE carriers and subjected to broth enrichment culture and multiplex polymerase chain reaction.
RESULTS
Of 66,672 fecal specimens collected, 345 specimens (0.5%) from 100 patients (0.3%) had CPE. The number and prevalence (per 100,000 patient-days) of CPE increased from 2 (0.3) in 2012 to 63 (8.0) in 2015 (P<.001). Male sex (odds ratio, 1.91 [95% CI, 1.15–3.18], P=.013), presence of wound or drain (3.12 [1.70–5.71], P<.001), and use of cephalosporins (3.06 [1.42–6.59], P=.004), carbapenems (2.21 [1.10–4.48], P=.027), and PPIs (2.84 [1.72–4.71], P<.001) in the preceding 6 months were significant risk factors by multivariable analysis. Of 79 patients with serial fecal specimens, spontaneous clearance of CPE was noted in 57 (72.2%), with a median (range) of 30 (3–411) days. Comparing patients without use of antibiotics and PPIs, consumption of both antibiotics and PPIs after CPE identification was associated with later clearance of CPE (hazard ratio, 0.35 [95% CI, 0.17–0.73], P=.005).
CONCLUSIONS
Concomitant use of antibiotics and PPIs prolonged duration of gastrointestinal colonization by CPE.
We describe the performance of the Boolardy Engineering Test Array, the prototype for the Australian Square Kilometre Array Pathfinder telescope. Boolardy Engineering Test Array is the first aperture synthesis radio telescope to use phased array feed technology, giving it the ability to electronically form up to nine dual-polarisation beams. We report the methods developed for forming and measuring the beams, and the adaptations that have been made to the traditional calibration and imaging procedures in order to allow BETA to function as a multi-beam aperture synthesis telescope. We describe the commissioning of the instrument and present details of Boolardy Engineering Test Array’s performance: sensitivity, beam characteristics, polarimetric properties, and image quality. We summarise the astronomical science that it has produced and draw lessons from operating Boolardy Engineering Test Array that will be relevant to the commissioning and operation of the final Australian Square Kilometre Array Path telescope.