To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Innovation Concept: Video has been proven to be an effective educational tool that is valued by learners and objectively improves knowledge and testing scores. It can simplify complex concepts and is more efficient and effective than audio or reading in tests of 3-day material recall. Our objective in this project was to develop a series of instructional videos geared towards emergency and family physicians on proper application of casts and splints in the emergency department. Methods: We created two procedural videos, each 5-6 minutes long. They each reviewed the process, indications, and precise steps for application for each of two splints: the ulnar gutter and the thumb spica. After finalizing the videos, we created a survey to assess feedback, asking questions about the applicability of the videos to the viewer's clinical practice, how interesting they found the content of the videos, what they liked and disliked, and how willing they would be to access future procedural videos if we were to make them. We also had respondents provide suggestions for topics of future videos. We then sent the videos and accompanying survey to a group of McMaster University medical students, residents, and attending physicians in family medicine and emergency medicine. Upon reviewed the results it seemed that there was a large difference in perceived utility of the videos between attending physicians and trainees, and so we proceeded with subgroup analysis of trainees and staff. Curriculum, Tool, or Material: Orthopedic procedural videos as described above. Conclusion: Using a 5-point Likert scale, we found that overall trainees (4.3, SD 0.76 CI 0.41) found the videos more useful and interesting than did attending physicians (3.4, SD 0.68 CI 0.37), with respondents commenting that they were very clear and easy to follow for junior trainees. Most respondents also indicated that they would access future videos we made (4.2 SD 0.74 CI 0.39 for trainees, 3.2 SD 0.65 SI 0.34) for attendings). Future directions include making the videos more concise and adding more visual summaries to improve viewership, and targeting videos for specific learner level. We are hoping to implement these videos into future curriculum development for our learners and, if successful, other Emergency Medicine residency programs across Canada.
Background: Unintentional opioid overdoses in and around acute care hospitals, including in the ED, are of increasing concern. In April 2018, the Addiction Recovery and Community Health (ARCH) Team at the Royal Alexandra Hospital opened the first acute care Supervised Consumption Service (SCS) in North America available to inpatients. In the SCS, patients can consume substances by injection, oral or intranasal routes under nursing supervision; immediate assistance is provided if an overdose occurs. After a quality assurance review, work began to expand SCS access to ED patients as well. Aim Statement: By expanding SCS access to ED patients, we aim to reduce unintentional and unwitnessed opioid overdoses in registered ED patients to 0 per month by the end of 2020. Measures & Design: Between June 13-July 15, 2019, ARCH ED Registered Nurses were asked to identify ED patients with a history of active substance use who may potentially require SCS access. Nurses identified 69 patients over 43 8-hour shifts (range 0-4 patients per shift); thus, we anticipated an average of 5 ED patients per 24-hour period to potentially require SCS access. Based on this evidence of need, ARCH leadership worked with a) hospital legal team and Health Canada to expand SCS access to ED patients; b) ED leadership to develop a procedure and flowchart for ED SCS access. ED patients were able to access the SCS effective October 1, 2019. Evaluation/Results: From October 1 to December 1, 2019, the SCS had 35 visits by 23 unique ED patients. The median time spent in the SCS was 42.5 minutes (range 14.0-140.0 minutes). Methamphetamine was the most commonly used substance (19, 45.2%), followed by fentanyl (10, 23.8%); substances were all injected (91.4% into a vein and 8.6% into an existing IV). In this time period, there were zero unintentional, unwitnessed opioid poisonings in registered ED patients. Data collection is ongoing and will expand to include chief complaint, ED length of stay and discharge status. Discussion/Impact: Being able to reduce unintentional overdoses and unwitnessed injection drug use in the ED has the potential to improve both patient and staff safety. Next steps include a case series designed to examine the impact of SCS access on emergency care, retention in treatment and uptake into addiction treatment.
This study used meta-analysis to comprehensively examine the factor analysis of the Children's Depression Inventory (CDI). Twenty-five studies (N = 18,897) consisting of 36 independent samples were identified. Generally, the CDI comprises five factors: Self-Depreciation, Somatic Concerns, Externalizing, Lack of Personal and Social Interest, and Dysphoric Mood. When reviewing individual items, the results of this meta-analysis suggest that self-depreciation had salient loadings on factors similar to Self-Depreciation, Externalizing, and Somatic Concerns. The variability in this item makes self-depreciation a poor marker for symptoms of Self-Depreciation, Externalizing, and Somatic Concerns, and hence suggests that it should be revised or excluded in future revisions of the CDI. The equivalence of factor structure is a prerequisite to comparing mean scores across groups. Hence, the factor structure of the CDI was examined for subgroups of studies. The 5-factor structure of the CDI was generally appropriate except in studies assessing depression of at-risk/clinical participants and participants using non-English versions of the CDI. For studies assessing depression among at-risk/clinical participants and participants using non-English versions of the CDI, factors similar to Self-Depreciation, Lack of Personal and Social Interest, and Externalizing were identified. The at-risk/clinical samples had an independent factor of Depressive Mood and Loneliness, while studies using non-English versions of the CDI had an independent factor of Sadness and Somatic Notably, the factor of Somatic Concerns was not identified in at-risk/clinical samples and items of sleep disturbance, fatigue, and reduced appetite had no salient loadings.
We describe an ultra-wide-bandwidth, low-frequency receiver recently installed on the Parkes radio telescope. The receiver system provides continuous frequency coverage from 704 to 4032 MHz. For much of the band (
), the system temperature is approximately 22 K and the receiver system remains in a linear regime even in the presence of strong mobile phone transmissions. We discuss the scientific and technical aspects of the new receiver, including its astronomical objectives, as well as the feed, receiver, digitiser, and signal processor design. We describe the pipeline routines that form the archive-ready data products and how those data files can be accessed from the archives. The system performance is quantified, including the system noise and linearity, beam shape, antenna efficiency, polarisation calibration, and timing stability.
Studies of Internet gaming disorder (IGD) suggest an imbalanced relationship between cognitive control and reward processing in people with IGD. However, it remains unclear how these two systems interact with each other, and whether they could serve as neurobiological markers for IGD.
Fifty IGD subjects and matched individuals with recreational game use (RGU) were selected and compared when they were performing a cue-craving task. Regions of interests [anterior cingulate cortex (ACC), lentiform nucleus] were selected based on the comparison between brain responses to gaming-related cues and neutral cues. Directional connectivities among these brain regions were determined using Bayesian estimation. We additionally examined the posterior cingulate cortex (PCC) in a separate analysis based on data implicating the PCC in craving in addiction.
During fixed-connectivity analyses, IGD subjects showed blunted ACC-to-lentiform and lentiform-to-ACC connectivity relative to RGU subjects, especially in the left hemisphere. When facing gaming cues, IGD subjects trended toward lower left-hemispheric modulatory effects in ACC-to-lentiform connectivity than RGU subjects. Self-reported cue-related craving prior to scanning correlated inversely with left-hemispheric modulatory effects in ACC-to-lentiform connectivity.
The results suggesting that prefrontal-to-lentiform connectivity is impaired in IGD provides a possible neurobiological mechanism for difficulties in controlling gaming-cue-elicited cravings. Reduced connectivity ACC-lentiform connectivity may be a useful neurobiological marker for IGD.
Rumen-protected betaine (RPB) can enhance betaine absorption in the small intestine of ruminants, while betaine can alter fat distribution and has the potential to affect the meat quality of livestock. Hence, we hypothesized that RPB might also affect the meat quality of lambs. Sixty male Hu sheep of similar weight (30.47 ± 2.04 kg) were selected and randomly subjected to five different treatments. The sheep were fed a control diet (control treatment, CTL); 1.1 g/day unprotected-betaine supplemented diet (UPB); or doses of 1.1 g/day (low RPB treatment; L-PB), 2.2 g/day (middle RPB treatment; M-PB) or 3.3 g/day (high RPB treatment; H-PB) RPB-supplemented diet for 70 days. Slaughter performance, meat quality, fatty acid and amino acid content in the longissimus dorsi (LD) muscle, shoulder muscle (SM) and gluteus muscle (GM) were measured. Compared with CTL, betaine (including UPB and RPB) supplementation increased the average daily weight gain (ADG) (P < 0.05) and average daily feed intake (P < 0.01) of lambs. Rumen-protected betaine increased ADG (P < 0.05) compared with UPB. With increasing RPB doses, the eye muscle area of the lambs linearly increased (P < 0.05). Compared with CTL, betaine supplementation decreased water loss (P < 0.05) in SM and increased pH24 in the SM (P < 0.05) and GM (P < 0.05). Compared with UPB, RPB decreased water loss in the GM (P < 0.01), decreased shear force (P < 0.05) in the LD and SM and increased the pH of the meat 24 h after slaughter (pH24). With increasing RPB doses, the shear force and b* value in the LD linearly decreased (P < 0.05), and the pH24 of the meat quadratically increased (P < 0.05). Compared with CTL, betaine supplementation increased the polyunsaturated fatty acid in the GM (P < 0.05). Compared with UPB, RPB supplementation decreased the saturated fatty acid (SFA) content in the LD (P < 0.05) and increased the unsaturated fatty acids (UFA), mono-unsaturated fatty acids and UFA/SFA ratio in the LD (P < 0.05). Compared with CTL, the content of histidine in the LD increased with betaine supplementation. Compared with UPB, RPB supplementation increased the content of total free amino acids and flavor amino acids in the LD of lambs (P < 0.05). With increasing RPB, the isoleucine and phenylalanine contents in the LD linearly increased (P < 0.05). Overall, the data collected indicated that the meat quality of lambs (especially in the LD) improved as a result of betaine supplementation, and RPB showed better effects than those of UPB.
The search for life in the Universe is a fundamental problem of astrobiology and modern science. The current progress in the detection of terrestrial-type exoplanets has opened a new avenue in the characterization of exoplanetary atmospheres and in the search for biosignatures of life with the upcoming ground-based and space missions. To specify the conditions favourable for the origin, development and sustainment of life as we know it in other worlds, we need to understand the nature of global (astrospheric), and local (atmospheric and surface) environments of exoplanets in the habitable zones (HZs) around G-K-M dwarf stars including our young Sun. Global environment is formed by propagated disturbances from the planet-hosting stars in the form of stellar flares, coronal mass ejections, energetic particles and winds collectively known as astrospheric space weather. Its characterization will help in understanding how an exoplanetary ecosystem interacts with its host star, as well as in the specification of the physical, chemical and biochemical conditions that can create favourable and/or detrimental conditions for planetary climate and habitability along with evolution of planetary internal dynamics over geological timescales. A key linkage of (astro)physical, chemical and geological processes can only be understood in the framework of interdisciplinary studies with the incorporation of progress in heliophysics, astrophysics, planetary and Earth sciences. The assessment of the impacts of host stars on the climate and habitability of terrestrial (exo)planets will significantly expand the current definition of the HZ to the biogenic zone and provide new observational strategies for searching for signatures of life. The major goal of this paper is to describe and discuss the current status and recent progress in this interdisciplinary field in light of presentations and discussions during the NASA Nexus for Exoplanetary System Science funded workshop ‘Exoplanetary Space Weather, Climate and Habitability’ and to provide a new roadmap for the future development of the emerging field of exoplanetary science and astrobiology.
The authors demonstrate that gold-binding peptides displayed on the outer membrane of Escherichia coli enhance bioelectrochemical charge transfer by binding gold nanoparticles. Microbial fuel cells were run with different gold-binding peptides displayed and with different nanoparticle sizes, and the results were correlated with transmission electron microscopy (TEM) imaging of nanoparticle binding. When a gold-binding peptide is displayed and 5 nm gold nanoparticles are present, up to 4× power generation over E. coli not displaying a gold-binding peptide is observed. While an enhanced current is observed using the previously published M6G9, the largest enhancement is observed when a new longer peptide named M9G18 is used.
Introduction: Given the current opioid crisis, caregivers have mounting fears regarding use of opioid medication in their children. Since caregivers are often the gatekeepers to their children's pain management, understanding their perspectives on analgesics is essential. For caregivers of children with acute injury presenting to the pediatric emergency department (PED), we aimed to determine caregivers’: a) willingness to accept opioids from emergency care providers, b) reasons for refusing opioids, and c) past experiences with opioids. Methods: A novel 31-item electronic survey was offered, via tablet device, to caregivers of children aged 4-16 years who had a musculoskeletal injury <7 days old and presented to one of two Canadian PEDs between March and November 2017. Primary outcome was caregiver willingness to accept opioids for moderate pain for their children. Results: 517 caregivers completed the survey; mean age was 40.9 +/−7 years with 70.0% (362/517) being mothers. Children included 62.2% (321/516) males with an overall mean age of 10 +/−3.6 years. 49.6% of caregivers (254/512) reported willingness to accept opioids for moderate pain that persisted after non-opioid analgesia, while 37.1% (190/512) were unsure what they would do. Only 33.2% (170/512) of caregivers stated they would accept opioid analgesia upon discharge while 45.5% (233/512) were unsure about at-home use. Caregivers were primarily concerned about side effects, overdose, addiction, and masking of diagnosis. Caregiver fear of addiction (OR 1.12, 95% CI 1.01-1.25) and side effects (OR 1.25, 95% CI 1.11-1.42) increased the odds of rejecting opioids in the emergency department, while fears of addiction (OR 1.19, 95% CI 1.07-1.32) and overdose (OR 1.15, 95% CI 1.04-1.27) increased the odds of rejecting opioids for at-home use. Conclusion: Only half of caregivers reported that they would accept opioids for moderate pain, despite ongoing pain following non-opioid analgesics. Caregiver fears of addiction, side effects, overdose, and masking their child's diagnosis influence their behaviours. These findings are a first step in understanding caregiver decision-making and can guide healthcare providers in their conversations about acute pain treatment with families.
Introduction: Emergency Department (ED) visits related to substance use are rapidly increasing. Despite this, few Canadian EDs have immediate access to addiction medicine specialists or on-site addiction medicine clinics. This study characterized substance-related ED presentations to an urban tertiary care ED and assessed need for an on-site rapid-access addiction clinic (RAAC). Methods: This prospective enrollment, retrospective chart review was conducted from June to August 2018. Adult patients presenting to the ED with a known or suspected substance use disorder were enrolled by any member of their ED care team using a 1-page form. Retrospective chart review of the index ED visit was conducted and the Emergency Department Information System was used to extract information related to the visit. A multivariable logistic regression model was fit to examine factors associated with recommendation for referral to a hypothetical on-site RAAC.This prospective enrollment, retrospective chart review was conducted from June to August 2018. Adult patients presenting to the ED with a known or suspected substance use disorder were enrolled by any member of their ED care team using a 1-page form. Retrospective chart review of the index ED visit was conducted and the Emergency Department Information System was used to extract information related to the visit. A multivariable logistic regression model was fit to examine factors associated with recommendation for referral to a hypothetical on-site RAAC. Results: Of the 557 enrolment forms received, 458 were included in the analysis. 64% of included patients were male and 36% were female, with a median age of 35.0 years. Polysubstance use was seen in 23% of patients, and alcohol was the most common substance indicated (60%), followed by stimulants (32%) and opioids (16%). The median ED length of stay for included patients was 483 minutes, compared to 354 minutes for all-comers discharged from the ED during the study period. 28% of patients had a previous ED visit within 7 days of the index visit, and an additional 17% had a visit in the preceding 30 days. The ED care team indicated ‘Yes’ for RAAC referral from the ED for 66% of patients, for a mean of 4.3 patients referred per day during the study period. Multivariable analysis showed that all substances (except cannabis) correlated to a statistically significant increase in likelihood for indicating ‘Yes’ for RAAC referral from the ED (alcohol, stimulants, opioids, polysubstance; p < 0.05). Patients presenting to the ED with a chief complaint related to substance use were also more likely to be referred (p = 0.01). Conclusion: This retrospective chart review characterized substance-related presentations at a Canadian urban tertiary care ED. Approximately four patients per day would have been referred to an on-site RAAC had one been available. The RAAC model has been implemented in other Canadian hospitals, and collaborating with these sites to begin developing this service would be an important next step.
Three-dimensional (3D) printing technology is a promising method for bone tissue engineering applications. For enhanced bone regeneration, it is important to have printable ink materials with appealing properties such as construct interconnectivity, mechanical strength, controlled degradation rates, and the presence of bioactive materials. In this respect, we develop a composite ink composed of polycaprolactone (PCL), poly(D,L-lactide-co-glycolide) (PLGA), and hydroxyapatite particles (HAps) and 3D print it into porous constructs. In vitro study revealed that composite constructs had higher mechanical properties, surface roughness, quicker degradation profile, and cellular behaviors compared to PCL counterparts. Furthermore, in vivo results showed that 3D-printed composite constructs had a positive influence on bone regeneration due to the presence of newly formed mineralized bone tissue and blood vessel formation. Therefore, 3D printable ink made of PCL/PLGA/HAp can be a highly useful material for 3D printing of bone tissue constructs.
Introduction: Inadequate pain management in children is ubiquitous in the emergency department (ED). As the current national opioid crisis has highlighted, physicians are caught between balancing pain management and the risk of long term opioid dependence. This study aimed to describe pediatric emergency physicians (PEPs) willingness to prescribe opioids to children in the ED and at discharge. Methods: A unique survey tool was created using published methodology guidelines. Information regarding practices, knowledge, attitudes, perceived barriers, facilitators and demographics were collected. The survey was distributed to all physician members of Pediatric Emergency Research Canada (PERC), using a modified Dillmans Tailored Design method, from October to December 2017. Results: The response rate was 49.7% (124/242); 53% (57/107) were female, mean age was 43.6 years (+/− 8.7), and 58% (72/124) had pediatric emergency subspecialty training. The most common first line ED pain medication was ibuprofen for mild, moderate and severe musculoskeletal injury (MSK-I)-related pain (94.4% (117/124), 89.5% (111/124), and 62.9% (78/124), respectively). For moderate and severe MSK-I, intranasal fentanyl was the most common opioid for first (35.5% (44/124) and 61.3% (76/124), respectively) and second line pain management (41.1% (51/124) and 20.2% (25/124), respectively). 74.8% (89/119) of PEPs reported that an opioid protocol would be helpful, specifically for morphine, fentanyl, and hydromorphone. Using a 0-100 scale, physicians minimally worried about physical dependence (13.3 +/−19.3), addiction (16.6 +/−19.8), and diversion of opioids (32.8+/−26.4) when prescribing short-term opioids to children. They reported that the current opioid crisis minimally influenced their willingness to prescribe opioids (30.0 +/−26.2). Physicians reported rarely (36%; 45/125) or never (28%; 35/125) completing a screening risk assessment prior to prescribing opioids. Conclusion: Ibuprofen remains the most common medication recommended for MSK-I pain in the ED and at discharge. Intranasal fentanyl was the top opioid for all pain intensities. PEPs are minimally concerned regarding dependence, addiction, and the current opioid crisis when prescribing short-term opioids to children. There is an urgent need for robust evidence regarding the dependence and addiction risk for children receiving short term opioids in order to create knowledge translation tools for ED physicians. Opioid specific protocols for both in the ED and at discharge would likely improve physician comfort in responsible and adequate pain management for children.
Association mapping based on linkage disequilibrium is an effective approach for dissecting the inheritance of complex multi-gene traits. In the present study, association mapping was performed for yield traits based on 172 popular Upland cotton (Gossypium hirsutum L.) cultivars in China and 331 polymorphic simple sequence repeat (SSR) markers. The gene diversity index of 331 markers ranged from 0·0387 to 0·7799 with an average of 0·4002, and the polymorphism information content ranged from 0·0379 to 0·7473 with an average of 0·3375. A total of 93 significantly associated markers for seven yield traits were identified across more than one environment, among which 11 were for seed cotton yield, 12 for lint yield, 11 for boll number per plant, 13 for boll weight, 21 for lint percentage, 14 for lint index and 11 for seed index. The corresponding ranges in phenotypic variation explained by markers across four environments for these seven traits were 1·75–10·49, 1·75–9·34, 2·84–11·80, 2·59–9·89, 2·38–13·97, 2·73–14·82 and 2·50–11·88%, respectively. Some of the yield-associated markers detected were found to be linked to or associated with the same traits identified in previous studies. Furthermore, elite alleles for yield traits were also mined. The present study can provide useful information for further understanding the genetic basis of yield traits, and facilitate high-yield breeding by molecular design in Upland cotton.
Maize in Canada is grown mainly in the south-eastern part of the country. No comprehensive studies on Canadian maize yield levels have been done so far to analyse the barriers of obtaining optimal yields associated with cultivar, environmental stress and agronomic management practices. The objective of the current study was to use a modelling approach to analyse the gaps between actual and potential (determined by cultivar, solar radiation and temperature without any other stresses) maize yields in Eastern Canada. The CSM–CERES–Maize model in DSSAT v4·6 was calibrated and evaluated with measured data of seven cultivars under different nitrogen (N) rates across four sites. The model was then used to simulate grain yield levels defined as: yield potential (YP), water-limited (YW, rainfed), and water- and N-limited yields with N rates 80 kg/ha (YW, N-80N) and 160 kg/ha (YW, N-160N). The options were assessed to further increase grain yield by analysing the yield gaps related to water and N deficiencies. The CSM–CERES–Maize model simulated the grain yields in the experiments well with normalized root-mean-squared errors <0·20. The model was able to capture yield variations associated with varying N rates, cultivar, soil type and inter-annual climate variability. The seven calibrated cultivars used in the experiments were divided into three grades according to their simulated YP: low, medium and high. The simulation results for the 30-year period from 1981 to 2010 showed that the average YP was 15 000 kg/ha for cultivars with high yield potential. The YP is generally about 6000 kg/ha greater than the actual yield (YA) at each experimental site in Eastern Canada. Two-thirds of this gap between YP and YA is probably associated with water stress, as a gap of approximately 4000 kg/ha between the YW and the YP was simulated. This gap may be reduced through crop management, such as introducing irrigation to improve the distribution of available water during the growing season. The simulated yields indicated a gap of about 3000 and 1000 kg/ha between YW and YW,N-80N for cultivars with high YP and low YP, respectively. The gap between YW and YW,N-160N decreased to <2000 kg/ha for high Yp cultivars with little difference for the low Yp cultivars. The different yield gaps among cultivars suggest that cultivars with high YP require high N rates but cultivars with low YP may need only low N rates.
Supervision is a widely recognised component of counsellor training, yet little is known about the clinical supervision training of rehabilitation counsellor educators during their doctoral education. Using syllabi from doctoral rehabilitation counselling programmes, this article discusses the state of clinical supervision in doctoral-level training, and its teaching and clinical implications. 16 of the 25 Ph.D. programmes in rehabilitation responded to contact, and 11 programmes reported offering a course in supervision. Eight of these programmes shared the syllabus for their doctoral-level supervision course(s). The syllabi were analysed to find common themes related to content, learning objectives, assignments and readings. These themes are discussed, and are followed by five recommendations on the manner in which clinical supervision should be provided in rehabilitation doctoral programmes.
Carbon nanotube (CNT)-reinforced aluminum composite powders were synthesized by cryogenic milling. The effects of different milling parameters and CNT contents on the structural characteristics and mechanical properties of the resulting composite powders were studied. Detailed information on powder morphology and the dispersion and structural integrity of the CNTs is crucial for many powder consolidation methods, particularly cold spray, which is increasingly utilized to fabricate metal-based nanocomposites. While all of the produced composite powders exhibited particle sizes suitable for spray applications, it was found that with increasing CNT content, the average particle size decreased and the size distribution became narrower. The dispersion of CNTs improved with milling time and helped to maintain a small Al grain size during cryogenic milling. Although extensive milling allowed for substantial grain size reduction, the process caused notable CNT degradation, leading to a deterioration of the mechanical properties of the resulting composite.
A multipath mechanism similar to that used in Australia sixty years ago by the Sea-cliff Interferometer is shown to generate correlations between the periods of oscillations observed by two distant radio telescopes pointed to the Sun. The oscillations are the result of interferences between the direct wave detected in the main antenna lobe and its reflection on ground detected in a side lobe. A model is made of such oscillations in the case of two observatories located at equal longitudes and opposite tropical latitudes, respectively in Ha Noi (Viet Nam) and Learmonth (Australia), where similar radio telescopes are operated at 1.4 GHz. Simple specular reflection from ground is found to give a good description of the observed oscillations and to explain correlations that had been previously observed and for which no satisfactory interpretation, instrumental or other, had been found.
To evaluate the accuracy of real-time polymerase chain reaction (PCR) for Clostridium difficile–associated disease (CDAD) detection, after hospital CDAD rates significantly increased following real-time PCR initiation for CDAD diagnosis.
Hospital-wide surveillance study following examination of CDAD incidence density rates by interrupted time series design.
Large university-based hospital.
Hospitalized adult patients.
CDAD rates were compared before and after real-time PCR implementation in a university hospital and in the absence of physician and infection control practice changes. After real-time PCR introduction, all hospitalized adult patients were screened for C. difficile by testing a fecal specimen by real-time PCR, toxin enzyme-linked immunosorbent assay, and toxigenic culture.
CDAD hospital rates significantly increased after changing from cell culture cytotoxicity assay to a real-time PCR assay. One hundred ninety-nine hospitalized subjects were enrolled, and 101 fecal specimens were collected. C. difficile was detected in 18 subjects (18%), including 5 subjects (28%) with either definite or probable CDAD and 13 patients (72%) with asymptomatic C. difficile colonization.
The majority of healthcare-associated diarrhea is not attributable to CDAD, and the prevalence of asymptomatic C. difficile colonization exceeds CDAD rates in healthcare facilities. PCR detection of asymptomatic C. difficile colonization among patients with non-CDAD diarrhea may be contributing to rising CDAD rates and a significant number of CDAD false positives. PCR may be useful for CDAD screening, but further study is needed to guide interpretation of PCR detection of C. difficile and the value of confirmatory tests. A gold standard CDAD diagnostic assay is needed.
Cotton plant architecture is an important agronomic trait affecting yield and quality. In the present study, two F2:3 upland cotton (Gossypium hirsutum L.) populations were developed from Baimian2/TM-1 and Baimian2/CIR12 to map quantitative trait loci (QTL) for cotton plant architecture traits using simple sequence repeat (SSR) markers. A total of 73 QTL (37 significant and 36 suggestive) affecting plant architecture traits were detected in both populations. Four common QTL, qTFN-17 for total fruit nodes, qFBN-17 for fruit branch nodes, qFBL-17 for fruit branch length and qTFB-17a/qTFB-17b (qTFB-17) for total fruit branches, were found across the two populations. These common QTL should have high reliability and could be used for marker-assisted selection (MAS) to facilitate cotton plant architecture. The two common QTL, qTFN-17 and qFBL-17, were especially significant in both populations, and moreover, they explained >0·100 of the phenotypic variation in at least one population. These two QTL should be considered preferentially for MAS. The synergistic alleles and the negative alleles could be utilized in cotton plant architecture breeding programmes according to specific breeding objectives.