To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Optical tracking systems typically trade off between astrometric precision and field of view. In this work, we showcase a networked approach to optical tracking using very wide field-of-view imagers that have relatively low astrometric precision on the scheduled OSIRIS-REx slingshot manoeuvre around Earth on 22 Sep 2017. As part of a trajectory designed to get OSIRIS-REx to NEO 101955 Bennu, this flyby event was viewed from 13 remote sensors spread across Australia and New Zealand to promote triangulatable observations. Each observatory in this portable network was constructed to be as lightweight and portable as possible, with hardware based off the successful design of the Desert Fireball Network. Over a 4-h collection window, we gathered 15 439 images of the night sky in the predicted direction of the OSIRIS-REx spacecraft. Using a specially developed streak detection and orbit determination data pipeline, we detected 2 090 line-of-sight observations. Our fitted orbit was determined to be within about 10 km of orbital telemetry along the observed 109 262 km length of OSIRIS-REx trajectory, and thus demonstrating the impressive capability of a networked approach to Space Surveillance and Tracking.
There is a substantial proportion of patients who drop out of treatment before they receive minimally adequate care. They tend to have worse health outcomes than those who complete treatment. Our main goal is to describe the frequency and determinants of dropout from treatment for mental disorders in low-, middle-, and high-income countries.
Respondents from 13 low- or middle-income countries (N = 60 224) and 15 in high-income countries (N = 77 303) were screened for mental and substance use disorders. Cross-tabulations were used to examine the distribution of treatment and dropout rates for those who screened positive. The timing of dropout was examined using Kaplan–Meier curves. Predictors of dropout were examined with survival analysis using a logistic link function.
Dropout rates are high, both in high-income (30%) and low/middle-income (45%) countries. Dropout mostly occurs during the first two visits. It is higher in general medical rather than in specialist settings (nearly 60% v. 20% in lower income settings). It is also higher for mild and moderate than for severe presentations. The lack of financial protection for mental health services is associated with overall increased dropout from care.
Extending financial protection and coverage for mental disorders may reduce dropout. Efficiency can be improved by managing the milder clinical presentations at the entry point to the mental health system, providing adequate training, support and specialist supervision for non-specialists, and streamlining referral to psychiatrists for more severe cases.
Public health practitioners face challenging, potentially high-consequence, problems that require computational support. Available computational tools may not adequately fit these problems, thus forcing practitioners to rely on qualitative estimates when making critical decisions. Scientists at the Center for Computational Epidemiology and Response Analysis and practitioners from the Texas Department of State Health Services (TXDSHS) have established a participatory development cycle where public health practitioners work closely with academia to foster the development of data-driven solutions for specific public health problems and to translate these solutions to practice. Tools developed through this cycle have been deployed at TXDSHS offices where they have been used to refine and enhance the region’s medical countermeasure distribution and dispensing capabilities. Consequently, TXDSHS practitioners planning for a 49-county region in North Texas have achieved a 29% reduction in the number of points of dispensing required to complete dispensing to the region within time limitations. Further, an entire receiving, staging, and storing site has been removed from regional plans, thus freeing limited resources (eg, personnel, security, and infrastructure) for other uses. In 2018, planners from Southeast Texas began using these tools to plan for a multi-county, full-scale exercise which was scheduled to be conducted in October 2019.
To assess potential transmission of antibiotic-resistant organisms (AROs) using surrogate markers and bacterial cultures.
A 1,260-bed tertiary-care academic medical center.
The study included 25 patients (17 of whom were on contact precautions for AROs) and 77 healthcare personnel (HCP).
Fluorescent powder (FP) and MS2 bacteriophage were applied in patient rooms. HCP visits to each room were observed for 2–4 hours; hand hygiene (HH) compliance was recorded. Surfaces inside and outside the room and HCP skin and clothing were assessed for fluorescence, and swabs were collected for MS2 detection by polymerase chain reaction (PCR) and selective bacterial cultures.
Transfer of FP was observed for 20 rooms (80%) and 26 HCP (34%). Transfer of MS2 was detected for 10 rooms (40%) and 15 HCP (19%). Bacterial cultures were positive for 1 room and 8 HCP (10%). Interactions with patients on contact precautions resulted in fewer FP detections than interactions with patients not on precautions (P < .001); MS2 detections did not differ by patient isolation status. Fluorescent powder detections did not differ by HCP type, but MS2 was recovered more frequently from physicians than from nurses (P = .03). Overall, HH compliance was better among HCP caring for patients on contact precautions than among HCP caring for patients not on precautions (P = .003), among nurses than among other nonphysician HCP at room entry (P = .002), and among nurses than among physicians at room exit (P = .03). Moreover, HCP who performed HH prior to assessment had fewer fluorescence detections (P = .008).
Contact precautions were associated with greater HCP HH compliance and reduced detection of FP and MS2.
Gut cell losses contribute to overall feed efficiency due to the energy requirement for cell replenishment. Intestinal epithelial cells are sloughed into the intestinal lumen as digesta passes through the gastrointestinal tract, where cells are degraded by endonucleases. This leads to fragmented DNA being present in faeces, which may be an indicator of gut cell loss. Therefore, measuring host faecal DNA content could have potential as a non-invasive marker of gut cell loss and result in a novel technique for the assessment of how different feed ingredients impact upon gut health. Faecal calprotectin (CALP) is a marker of intestinal inflammation. This was a pilot study designed to test a methodology for extracting and quantifying DNA from pig faeces, and to assess whether any differences in host faecal DNA and CALP could be detected. An additional aim was to determine whether any differences in the above measures were related to the pig performance response to dietary yeast-enriched protein concentrate (YPC). Newly weaned (∼26.5 days of age) Large White × Landrace × Pietrain piglets (8.37 kg ±1.10, n = 180) were assigned to one of four treatment groups (nine replicates of five pigs), differing in dietary YPC content: 0% (control), 2.5%, 5% and 7.5% (w/w). Pooled faecal samples were collected on days 14 and 28 of the 36-day trial. Deoxyribonucleic acid was extracted and quantitative PCR was used to assess DNA composition. Pig genomic DNA was detected using primers specific for the pig cytochrome b (CYTB) gene, and bacterial DNA was detected using universal 16S primers. A pig CALP ELISA was used to assess gut inflammation. Dietary YPC significantly reduced feed conversion ratio (FCR) from weaning to day 14 (P<0.001), but not from day 14 to day 28 (P = 0.220). Pig faecal CYTB DNA content was significantly (P = 0.008) reduced in YPC-treated pigs, with no effect of time, whereas total faecal bacterial DNA content was unaffected by diet or time (P>0.05). Faecal CALP levels were significantly higher at day 14 compared with day 28, but there was no effect of YPC inclusion and no relationship with FCR. In conclusion, YPC reduced faecal CYTB DNA content and this correlated positively with FCR, but was unrelated to gut inflammation, suggesting that it could be a non-invasive marker of gut cell loss. However, further validation experiments by an independent method are required to verify the origin of pig faecal CYTB DNA as being from sloughed intestinal epithelial cells.
Ebola is a high consequence infectious disease—a disease with the potential to cause outbreaks, epidemics, or pandemics with deadly possibilities, highly infectious, pathogenic, and virulent. Ebola’s first reported cases in the United States in September 2014 led to the development of preparedness capabilities for the mitigation of possible rapid outbreaks, with the Centers for Disease Control and Prevention (CDC) providing guidelines to assist public health officials in infectious disease response planning. These guidelines include broad goals for state and local agencies and detailed information concerning the types of resources needed at health care facilities. However, the spatial configuration of populations and existing health care facilities is neglected. An incomplete understanding of the demand landscape may result in an inefficient and inequitable allocation of resources to populations. Hence, this paper examines challenges in implementing CDC’s guidance for Ebola preparedness and mitigation in the context of geospatial allocation of health resources and discusses possible strategies for addressing such challenges. (Disaster Med Public Health Preparedness. 2018;12:563–566)
A total of 432, one-day-old broiler chickens were randomly assigned as a 2 × 4 factorial design (pellet or mash and 0, 25, 50, and 75% whole sorghum levels) in a completely randomised experiment, having six replicates with nine birds per replicate. Body weight and feed intake were measured on a pen basis at 10, 25, and 35 days of age and feed conversion ratio calculated. Pelleting diets significantly improved (P<0.05) feed intake, body weight and carcass yield of broiler chickens at 10 and 24 days of age. Heavier relative gizzard weights with lower pH (P<0.05) were recorded for broiler chickens offered mash diets at 35 days old. Feed conversion ratio at 35 days of age increased (P<0.035, quadratic effect) with higher levels of whole sorghum and levelled off at 75% inclusion rates. Relative gizzard weight at 35 days was marginally increased (P<0.033, linear effect) in line with rising sorghum levels. Similarly, relative bursa and liver weights at 35 days increased (P<0.037, quadratic effect and P<0.033, linear effect, respectively) with sorghum inclusion. The results showed that pelleted diets gave superior performance compared to mash diets. Although higher levels of sorghum inclusion in mash diets enhanced gizzard development, performance parameters of birds at 35 days of age were poorer, with 125 g less body weight and an increase in FCR from 1.51 to 1.62 for the 0% and 75% sorghum levels respectively.
The Molonglo Observatory Synthesis Telescope (MOST) is an 18000 m2 radio telescope located 40 km from Canberra, Australia. Its operating band (820–851 MHz) is partly allocated to telecommunications, making radio astronomy challenging. We describe how the deployment of new digital receivers, Field Programmable Gate Array-based filterbanks, and server-class computers equipped with 43 Graphics Processing Units, has transformed the telescope into a versatile new instrument (UTMOST) for studying the radio sky on millisecond timescales. UTMOST has 10 times the bandwidth and double the field of view compared to the MOST, and voltage record and playback capability has facilitated rapid implementaton of many new observing modes, most of which operate commensally. UTMOST can simultaneously excise interference, make maps, coherently dedisperse pulsars, and perform real-time searches of coherent fan-beams for dispersed single pulses. UTMOST operates as a robotic facility, deciding how to efficiently target pulsars and how long to stay on source via real-time pulsar folding, while searching for single pulse events. Regular timing of over 300 pulsars has yielded seven pulsar glitches and three Fast Radio Bursts during commissioning. UTMOST demonstrates that if sufficient signal processing is applied to voltage streams, innovative science remains possible even in hostile radio frequency environments.
This study estimates the symptomatology of attention deficit–hyperactivity disorder (ADHD) in adult mental health services (AMHS) outpatient clinics.
All consecutive patients attending any of the outpatients’ clinics in Sligo/Leitrim AMHS were invited to participate. Participants completed the Adult ADHD Self-Report Scale (ASRS) and the Wender Utah Rating Scale (WURS) self-report. Clinical notes were reviewed to identify those with a pre-existing ADHD diagnosis.
From 822 attending the clinics, 62 did not meet inclusion criteria, 97 declined to participate and 29 had incomplete data in either of the screening scales, leaving 634 (77%) eligible for full study analysis. Mean age was 40.38 (s.d.: 12.85), and 326 (51.4%) were females. In total, 215 (33.9%) screened positive on the WURS for childhood onset ADHD and 219 (34.5%) participants scored positive on the ASRS. Applying a more stringent criteria of scoring above cut-offs on both scales, suggested 131 (20.7%) screened positive on both. Only three (2.3%) had a prior clinical diagnosis.
This preliminary study suggests the possibility of relatively higher rates of ADHD in a general AMHS than previously thought, however, given the possibility of overlapping symptoms with other major psychiatric disorders in adulthood and recall bias further research is needed before drawing firm conclusions.
Research on post-traumatic stress disorder (PTSD) course finds a substantial proportion of cases remit within 6 months, a majority within 2 years, and a substantial minority persists for many years. Results are inconsistent about pre-trauma predictors.
The WHO World Mental Health surveys assessed lifetime DSM-IV PTSD presence-course after one randomly-selected trauma, allowing retrospective estimates of PTSD duration. Prior traumas, childhood adversities (CAs), and other lifetime DSM-IV mental disorders were examined as predictors using discrete-time person-month survival analysis among the 1575 respondents with lifetime PTSD.
20%, 27%, and 50% of cases recovered within 3, 6, and 24 months and 77% within 10 years (the longest duration allowing stable estimates). Time-related recall bias was found largely for recoveries after 24 months. Recovery was weakly related to most trauma types other than very low [odds-ratio (OR) 0.2–0.3] early-recovery (within 24 months) associated with purposefully injuring/torturing/killing and witnessing atrocities and very low later-recovery (25+ months) associated with being kidnapped. The significant ORs for prior traumas, CAs, and mental disorders were generally inconsistent between early- and later-recovery models. Cross-validated versions of final models nonetheless discriminated significantly between the 50% of respondents with highest and lowest predicted probabilities of both early-recovery (66–55% v. 43%) and later-recovery (75–68% v. 39%).
We found PTSD recovery trajectories similar to those in previous studies. The weak associations of pre-trauma factors with recovery, also consistent with previous studies, presumably are due to stronger influences of post-trauma factors.
Although mental disorders are significant predictors of educational attainment throughout the entire educational career, most research on mental disorders among students has focused on the primary and secondary school years.
The World Health Organization World Mental Health Surveys were used to examine the associations of mental disorders with college entry and attrition by comparing college students (n = 1572) and non-students in the same age range (18–22 years; n = 4178), including non-students who recently left college without graduating (n = 702) based on surveys in 21 countries (four low/lower-middle income, five upper-middle-income, one lower-middle or upper-middle at the times of two different surveys, and 11 high income). Lifetime and 12-month prevalence and age-of-onset of DSM-IV anxiety, mood, behavioral and substance disorders were assessed with the Composite International Diagnostic Interview (CIDI).
One-fifth (20.3%) of college students had 12-month DSM-IV/CIDI disorders; 83.1% of these cases had pre-matriculation onsets. Disorders with pre-matriculation onsets were more important than those with post-matriculation onsets in predicting subsequent college attrition, with substance disorders and, among women, major depression the most important such disorders. Only 16.4% of students with 12-month disorders received any 12-month healthcare treatment for their mental disorders.
Mental disorders are common among college students, have onsets that mostly occur prior to college entry, in the case of pre-matriculation disorders are associated with college attrition, and are typically untreated. Detection and effective treatment of these disorders early in the college career might reduce attrition and improve educational and psychosocial functioning.
Healthcare provider hands are an important source of intraoperative bacterial transmission events associated with postoperative infection development.
To explore the efficacy of a novel hand hygiene improvement system leveraging provider proximity and individual and group performance feedback in reducing 30-day postoperative healthcare-associated infections via increased provider hourly hand decontamination events.
Randomized, prospective study.
Dartmouth-Hitchcock Medical Center in New Hampshire and UMass Memorial Medical Center in Massachusetts.
Patients undergoing surgery.
Operating room environments were randomly assigned to usual intraoperative hand hygiene or to a personalized, body-worn hand hygiene system. Anesthesia and circulating nurse provider hourly hand decontamination events were continuously monitored and reported. All patients were followed prospectively for the development of 30-day postoperative healthcare-associated infections.
A total of 3,256 operating room environments and patients (1,620 control and 1,636 treatment) were enrolled. The mean (SD) provider hand decontamination event rate achieved was 4.3 (2.9) events per hour, an approximate 8-fold increase in hand decontamination events above that of conventional wall-mounted devices (0.57 events/hour); P<.001. Use of the hand hygiene system was not associated with a reduction in healthcare-associated infections (odds ratio, 1.07 [95% CI, 0.82–1.40], P=.626).
The hand hygiene system evaluated in this study increased the frequency of hand decontamination events without reducing 30-day postoperative healthcare-associated infections. Future work is indicated to optimize the efficacy of this hand hygiene improvement strategy.
Following implementation of automatic end dates for antimicrobial orders to facilitate antimicrobial stewardship at a large, academic children’s hospital, no differences were observed in patient mortality, length of stay, or readmission rates, even among patients with documented bacteremia.
Parotid gland tumours are complex neoplasms with a broad histological range. The parotid gland is also a common site of face and scalp skin cancer metastases.
Parotidectomies performed by ENT department in the Gold Coast health district from 2006 to 2013.
A total of 158 specimens were examined. Of these, 53.80 per cent were benign and 46.20 per cent were malignant. Pleomorphic adenoma was the most common tumour (29.11 per cent), followed by cutaneous squamous cell carcinoma (23.42 per cent) and Warthin's tumour (12.03 per cent).
Metastatic squamous cell carcinoma accounted for a large proportion of parotid masses in our case series, reflecting the high prevalence of non-melanoma skin cancer in Australia. Primary parotid neoplasms had similar incidence rates to other studies.
Convincing evidence has identified inflammation as an initiator of atherosclerosis, underpinning CVD. We investigated (i) whether dietary inflammation, as measured by the ‘dietary inflammatory index (DII)’, was predictive of 5-year CVD in men and (ii) its predictive ability compared with that of SFA intake alone. The sample consisted of 1363 men enrolled in the Geelong Osteoporosis Study who completed an FFQ at baseline (2001–2006) (excluding participants who were identified as having previous CVD). DII scores were computed from participants’ reported intakes of carbohydrate, micronutrients and glycaemic load. DII scores were dichotomised into a pro-inflammatory diet (positive values) or an anti-inflammatory diet (negative values). The primary outcome was a formal diagnosis of CVD resulting in hospitalisation over the 5-year study period. In total, seventy-six events were observed during the 5-year follow-up period. Men with a pro-inflammatory diet at baseline were twice as likely to experience a CVD event over the study period (OR 2·07; 95 % CI 1·20, 3·55). This association held following adjustment for traditional CVD risk factors and total energy intake (adjusted OR 2·00; 95 % CI 1·03, 3·96). This effect appeared to be stronger with the inclusion of an age-by-DII score interaction. In contrast, SFA intake alone did not predict 5-year CVD events after adjustment for covariates (adjusted OR 1·40; 95 % CI 0·73, 2·70). We conclude that an association exists between a pro-inflammatory diet and CVD in Australian men. CVD clinical guidelines and public health recommendations may have to expand to include dietary patterns in the context of vascular inflammation.
To compare neuropsychological test performance of Veterans with and without mild traumatic brain injury (MTBI), blast exposure, and posttraumatic stress disorder (PTSD) symptoms. We compared the neuropsychological test performance of 49 Operation Enduring Freedom/Operation Iraqi Freedom (OEF/OIF) Veterans diagnosed with MTBI resulting from combat blast-exposure to that of 20 blast-exposed OEF/OIF Veterans without history of MTBI, 23 OEF/OIF Veterans with no blast exposure or MTBI history, and 40 matched civilian controls. Comparison of neuropsychological test performance across all four participant groups showed a complex pattern of mixed significant and mostly nonsignificant results, with omnibus tests significant for measures of attention, spatial abilities, and executive function. The most consistent pattern was the absence of significant differences between blast-exposed Veterans with MTBI history and blast-exposed Veterans without MTBI history. When blast-exposed Veteran groups with and without MTBI history were aggregated and compared to non–blast-exposed Veterans, there were significant differences for some measures of learning and memory, spatial abilities, and executive function. However, covariation for severity of PTSD symptoms eliminated all significant omnibus neuropsychological differences between Veteran groups. Our results suggest that, although some mild neurocognitive effects were associated with blast exposure, these neurocognitive effects might be better explained by PTSD symptom severity rather than blast exposure or MTBI history alone. (JINS, 2015, 21, 353–363)