We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Backyard chickens are increasingly popular, and their husbandry varies widely. How backyard chickens are housed may influence the accessibility of chicken feed and water to wild birds, and thus, the contact rates between both groups. Increased contacts have implications for pathogen transmission; for instance, Newcastle disease virus or avian influenza virus may be transmitted to and from backyard chickens from contaminated water or feed. Given this potentially increased pathogen risk to wild birds and backyard chickens, we examined which wild bird species are likely to encounter backyard chickens and their resources. We performed a supplemental feeding experiment followed by observations at three sites associated with backyard chickens in North Georgia, USA. At each site, we identified the species of wild birds that: (a) shared habitat with the chickens, (b) had a higher frequency of detection relative to other species and (c) encountered the coops. We identified 14 wild bird species that entered the coops to consume supplemental feed and were considered high-risk for pathogen transmission. Our results provide evidence that contact between wild birds and backyard chickens is frequent and more common than previously believed, which has crucial epidemiological implications for wildlife managers and backyard chicken owners.
The coronavirus disease 2019 (COVID-19) pandemic has challenged the ability of Emergency Medical Services (EMS) providers to maintain personal safety during the treatment and transport of patients potentially infected. Increased rates of COVID-19 infection in EMS providers after patient care exposure, and notably after performing aerosol-generating procedures (AGPs), have been reported. With an already strained workforce seeing rising call volumes and increased risk for AGP-requiring patient presentations, development of novel devices for the protection of EMS providers is of great importance.
Based on the concept of a negative pressure room, the AerosolVE BioDome is designed to encapsulate the patient and contain aerosolized infectious particles produced during AGPs, making the cabin of an EMS vehicle safer for providers. The objective of this study was to determine the efficacy and safety of the tent in mitigating simulated infectious particle spread in varied EMS transport platforms during AGP utilization.
Methods:
Fifteen healthy volunteers were enrolled and distributed amongst three EMS vehicles: a ground ambulance, an aeromedical-configured helicopter, and an aeromedical-configured jet. Sodium chloride particles were used to simulate infectious particles and particle counts were obtained in numerous locations close to the tent and around the patient compartment. Counts near the tent were compared to ambient air with and without use of AGPs (non-rebreather mask, continuous positive airway pressure [CPAP] mask, and high-flow nasal cannula [HFNC]).
Results:
For all transport platforms, with the tent fan off, the particle generator alone, and with all AGPs produced particle counts inside the tent significantly higher than ambient particle counts (P <.0001). With the tent fan powered on, particle counts near the tent, where EMS providers are expected to be located, showed no significant elevation compared to baseline ambient particle counts during the use of the particle generator alone or with use of any of the AGPs across all transport platforms.
Conclusion:
Development of devices to improve safety for EMS providers to allow for use of all available therapies to treat patients while reducing risk of communicable respiratory disease transmission is of paramount importance. The AerosolVE BioDome demonstrated efficacy in creating a negative pressure environment and workspace around the patient and provided significant filtration of simulated respiratory droplets, thus making the confined space of transport vehicles potentially safer for EMS personnel.
The coronavirus disease 2019 (COVID-19) pandemic has created challenges in maintaining the safety of prehospital providers caring for patients. Reports have shown increased rates of Emergency Medical Services (EMS) provider infection with COVID-19 after patient care exposure, especially while utilizing aerosol-generating procedures (AGPs). Given the increased risk and rising call volumes for AGP-necessitating complaints, development of novel devices for the protection of EMS clinicians is of great importance.
Drawn from the concept of the powered air purifying respirator (PAPR), the AerosolVE helmet creates a personal negative pressure space to contain aerosolized infectious particles produced by patients, making the cabin of an EMS vehicle safer for providers. The helmet was developed initially for use in hospitals and could be of significant use in the prehospital setting. The objective of this study was to determine the efficacy and safety of the helmet in mitigating simulated infectious particle spread in varied EMS transport platforms during AGP utilization.
Methods:
Fifteen healthy volunteers were enrolled and distributed amongst three EMS vehicles: a ground ambulance, a medical helicopter, and a medical jet. Sodium chloride particles were used to simulate infectious particles, and particle counts were obtained in numerous locations close to the helmet and around the patient compartment. Counts near the helmet were compared to ambient air with and without use of AGPs (non-rebreather mask [NRB], continuous positive airway pressure mask [CPAP], and high-flow nasal cannula [HFNC]).
Results:
Without the helmet fan on, the particle generator alone and with all AGPs produced particle counts inside the helmet significantly higher than ambient particle counts. With the fan on, there was no significant difference in particle counts around the helmet compared to baseline ambient particle counts. Particle counts at the filter exit averaged less than one despite markedly higher particle counts inside the helmet.
Conclusion:
Given the risk to EMS providers by communicable respiratory diseases, development of devices to improve safety while still enabling use of respiratory therapies is of paramount importance. The AerosolVE helmet demonstrated efficacy in creating a negative pressure environment and provided significant filtration of simulated respiratory droplets, thus making the confined space of transport vehicles potentially safer for EMS personnel.
Delineating the proximal urethra can be critical for radiotherapy planning but is challenging on computerised tomography (CT) imaging.
Materials and methods:
We trialed a novel non-invasive technique to allow visualisation of the proximal urethra using a rapid sequence magnetic resonance imaging (MRI) protocol to visualise the urinary flow in patients voiding during the simulation scan.
Results:
Of the seven patients enrolled, four were able to void during the MRI scan. For these four patients, direct visualisation of urinary flow through the proximal urethra was achieved. The average volume of the proximal urethra contoured on voiding MRI was significantly higher than the proximal urethra contoured on CT, 4·07 and 1·60 cc, respectively (p = 0·02). The proximal urethra location also differed; the Dice coefficient average was 0·28 (range 0–0·62).
Findings:
In this small, proof-of-concept prospective clinical trial, the volume and location of the proximal urethra differed significantly when contoured on a voiding MRI scan compared to that determined by a conventional CT simulation. The shape of the proximal urethra on voiding MRI may be more anatomically correct compared to the proximal urethra shape determined with a semi-rigid catheter in place.
Mental disorders are common in people living with HIV (PLWH) but often remain untreated. This study aimed to explore the treatment gap for mental disorders in adults followed-up in antiretroviral therapy (ART) programmes in South Africa and disparities between ART programmes regarding the provision of mental health services.
Methods
We conducted a cohort study using ART programme data and linked pharmacy and hospitalisation data to examine the 12-month prevalence of treatment for mental disorders and factors associated with the rate of treatment for mental disorders among adults, aged 15–49 years, followed-up from 1 January 2012 to 31 December 2017 at one private care, one public tertiary care and two pubic primary care ART programmes in South Africa. We calculated the treatment gap for mental disorders as the discrepancy between the 12-month prevalence of mental disorders in PLWH (aged 15–49 years) in South Africa (estimated based on data from the Global Burden of Disease study) and the 12-month prevalence of treatment for mental disorders in ART programmes. We calculated adjusted rate ratios (aRRs) for factors associated with the treatment rate of mental disorders using Poisson regression.
Results
In total, 182 285 ART patients were followed-up over 405 153 person-years. In 2017, the estimated treatment gap for mental disorders was 40.5% (95% confidence interval [CI] 19.5–52.9) for patients followed-up in private care, 96.5% (95% CI 95.0–97.5) for patients followed-up in public primary care and 65.0% (95% CI 36.5–85.1) for patients followed-up in public tertiary care ART programmes. Rates of treatment with antidepressants, anxiolytics and antipsychotics were 17 (aRR 0.06, 95% CI 0.06–0.07), 50 (aRR 0.02, 95% CI 0.01–0.03) and 2.6 (aRR 0.39, 95% CI 0.35–0.43) times lower in public primary care programmes than in the private sector programmes.
Conclusions
There is a large treatment gap for mental disorders in PLWH in South Africa and substantial disparities in access to mental health services between patients receiving ART in the public vs the private sector. In the public sector and especially in public primary care, PLWH with common mental disorders remain mostly untreated.
Cognitive impairments, which contribute to the profound functional deficits observed in psychotic disorders, have found to be associated with abnormalities in trial-level cognitive control. However, neural tasks operate within the context of sustained cognitive states, which can be assessed with ‘background connectivity’ following the removal of task effects. To date, little is known about the integrity of brain processes supporting the maintenance of a cognitive state in individuals with psychotic disorders. Thus, here we examine background connectivity during executive processing in a cohort of participants with first-episode psychosis (FEP).
Methods
The following fMRI study examined background connectivity of the dorsolateral prefrontal cortex (DLPFC), during working memory engagement in a group of 43 patients with FEP, relative to 35 healthy controls (HC). Findings were also examined in relation to measures of executive function.
Results
The FEP group relative to HC showed significantly lower background DLPFC connectivity with bilateral superior parietal lobule (SPL) and left inferior parietal lobule. Background connectivity between DLPFC and SPL was also positively associated with overall cognition across all subjects and in our FEP group. In comparison, resting-state frontoparietal connectivity did not differ between groups and was not significantly associated with overall cognition, suggesting that psychosis-related alterations in executive networks only emerged during states of goal-oriented behavior.
Conclusions
These results provide novel evidence indicating while frontoparietal connectivity at rest appears intact in psychosis, when engaged during a cognitive state, it is impaired possibly undermining cognitive control capacities in FEP.
Dietary fibre fermentation in humans and monogastric animals is considered to occur in the hindgut, but it may also occur in the lower small intestine. This study aimed to compare ileal and hindgut fermentation in the growing pig fed a human-type diet using a combined in vivo/in vitro methodology. Five pigs (23 (sd 1·6) kg body weight) were fed a human-type diet. On day 15, pigs were euthanised. Digesta from terminal jejunum and terminal ileum were collected as substrates for fermentation. Ileal and caecal digesta were collected for preparing microbial inocula. Terminal jejunal digesta were fermented in vitro with a pooled ileal digesta inoculum for 2 h, whereas terminal ileal digesta were fermented in vitro with a pooled caecal digesta inoculum for 24 h. The ileal organic matter fermentability (28 %) was not different from hindgut fermentation (35 %). However, the organic matter fermented was 66 % greater for ileal fermentation than hindgut fermentation (P = 0·04). Total numbers of bacteria in ileal and caecal digesta did not differ (P = 0·09). Differences (P < 0·05) were observed in the taxonomic composition. For instance, ileal digesta contained 32-fold greater number of the genus Enterococcus, whereas caecal digesta had a 227-fold greater number of the genus Ruminococcus. Acetate synthesis and iso-valerate synthesis were greater (P < 0·05) for ileal fermentation than hindgut fermentation, but propionate, butyrate and valerate synthesis was lower. SCFA were absorbed in the gastrointestinal tract location where they were synthesised. In conclusion, a quantitatively important degree of fermentation occurs in the ileum of the growing pig fed a human-type diet.
Previous research demonstrates various associations between depression, cardiovascular disease (CVD) incidence and mortality. Differences between studies may occur as a result of different methodologies.
Objectives:
This work investigated the impact of using two different methods to measure depression and two different methods of analysis to establish relationships.
Aims:
The work investigated the association between depression, CVD incidence (CVDI) and mortality from coronary heart disease (MCHD), smoking related conditions (MSRC), and all causes (MALL), in a major population study using depression measured from a validated scale and a depression measure derived by factor analysis, and analyses based on continuous data and grouped data.
Methods:
Data from the PRIME Study (N=9,798 men) on depression and ten year CVD incidence and mortality were analysed using Cox proportional hazards models.
Results:
Using continuous data, no relationships with CVDI were found, but both measures of depression resulted in the emergence of positive associations between depression and mortality (MCHD, MSRC, MALL). Using grouped data, no associations with CVDI or MCVD were found, and associations between the measure derived from factor analysis and MSRC and MALL were also lost. Positive associations were only found between depression measured using validated items, MSRC and MALL.
Conclusions:
These data demonstrate a possible association between depression and mortality but detecting this association is dependent on the methodology used. Different findings based on methodology present clear problems for the determination of relationships. The differences here suggest the preferential use of validated scales and suggest against over-reduction via factor analysis and grouping.
Dini niŋ ka tuma kalinsi mini pukparigu ni labiri nyanga saha ŋɔ la zuɣu, dabba ban be Tuduyaɣili polo (Northern Region) di niŋdi tom pam tiba zaŋ jandi bɛ biɛhigu ni bɛ laɣi dibo soya diyi ti kana dotali polo. To ayi yuli zaŋ chaŋ kali wahi din jandi Dagbamba Sapashin nim polo, sabbu ŋɔ wuhirila waligimsim din be dotali mini kali wahi yeltɔɣa. Gun Gon nyala kali tuun kpeiŋ din wuhiri dotali tuun tumsa Sapashin nim ni Dagbaŋ pulini, kadi wuhiri ka kpaŋsiri dotali ni nye sheli zaŋ ti dabba ban tumdili.
Correction of tetralogy of Fallot during infancy usually eliminates the risks associated with general anaesthesia. In rare cases of uncorrected defects persisting into adulthood, anaesthetic management during non-cardiac surgery may therefore be challenging. We describe the use of continuous spinal anaesthesia to successfully circumvent the operative risk of major abdominal surgery in an adult patient with uncorrected tetralogy of Fallot.
Diverse theoretical perspectives suggest that place plays an important role in human behavior. One recent perspective proposes that habitual and recursive use of places among humans may be an emergent property of obligate tool use by our species. In this view, the costs of tool use are reduced by preferential occupation of previously occupied places where cultural materials have been discarded. Here we use the model to generate five predictions for ethnographic mobility patterns. We then test the predictions against observations made during one month of coresidence with a residentially mobile Dukha family in the Mongolian Taiga. We show that (1) there is a strong tendency to occupy previously used camps, (2) previously deposited materials are habitually recycled, (3) reoccupation of places transcends kinship, (4) occupational hiatuses can span decades or longer, and (5) the distribution of occupation intensity among camps is highly skewed such that most camps are not intensively reoccupied whereas a few camps experience extremely high reoccupation intensity. These findings complement previous archaeological findings and support the conclusion that the constructed dimensions of human habitats exert a strong influence on mobility patterns in mobile societies.
To evaluate changes in outpatient fluoroquinolone (FQ) and nitrofurantoin (NFT) use and resistance among E. coli isolates after a change in institutional guidance to use NFT over FQs for acute uncomplicated cystitis.
We compared 2 time periods: January 2003–June 2007 when FQs were recommended as first-line therapy, and July 2007–December 2012, when NFT was recommended. The main outcomes were changes in FQ and NFT use and FQ- and NFT-resistant E. coli by time-series analysis.
RESULTS
Overall, 5,714 adults treated for acute cystitis and 11,367 outpatient E. coli isolates were included in the analysis. After the change in prescribing guidance, there was an immediate 26% (95% CI, 20%–32%) decrease in FQ use (P<.001), and a nonsignificant 6% (95% CI, −2% to 15%) increase in NFT use (P=.12); these changes were sustained over the postintervention period. Oral cephalosporin use also increased during the postintervention period. There was a significant decrease in FQ-resistant E. coli of −0.4% per quarter (95% CI, −0.6% to −0.1%; P=.004) between the pre- and postintervention periods; however, a change in the trend of NFT-resistant E. coli was not observed.
CONCLUSIONS
In an integrated healthcare system, a change in institutional guidance for acute uncomplicated cystitis was associated with a reduction in FQ use, which may have contributed to a stabilization in FQ-resistant E. coli. Increased nitrofurantoin use was not associated with a change in NFT resistance.
Climate change is a growing international concern, and it is well established that the release of greenhouse gases (GHG) is a contributing factor. So far, within animal production, there is little or no concerted effort on long-term breeding strategies to mitigate against GHG from ruminants. In recent years, several consortia have been formed to collect and combine data for genetic evaluation. The discussion areas of these consortia focus on (1) What are methane-determining factors, (2) What are genetic parameters for methane emissions, (3) What proxies can be used, and what is their association with methane emission, and (4) How to move on with breeding for lower emitting animals? The methane-determining factors can be divided into four groups: (1) rumen microbial population, (2) feed intake and diet composition, (3) host physiology and (4) host genetics. The genetic parameters show that enteric methane is a heritable trait, and that it is highly genetically correlated with dry matter intake. So far, the most useful proxies relate to feed intake, milk mid IR spectral data and fatty acids in the milk. To be able to move on with a genetic evaluation and ranking of animals for methane emission, it is crucial to make measurements on commercial farms. In order to make that possible, it will be necessary to develop phenotypes that can be used by the farmer to optimise the production on farm level. Also, it is crucial to develop equipment that makes it possible to make measurements without interfering with everyday routines or identify proxies that are highly related to methane and which could easily be measured on a large scale. International collaboration is essential to make progress in this area. This is both in terms of sharing ideas, experiences and phenotypes, but also in terms of coming to a consensus regarding what phenotype to collect and to select for.
Original studies published over the last decade regarding time trends in dementia report mixed results. The aims of the present study were to use linked administrative health data for the province of Saskatchewan for the period 2005/2006 to 2012/2013 to: (1) examine simultaneous temporal trends in annual age- and sex-specific dementia incidence and prevalence among individuals aged 45 and older, and (2) stratify the changes in incidence over time by database of identification.
Methods:
Using a population-based retrospective cohort study design, data were extracted from seven provincial administrative health databases linked by a unique anonymized identification number. Individuals 45 years and older at first identification of dementia between April 1, 2005 and March 31, 2013 were included, based on case definition criteria met within any one of four administrative health databases (hospital, physician, prescription drug, and long-term care).
Results:
Between 2005/2006 and 2012/2013, the 12-month age-standardized incidence rate of dementia declined significantly by 11.07% and the 12-month age-standardized prevalence increased significantly by 30.54%. The number of incident cases decreased from 3,389 to 3,270 and the number of prevalent cases increased from 8,795 to 13,012. Incidence rate reductions were observed in every database of identification.
Conclusions:
We observed a simultaneous trend of decreasing incidence and increasing prevalence of dementia over a relatively short 8-year time period from 2005/2006 to 2012/2013. These trends indicate that the average survival time of dementia is lengthening. Continued observation of these time trends is warranted given the short study period.
In August 2013, a nationwide vaccination campaign with bivalent oral polio vaccine (bOPV) was initiated after isolation of wild-type poliovirus type 1 (WPV-1) in routine sewage surveillance in Israel. The campaign started in the Southern district and later extended to the entire country. This study examined the association between socioeconomic status (SES), and compliance with bOPV vaccine during the campaign. Nationwide data relating to SES by geographical cluster were correlated with vaccine coverage rates in the same areas. All analyses were conducted separately for Jews and Arabs. Coverage with the bOPV vaccination campaign in the Arab population (92·4%) was higher than in the Jewish population (59·2%). This difference was consistently present in all SES clusters. In the Jewish population there was an inverse correlation between SES and vaccination coverage rates (R = −0·93, P < 0·001). Lower vaccination coverage with supplemental vaccine activities in higher SES groups is a challenge that needs to be addressed in future public health events and emergencies in order to achieve satisfactory protection rates for the public.
Camera-based systems in dairy cattle were intensively studied over the last years. Different from this study, single camera systems with a limited range of applications were presented, mostly using 2D cameras. This study presents current steps in the development of a camera system comprising multiple 3D cameras (six Microsoft Kinect cameras) for monitoring purposes in dairy cows. An early prototype was constructed, and alpha versions of software for recording, synchronizing, sorting and segmenting images and transforming the 3D data in a joint coordinate system have already been implemented. This study introduced the application of two-dimensional wavelet transforms as method for object recognition and surface analyses. The method was explained in detail, and four differently shaped wavelets were tested with respect to their reconstruction error concerning Kinect recorded depth maps from different camera positions. The images’ high frequency parts reconstructed from wavelet decompositions using the haar and the biorthogonal 1.5 wavelet were statistically analyzed with regard to the effects of image fore- or background and of cows’ or persons’ surface. Furthermore, binary classifiers based on the local high frequencies have been implemented to decide whether a pixel belongs to the image foreground and if it was located on a cow or a person. Classifiers distinguishing between image regions showed high (⩾0.8) values of Area Under reciever operation characteristic Curve (AUC). The classifications due to species showed maximal AUC values of 0.69.
Measuring and mitigating methane (CH4) emissions from livestock is of increasing importance for the environment and for policy making. Potentially, the most sustainable way of reducing enteric CH4 emission from ruminants is through the estimation of genomic breeding values to facilitate genetic selection. There is potential for adopting genetic selection and in the future genomic selection, for reduced CH4 emissions from ruminants. From this review it has been observed that both CH4 emissions and production (g/day) are a heritable and repeatable trait. CH4 emissions are strongly related to feed intake both in the short term (minutes to several hours) and over the medium term (days). When measured over the medium term, CH4 yield (MY, g CH4/kg dry matter intake) is a heritable and repeatable trait albeit with less genetic variation than for CH4 emissions. CH4 emissions of individual animals are moderately repeatable across diets, and across feeding levels, when measured in respiration chambers. Repeatability is lower when short term measurements are used, possibly due to variation in time and amount of feed ingested prior to the measurement. However, while repeated measurements add value; it is preferable the measures be separated by at least 3 to 14 days. This temporal separation of measurements needs to be investigated further. Given the above issue can be resolved, short term (over minutes to hours) measurements of CH4 emissions show promise, especially on systems where animals are fed ad libitum and frequency of meals is high. However, we believe that for short-term measurements to be useful for genetic evaluation, a number (between 3 and 20) of measurements will be required over an extended period of time (weeks to months). There are opportunities for using short-term measurements in standardised feeding situations such as breath ‘sniffers’ attached to milking parlours or total mixed ration feeding bins, to measure CH4. Genomic selection has the potential to reduce both CH4 emissions and MY, but measurements on thousands of individuals will be required. This includes the need for combined resources across countries in an international effort, emphasising the need to acknowledge the impact of animal and production systems on measurement of the CH4 trait during design of experiments.
We conducted a time-series analysis to evaluate the impact of the ASP over a 6.25-year period (July 1, 2008–September 30, 2014) while controlling for trends during a 3-year preintervention period (July 1, 2005–June 30, 2008). The primary outcome measures were total antibacterial and antipseudomonal use in days of therapy (DOT) per 1,000 patient-days (PD). Secondary outcomes included antimicrobial costs and resistance, hospital-onset Clostridium difficile infection, and other patient-centered measures.
RESULTS
During the preintervention period, total antibacterial and antipseudomonal use were declining (−9.2 and −5.5 DOT/1,000 PD per quarter, respectively). During the stewardship period, both continued to decline, although at lower rates (−3.7 and −2.2 DOT/1,000 PD, respectively), resulting in a slope change of 5.5 DOT/1,000 PD per quarter for total antibacterial use (P=.10) and 3.3 DOT/1,000 PD per quarter for antipseudomonal use (P=.01). Antibiotic expenditures declined markedly during the stewardship period (−$295.42/1,000 PD per quarter, P=.002). There were variable changes in antimicrobial resistance and few apparent changes in C. difficile infection and other patient-centered outcomes.
CONCLUSION
In a hospital with low baseline antibiotic use, implementation of an ASP was associated with sustained reductions in total antibacterial and antipseudomonal use and declining antibiotic expenditures. Common ASP outcome measures have limitations.