Book chapters will be unavailable on Saturday 24th August between 8am-12pm BST. This is for essential maintenance which will provide improved performance going forwards. Please accept our apologies for any inconvenience caused.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Norovirus, a major cause of gastroenteritis in people of all ages worldwide, was first reported in South Korea in 1999. The most common causal agents of pediatric acute gastroenteritis are norovirus and rotavirus. While vaccination has reduced the pediatric rotavirus infection rate, norovirus vaccines have not been developed. Therefore, prediction and prevention of norovirus are very important. Norovirus is divided into genogroups GI–GVII, with GII.4 being the most prevalent. However, in 2012–2013, GII.17 showed a higher incidence than GII.4 and a novel variant, GII.P17-GII.17, appeared. In this study, 204 stool samples collected in 2013–2014 were screened by reverse transcriptase-polymerase chain reaction; 11 GI (5.39%) and 45 GII (22.06%) noroviruses were identified. GI.4, GI.5, GII.4, GII.6 and GII.17 were detected. The whole genomes of the three norovirus GII.17 were sequenced. The whole genome of GII.17 consists of three open reading frames of 5109, 1623 and 780 bp. Compared with 20 GII.17 strains isolated in other countries, we observed numerous changes in the protruding P2 domain of VP1 in the Korean GII.17 viruses. Our study provided genome information that might aid in epidemic prevention, epidemiology studies and vaccine development.
This study evaluated tumour necrosis factor-α, interleukins 10 and 12, and interferon-γ levels, peripheral blood mononuclear cells, and clusters of differentiation 17c and 86 expression in unilateral sudden sensorineural hearing loss.
Twenty-four patients with unilateral sudden sensorineural hearing loss, and 24 individuals with normal hearing and no history of sudden sensorineural hearing loss (who were attending the clinic for other problems), were enrolled. Peripheral blood mononuclear cells, and clusters of differentiation 11c and 86 were isolated and analysed. Plasma and supernatant levels of tumour necrosis factor-α, interferon-γ, and interleukins 10 and 12 were measured.
There were no significant differences with respect to age and gender. Monocyte population, mean tumour necrosis factor-α level and cluster of differentiation 86 expression were significantly increased in the study group compared to the control group. However, interferon-γ and interleukin 12 levels were significantly decreased. The difference in mean interleukin 10 level was not significant.
Increases in tumour necrosis factor-α level and monocyte population might play critical roles in sudden sensorineural hearing loss. This warrants detailed investigation and further studies on the role of dendritic cells in sudden sensorineural hearing loss.
In this study, the combustion instability and emission characteristics of flames of different H2/CH4 compositions were investigated in a partially premixed model gas turbine combustor. A mode shift in the frequency of instability occurred under varying experimental conditions from the first to the seventh mode of longitudinal frequency in the combustor, and a parametric study was conducted to determine the reasons for this shift by using the length of the combustor, a factor that determines the mode frequency of longitudinal instability, as the main parameter. Furthermore, heat load and fuel composition (H2 ratio) were considered as parameters to compare the phenomenon under different conditions. The GRI-3.0 CANTERA code, OH chemiluminescence and the Abel inversion process were applied to analyse the frequency mode shift. NOx emissions, which occurred through the thermal NOx mechanism, increased with increasing heat load and H2 ratio. The instability frequency shifted from the first to the seventh mode as the H2 ratio increased in the H2/CH4 mixture. However, 100% H2 as fuel did not cause combustion instability because it has a higher burning velocity and extinction stretch rate than CH4. Furthermore, the laminar flame speed influenced the frequency mode shift. These phenomena were confirmed by the flame shapes. The Abel inversion process was applied to obtain the cross section of the flames from averaged OH chemiluminescence images. Stable and unstable flames were identified from the radial profile of OH concentration. The combustor length was found to not influence frequency mode shift, whereas the H2 ratio significantly influenced it as well as the flame shape. The results of this experimental study can help in the reliable operation of gas turbine systems in SNG plants.
To assess variability in antimicrobial use and associations with infection testing in pediatric ventilator-associated events (VAEs).
Descriptive retrospective cohort with nested case-control study.
Pediatric intensive care units (PICUs), cardiac intensive care units (CICUs), and neonatal intensive care units (NICUs) in 6 US hospitals.
Children≤18 years ventilated for≥1 calendar day.
We identified patients with pediatric ventilator-associated conditions (VACs), pediatric VACs with antimicrobial use for≥4 days (AVACs), and possible ventilator-associated pneumonia (PVAP, defined as pediatric AVAC with a positive respiratory diagnostic test) according to previously proposed criteria.
Among 9,025 ventilated children, we identified 192 VAC cases, 43 in CICUs, 70 in PICUs, and 79 in NICUs. AVAC criteria were met in 79 VAC cases (41%) (58% CICU; 51% PICU; and 23% NICU), and varied by hospital (CICU, 20–67%; PICU, 0–70%; and NICU, 0–43%). Type and duration of AVAC antimicrobials varied by ICU type. AVAC cases in CICUs and PICUs received broad-spectrum antimicrobials more often than those in NICUs. Among AVAC cases, 39% had respiratory infection diagnostic testing performed; PVAP was identified in 15 VAC cases. Also, among AVAC cases, 73% had no associated positive respiratory or nonrespiratory diagnostic test.
Antimicrobial use is common in pediatric VAC, with variability in spectrum and duration of antimicrobials within hospitals and across ICU types, while PVAP is uncommon. Prolonged antimicrobial use despite low rates of PVAP or positive laboratory testing for infection suggests that AVAC may provide a lever for antimicrobial stewardship programs to improve utilization.
The white-backed planthopper, Sogatella furcifera (Horváth) (Hemiptera, Delphacidae), has emerged as a serious rice pest in Asia. In the present study, 12 microsatellite markers were employed to investigate the genetic structure, diversity and migration route of 43 populations sampled from seven Asian countries (Bangladesh, China, Korea, Laos, Nepal, Thailand, and Vietnam). According to the isolation by distance analysis, a significant positive correlation was observed between genetic and geographic distances by the Mantel test (r2 = 0.4585, P = 0.01), indicating the role of geographic isolation in the genetic structure of S. furcifera. A population assignment test using the first-generation migrants detection method (thresholds a = 0.01) revealed southern China and northern Vietnam as the main sources of S. furcifera in Korea. Nepal and Bangladesh might be additional potential sources via interconnection with Vietnam populations. This paper provides useful data for the migration route and origin of S. furcifera in Korea and will contribute to planthopper resistance management.
Chlamydia trachomatis (CT) infections remain highly prevalent. CT reinfection occurs frequently within months after treatment, likely contributing to sustaining the high CT infection prevalence. Sparse studies have suggested CT reinfection is associated with a lower organism load, but it is unclear whether CT load at the time of treatment influences CT reinfection risk. In this study, women presenting for treatment of a positive CT screening test were enrolled, treated and returned for 3- and 6-month follow-up visits. CT organism loads were quantified at each visit. We evaluated for an association of CT bacterial load at initial infection with reinfection risk and investigated factors influencing the CT load at baseline and follow-up in those with CT reinfection. We found no association of initial CT load with reinfection risk. We found a significant decrease in the median log10 CT load from baseline to follow-up in those with reinfection (5.6 CT/ml vs. 4.5 CT/ml; P = 0.015). Upon stratification of reinfected subjects based upon presence or absence of a history of CT infections prior to their infection at the baseline visit, we found a significant decline in the CT load from baseline to follow-up (5.7 CT/ml vs. 4.3 CT/ml; P = 0.021) exclusively in patients with a history of CT infections prior to our study. Our findings suggest repeated CT infections may lead to possible development of partial immunity against CT.
We evaluated the utility of vancomycin-resistant Enterococcus (VRE) surveillance by varying 2 parameters: admission versus weekly surveillance and perirectal swabbing versus stool sampling.
Prospective, patient-level surveillance program of incident VRE colonization.
Liver transplant surgical intensive care unit (SICU) of a tertiary-care referral medical center with a high prevalence of VRE.
All patients admitted to the SICU from June to August 2015.
We conducted a point-prevalence estimate followed by admission and weekly surveillance by perirectal swabbing and/or stool sampling. Incident colonization was defined as a negative screen followed by positive surveillance. VRE was detected by culture on Remel Spectra VRE chromogenic agar. Microbiologically-confirmed VRE bloodstream infections (BSIs) were tracked for 2 months. Statistical analyses were calculated using the McNemar test, the Fisher exact test, the t test, and the χ2 test.
In total, 91 patients underwent VRE surveillance testing. The point prevalence of VRE colonization was 60.9%; VRE prevalence on admission was 30.1%. Weekly surveillance identified an additional 7 of 28 patients (25.0%) with incident colonization. VRE BSIs were more common in VRE-colonized patients than in noncolonized patients (8 of 43 vs 2 of 48; P=.028). In a direct comparison, perirectal swabs were more sensitive than stool samples in detecting VRE (64 of 67 vs 56 of 67; P=.023). Compliance with perirectal swabbing was 89% (201 of 226) compared to 56% (127 of 226) for stool collection (P≤0.001).
We recommend weekly VRE surveillance over admission-only screening in high-burden units such as liver transplant SICUs. Perirectal swabs had greater collection compliance and sensitivity than stool samples, making them the preferred methodology. Further work may have implications for antimicrobial stewardship and infection control.
Planning mental health carer services requires information about the number of carers, their characteristics, service use and unmet support needs. Available Australian estimates vary widely due to different definitions of mental illness and the types of carers included. This study aimed to provide a detailed profile of Australian mental health carers using a nationally representative household survey.
The number of mental health carers, characteristics of carers and their care recipients, caring hours and tasks provided, service use and unmet service needs were derived from the national 2012 Survey of Disability, Ageing and Carers. Co-resident carers of adults with a mental illness were compared with those caring for people with physical health and other cognitive/behavioural conditions (e.g., autism, intellectual disability, dementia) on measures of service use, service needs and aspects of their caring role.
In 2012, there were 225 421 co-resident carers of adults with mental illness in Australia, representing 1.0% of the population, and an estimated further 103 813 mental health carers not living with their care recipient. The majority of co-resident carers supported one person with mental illness, usually their partner or adult child. Mental health carers were more likely than physical health carers to provide emotional support (68.1% v. 19.7% of carers) and less likely to assist with practical tasks (64.1% v. 86.6%) and activities of daily living (31.9% v. 48.9%). Of co-resident mental health carers, 22.5% or 50 828 people were confirmed primary carers – the person providing the most support to their care recipient. Many primary mental health carers (37.8%) provided more than 40 h of care per week. Only 23.8% of primary mental health carers received government income support for carers and only 34.4% received formal service assistance in their caring role, while 49.0% wanted more support. Significantly more primary mental health than primary physical health carers were dissatisfied with received services (20.0% v. 3.2%), and 35.0% did not know what services were available to them.
Results reveal a sizable number of mental health carers with unmet needs in the Australian community, particularly with respect to financial assistance and respite care, and that these carers are poorly informed about available supports. The prominence of emotional support and their greater dissatisfaction with services indicate a need to better tailor carer services. If implemented carefully, recent Australian reforms including the Carer Gateway and National Disability Insurance Scheme hold promise for improving mental health carer supports.
Identifying factors that influence the functional outcome is an important goal in schizophrenia research. The 22q11.2 deletion syndrome (22q11DS) is a unique genetic model with high risk (20–25%) for schizophrenia. This study aimed to identify potentially targetable domains of neurocognitive functioning associated with functional outcome in adults with 22q11DS.
We used comprehensive neurocognitive test data available for 99 adults with 22q11DS (n = 43 with schizophrenia) and principal component analysis to derive four domains of neurocognition (Verbal Memory, Visual and Logical Memory, Motor Performance, and Executive Performance). We then investigated the association of these neurocognitive domains with adaptive functioning using Vineland Adaptive Behavior Scales data and a linear regression model that accounted for the effects of schizophrenia status and overall intellectual level.
The regression model explained 46.8% of the variance in functional outcome (p < 0.0001). Executive Performance was significantly associated with functional outcome (p = 0.048). Age and schizophrenia were also significant factors. The effects of Executive Performance on functioning did not significantly differ between those with and without psychotic illness.
The findings provide the impetus for further studies to examine the potential of directed (early) interventions targeting Executive Performance to improve long-term adaptive functional outcome in individuals with, or at high risk for, schizophrenia. Moreover, the neurocognitive test profiles may benefit caregivers and clinicians by providing insight into the relative strengths and weaknesses of individuals with 22q11DS, with and without psychotic illness.
BACKGROUND: Meningiomas are the most common primary benign brain tumors in adults. Given the extended life expectancy of most meningiomas, consideration of quality of life (QOL) is important when selecting the optimal management strategy. There is currently a dearth of meningioma-specific QOL tools in the literature. OBJECTIVE: In this systematic review, we analyze the prevailing themes and propose toward building a meningioma-specific QOL assessment tool. METHODS: A systematic search was conducted, and only original studies based on adult patients were considered. QOL tools used in the various studies were analyzed for identification of prevailing themes in the qualitative analysis. The quality of the studies was also assessed. RESULTS: Sixteen articles met all inclusion criteria. Fifteen different QOL assessment tools assessed social and physical functioning, psychological, and emotional well-being. Patient perceptions and support networks had a major impact on QOL scores. Surgery negatively affected social functioning in younger patients, while radiation therapy had a variable impact. Any intervention appeared to have a greater negative impact on physical functioning compared to observation. CONCLUSION: Younger patients with meningiomas appear to be more vulnerable within social and physical functioning domains. All of these findings must be interpreted with great caution due to great clinical heterogeneity, limited generalizability, and risk of bias. For meningioma patients, the ideal QOL questionnaire would present outcomes that can be easily measured, presented, and compared across studies. Existing scales can be the foundation upon which a comprehensive, standard, and simple meningioma-specific survey can be prospectively developed and validated.
Brain tumor behavior is driven by aberrations in the genome and epigenome. Many of these changes, such as IDH mutations in diffuse low-grade glioma (DLGG), are common amongst the same class of tumour and can be incorporated into the diagnostic criteria. However, any given tumor may have other, less common genomic aberrations that are essential for its biological behavior and may inform on underlying aberrant cellular pathways, and potential therapeutic agents. Precision oncology is a genomics-based approach which profiles these alterations to better manage cancer patients and has established itself within the practice of oncology and is slowly making its way into neuro-oncology. The BC Cancer’s Personalized OncoGenomics (POG) program has profiled 16 adult tumours originating from the central nervous system using whole genome and transcriptome analysis (WGTA), for the first time, within a meaningful clinical timeframe/setting. As expected, primary genomic drivers were consistent with their respective diagnoses, though secondary drivers were found to be unique to each tumour. Although these analyses did not result in altered clinical management for these patients, primarily due to availability of drug or clinical trials, they highlight the heterogeneity of secondary drivers in cancers and provide clinicians with meaningful biological information. Lastly, the data generated by POG has highlighted the frequency and complexity of novel driver fusions which are predicted to behave similarly to canonical driver events in their respective tumours. The information available to clinicians through POG has provided paramount knowledge into the biology of each unique tumour.
Background: With advancements in technology, the use of video as a pedagogical method in medical education has gained in popularity, and may aid in teaching clinical skills. In the UBC MD program, videos have been used to assist in teaching the -neurological exam for several decades, but the currently available videos are outdated and not of contemporary quality. Methods: Drawing upon the cognitive theory of multimedia learning from Mayer and Moreno (2003) which describes methods to maximize learning by minimizing cognitive load, we developed a tool to systematically assess pedagogical videos. We inventoried twelve existing neurology videos and analyzed their use of methods such as weeding (removing extraneous information), signalling (visually highlighting important information), and chunking (grouping similar information together). Results: Generally, older videos had poor audiovisual quality that introduced extraneous load, while more current videos had higher production value, albeit inconsistent with the depth of their content. We therefore produced a new three-part neurological exam video series. We wrote storyboards, filmed with a focus on visually depicting the exam and findings, and edited to elucidate relevant physiological concepts. Conclusions: The end product has been adopted by the UBC MD program, and can be shared with other programs who may wish to adopt them.
Childhood obesity rates are higher among Indigenous compared with non-Indigenous Australian children. It has been hypothesized that early-life influences beginning with the intrauterine environment predict the development of obesity in the offspring. The aim of this paper was to assess, in 227 mother–child dyads from the Gomeroi gaaynggal cohort, associations between prematurity, Gestation Related-Optimal Weight (GROW) centiles, maternal adiposity (percentage body fat, visceral fat area), maternal non-fasting plasma glucose levels (measured at mean gestational age of 23.1 weeks) and offspring BMI and adiposity (abdominal circumference, subscapular skinfold thickness) in early childhood (mean age 23.4 months). Maternal non-fasting plasma glucose concentrations were positively associated with infant birth weight (P=0.005) and GROW customized birth weight centiles (P=0.008). There was a significant association between maternal percentage body fat (P=0.02) and visceral fat area (P=0.00) with infant body weight in early childhood. Body mass index (BMI) in early childhood was significantly higher in offspring born preterm compared with those born at term (P=0.03). GROW customized birth weight centiles was significantly associated with body weight (P=0.01), BMI (P=0.007) and abdominal circumference (P=0.039) at early childhood. Our findings suggest that being born preterm, large for gestational age or exposed to an obesogenic intrauterine environment and higher maternal non-fasting plasma glucose concentrations are associated with increased obesity risk in early childhood. Future strategies should aim to reduce the prevalence of overweight/obesity in women of child-bearing age and emphasize the importance of optimal glycemia during pregnancy, particularly in Indigenous women.
Introduction: Gastroenteritis accounts for 1.7 million emergency department visits by children annually in the United States. We conducted a double-blind trial to determine whether twice daily probiotic administration for 5 days, improves outcomes. Methods: 886 children aged 348 months with gastroenteritis were enrolled in six Canadian pediatric emergency departments. Participants were randomly assigned to twice daily Lactobacillus rhamnosus R0011 and Lactobacillus helveticus R0052, 4.0 x 109 CFU, in a 95:5 ratio or placebo. Primary outcome was development of moderate-severe disease within 14 days of randomization defined by a Modified Vesikari Scale score 9. Secondary outcomes included duration of diarrhea and vomiting, subsequent physician visits and adverse events. Results: Moderate-severe disease occurred in 108 (26.1%) participants administered probiotics and 102 (24.7%) participants allocated to placebo (OR 1.06; 95%CI: 0.77, 1.46; P=0.72). After adjustment for site, age, and frequency of vomiting and diarrhea, treatment assignment did not predict moderate-severe disease (OR, 1.11, 95%CI, 0.80 to 1.56; P=0.53). In the probiotic versus placebo groups, there were no differences in the median duration of diarrhea [52.5 (18.3, 95.8) vs. 55.5 (20.2, 102.3) hours; P=0.31], vomiting [17.7 (0, 58.6) vs. 18.7 (0, 51.6) hours; P=0.18], physician visits (30.2% vs. 26.6%; OR 1.19; 95% CI0.87. 1.62; P=0.27), or adverse events (32.9% vs. 36.8%; OR 0.83; 95%CI 0.62. 1.11; P=0.21). Conclusion: In children presenting to an emergency department with gastroenteritis, twice daily administration of 4.0 x 109 CFU of a Lactobacillus rhamnosus/helveticus probiotic does not prevent development of moderate-severe disease or improvements in other outcomes measured.
Introduction: The purpose of this study is to determine if the introduction of a pre-arrival and pre-departure Trauma Checklist as a cognitive aid, coupled with an educational session, will improve clinical performance in a simulated environment. The Trauma Checklist was developed in response to a quality assurance review of high-acuity trauma activations. It focuses on pre-arrival preparation and a pre-departure review prior to patient transfer to diagnostic imaging or the operating room. We conducted a pilot, randomized control trial assessing the impact of the Trauma Checklist on time to critical interventions on a simulated pediatric patient by multidisciplinary teams. Methods: Emergency department teams composed of 2 physicians, 2 nurses and 2 confederate actors were enrolled in our study. In the intervention arm, participants watched a 10-minute educational video modelling the use of the trauma checklist prior to their simulation scenario and were provided a copy of the checklist. Teams participated in a standardized simulation scenario caring for a severely injured adolescent patient with hemorrhagic shock, respiratory failure and increased intracranial pressure. Our primary outcome of interest was time measurement to initiation of key clinical interventions, including intubation, first blood product administration, massive transfusion protocol activation, initiation of hyperosmolar therapy and others. Secondary outcome measures included a Trauma Task Performance score and checklist completion scores. Results: We enrolled 14 multidisciplinary teams (n=56 participants) into our study. There was a statistically significant decrease in median time to initiation of hyperosmolar therapy by teams in the intervention arm compared to the control arm (581 seconds, [509-680] vs. 884 seconds, [588-1144], p=0.03). Time to initiation of other clinical interventions was not statistically significant. There was a trend to higher Trauma Task Performance scores in the intervention group however this did not reach statistical significant (p=0.09). Pre-arrival and pre-departure checklist scores were higher in the intervention group (9.0 [9.0-10.0] vs. 7.0 [6.0-8.0], p=0.17 and 12.0 [11.5-12.0] vs. 7.5 [6.0-8.5], p=0.01). Conclusion: Teams using the Trauma Checklist did not have decreased time to initiation of key clinical interventions except in initiating hyperosmolar therapy. Teams in the intervention arm had statistically significantly higher pre-arrival and pre-departure scores, with a trend to higher Trauma Task Performance scores. Our study was a pilot and recruitment did not achieve the anticipated sample size, thus underpowered. The impact of this checklist should be studied outside tertiary trauma centres, particularly in trainees and community emergency providers, to assess for benefit and further generalizability.
For this study, we adapted the Montgomery Borgatta Caregiver Burden Scale, used widely in the United States, to the Saudi Arabian context. To produce an Arabic, culturally sensitive version of the scale, we conducted semi-structured interviews with 20 Saudi family caregivers. The Arabic version of the scale was tested, and participants were asked to comment on the appropriateness of items for the construct of “caregiver burden” using the repertory grid technique and laddering procedure – two constructivist methods derived from personal construct theory. From interview findings, we examined the content of the items and the caregiver burden construct itself. Our findings suggest that the use of constructivist methods to refine constructs and quantitative instruments is highly informative. This strategy is feasible even when little is known about the investigated constructs in the target culture and further elucidates our understanding of cross-cultural variations or invariance of different versions of the scale.
Human bocaviruses (HBoVs) have been detected in human gastrointestinal infections worldwide. In 2005, HBoV was also discovered in infants and children with infections of the lower respiratory tract. Recently, several genotypes of this parvovirus, including HBoV genotype 2 (HBoV2), genotype 3 (HBoV3) and genotype 4 (HBoV4), were discovered and found to be closely related to HBoV. HBoV2 was first detected in stool samples from children in Pakistan, followed by detection in other countries. HBoV3 was detected in Australia and HBoV4 was identified in stool samples from Nigeria, Tunisia and the USA. Recently, HBoV infection has been on the rise throughout the world, particularly in countries neighbouring South Korea; however, there have been very few studies on Korean strains. In this study, we characterised the whole genome and determined the phylogenetic position of CUK-BC20, a new clinical HBoV strain isolated in South Korea. The CUK-BC20 genome of 5184 nucleotides (nt) contains three open-reading frames (ORFs). The genotype of CUK-BC20 is HBoV2, and 98.77% of its nt sequence is identical with those of other HBoVs, namely Rus-Nsc10-N386. Especially, the ORF3 amino acid sequences from positions 212–213 and 454 corresponding to a variable region (VR)1 and VR5, respectively, showed genotype-specific substitutions that distinguished the four HBoV genotypes. As the first whole-genome sequence analysis of HBoV in South Korea, this information will provide a valuable reference for the detection of recombination, tracking of epidemics and development of diagnosis methods for HBoV.
Studies have consistently shown that subthreshold depression is associated with an increased risk of developing major depression. However, no study has yet calculated a pooled estimate that quantifies the magnitude of this risk across multiple studies.
We conducted a systematic review to identify longitudinal cohort studies containing data on the association between subthreshold depression and future major depression. A baseline meta-analysis was conducted using the inverse variance heterogeneity method to calculate the incidence rate ratio (IRR) of major depression among people with subthreshold depression relative to non-depressed controls. Subgroup analyses were conducted to investigate whether IRR estimates differed between studies categorised by age group or sample type. Sensitivity analyses were also conducted to test the robustness of baseline results to several sources of study heterogeneity, such as the case definition for subthreshold depression.
Data from 16 studies (n = 67 318) revealed that people with subthreshold depression had an increased risk of developing major depression (IRR = 1.95, 95% confidence interval 1.28–2.97). Subgroup analyses estimated similar IRRs for different age groups (youth, adults and the elderly) and sample types (community-based and primary care). Sensitivity analyses demonstrated that baseline results were robust to different sources of study heterogeneity.
The results of this study support the scaling up of effective indicated prevention interventions for people with subthreshold depression, regardless of age group or setting.
Exercise and physical training are known to affect gastrointestinal function and digestibility in horses and can lead to inaccurate estimates of nutrient and energy digestibility when markers are used. The effect of exercise on apparent nutrient digestibility and faecal recoveries of ADL and TiO2 was studied in six Welsh pony geldings subjected to either a low- (LI) or high-intensity (HI) exercise regime according to a cross-over design. Ponies performing LI exercise were walked once per day for 45 min in a horse walker (5 km/h) for 47 consecutive days. Ponies submitted to HI exercise were gradually trained for the same 47 days according a standardized protocol. Throughout the experiment, the ponies received a fixed level of feed and the daily rations consisted of 4.7 kg DM of grass hay and 0.95 kg DM of concentrate. The diet was supplemented with minerals, vitamins and TiO2 (3.0 g Ti/day). Total tract digestibility of DM, organic matter (OM), CP, crude fat, NDF, ADF, starch, sugar and energy was determined with the total faeces collection (TFC) method. In addition, DM and OM digestibility was estimated using internal ADL and the externally supplemented Ti as markers. Urine was collected on the final 2 days of each experimental period. Exercise did not affect apparent digestibility of CP, crude fat, starch and sugar. Digestibility of DM (DMD), OM (OMD), ADF and NDF tended to be lower and DE was decreased when ponies received the HI exercise regime. For all treatments combined, mean faecal recoveries of ADL and Ti were 87.8±1.7% and 99.3±1.7%, respectively. Ti was not detected in the urine, indicating that intestinal integrity was maintained with exercise. Dry matter digestibility estimated with the TFC, ADL and Ti for ponies subjected to LI exercise were 66.3%, 60.3% and 64.8%, respectively, while DMD for HI ponies were 64.2%, 60.3% and 65.2%, respectively. In conclusion, physical exercise has an influence on the GE digestibility of the feed in ponies provided with equivalent levels of feed intake. In addition, the two markers used for estimating apparent DMD and OMD indicate that externally supplemented Ti is a suitable marker to determine digestibility of nutrients in horses performing exercise unlike dietary ADL.