Book chapters will be unavailable on Saturday 24th August between 8am-12pm BST. This is for essential maintenance which will provide improved performance going forwards. Please accept our apologies for any inconvenience caused.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
The objective of the present study is to summarise trends in under- and over-nutrition in pregnant women on the Thailand–Myanmar border. Refugees contributed data from 1986 to 2016 and migrants from 1999 to 2016 for weight at first antenatal consultation. BMI and gestational weight gain (GWG) data were available during 2004–2016 when height was routinely measured. Risk factors for low and high BMI were analysed for <18·5 kg/m2 or ≥23 kg/m2, respectively. A total of 48 062 pregnancies over 30 years were available for weight analysis and 14 646 pregnancies over 13 years (2004–2016) had BMI measured in first trimester (<14 weeks’ gestational age). Mean weight at first antenatal consultation in any trimester increased over the 30-year period by 2·0 to 5·2 kg for all women. First trimester BMI has been increasing on average by 0·5 kg/m2 for refugees and 0·6 kg/m2 for migrants, every 5 years. The proportion of women with low BMI in the first trimester decreased from 16·7 to 12·7 % for refugees and 23·1 to 20·2 % for migrants, whereas high BMI increased markedly from 16·9 to 33·2 % for refugees and 12·3 to 28·4 % for migrants. Multivariate analysis demonstrated low BMI as positively associated with being Burman, Muslim, primigravid, having malaria during pregnancy and smoking, and negatively associated with refugee as opposed to migrant status. High BMI was positively associated with being Muslim and literate, and negatively associated with age, primigravida, malaria, anaemia and smoking. Mean GWG was 10·0 (sd 3·4), 9·5 (sd 3·6) and 8·3 (sd 4·3) kg, for low, normal and high WHO BMI categories for Asians, respectively.
Norovirus, a major cause of gastroenteritis in people of all ages worldwide, was first reported in South Korea in 1999. The most common causal agents of pediatric acute gastroenteritis are norovirus and rotavirus. While vaccination has reduced the pediatric rotavirus infection rate, norovirus vaccines have not been developed. Therefore, prediction and prevention of norovirus are very important. Norovirus is divided into genogroups GI–GVII, with GII.4 being the most prevalent. However, in 2012–2013, GII.17 showed a higher incidence than GII.4 and a novel variant, GII.P17-GII.17, appeared. In this study, 204 stool samples collected in 2013–2014 were screened by reverse transcriptase-polymerase chain reaction; 11 GI (5.39%) and 45 GII (22.06%) noroviruses were identified. GI.4, GI.5, GII.4, GII.6 and GII.17 were detected. The whole genomes of the three norovirus GII.17 were sequenced. The whole genome of GII.17 consists of three open reading frames of 5109, 1623 and 780 bp. Compared with 20 GII.17 strains isolated in other countries, we observed numerous changes in the protruding P2 domain of VP1 in the Korean GII.17 viruses. Our study provided genome information that might aid in epidemic prevention, epidemiology studies and vaccine development.
Introduction: Many drugs, including cannabis and alcohol, cause impairment and contribute to motor vehicle collisions (MVCs). Policy makers require knowledge of the prevalence of drug use in crash-involved drivers, and types of drugs used in order to develop effective prevention programs. This issue is particularly relevant with the recent legalization of cannabis. We aim to study the prevalence of alcohol, cannabis, sedating medications, and other drugs in injured drivers from 4 Canadian Provinces. Methods: This prospective cohort study obtained excess clinical blood samples from consecutive injured drivers who attended a participating Canadian trauma centre following a MVC. Blood samples were analyzed using a broad spectrum toxicology screen capable of detecting cannabinoids, cocaine, amphetamines (including their major analogues), and opioids as well as psychotropic pharmaceuticals (including antihistamines, benzodiazepines, other hypnotics, and sedating antidepressants). Alcohol and cannabinoids were quantified. Health records were reviewed to extract demographic, medical, and MVC information using a standardized data collection tool. Results: This study has been collecting data in 4 trauma centres in British Columbia (BC) since 2011 and was launched in 2 trauma centres in Alberta (AB), 1 in Saskatchewan (SK), and 2 in Ontario (ON) in 2018. In preliminary results from BC (n = 2412), 8% of injured drivers tested positive for THC and 13% for alcohol. Preliminary results from other provinces (n = 301) suggest a regional variation in prevalence of drivers testing positive for THC (10% - 27%), alcohol (17% - 29%), and other drugs. By May 2018, an estimated 4500 cases from BC, 600 from AB, 150 from SK, and 650 from ON will have been analyzed. We will report the prevalence of positive tests for alcohol, THC, other recreational drugs, and sedating medications, pre and post cannabis legalization. The number of cases with alcohol and/or THC levels above Canadian per se limits will also be reported. Results will be reported according to province, driver sex, age, single vs. multi vehicle crashes, and requirement for hospital admission. Conclusion: This will be among the largest international datasets on drug use by injured drivers. Our findings will provide patterns of drug and alcohol impairment in 4 Canadian provinces pre and post cannabis legalization. The significance of these findings and implication for impaired driving policy and prevention programs in Canada will be discussed.
Introduction: Individualizing risk for stroke following a transient ischemic attack (TIA) is a topic of intense research, as existing scores are context-dependent or have not been well validated. The Canadian TIA Score stratifies risk of subsequent stroke into low, moderate and high risk. Our objective was to prospectively validate the Canadian TIA Score in a new cohort of emergency department (ED) patients. Methods: We conducted a prospective cohort study in 14 Canadian EDs over 4 years. We enrolled consecutive adult patients with an ED visit for TIA or nondisabling stroke. Treating physicians recorded standardized clinical variables onto data collection forms. Given the ability of prompt emergency carotid endarterectomy (CEA) to prevent stroke (NNT = 3) in high risk patients, our primary outcome was the composite of subsequent stroke or CEA ≤7 days. We conducted telephone follow-up using the validated Questionnaire for Verifying Stroke Free Status at 7 and 90 days. Outcomes were adjudicated by panels of 3 local stroke experts, blinded to the index ED data collection form. Based on prior work, we estimated a sample size of 5,004 patients including 93 subsequent strokes, would yield 95% confidence bands of +/− 10% for sensitivity and likelihood ratio (LR). Our analyses assessed interval LRs (iLR) with 95% CIs. Results: We prospectively enrolled 7,569 patients with mean 68.4 +/−14.7 years and 52.4% female, of whom 107 (1.4%) had a subsequent stroke and 74 (1.0%) CEA ≤7 days (total outcomes = 181). We enrolled 81.2% of eligible patients; missed patients were similar to enrolled. The Canadian TIA Score stratified the stroke/CEA ≤7days risk as: Low (probability <0.2%, iLR 0.20 [95%CI 0.091-0.44]; Moderate (probability 1.3%, iLR 0.79 [0.68-0.92]; High (probability 2.6%, iLR 2.2 [1.9-2.6]. Sensitivity analysis for just stroke ≤7 days yielded similar results: Low iLR 0.17 [95%CI 0.056-0.52], Medium iLR 0.89 [0.75-1.1], High iLR 2.0 [1.6-2.4]. Conclusion: The Canadian TIA Score accurately identifies TIA patients risk for stroke/CEA ≤7 days. Patients classified as low risk can be safely discharged following a careful ED assessment with elective follow-up. Patients at moderate risk can undergo additional testing in the ED, have antithrombotic therapy optimized, and be offered early stroke specialist follow-up. Patients at high risk should in most cases be fully investigated and managed ideally in consultation with a stroke specialist during their index ED visit.
In this study, an improved fluid–structure interaction (FSI) analysis method is developed for a flapping wing. A co-rotational (CR) shell element is developed for its structural analysis. Further, a relevant non-linear dynamic formulation is developed based on the CR framework. Three-dimensional preconditioned Navier–Stokes equations are employed for its fluid analysis. An implicit coupling scheme is employed to combine the structural and fluid analyses. An explicit investigation of a 3D plunging wing is conducted using this FSI analysis method. A further investigation of this plunging wing is performed in relation to its operating condition. In addition, the relation between the wing’s aerodynamic performance and plunging motion is investigated.
A new ESCA (electron spectroscopy for chemical analysis) instrument has been developed to provide high sensitivity and efficient operation for laboratory analysis of composition and chemical bonding in very thin surface layers of solid samples. High sensitivity is achieved by means of the high-intensity, efficient X-ray source described by Davies and Herglotz at the 1968 Denver X-Ray Conference, in combination with the new electron energy analyzer described by Lee at the 1972 Pittsburgh Conference on Analytical Chemistry and Applied Spectroscopy. A sample chamber designed to provide for rapid introduction and replacement of samples has adequate facilities for various sample treatments and conditioning followed immediafely by ESCA analysis of the sample.
Examples of application are presented, demonstrating the sensitivity and resolution achievable with this instrument. Its usefulness in trace surface analysis is shown and some “chemical shifts” measured by the instrument are compared with those obtained by X-ray spectroscopy.
Identifying routes of transmission among hospitalized patients during a healthcare-associated outbreak can be tedious, particularly among patients with complex hospital stays and multiple exposures. Data mining of the electronic health record (EHR) has the potential to rapidly identify common exposures among patients suspected of being part of an outbreak.
We retrospectively analyzed 9 hospital outbreaks that occurred during 2011–2016 and that had previously been characterized both according to transmission route and by molecular characterization of the bacterial isolates. We determined (1) the ability of data mining of the EHR to identify the correct route of transmission, (2) how early the correct route was identified during the timeline of the outbreak, and (3) how many cases in the outbreaks could have been prevented had the system been running in real time.
Correct routes were identified for all outbreaks at the second patient, except for one outbreak involving >1 transmission route that was detected at the eighth patient. Up to 40 or 34 infections (78% or 66% of possible preventable infections, respectively) could have been prevented if data mining had been implemented in real time, assuming the initiation of an effective intervention within 7 or 14 days of identification of the transmission route, respectively.
Data mining of the EHR was accurate for identifying routes of transmission among patients who were part of the outbreak. Prospective validation of this approach using routine whole-genome sequencing and data mining of the EHR for both outbreak detection and route attribution is ongoing.
This study evaluated tumour necrosis factor-α, interleukins 10 and 12, and interferon-γ levels, peripheral blood mononuclear cells, and clusters of differentiation 17c and 86 expression in unilateral sudden sensorineural hearing loss.
Twenty-four patients with unilateral sudden sensorineural hearing loss, and 24 individuals with normal hearing and no history of sudden sensorineural hearing loss (who were attending the clinic for other problems), were enrolled. Peripheral blood mononuclear cells, and clusters of differentiation 11c and 86 were isolated and analysed. Plasma and supernatant levels of tumour necrosis factor-α, interferon-γ, and interleukins 10 and 12 were measured.
There were no significant differences with respect to age and gender. Monocyte population, mean tumour necrosis factor-α level and cluster of differentiation 86 expression were significantly increased in the study group compared to the control group. However, interferon-γ and interleukin 12 levels were significantly decreased. The difference in mean interleukin 10 level was not significant.
Increases in tumour necrosis factor-α level and monocyte population might play critical roles in sudden sensorineural hearing loss. This warrants detailed investigation and further studies on the role of dendritic cells in sudden sensorineural hearing loss.
The white-backed planthopper, Sogatella furcifera (Horváth) (Hemiptera, Delphacidae), has emerged as a serious rice pest in Asia. In the present study, 12 microsatellite markers were employed to investigate the genetic structure, diversity and migration route of 43 populations sampled from seven Asian countries (Bangladesh, China, Korea, Laos, Nepal, Thailand, and Vietnam). According to the isolation by distance analysis, a significant positive correlation was observed between genetic and geographic distances by the Mantel test (r2 = 0.4585, P = 0.01), indicating the role of geographic isolation in the genetic structure of S. furcifera. A population assignment test using the first-generation migrants detection method (thresholds a = 0.01) revealed southern China and northern Vietnam as the main sources of S. furcifera in Korea. Nepal and Bangladesh might be additional potential sources via interconnection with Vietnam populations. This paper provides useful data for the migration route and origin of S. furcifera in Korea and will contribute to planthopper resistance management.
A cluster of Salmonella Paratyphi B variant L(+) tartrate(+) infections with indistinguishable pulsed-field gel electrophoresis patterns was detected in October 2015. Interviews initially identified nut butters, kale, kombucha, chia seeds and nutrition bars as common exposures. Epidemiologic, environmental and traceback investigations were conducted. Thirteen ill people infected with the outbreak strain were identified in 10 states with illness onset during 18 July–22 November 2015. Eight of 10 (80%) ill people reported eating Brand A raw sprouted nut butters. Brand A conducted a voluntary recall. Raw sprouted nut butters are a novel outbreak vehicle, though contaminated raw nuts, nut butters and sprouted seeds have all caused outbreaks previously. Firms producing raw sprouted products, including nut butters, should consider a kill step to reduce the risk of contamination. People at greater risk for foodborne illness may wish to consider avoiding raw products containing raw sprouted ingredients.
Planning mental health carer services requires information about the number of carers, their characteristics, service use and unmet support needs. Available Australian estimates vary widely due to different definitions of mental illness and the types of carers included. This study aimed to provide a detailed profile of Australian mental health carers using a nationally representative household survey.
The number of mental health carers, characteristics of carers and their care recipients, caring hours and tasks provided, service use and unmet service needs were derived from the national 2012 Survey of Disability, Ageing and Carers. Co-resident carers of adults with a mental illness were compared with those caring for people with physical health and other cognitive/behavioural conditions (e.g., autism, intellectual disability, dementia) on measures of service use, service needs and aspects of their caring role.
In 2012, there were 225 421 co-resident carers of adults with mental illness in Australia, representing 1.0% of the population, and an estimated further 103 813 mental health carers not living with their care recipient. The majority of co-resident carers supported one person with mental illness, usually their partner or adult child. Mental health carers were more likely than physical health carers to provide emotional support (68.1% v. 19.7% of carers) and less likely to assist with practical tasks (64.1% v. 86.6%) and activities of daily living (31.9% v. 48.9%). Of co-resident mental health carers, 22.5% or 50 828 people were confirmed primary carers – the person providing the most support to their care recipient. Many primary mental health carers (37.8%) provided more than 40 h of care per week. Only 23.8% of primary mental health carers received government income support for carers and only 34.4% received formal service assistance in their caring role, while 49.0% wanted more support. Significantly more primary mental health than primary physical health carers were dissatisfied with received services (20.0% v. 3.2%), and 35.0% did not know what services were available to them.
Results reveal a sizable number of mental health carers with unmet needs in the Australian community, particularly with respect to financial assistance and respite care, and that these carers are poorly informed about available supports. The prominence of emotional support and their greater dissatisfaction with services indicate a need to better tailor carer services. If implemented carefully, recent Australian reforms including the Carer Gateway and National Disability Insurance Scheme hold promise for improving mental health carer supports.
Giardia duodenalis is the most common intestinal parasite of humans in the USA, but the risk factors for sporadic (non-outbreak) giardiasis are not well described. The Centers for Disease Control and Prevention and the Colorado and Minnesota public health departments conducted a case-control study to assess risk factors for sporadic giardiasis in the USA. Cases (N = 199) were patients with non-outbreak-associated laboratory-confirmed Giardia infection in Colorado and Minnesota, and controls (N = 381) were matched by age and site. Identified risk factors included international travel (aOR = 13.9; 95% CI 4.9–39.8), drinking water from a river, lake, stream, or spring (aOR = 6.5; 95% CI 2.0–20.6), swimming in a natural body of water (aOR = 3.3; 95% CI 1.5–7.0), male–male sexual behaviour (aOR = 45.7; 95% CI 5.8–362.0), having contact with children in diapers (aOR = 1.6; 95% CI 1.01–2.6), taking antibiotics (aOR = 2.5; 95% CI 1.2–5.0) and having a chronic gastrointestinal condition (aOR = 1.8; 95% CI 1.1–3.0). Eating raw produce was inversely associated with infection (aOR = 0.2; 95% CI 0.1–0.7). Our results highlight the diversity of risk factors for sporadic giardiasis and the importance of non-international-travel-associated risk factors, particularly those involving person-to-person transmission. Prevention measures should focus on reducing risks associated with diaper handling, sexual contact, swimming in untreated water, and drinking untreated water.
Introduction: Survival from cardiac arrest has been linked to the quality of resuscitation care. Unfortunately, healthcare providers frequently underperform in these critical scenarios, with a well-documented deterioration in skills weeks to months following advanced life support courses. Improving initial training and preventing decay in knowledge and skills are a priority in resuscitation education. The spacing effect has repeatedly been shown to have an impact on learning and retention. Despite its potential advantages, the spacing effect has seldom been applied to organized education training or complex motor skill learning where it has the potential to make a significant impact. The purpose of this study was to determine if a resuscitation course taught in a spaced format compared to the usual massed instruction results in improved retention of procedural skills. Methods: EMS providers (Paramedics and Emergency Medical Technicians (EMT)) were block randomized to receive a Pediatric Advanced Life Support (PALS) course in either a spaced format (four 210-minute weekly sessions) or a massed format (two sequential 7-hour days). Blinded observers used expert-developed 4-point global rating scales to assess video recordings of each learner performing various resuscitation skills before, after and 3-months following course completion. Primary outcomes were performance on infant bag-valve-mask ventilation (BVMV), intraosseous (IO) insertion, infant intubation, infant and adult chest compressions. Results: Forty-eight of 50 participants completed the study protocol (26 spaced and 22 massed). There was no significant difference between the two groups on testing before and immediately after the course. 3-months following course completion participants in the spaced cohort scored higher overall for BVMV (2.2 ± 0.13 versus 1.8 ± 0.14, p=0.012) without statistically significant difference in scores for IO insertion (3.0 ± 0.13 versus 2.7± 0.13, p= 0.052), intubation (2.7± 0.13 versus 2.5 ± 0.14, p=0.249), infant compressions (2.5± 0.28 versus 2.5± 0.31, p=0.831) and adult compressions (2.3± 0.24 versus 2.2± 0.26, p=0.728) Conclusion: Procedural skills taught in a spaced format result in at least as good learning as the traditional massed format; more complex skills taught in a spaced format may result in better long term retention when compared to traditional massed training as there was a clear difference in BVMV and trend toward a difference in IO insertion.
Introduction: The cricothyroid membrane is used as a landmark for emergent surgical airway access. Ultrasound identification of the cricothyroid membrane is more accurate than landmarking by palpation. The objective of this study was to determine whether head of bed elevation affects the position of the cricothyroid membrane as identified by ultrasound. Methods: This was a prospective, observational study on a convenience sample of adult patients presenting to the emergency department. Participants underwent ultrasound scans by trained physicians at 0, 30 and 90 degrees head of bed elevation to identify the cricothyroid membrane. The cricothyroid membrane position identified at 0 degrees was used as a reference, and the change in position of the external landmark of the cricothyroid membrane with the patient at 30 and 90 degrees was measured. Additionally, the patients gender, age, body mass index (BMI) and Mallampati score were recorded for comparison. Linear mixed effects models with 95% confidence intervals were used to determine the effect of head of bed elevation, age, BMI and Mallampati score on the differences between measured distances. Results: One hundred and two patients were enrolled in the study. The average change in position from reference was statistically significant for both 30 degrees [2.72±0.77mm (p<0.01)] and 90 degrees [4.23±0.77mm (p<0.01)] head of bed elevation. The adjusted linear mixed effects model showed age greater than 70, BMI over 30 and higher Mallampati score were associated with greater change in distance between cricothyroid membrane landmarks. Conclusion: There was a statistically significant difference in the position of the cricothyroid membrane comparing 0 degrees to 30 and 90 degrees head of bed elevation. However, the relatively small differences suggest that this finding is not clinically relevant. Further study is required to evaluate if these differences impact the actual successful performance of cricothyrotomy.
For livestock production systems to play a positive role in global food security, the balance between their benefits and disbenefits to society must be appropriately managed. Based on the evidence provided by field-scale randomised controlled trials around the world, this debate has traditionally centred on the concept of economic-environmental trade-offs, of which existence is theoretically assured when resource allocation is perfect on the farm. Recent research conducted on commercial farms indicates, however, that the economic-environmental nexus is not nearly as straightforward in the real world, with environmental performances of enterprises often positively correlated with their economic profitability. Using high-resolution primary data from the North Wyke Farm Platform, an intensively instrumented farm-scale ruminant research facility located in southwest United Kingdom, this paper proposes a novel, information-driven approach to carry out comprehensive assessments of economic-environmental trade-offs inherent within pasture-based cattle and sheep production systems. The results of a data-mining exercise suggest that a potentially systematic interaction exists between ‘soil health’, ecological surroundings and livestock grazing, whereby a higher level of soil organic carbon (SOC) stock is associated with a better animal performance and less nutrient losses into watercourses, and a higher stocking density with greater botanical diversity and elevated SOC. We contend that a combination of farming system-wide trials and environmental instrumentation provides an ideal setting for enrolling scientifically sound and biologically informative metrics for agricultural sustainability, through which agricultural producers could obtain guidance to manage soils, water, pasture and livestock in an economically and environmentally acceptable manner. Priority areas for future farm-scale research to ensure long-term sustainability are also discussed.
The objective of this study was to assess determinants of poor sleep quality which is an under-diagnosed and under-treated problem in elderly patients with diabetes mellitus, hyperlipidemia and hypertension.
Poor sleep quality is linked to decreased quality of life, increased morbidity and mortality. Poor sleep quality is common in the elderly population with associated cardiometabolic risk factors such as diabetes, hyperlipidemia and hypertension.
This is a cross-sectional study undertaken in the primary healthcare setting (Singhealth Polyclinics-Outram) in Singapore. Singaporeans aged 65 years and above who had at least one of the three cardiometabolic risk factors (diabetes, hypertension and hyperlipidemia) were identified. Responders’ sleep quality was assessed using the Pittsburgh Sleep Quality Index (PSQI) questionnaire and were divided into those with good quality sleep and those with poor quality sleep, based on the PSQI score. Information on demographics, co-morbidities and lifestyle practices were collected. Descriptive and multivariate analyses of determinants of poor sleep were determined.
There were 199 responders (response rate 88.1%). Nocturia (adjusted prevalence rate ratio 1.54, 95% confidence interval 1.06–2.26) was found to be associated with an increased risk of poor sleep quality in elderly patients with diabetes mellitus, hypertension and hyperlipidaemia. Nocturia, a prevalent problem in the Asian elderly population, has been found to be associated with poor sleep quality in our study. Hence, it is imperative to identify and treat patients with nocturia to improve sleep quality among them.
Human bocaviruses (HBoVs) have been detected in human gastrointestinal infections worldwide. In 2005, HBoV was also discovered in infants and children with infections of the lower respiratory tract. Recently, several genotypes of this parvovirus, including HBoV genotype 2 (HBoV2), genotype 3 (HBoV3) and genotype 4 (HBoV4), were discovered and found to be closely related to HBoV. HBoV2 was first detected in stool samples from children in Pakistan, followed by detection in other countries. HBoV3 was detected in Australia and HBoV4 was identified in stool samples from Nigeria, Tunisia and the USA. Recently, HBoV infection has been on the rise throughout the world, particularly in countries neighbouring South Korea; however, there have been very few studies on Korean strains. In this study, we characterised the whole genome and determined the phylogenetic position of CUK-BC20, a new clinical HBoV strain isolated in South Korea. The CUK-BC20 genome of 5184 nucleotides (nt) contains three open-reading frames (ORFs). The genotype of CUK-BC20 is HBoV2, and 98.77% of its nt sequence is identical with those of other HBoVs, namely Rus-Nsc10-N386. Especially, the ORF3 amino acid sequences from positions 212–213 and 454 corresponding to a variable region (VR)1 and VR5, respectively, showed genotype-specific substitutions that distinguished the four HBoV genotypes. As the first whole-genome sequence analysis of HBoV in South Korea, this information will provide a valuable reference for the detection of recombination, tracking of epidemics and development of diagnosis methods for HBoV.