We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In the present study, we investigated the influence of different mid-stage N compensation timings on agronomic and physiological traits associated with grain yield and quality in field experiments. Two japonica rice cultivars with a good tasting quality (Nangeng 9108 and Nangeng 5055) were examined under eight N compensation timings (N1–N6: one-time N compensation at 7-2 weeks before heading; N7: split N compensation at 5 and 3 weeks before heading; N8: split N compensation at 4 and 2 weeks before heading) and a control with no N compensation. The highest yield was obtained with N7, followed by N3. The yield advantage is mainly attributable to the improved population structure (higher productive tiller rate with a stable number of effective panicles), higher total number of spikelets per unit area (large panicles with more grains per panicle), larger leaf area index in the late period and higher photosynthetic production capacity (more dry matter accumulation and transportation in the middle and late periods). Delaying N compensation timing improved the processing and nutritional quality of rice, but decreased the quality of appearance and cooking/eating traits. Our results suggest that, from the perspective of achieving relative coordination between high yield and high quality of japonica rice, the optimal N compensation should be divided equally at 5 and 3 weeks before heading. However, if simplifying the number of operations and the pursuit of eating quality were considered, one-time N compensation should be conducted at 5 weeks before heading.
Millions of people visit US national parks annually to engage in recreational wilderness activities, which can occasionally result in traumatic injuries that require timely, high-level care. However, no study to date has specifically examined timely access to trauma centers from national parks. This study aimed to examine the accessibility of trauma care from national parks by calculating the travel time by ground and air from each park to its nearest trauma center. Using these calculations, the percentage of parks by census region with timely access to a trauma center was determined.
Methods:
This was a cross-sectional study analyzing travel times by ground and air transport between national parks and their closest adult advanced trauma center (ATC) in 2018. A list of parks was compiled from the National Parks Service (NPS) website, and the location of trauma centers from the 2018 National Emergency Department Inventory (NEDI)-USA database. Ground and air transport times were calculated using Google Maps and ArcGIS, with medians and interquartile ranges reported by US census region. Percentage of parks by region with timely trauma center access—defined as access within 60 minutes of travel time—were determined based on these calculated travel times.
Results:
In 2018, 83% of national parks had access to an adult ATC within 60 minutes of air travel, while only 26% had timely access by ground. Trauma center access varied by region, with median travel times highest in the West for both air and ground transport. At a national level, national parks were unequally distributed, with the West housing the most parks of all regions.
Conclusion:
While most national parks had timely access to a trauma center by air travel, significant gaps in access remain for ground, the extent of which varies greatly by region. To improve the accessibility of trauma center expertise from national parks, the study highlights the potential that increased implementation of trauma telehealth in emergency departments (EDs) may have in bridging these gaps.
Glutamatergic dysfunction has been implicated in sensory integration deficits in schizophrenia, yet how glutamatergic function contributes to behavioural impairments and neural activities of sensory integration remains unknown.
Methods
Fifty schizophrenia patients and 43 healthy controls completed behavioural assessments for sensory integration and underwent magnetic resonance spectroscopy (MRS) for measuring the anterior cingulate cortex (ACC) glutamate levels. The correlation between glutamate levels and behavioural sensory integration deficits was examined in each group. A subsample of 20 pairs of patients and controls further completed an audiovisual sensory integration functional magnetic resonance imaging (fMRI) task. Blood Oxygenation Level Dependent (BOLD) activation and task-dependent functional connectivity (FC) were assessed based on fMRI data. Full factorial analyses were performed to examine the Group-by-Glutamate Level interaction effects on fMRI measurements (group differences in correlation between glutamate levels and fMRI measurements) and the correlation between glutamate levels and fMRI measurements within each group.
Results
We found that schizophrenia patients exhibited impaired sensory integration which was positively correlated with ACC glutamate levels. Multimodal analyses showed significantly Group-by-Glutamate Level interaction effects on BOLD activation as well as task-dependent FC in a ‘cortico-subcortical-cortical’ network (including medial frontal gyrus, precuneus, ACC, middle cingulate gyrus, thalamus and caudate) with positive correlations in patients and negative in controls.
Conclusions
Our findings indicate that ACC glutamate influences neural activities in a large-scale network during sensory integration, but the effects have opposite directionality between schizophrenia patients and healthy people. This implicates the crucial role of glutamatergic system in sensory integration processing in schizophrenia.
Ascaridia galli (Nematoda: Ascaridiidae) is the most common intestinal roundworm of chickens and other birds with a worldwide distribution. Although A. galli has been extensively studied, knowledge of the genetic variation of this parasite in detail is still insufficient. The present study examined genetic variation in the mitochondrial cytochrome c oxidase subunit 1 (cox1) gene among A. galli isolates (n = 26) from domestic chickens in Hunan Province, China. A portion of the cox1 (pcox1) gene was amplified by polymerase chain reaction separately from adult A. galli individuals and the amplicons were subjected to sequencing from both directions. The length of the sequences of pcox1 is 441 bp. Although the intra-specific sequence variation within A. galli is 0–7.7%, the inter-specific sequence differences among other members of the infraorder Ascaridomorpha were 11.4–18.9%. Phylogenetic analyses based on the maximum likelihood method using the sequences of pcox1 confirmed that all of the Ascaridia isolates were A. galli, and also resolved three distinct clades. Taken together, the findings suggest that A. galli may represent a complex of cryptic species. Our results provide an additional genetic marker for the management of A. galli in chickens and other birds.
Background: Medulloblastoma (MB) is the most common solid malignant pediatric brain neoplasm. Group 3 (G3) MB, particularly MYC amplified G3 MB, is the most aggressive subgroup with the highest frequency of children presenting with metastatic disease, and is associated with a poor prognosis. To further our understanding of the role of MSI1 in MYC amplified G3 MB, we performed an unbiased integrative analysis of eCLIP binding sites, with changes observed at the transcriptome, the translatome, and the proteome after shMSI1 inhibition. Methods: Primary human pediatric MBs, SU_MB002 and HD-MB03 were kind gifts from Dr. Yoon-Jae Cho (Harvard, MS) and Dr. Till Milde (Heidelberg) and cultured for in vitro and in vivo experiments. eCLIP, RNA-seq, Polysome-seq, and TMT-MS were completed as previously described. Results:MSI1 is overexpressed in G3 MB. shRNA Msi1 interference resulted in a reduction in tumour burden conferring a survival advantage to mice injected with shMSI1 G3MB cells. Robust ranked multiomic analysis (RRA) identified an unconventional gene set directly perturbed by MSI1 in G3 MB. Conclusions: Our robust unbiased integrative analysis revealed a distinct role for MSI1 in the maintenance of the stem cell state in G3 MB through post-transcriptional modification of multiple pathways including identification of unconventional targets such as HIPK1.
Background: The goal of the study was to assess responder rates at various times after initiating atogepant treatment. Methods: A 12-week phase 3 trial evaluated the safety, efficacy, and tolerability of atogepant for preventive treatment of migraine (ADVANCE; NCT03777059) in adult participants with a ≥1-year history of migraine, experiencing 4-14 migraine days/month. Participants were randomized to atogepant 10, 30, or 60mg, or placebo once daily. These analyses evaluated ≥25%, ≥50%, ≥75%, and 100% reductions in mean monthly migraine days (MMDs) across 12 weeks and each 4-week interval. Adverse events (AEs) in ≥5% of participants are reported. Results: The efficacy analysis population included 873 participants: placebo: n=214; atogepant: 10mg: n=214; 30mg: n=223; 60mg: n=222. Atogepant-treated participants were more likely to experience a ≥50% reduction in the 3-month mean MMDs (56-61% vs 29% with placebo; P<0.0001). The proportions of participants experiencing ≥25%, ≥50%, ≥75%, and 100% reductions in mean MMDs significantly increased during each 4-week interval (≥50% reduction: 48-71% vs 27-47% with placebo). The most common AEs for atogepant were constipation (6.9-7.7%) and nausea (4.4-6.1%). Conclusions: Once-daily atogepant 10, 30, and 60mg significantly increased responder rates at all thresholds with approximately 60% achieving a ≥50% reduction in mean MMDs at 12 weeks.
This study aimed to investigate the association of nasal nitric oxide and olfactory function.
Method
A cross-sectional study was performed in 117 adults, including 91 patients with chronic rhinosinusitis and 26 healthy controls. Scores on the 22-item Sino-Nasal Outcomes Test, Lund-Mackay scale and Lund-Kennedy scale were recorded to assess severity of disease. All participants were screened for common inhaled and food allergens. Nasal nitric oxide and fractional exhaled nitric oxide testing, acoustic rhinometry and anterior rhinomanometry testing were performed to measure nasal function. The validated Sniffin’ Sticks test battery was used to assess olfactory function.
Results
Higher nasal nitric oxide was an independent protective factor for odour discrimination and odour threshold in participants with chronic rhinosinusitis after adjusting for age, gender, drinking, smoking, 22-item Sino-Nasal Outcomes Test, Lund-Mackay score, Lund-Kennedy score, immunoglobulin E and the second minimal cross-sectional area by acoustic rhinometry. Nasal nitric oxide also showed high discrimination in predicting impaired odour discrimination. In addition, nasal nitric oxide was lower in older participants, those with higher Lund-Mackay or Lund-Kennedy scores and higher with elevated total serum immunoglobulin E concentrations above a threshold of 0.35 kU/l.
Conclusion
Higher nasal nitric oxide is associated with better odour discrimination in chronic rhinosinusitis and is modulated by age, degree of allergy and severity of chronic rhinosinusitis.
Microfluidic systems consisting of a square microchannel with an orthogonal side branch are promising tools to enrich or sort suspensions of deformable capsules. To allow their operating control, we numerically consider a train of initially spherical identical capsules, equally spaced along the axis of the feeding channel. The capsules have a strain-hardening membrane, an internal fluid viscosity identical to that of the external fluid and a size comparable to that of the channel. We study the influence of the interspacing on the capsule path selection at the channel bifurcation using a three-dimensional immersed boundary–lattice Boltzmann method. Our objectives are to establish a phase diagram and identify the critical interspacing above which hydrodynamic interaction between capsules no longer affects their path selection. We find two main regimes. At low interspacing, strong capsule interaction leads to an unsteady regime for which the capsule path selection follows either a periodic or a disordered state. Above a critical initial interspacing $d_{ct}$, a steady regime is achieved where interaction between capsules is too weak to affect their path selection. The capsules then follow an identical steady trajectory. We find that the dependence of the interspacing $d_{ct}$, normalised by the capsule radius, on the flow split ratio falls onto a universal curve regardless of the flow strength, capsule size and membrane shear elasticity. We also compare the path selection of a capsule train with that of a two-capsule system, and discuss applications of the present results in controlling capsule trains in microfluidic suspension enrichment devices.
The vitamin B group, including riboflavin, plays paramount roles in one-carbon metabolism (OCM), and disorders related to this pathway have been linked to cancer development. The variants of genes encoding OCM enzymes and the insufficiency of B vitamins could contribute to carcinogenesis. Very few observational studies have revealed a relationship between riboflavin and gastric cancer (GC), especially under conditions of modified genetic factors. We carried out a study examining the association of riboflavin intake and its interaction with MTRR (rs1532268) genetic variants with GC risk among 756 controls and 377 cases. The OR and 95 % CI were evaluated using unconditional logistic regression models. We observed protective effects of riboflavin intake against GC, particularly in the female subgroup (OR = 0·52, 95 % CI 0·28, 0·97, Ptrend = 0·031). In the MTRR (rs1532268) genotypes analysis, the dominant model showed that the effects of riboflavin differed between the CC and CT + TT genotypes. Compared with CC carriers, low riboflavin intake in T+ carriers was significantly associated with a 93 % higher GC risk (OR = 1·93, 95 % CI 1·09, 3·42, Pinteraction = 0·037). In general, higher riboflavin intake might help reduce the risk of GC in both CC and TC + TT carriers, particularly the T+ carriers, with marginal significance (OR = 0·54, 95 % CI 0·28, 1·02, Pinteraction = 0·037). Our study indicates a protective effect of riboflavin intake against GC. Those who carry at least one minor allele and have low riboflavin intake could modify this association to increase GC risk in the Korean population.
We performed secondary analyses of a postdischarge decolonization trial of MRSA carriers that reduced MRSA infection and hospitalization by 30%. Hospitalized MRSA infection was associated with 7.9 days of non-MRSA antibiotics and CDI in 3.9%. Preventing MRSA infection and associated hospitalization may reduce antibiotic use and CDI incidence.
Major depressive disorder (MDD) is a common, debilitating, phenotypically heterogeneous disorder with heritability ranges from 30% to 50%. Compared to other psychiatric disorders, its high prevalence, moderate heritability, and strong polygenicity have posed major challenges for gene-mapping in MDD. Studies of common genetic variation in MDD, driven by large international collaborations such as the Psychiatric Genomics Consortium, have confirmed the highly polygenic nature of the disorder and implicated over 100 genetic risk loci to date. Rare copy number variants associated with MDD risk were also recently identified. The goal of this review is to present a broad picture of our current understanding of the epidemiology, genetic epidemiology, molecular genetics, and gene–environment interplay in MDD. Insights into the impact of genetic factors on the aetiology of this complex disorder hold great promise for improving clinical care.
The coronavirus disease 2019 (COVID-19) pandemic represents an unprecedented threat to mental health. Herein, we assessed the impact of COVID-19 on subthreshold depressive symptoms and identified potential mitigating factors.
Methods
Participants were from Depression Cohort in China (ChiCTR registry number 1900022145). Adults (n = 1722) with subthreshold depressive symptoms were enrolled between March and October 2019 in a 6-month, community-based interventional study that aimed to prevent clinical depression using psychoeducation. A total of 1506 participants completed the study in Shenzhen, China: 726 participants, who completed the study between March 2019 and January 2020 (i.e. before COVID-19), comprised the ‘wave 1’ group; 780 participants, who were enrolled before COVID-19 and completed the 6-month endpoint assessment during COVID-19, comprised ‘wave 2’. Symptoms of depression, anxiety and insomnia were assessed at baseline and endpoint (i.e. 6-month follow-up) using the Patient Health Questionnaire-9 (PHQ-9), Generalised Anxiety Disorder-7 (GAD-7) and Insomnia Severity Index (ISI), respectively. Measures of resilience and regular exercise were assessed at baseline. We compared the mental health outcomes between wave 1 and wave 2 groups. We additionally investigated how mental health outcomes changed across disparate stages of the COVID-19 pandemic in China, i.e. peak (7–13 February), post-peak (14–27 February), remission plateau (28 February−present).
Results
COVID-19 increased the risk for three mental outcomes: (1) depression (odds ratio [OR] = 1.30, 95% confidence interval [CI]: 1.04–1.62); (2) anxiety (OR = 1.47, 95% CI: 1.16–1.88) and (3) insomnia (OR = 1.37, 95% CI: 1.07–1.77). The highest proportion of probable depression and anxiety was observed post-peak, with 52.9% and 41.4%, respectively. Greater baseline resilience scores had a protective effect on the three main outcomes (depression: OR = 0.26, 95% CI: 0.19–0.37; anxiety: OR = 1.22, 95% CI: 0.14–0.33 and insomnia: OR = 0.18, 95% CI: 0.11–0.28). Furthermore, regular physical activity mitigated the risk for depression (OR = 0.79, 95% CI: 0.79–0.99).
Conclusions
The COVID-19 pandemic exerted a highly significant and negative impact on symptoms of depression, anxiety and insomnia. Mental health outcomes fluctuated as a function of the duration of the pandemic and were alleviated to some extent with the observed decline in community-based transmission. Augmenting resiliency and regular exercise provide an opportunity to mitigate the risk for mental health symptoms during this severe public health crisis.
Late-life depression has substantial impacts on individuals, families and society. Knowledge gaps remain in estimating the economic impacts associated with late-life depression by symptom severity, which has implications for resource prioritisation and research design (such as in modelling). This study examined the incremental health and social care expenditure of depressive symptoms by severity.
Methods
We analysed data collected from 2707 older adults aged 60 years and over in Hong Kong. The Patient Health Questionnaire-9 (PHQ-9) and the Client Service Receipt Inventory were used, respectively, to measure depressive symptoms and service utilisation as a basis for calculating care expenditure. Two-part models were used to estimate the incremental expenditure associated with symptom severity over 1 year.
Results
The average PHQ-9 score was 6.3 (standard deviation, s.d. = 4.0). The percentages of respondents with mild, moderate and moderately severe symptoms and non-depressed were 51.8%, 13.5%, 3.7% and 31.0%, respectively. Overall, the moderately severe group generated the largest average incremental expenditure (US$5886; 95% CI 1126–10 647 or a 272% increase), followed by the mild group (US$3849; 95% CI 2520–5177 or a 176% increase) and the moderate group (US$1843; 95% CI 854–2831, or 85% increase). Non-psychiatric healthcare was the main cost component in a mild symptom group, after controlling for other chronic conditions and covariates. The average incremental association between PHQ-9 score and overall care expenditure peaked at PHQ-9 score of 4 (US$691; 95% CI 444–939), then gradually fell to negative between scores of 12 (US$ - 35; 95% CI - 530 to 460) and 19 (US$ -171; 95% CI - 417 to 76) and soared to positive and rebounded at the score of 23 (US$601; 95% CI -1652 to 2854).
Conclusions
The association between depressive symptoms and care expenditure is stronger among older adults with mild and moderately severe symptoms. Older adults with the same symptom severity have different care utilisation and expenditure patterns. Non-psychiatric healthcare is the major cost element. These findings inform ways to optimise policy efforts to improve the financial sustainability of health and long-term care systems, including the involvement of primary care physicians and other geriatric healthcare providers in preventing and treating depression among older adults and related budgeting and accounting issues across services.
To describe epidemiologic and genomic characteristics of a severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) outbreak in a large skilled-nursing facility (SNF), and the strategies that controlled transmission.
Design, setting, and participants:
This cohort study was conducted during March 22–May 4, 2020, among all staff and residents at a 780-bed SNF in San Francisco, California.
Methods:
Contact tracing and symptom screening guided targeted testing of staff and residents; respiratory specimens were also collected through serial point prevalence surveys (PPSs) in units with confirmed cases. Cases were confirmed by real-time reverse transcription–polymerase chain reaction testing for SARS-CoV-2, and whole-genome sequencing (WGS) was used to characterize viral isolate lineages and relatedness. Infection prevention and control (IPC) interventions included restricting from work any staff who had close contact with a confirmed case; restricting movement between units; implementing surgical face masking facility-wide; and the use of recommended PPE (ie, isolation gown, gloves, N95 respirator and eye protection) for clinical interactions in units with confirmed cases.
Results:
Of 725 staff and residents tested through targeted testing and serial PPSs, 21 (3%) were SARS-CoV-2 positive: 16 (76%) staff and 5 (24%) residents. Fifteen cases (71%) were linked to a single unit. Targeted testing identified 17 cases (81%), and PPSs identified 4 cases (19%). Most cases (71%) were identified before IPC interventions could be implemented. WGS was performed on SARS-CoV-2 isolates from 4 staff and 4 residents: 5 were of Santa Clara County lineage and the 3 others were distinct lineages.
Conclusions:
Early implementation of targeted testing, serial PPSs, and multimodal IPC interventions limited SARS-CoV-2 transmission within the SNF.
Background: Methicillin-resistant Staphylococcus aureus (MRSA) is responsible for the largest number of invasive infections due to a multidrug-resistant pathogen. Approximately 10% of hospitalized carriers will experience invasive MRSA disease in the year following discharge incurring antibiotic therapy beyond focused treatment of MRSA. Objective: We aimed to quantify the extent of non-MRSA empiric antibiotics incurred by MRSA infections and further assess the risk of Clostridioides difficile Infection (CDI) as a result of treatment of MRSA infection. Methods: The CLEAR Trial was a postdischarge randomized controlled trial of 2,121 MRSA carriers comparing MRSA education alone to education plus repeated decolonization that demonstrated a 30% reduction in MRSA infection and a 17% reduction in all-cause infection attributable to decolonization in the year following hospital discharge (Huang SS, NEJM 2019). We included all hospitalization outcomes due to MRSA infection in the CLEAR Trial with detailed medication administration records to quantify unintended consequences of MRSA infection related to empiric non-MRSA antibiotic use and resultant CDI. Full-text medical records were reviewed with a standardized abstraction form to collect inpatient administered antibiotics and hospital-associated CDI. Results: In total,154 hospitalizations due to MRSA infection with a mean length-of-stay of 10.6 days were identified. During 25 hospitalizations (16.2%), patients received only anti-MRSA antibiotics. During the remaining 129 (83.8%) hospitalizations, patients received a mean of 1.6 distinct non-MRSA antibiotics totaling a mean of 6.6 days of therapy (DOT). Empiric non-MRSA therapy was given for 3.2 DOT before MRSA culture results became available and was continued for an additional 3.4 DOT afterward. Among all 849 non-MRSA DOT, the most common were due to piperacillin-tazobactam (293 DOT, 34.5%), levofloxacin (105 DOT, 12.4%), and metronidazole (93 DOT, 11.0%). Across all 154 hospitalizations, a mean of 5.5 non-MRSA DOT was calculated per MRSA hospitalization, with 6 CDI cases (3.9%) as a direct sequelae of empiric non-MRSA antibiotics provided for MRSA infection. Conclusions: Hospitalization for MRSA infection results in extensive non-MRSA empiric antibiotic therapy both before and after MRSA culture results are known. This antibiotic use is associated with a 3.9% risk of CDI that exceeds the national risk of acquiring CDI (3.2 per 1,000 admissions) by 12-fold during any hospital stay (Barrett ML, AHRQ 2018). The CLEAR Trial findings that postdischarge decolonization reduces MRSA infection and hospitalization by 30% suggests that decolonization may also reduce non-MRSA antibiotic use and CDI in this population.
Over the past decades, anti-cancer treatments have evolved rapidly from cytotoxic chemotherapies to targeted therapies including oral targeted medications and injectable immunooncology and cell therapies. New anti-cancer medications come to markets at increasingly high prices, and health insurance coverage is crucial for patient access to these therapies. State laws are intended to facilitate insurance coverage of anti-cancer therapies.
Using Massachusetts as a case study, we identified five current cancer coverage state laws and interviewed experts on their perceptions of the relevance of the laws and how well they meet the current needs of cancer care given rapid changes in therapies. Interviewees emphasized that cancer therapies, as compared to many other therapeutic areas, are unique because insurance legislation targets their coverage. They identified the oral chemotherapy parity law as contributing to increasing treatment costs in commercial insurance. For commercial insurers, coverage mandates combined with the realities of new cancer medications — including high prices and often limited evidence of efficacy at approval — compound a difficult situation. Respondents recommended policy approaches to address this challenging coverage environment, including the implementation of closed formularies, the use of cost-effectiveness studies to guide coverage decisions, and the application of value-based pricing concepts. Given the evolution of cancer therapeutics, it may be time to evaluate the benefits and challenges of cancer coverage mandates.
Klebsiella pneumoniae is a common pathogen associated with nosocomial infections and is characterised serologically by capsular polysaccharide (K) and lipopolysaccharide O antigens. We surveyed a total of 348 non-duplicate K. pneumoniae clinical isolates collected over a 1-year period in a tertiary care hospital, and determined their O and K serotypes by sequencing of the wbb Y and wzi gene loci, respectively. Isolates were also screened for antimicrobial resistance and hypervirulent phenotypes; 94 (27.0%) were identified as carbapenem-resistant (CRKP) and 110 (31.6%) as hypervirulent (hvKP). isolates fell into 58 K, and six O types, with 92.0% and 94.2% typeability, respectively. The predominant K types were K14K64 (16.38%), K1 (14.66%), K2 (8.05%) and K57 (5.46%), while O1 (46%), O2a (27.9%) and O3 (11.8%) were the most common. CRKP and hvKP strains had different serotype distributions with O2a:K14K64 (41.0%) being the most frequent among CRKP, and O1:K1 (26.4%) and O1:K2 (17.3%) among hvKP strains. Serotyping by gene sequencing proved to be a useful tool to inform the clinical epidemiology of K. pneumoniae infections and provides valuable data relevant to vaccine design.