To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Surveillance of non–ventilator-associated hospital-acquired pneumonia (NV-HAP) is complicated by subjectivity and variability in diagnosing pneumonia. We compared a fully automatable surveillance definition using routine electronic health record data to manual determinations of NV-HAP according to surveillance criteria and clinical diagnoses.
We retrospectively applied an electronic surveillance definition for NV-HAP to all adults admitted to Veterans’ Affairs (VA) hospitals from January 1, 2015, to November 30, 2020. We randomly selected 250 hospitalizations meeting NV-HAP surveillance criteria for independent review by 2 clinicians and calculated the percent of hospitalizations with (1) clinical deterioration, (2) CDC National Healthcare Safety Network (CDC-NHSN) criteria, (3) NV-HAP according to a reviewer, (4) NV-HAP according to a treating clinician, (5) pneumonia diagnosis in discharge summary; and (6) discharge diagnosis codes for HAP. We assessed interrater reliability by calculating simple agreement and the Cohen κ (kappa).
Among 3.1 million hospitalizations, 14,023 met NV-HAP electronic surveillance criteria. Among reviewed cases, 98% had a confirmed clinical deterioration; 67% met CDC-NHSN criteria; 71% had NV-HAP according to a reviewer; 60% had NV-HAP according to a treating clinician; 49% had a discharge summary diagnosis of pneumonia; and 82% had NV-HAP according to any definition according to at least 1 reviewer. Only 8% had diagnosis codes for HAP. Interrater agreement was 75% (κ = 0.50) for CDC-NHSN criteria and 78% (κ = 0.55) for reviewer diagnosis of NV-HAP.
Electronic NV-HAP surveillance criteria correlated moderately with existing manual surveillance criteria. Reviewer variability for all manual assessments was high. Electronic surveillance using clinical data may therefore allow for more consistent and efficient surveillance with similar accuracy compared to manual assessments or diagnosis codes.
The mitochondrial genome provides important information for phylogenetic analysis and an understanding of evolutionary origin. In this study, the mitochondrial genomes of Ilisha elongata and Setipinna tenuifilis were sequenced, which are typical circular vertebrate mitochondrial genomes composed of 16,770 and 16,805 bp, respectively. The mitogenomes of I. elongata and S. tenuifilis include 13 protein-coding genes (PCGs), 22 transfer RNA (tRNA), two ribosomal RNA (rRNA) genes and one control region (CR). Both two species' genome compositions were highly A + T biased and exhibited positive AT-skews and negative GC-skews. The genetic distance and Ka/Ks ratio analyses indicated that 13 PCGs were affected by purifying selection and the selection pressures were different from certain deep-sea fishes, which were most likely due to the difference in their living environment. Results of phylogenetic analysis support close relationships among Chirocentridae, Denticipitidae, Clupeidae, Engraulidae and Pristigasteridae based on the nucleotide and amino acid sequences of 13 PCGs. Within Clupeoidei, I. elongata and S. tenuifilis were most closely related to the family Pristigasteridae and Engraulidae, respectively. These results will help to better understand the evolutionary position of Clupeiformes and provide a reference for further phylogenetic research on Clupeiformes species.
To determine whether a clinician-directed acute respiratory tract infection (ARI) intervention was associated with improved antibiotic prescribing and patient outcomes across a large US healthcare system.
Multicenter retrospective quasi-experimental analysis of outpatient visits with a diagnosis of uncomplicated ARI over a 7-year period.
Outpatients with ARI diagnoses: sinusitis, pharyngitis, bronchitis, and unspecified upper respiratory tract infection (URI-NOS). Outpatients with concurrent infection or select comorbid conditions were excluded.
Audit and feedback with peer comparison of antibiotic prescribing rates and academic detailing of clinicians with frequent ARI visits. Antimicrobial stewards and academic detailing personnel delivered the intervention; facility and clinician participation were voluntary.
We calculated the probability to receive antibiotics for an ARI before and after implementation. Secondary outcomes included probability for a return clinic visits or infection-related hospitalization, before and after implementation. Intervention effects were assessed with logistic generalized estimating equation models. Facility participation was tracked, and results were stratified by quartile of facility intervention intensity.
We reviewed 1,003,509 and 323,023 uncomplicated ARI visits before and after the implementation of the intervention, respectively. The probability to receive antibiotics for ARI decreased after implementation (odds ratio [OR], 0.82; 95% confidence interval [CI], 0.78–0.86). Facilities with the highest quartile of intervention intensity demonstrated larger reductions in antibiotic prescribing (OR, 0.69; 95% CI, 0.59–0.80) compared to nonparticipating facilities (OR, 0.89; 95% CI, 0.73–1.09). Return visits (OR, 1.00; 95% CI, 0.94–1.07) and infection-related hospitalizations (OR, 1.21; 95% CI, 0.92–1.59) were not different before and after implementation within facilities that performed intensive implementation.
Implementation of a nationwide ARI management intervention (ie, audit and feedback with academic detailing) was associated with improved ARI management in an intervention intensity–dependent manner. No impact on ARI-related clinical outcomes was observed.
The Core Elements of Outpatient Antibiotic Stewardship provides a framework to improve antibiotic use, but cost-effectiveness data on implementation of outpatient antibiotic stewardship interventions are limited. We evaluated the cost-effectiveness of Core Element implementation in the outpatient setting.
An economic simulation model from the health-system perspective was developed for patients presenting to outpatient settings with uncomplicated acute respiratory tract infections (ARI). Effectiveness was measured as quality-adjusted life years (QALYs). Cost and utility parameters for antibiotic treatment, adverse drug events (ADEs), and healthcare utilization were obtained from the literature. Probabilities for antibiotic treatment and appropriateness, ADEs, hospitalization, and return ARI visits were estimated from 16,712 and 51,275 patient visits in intervention and control sites during the pre- and post-implementation periods, respectively. Data for materials and labor to perform the stewardship activities were used to estimate intervention cost. We performed a one-way and probabilistic sensitivity analysis (PSA) using 1,000,000 second-order Monte Carlo simulations on input parameters.
The proportion of ARI patient-visits with antibiotics prescribed in intervention sites was lower (62% vs 74%) and appropriate treatment higher (51% vs 41%) after implementation, compared to control sites. The estimated intervention cost over a 2-year period was $133,604 (2018 US dollars). The intervention had lower mean costs ($528 vs $565) and similar mean QALYs (0.869 vs 0.868) per patient compared to usual care. In the PSA, the intervention was dominant in 63% of iterations.
Implementation of the CDC Core Elements in the outpatient setting was a cost-effective strategy.
EPA and DHA are essential for maternal and fetal health, but epidemiological data are sparse in China. We examined the trends of EPA alone and a combination of EPA plus DHA in pregnant and lactating women in three distinct geographic regions in China and explored their potential influencing factors. A total of 1015 healthy women during mid-pregnancy, late pregnancy or lactation were recruited from Weihai (coastland), Yueyang (lakeland) and Baotou (inland) cities of China between May and July of 2014. Maternal EPA and DHA concentrations (percentage of total fatty acids) in plasma and erythrocytes were measured by capillary GC. Adjusted EPA plus DHA concentrations in both plasma and erythrocytes significantly declined from mid-pregnancy (2·92 %, 6·95 %) to late pregnancy (2·20 %, 6·42 %) and lactation (2·40 %, 6·29 %) (Ptrend < 0·001); and both concentrations were highest in coastland, followed by lakeland, and lowest in inland (P < 0·001). Regarding EPA alone, the concentrations were higher in women during lactation or late pregnancy and in women in coastland and inland areas. Moreover, concentrations of EPA or EPA plus DHA were higher in women with older age, higher education, higher annual family income per capita and higher dietary intake of marine aquatic product and mutton. In lactating women, erythrocyte EPA concentration was higher in those having breast-feeding partially v. exclusively. In conclusion, maternal plasma and erythrocyte concentrations of EPA plus DHA or EPA alone differed with geographic regions, physiological periods and maternal characteristics, indicating a need of population-specific health strategies to improve fatty acids status in pregnant and lactating women.
Schools offer an ideal setting for childhood obesity interventions due to their access to children and adolescents. This review aimed to systematically review the impact of school-based intervention for the treatment of childhood obesity.
Eight databases were searched from inception till 30 May 2020. A revised Cochrane risk-of-bias tool and the Grading of Recommendations, Assessment, Development and Evaluations criteria were used to evaluate the risk of bias and overall evidence. Meta-analysis and meta-regression were performed on Stata software using the random effects model. Overall effect was evaluated using Hedges’ g, and heterogeneity was assessed using Cochran’s Q and I2.
Cluster randomised controlled trials (cluster-RCT) delivered in school.
Children and adolescents (6–18 years of age) with overweight and obesity.
Twelve cluster-RCT from seven countries with 1755 participants were included in the meta-analysis. School-based interventions for the treatment of childhood obesity reduced BMI and BMI z-scores with a medium effect (g = 0·52). Subgroup analyses showed greater effectiveness of brief school-based interventions and the interventions conducted in lower-middle to upper-middle economies. Meta-regression assessed the heterogeneity and the final model, with covariates of the type of economies and trial duration, accounted for 41·2 % of the variability. The overall quality of evidence was rated low because of the high risk of bias and inconsistency.
School-based interventions are a possible approach to provide universal healthcare for the treatment of childhood obesity, and further well-designed cluster-RCT with longer follow-up are needed. This study is registered with PROSPERO (CRD42020160735).
The aim of this study was to assess the current status of disease-related knowledge and to analyze the relationship among the general condition, illness perception, and psychological status of patients with coronavirus disease 2019 (COVID-19).
A hospital-based cross-sectional study was conducted on 118 patients using convenience sampling. The general questionnaire, disease-related knowledge questionnaire of COVID-19, Illness Perception Questionnaire (IPQ), and Profile of Mood States (POMS) were used to measure the current status of participants.
The overall average score of the disease-related knowledge of patients with COVID-19 was (79.19 ± 14.25), the self-care situation was positively correlated with knowledge of prevention and control (r = 0.265; P = 0.004) and total score of disease-related knowledge (r = 0.206; P = 0.025); the degree of anxiety was negatively correlated with the knowledge of diagnosis and treatment (r = −0.182; P = 0.049). The score of disease-related knowledge was negatively correlated with negative cognition (volatility, consequences, emotional statements) and negative emotions (tension, fatigue, depression) (P < 0.05); positively correlated with positive cognition (disease coherence) and positive emotion (self-esteem) (P < 0.05).
It was recommended that we should pay more attention to the elderly and low-income groups, and increase the knowledge about diagnosis and treatment of COVID-19 and self-care in the future health education for patients.
The prevalence of central obesity in the total population has been reported in numerous studies. However, information on the prevalence of central obesity within normal-category BMI is scant. In the present study, we examined the profiles of central obesity among normal-weight children and adolescents. A total of 29 516 (14 226 boys and 15 290 girls) normal-weight children and adolescents (excluding underweight, overweight and obesity) aged 7–18 years were included in the final analysis. Central obesity was defined by the international age- and sex-specific cut-offs of waist circumference (WC) and threshold of waist:height ratio (WHtR ≥ 0·5). All subjects were classified into four groups (Q1–Q4) according to the age- and sex-specific quartiles of BMI, those in the upper fourth (Q4) were defined as ‘high-normal BMI’ and those in the lower fourth (Q1) were defined as ‘low-normal BMI’. The prevalence of central obesity as measured by WC was 9·90 (95 % CI 9·41, 10·39) % for boys and 8·11 (95 % CI 7·68, 8·54) % for girls; by WHtR was 2·97 (95 % CI 2·69, 3·25) % for boys and 2·44 (95 % CI 2·20, 2·68) % for girls. Subjects in the Q4 group had a much higher prevalence of central obesity than their counterparts in the Q1 group (P < 0·01). Our findings suggest that the health risks of children with normal-weight central obesity may be missed when BMI is used alone as a measure; it is meaningful to include WC in clinical practice and to include the simple message ‘Keep your waist to less than half your height’.
Folate status for women during early pregnancy has been investigated, but data for women during mid-pregnancy, late pregnancy or lactation are sparse or lacking. Between May and July 2014, we conducted a cross-sectional study in 1211 pregnant and lactating women from three representative regions in China. Approximately 135 women were enrolled in each stratum by physiological periods (mid-pregnancy, late pregnancy or lactation) and regions (south, central or north). Plasma folate concentrations were measured by microbiological assay. The adjusted medians of folate concentration decreased from 28·8 (interquartile range (IQR) 19·9, 38·2) nmol/l in mid-pregnancy to 18·6 (IQR 13·2, 26·4) nmol/l in late pregnancy, and to 17·0 (IQR 12·3, 22·5) nmol/l in lactation (Pfor trend < 0·001). Overall, lower folate concentrations were more likely to be observed in women residing in the northern region, with younger age, higher pre-pregnancy BMI, lower education or multiparity, and in lactating women who had undergone a Caesarean delivery or who were breastfeeding exclusively. In total, 380 (31·4 %) women had a suboptimal folate status (folate concentration <13·5 nmol/l). Women in late pregnancy and lactating, residing in the northern region, having multiparity and low education level had a higher risk of suboptimal folate status, while those with older age had a lower risk. In conclusion, maternal plasma folate concentrations decreased as pregnancy progressed, and were influenced by geographic region and maternal socio-demographic characteristics. Future studies are warranted to assess the necessity of folic acid supplementation during later pregnancy and lactation especially for women at a higher risk of folate depletion.
Background: Acute respiratory infections (ARIs) are a key target to improve antibiotic use in the outpatient setting. The Core Elements of Outpatient Antibiotic Stewardship provide a framework for improving antibiotic use, but data on safety and effectiveness of interventions to improve antibiotic use are limited. We report the impact of Core Elements implementation within Veterans’ Healthcare Administration clinics on antibiotic prescribing and patient outcomes. Methods: The intervention targeting treatment of uncomplicated ARIs (sinusitis, pharyngitis, bronchitis, and viral upper respiratory infections [URIs]) in emergency department and primary care settings was initiated within 10 sites between September 2017 and January 2018. The intervention was developed using the Core Elements and included local site champions, audit-and-feedback with peer comparison, and academic detailing. We evaluated the following outcomes: per-visit antibiotic prescribing rates overall and by diagnosis; appropriateness of treatment; 30-day ARI revisits; 30-day infectious complications (eg,, pneumonia,); 30-day adverse medication effects; 90-day Clostridium difficile infection (CDI); and 30-day hospitalizations. Multilevel logistic regression was used to calculate rate ratios (RR) with 95% CI for each outcome in the postintervention period (12 months) compared to the preintervention period (39–42 months). Results: There were 14,020 uncomplicated ARI visits before the intervention and 4,866 uncomplicated ARI visits after the intervention. The proportions of uncomplicated ARI visits with antibiotics prescribed were 59.17% before the intervention versus 44.34% after the intervention. A trend in reduced antibiotic prescribing for ARIs throughout the entire (before and after) observation period was evident (0.92; 95% CI, 0.90–0.94); however, a significant reduction in antibiotic prescribing after the intervention was identified (0.74; 95% CI, 0.59–0.93). Per-visit antibiotic prescribing rates decreased significantly for bronchitis and URI (0.54; 95% CI, 0.44–0.65), pharyngitis (0.76; 95% CI, 0.67–0.86), and sinusitis (0.92; 95% CI, 0.85–1.0). Appropriate therapy for pharyngitis increased (1.43; 95% CI, 1.21–1.68), but appropriate therapy for sinusitis remained unchanged (0.92; 95% CI, 0.85–1.0) after the intervention. Complications associated with antibiotic undertreatment were not different after the intervention: ARI-related revisit rates (1.01; 95% CI, 0.98–1.05) and infectious complications (1.01; 95% CI, 0.79–1.28). A potential benefit of improved antibiotic use included a reduction in visits for adverse medication effects (0.82; 95% CI, 0.72–0.94). Furthermore, 90-day CDI events were too sparse to model: preintervention incidence was 0.08% and postintervention incidence was 0.06%. Additionally, 30-day hospitalizations were significantly lower in the postintervention period (0.79; 95% CI, 0.72–0.87). Conclusions: Implementation of the Core Elements was safe and effective and was associated with reduced antibiotic prescribing rates for uncomplicated ARIs, improvements in diagnosis-specific appropriate therapy, visits for adverse antibiotic effects, and 30-day hospitalization rates. No adverse events were noted in ARI-related revisit rates or infectious complications. CDI rates were low and unchanged.
Background: The Core Elements of Outpatient Antibiotic Stewardship provide a framework to improve antibiotic use, but cost-effectiveness data on interventions to improve antibiotic use are limited. Beginning in September 2017, an antibiotic stewardship intervention was launched in within 10 outpatient Veterans Healthcare Administration clinics. The intervention was based on the Core Elements and used an academic detailing (AD) and an audit and feedback (AF) approach to encourage appropriate use of antibiotics. The objective of this analysis was to evaluate the cost-effectiveness of the intervention among patients with uncomplicated acute respiratory tract infections (ARI). Methods: We developed an economic simulation model from the VA’s perspective for patients presenting for an index outpatient clinic visit with an ARI (Fig. 1). Effectiveness was measured as quality-adjusted life-years (QALYs). Cost and utility parameters for antibiotic treatment, adverse drug reactions (ADRs), and healthcare utilization were obtained from the published literature. Probability parameters for antibiotic treatment, appropriateness of treatment, antibiotic ADRs, hospitalization, and return ARI visits were estimated using VA Corporate Data Warehouse data from a total of 22,137 patients in the 10 clinics during 2014–2019 before and after the intervention. Detailed cost data on the development of the AD and AF materials and electronically captured time and effort for the National AD Service activities by specific providers from a national ARI campaign were used as a proxy for the cost estimate of similar activities conducted in this intervention. We performed 1-way and probabilistic sensitivity analyses (PSAs) using 10,000 second-order Monte Carlo simulations on costs and utility values using their means and standard deviations. Results: The proportion of uncomplicated ARI visits with antibiotics prescribed (59% vs 40%) was lower and appropriate treatment was higher (24% vs 32%) after the intervention. The intervention was estimated to cost $110,846 (2018 USD) over a 2-year period. Compared to no intervention, the intervention had lower mean costs ($880 vs $517) and higher mean QALYs (0.837 vs 0.863) per patient because of reduced inappropriate treatment, ADRs, and subsequent healthcare utilization, including hospitalization. In threshold analyses, the antibiotic stewardship strategy was no longer dominant if intervention cost was >$64,415,000 or the number of patients cared for was <3,672. In the PSA, the antibiotic stewardship intervention was dominant in 100% of the 10,000 Monte Carlo iterations (Fig. 2). Conclusions: In every scenario, the VA outpatient AD and AF antibiotic stewardship intervention was a dominant strategy compared to no intervention.
Predictors of compliance with aspirin in children following cardiac catheterisation have not been identified. The aim of this study is to identify the caregivers’ knowledge, compliance with aspirin medication, and predictors of compliance with aspirin in children with Congenital Heart Disease (CHD) post-percutaneous transcatheter occlusion.
A cross-sectional explorative design was adopted using a self-administered questionnaire and conducted between May 2017 and May 2018. Recruited were 220 caregivers of children with CHD post-percutaneous transcatheter occlusion. Questionnaires included child and caregivers’ characteristics, a self-designed and tested knowledge about aspirin scale (scoring scale 0–2), and the 8-item Morisky Medication Adherence Scale (scoring scale 0–8). Data were analysed using multivariate binary logistic regression analysis to identify predictors of compliance with aspirin.
Of the 220 eligible children and caregivers, 210 (95.5%) responded and 209 surveys were included in the analysis. The mean score of knowledge was 7.25 (standard deviation 2.27). The mean score of compliance was 5.65 (standard deviation 1.36). Child’s age, length of aspirin use, health insurance policies, relationship to child, monthly income, and knowledge about aspirin of caregivers were independent predictors of compliance with aspirin (p < 0.05).
Caregivers of children with CHD had an adequate level of knowledge about aspirin. Compliance to aspirin medication reported by caregivers was low. Predictors of medium to high compliance with aspirin were related to the child’s age and socio-economic reasons. Further studies are needed to identify effective strategies to improve knowledge, compliance with medication, and long-term outcomes of children with CHD.
A suite of Jurassic–Cretaceous migmatites was newly identified in the Liaodong Peninsula of the eastern North China Craton (NCC). Anatexis is commonly associated with crustal thickening. However, the newly identified migmatites were formed during strong lithospheric thinning accompanied by voluminous magmatism and intense deformation. Field investigations show that the migmatites are spatially associated with low-angle detachment faults. Numerous leucosomes occur either as isolated lenses or thin layers (dykes), parallel to or cross-cutting the foliation. Peritectic minerals such as titanite and sillimanite are distributed mainly along the boundaries of reactant minerals or are accumulated along the foliation. Most zircons show distinct core–rim structures, and the rims have low Th/U ratios (0.01–0.24). Zircon U–Pb dating results indicate that the protoliths of the migmatites were either the Late Triassic (224–221 Ma) diorites or metasedimentary rocks deposited sometime after c. 1857 Ma. The zircon overgrowth rims record crystallization ages of 173–161 Ma and 125 Ma, which represent the formation time of leucosomes. These ages are consistent with those reported magmatic events in the Liaodong Peninsula and surrounding areas. The leucosomes indicate a strong anatectic event during the Jurassic–Cretaceous period. Partial melting occurred through the breakdown of muscovite and biotite with the presence of water-rich fluid under a thermal anomaly regime. The possible mechanism that caused the 173–161 Ma and 125 Ma anatectic events was intimately related to the regional crustal extension during the lithospheric thinning of the NCC. Meanwhile, the newly generated melts further weakened the rigidity of the crust and enhanced the extension.
In this work, a new reconfigurable discrete 1D beam-steering Fabry–Perot cavity antenna with enhanced radiation performance is presented. It consists of a probe-fed patch antenna printed on the ground plane and a reconfigurable metasurface acting as the upper partially reflective surface to realize beam steering. By utilizing 6 × 6 proposed reconfigurable unit cells on the superstrate, the beam-steering angle can be effectively enhanced from ±7° to ±17° with fewer active elements and a much simpler biasing network. The proposed antenna was fabricated to validate the feasibility. Good agreement between the simulated and measured results is achieved. Moreover, the measured realized gains are over 11 dBi with a gain variation from the boresight direction to the tilted direction <0.2 dBi.
Paediatric Mycoplasma pneumoniae pneumonia (MPP) is a major cause of community-acquired pneumonia in China. Data on epidemiology of paediatric MPP from China are little known. This study retrospectively collected data from June 2006 to June 2016 in Beijing Children's Hospital, Capital Medical University of North China and aims to explore the epidemiological features of paediatric MPP and severe MPP (SMPP) in North China during the past 10 years. A total of 27 498 paediatric patients with pneumonia were enrolled. Among them, 37.5% of paediatric patients had MPP. In this area, an epidemic took place every 2–3 years at the peak, and the positive rate of MPP increased during these peak years over time. The peak age of MPP was between the ages of 6 and 10 years, accounting for 75.2%, significantly more compared with other age groups (χ2 = 1384.1, P < 0.0001). The epidemics peaked in September, October and November (χ2 = 904.9, P < 0.0001). Additionally, 13.0% of MPP paediatric patients were SMPP, but over time, the rate of SMPP increased, reaching 42.6% in 2016. The mean age of paediatric patients with SMPP (6.7 ± 3.0 years old) was younger than that of patients with non-SMPP (7.4 ± 3.2 years old) (t = 3.60, P = 0.0001). The prevalence of MPP and SMPP is common in China, especially in children from 6 to 10 years old. Paediatric patients with SMPP tend to be younger than those with non-SMPP. MPP outbreaks occur every 2–3 years in North China. September, October and November are the peak months, unlike in South China. Understanding the epidemiological characteristics of paediatric MPP can contribute to timely treatment and diagnosis, and may improve the prognosis of children with SMPP.
There is emerging evidence that glycaemic variability (GV) plays an important role in the development of diabetic complications. The current study aimed to compare the effects of lifestyle intervention (LI) with and without partial meal replacement (MR) on GV. A total of 123 patients with newly diagnosed and untreated type 2 diabetes (T2D) were randomised to receive either LI together with breakfast replacement with a liquid formula (LI+MR) (n 62) or LI alone (n 61) for 4 weeks and completed the study. Each participant was instructed to have three main meals per d and underwent 72-h continuous glucose monitoring (CGM) both before and after intervention. Measures of GV assessed by CGM included the incremental AUC of postprandial blood glucose (AUCpp), standard deviation of blood glucose (SDBG), glucose CV and mean amplitude of glycaemic excursions (MAGE). After a 4-week intervention, the improvements in systolic blood pressure (P=0·046) and time in range (P=0·033) were more pronounced in the LI+MR group than in the LI group. Furthermore, LI+MR caused significantly greater improvements in all GV metrics including SDBG (P=0·005), CV (P=0·002), MAGE (P=0·016) and AUCpp (P<0·001) than did LI. LI+MR (v. LI) was independently associated with improvements in GV after adjustment of covariates (all P<0·05). Our study showed that LI+MR led to significantly greater improvements in GV compared with LI, suggesting that LI+MR could be an effective treatment to alleviate glucose excursions.
Identifying the relative importance of urban and non-urban land-use types for potential denitrification derived N2O at a regional scale is critical for quantifying the impacts of human activities on nitrous oxide (N2O) emission under changing environments. In this study we used a regional dataset from China including 197 soil samples and six land-use types to evaluate the main predictors (land use, heavy metals, soil pH, soil moisture, substrate availability, functional and broad microbial abundances) of potential denitrification using multivariate and pathway analysis. Our results provide empirical evidence that soils on farms have the greatest potential denitrifying ability (PDA) (10.92±6.08ng N2O-N·g–1 dry soil·min–1) followed by urban soil (6.80±5.35ng N2O-N·g–1 dry soil·min–1). Our models indicate that land use (low vs. high human activity), followed by total nitrogen (TN) and heavy metals (Cu, Zn, Pb, Cd) was the most important driver of PDA. In addition, our path analysis suggests that at least part of the impacts of land use on potential denitrification were mediated via microbial abundance, soil pH and substrates including TN, dissolved organic carbon and nitrate. This study identifies the main predictors of denitrification at a regional scale which is needed to quantify the impact of human activities on ecosystem functionality under changing conditions.
We previously reported four heterozygous missense mutations of MYH7, KCNQ1, MYLK2, and TMEM70 in a single three-generation Chinese family with dual Long QT and hypertrophic cardiomyopathy phenotypes for the first time. However, the clinical course among the family members was various, and the potential myocardial dysfunction has not been investigated.
The objective of this study was to investigate the echocardiographic and electrocardiographic characteristics in a genetic positive Chinese family with hypertrophic cardiomyopathy and further to explore the association between myocardial dysfunction and electric activity, and the identified mutations.
A comprehensive echocardiogram – standard two-dimensional Doppler echocardiography and three-dimensional speckle tracking echocardiography – and electrocardiogram were obtained for members in this family.
As previously reported, four missense mutations – MYH7-H1717Q, KCNQ1-R190W, MYLK2-K324E, and TMEM70-I147T – were identified in this family. The MYH7-H1717Q mutation carriers had significantly increased left ventricular mass indices, elevated E/e’ ratio, deteriorated global longitudinal stain, but enhanced global circumferential and radial strain compared with those in non-mutation patients (all p<0.05). The KCNQ1-R190W carriers showed significantly prolonged QTc intervals, and the MYLK2-K324E mutation carriers showed inverted T-waves (both p<0.05). However, the TMEM70-I147T mutation carriers had similar echocardiography and electrocardiographic data as non-mutation patients.
Three of the identified four mutations had potential pathogenic effects in this family: MYH7-H1717Q was associated with increased left ventricular thickness, elevated left ventricular filling pressure, and altered myocardial deformation; KCNQ1-R190W and MYLK2-K324E mutations were correlated with electrocardiographic abnormalities reflected in long QT phenotype and inverted T-waves, respectively.
Monosized spherical Cu–20% Sn (wt%) alloy particles with diameter ranging from 70.6 to 334.0 μm were prepared by the pulsated orifice ejection method (termed “POEM”). Fully dense without pores and bulk inclusions, the cross-sectional micrographs of the spherical alloy particles indicate an even distribution of Cu and Sn. These spherical Cu–Sn alloy particles exhibit a good spherical shape and a narrow size distribution, suggesting that the liquid Cu–Sn alloy can completely break the balance between the surface tension and the liquid static pressure in the crucible micropores and accurately control the volume of the droplets. Furthermore, the cooling rate of spherical Cu–20% Sn alloy particles is estimated by a Newton’s cooling model. The cooling rate of the Cu–20% Sn alloy particle decreases gradually with the particle diameter increasing. Smaller particles have higher cooling rates and when the particle diameter is less than 70 μm, the cooling rate of particles can reach more than 3.3 × 104 K/s. The secondary dendrite arm spacing has strong dependence on particle diameter which increases gradually with the increase of particle diameter. The results demonstrate that POEM is an effective route for fabrication of high-quality monosized Cu–20% Sn alloy particles.
OBJECTIVES/SPECIFIC AIMS: The objective of this research is to determine under what conditions endpoints based on estimated glomerular filtration rate (eGFR) slope or on relatively small declines in eGFR provide valid and useful surrogate endpoints for pivotal clinical trials in chronic kidney disease (CKD) patients. METHODS/STUDY POPULATION: We consider 2 classes of surrogate endpoints. The first class includes endpoints defined by the average rate of change in eGFR during defined portions of the follow-up period of the trial, following initiation of the randomized treatment interventions. The second class includes composite endpoints defined by the time from randomization until the occurrence of a designated decline in eGFR or kidney failure. The true clinical endpoint is considered to be the time from randomization until kidney failure, irrespective of the trajectory in eGFR measurements prior to kidney failure. We apply statistical simulation to determine conditions under which alternative endpoints within the 2 classes are (1) valid surrogate endpoints, in the sense of preserving a low probability of rejecting the null hypothesis of no treatment effect on the surrogate endpoint when there is no treatment effect on the clinical endpoints and are also (2) useful surrogate endpoints, in the sense of providing increased statistical power that allows significant reductions in sample size and/or duration of follow-up. Input parameters for the simulations include (a) characteristics of the joint distribution of the longitudinal eGFR measurements and the time to occurrence of renal failure, (b) characteristics of the short-term and long-term effects of the treatment, and (c) design parameters, including the duration of accrual and follow-up and the spacing of eGFR measurements during the follow-up period. We use joint analyses of 19 treatment comparisons across 13 previous clinical trials of CKD patients to guide the selection of input parameters for the simulations. We apply longitudinal mixed effects models for analysis of endpoints based on eGFR slope, and Cox regression for analyses of the composite time-to-event endpoints. RESULTS/ANTICIPATED RESULTS: We have previously shown that surrogate endpoints defined by eGFR declines of 30% or 40% can provide valid and useful alternative endpoints in CKD clinical trials for interventions that do not produce short-term effects on eGFR which differ from the longer-term effects of the interventions. Other factors influencing the validity and utility of these endpoints include the average baseline eGFR, the mean rate of change in eGFR, and the extent to which the size of the treatment effect depends on the patient’s underling rate of eGFR decline. We will extend these results by presenting preliminary results describing conditions under which outcomes based on eGFR slope provide valid and useful alternatives to the clinical endpoint of time until occurrence of kidney failure. DISCUSSION/SIGNIFICANCE OF IMPACT: The statistical simulation strategy described in this research can be used during the design of clinical trials of chronic kidney disease to assist in the selection of endpoints that maximize savings in sample size and duration of follow-up while retaining a low risk of producing a false positive conclusion in the absence of a true effect of the treatment on the time until kidney failure.