We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We aimed to examine the association between the quantity and quality of dietary fat in early pregnancy and gestational diabetes mellitus (GDM) risk. 1477 singleton pregnancy women were included from Sichuan Provincial Hospital for Women and Children, Southwest China. Dietary information was collected by a 3-d 24-h dietary recall. GDM was diagnosed based on the results of a 75-g, 2-h oral glucose tolerance test at 24–28 gestational weeks. Log-binomial models were used to estimate relative risks (RR) and 95% CI. The results showed that total fat intake was positively associated with GDM risk (Q4 v. Q1: RR=1.40; 95 % CI: 1.11, 1.76; Ptrend= 0.001). This association was also observed for the intakes of animal fat and vegetable fat. After stratified by total fat intake (<30%E v. ≥30%E), the higher animal fat intake was associated with higher GDM risk in the high-fat group but the moderate animal fat intake was associated with reduced risk of GDM (T2 v. T1: RR= 0.65; 95 % CI: 0.45, 0.96) in the normal-fat group. Vegetable fat intake was positively associated with GDM risk in the high-fat group but not in the normal-fat group. No association between fatty acids intakes and GDM risk were found. In conclusion, total fat, animal and vegetable fat intakes were positively associated with GDM risk, respectively. Whereas, when total fat intake was not excessive, higher intakes of animal and vegetable fat were likely irrelevant with increased GDM risk, even the moderate animal fat intake could be linked to lower GDM risk.
Background: Although small- and medium-sized hospitals comprise most healthcare providers in South Korea, data on antibiotic usage is limited in these facilities. We evaluated the pattern of antibiotic usage and its appropriateness in hospitals with <400 beds in South Korea. Methods: A multicenter retrospective study was conducted in 10 hospitals (6 long-term care hospitals, 3 acute-care hospitals, and 1 orthopedic hospital), with <400 beds in South Korea. We analyzed patterns of antibiotic prescription and their appropriateness in the participating hospitals. Data on the monthly antibiotic prescriptions and patient days for hospitalized patients were collected using electronic databases from each hospital. To avoid the effect of the COVID-19 pandemic, data were collected from January to December 2019. For the evaluation of the appropriateness of the prescription, 25 patients under antibiotic therapy were randomly selected at each hospital over 2 separate periods. Due to the heterogeneity of their characteristics, the orthopedics hospital was excluded from the analysis. The collected data were reviewed, and the appropriateness of antibiotic prescriptions was evaluated by 5 specialists in infectious diseases (adult and pediatric). Data from 2 hospitals were assigned to each specialist. The appropriateness of antibiotic prescriptions was evaluated from 3 aspects: route of administration, dose, and class. If the 3 aspects were ‘optimal,’ the prescription was considered ‘optimal.’ If only the route was ‘optimal,’ and the dose and/or class was ‘suboptimal,’ but not ‘inappropriate,’ it was considered ‘suboptimal.’ If even 1 aspect was ‘inappropriate,’ it was classified as ‘inappropriate.’ Results: The most commonly prescribed antibiotics in long-term care hospitals was fluoroquinolone, followed by β-lactam/β-lactamase inhibitor (antipseudomonal). In acute-care hospitals, these were third-generation cephalosporin, followed by first-generation cephalosporin and second-generation cephalosporin. The major antibiotics that were prescribed in the orthopedics hospital was first-generation cephalosporin. Only 2.3% of the antibiotics were administered inappropriately. In comparison, 15.3% of patients were prescribed an inappropriate dose. The proportion of inappropriate antibiotic prescriptions was 30.6% of the total antibiotic prescriptions. Conclusions: The antibiotic usage patterns vary between small- and medium-sized hospitals in South Korea. The proportion of inappropriate prescriptions exceeded 30% of the total antibiotic prescriptions.
It is generally accepted that high-oleic crops have at least 70% oleate. As compared to their normal-oleic counterparts, oil and food products made from high-oleic peanut have better keeping quality and are much healthier. Therefore, high-oleic peanut is well recognized by processors and consumers. However, owing to the limited availability of high-oleic donors, most present-day high-oleic peanut varietal releases merely have F435 type FAD2 mutations. Through screening of a mutagenized peanut population of 15L46, a high-yielding peanut line with desirable elliptical oblong large seeds, using near infrared model for predicting oleate content in individual single seeds, high-oleic peanut mutants were identified. Sequencing FAD2A and FAD2B of the mutants along with the wild type revealed that these mutants possessed G448A FAD2A (F435 type FAD2A mutation) and G558A FAD2B (non-F435 type FAD2B mutation). Expression of the wild and mutated type FAD2B in yeast verified that the functional mutation contributed to the high-oleic phenotype in these mutants. The mutants provided additional high-oleic donors to peanut quality improvement.
For individual cultures, findings on regulating embryo density by changing the microdrop volume are contradictory. The aim of this study was to investigate the relationship between embryo density and the developmental outcome of day 3 embryos after adjusting covariates. In total, 1196 embryos from 206 couples who had undergone in vitro fertilization treatment were analyzed retrospectively. Three embryo densities were used routinely, i.e. one embryo in a drop (30 μl/embryo), two embryos in a drop (15 μl/embryo) and three embryos in a drop (10 μl/embryo). Embryo quality on day 3 was evaluated, both the cell number of day 3 embryos and the proportion of successful implantations served as endpoints. Maternal age, paternal age, antral follicles and level of anti-Müllerian hormone, type of infertility, controlled ovarian stimulation protocol, length of stimulation, number of retrieved oocytes, number of zygotes (two pronuclei) and insemination type were covariates and adjusted. After adjusting fully for all covariates, the cell number of day 3 embryos was significantly increased by 0.40 (95% CI 0.00, 0.79; P = 0.048) and 0.78 (95% CI 0.02, 1.54; P = 0.044) in the 15 μl/embryo and 10 μl/embryo group separately, compared with the 30 μl/embryo group. The proportions of implanted embryos were 42.1%, 48.7% and 0.0% in the 30 μl/embryo, 15 μl/embryo and 10 μl/embryo groups respectively. There was no statistical significance (P = 0.22) between the 30 μl/embryo group and the 15 μl/embryo group. After adjusting for confounders that were significant in univariate analysis, embryo density was still not associated with day 3 embryo implantation potential (P > 0.05). In a 30-μl microdrop, culturing embryos with an embryo density of both 15 and 10 μl/embryo increased the cell number of day 3 embryos, which did not benefit embryo implanting potential, compared with individual culture of 30 μl/embryo.
Nosocomial transmission of COVID-19 among immunocompromised hosts can have a serious impact on COVID-19 severity, underlying disease progression and SARS-CoV-2 transmission to other patients and healthcare workers within hospitals. We experienced a nosocomial outbreak of COVID-19 in the setting of a daycare unit for paediatric and young adult cancer patients. Between 9 and 18 November 2020, 473 individuals (181 patients, 247 caregivers/siblings and 45 staff members) were exposed to the index case, who was a nursing staff. Among them, three patients and four caregivers were infected. Two 5-year-old cancer patients with COVID-19 were not severely ill, but a 25-year-old cancer patient showed prolonged shedding of SARS-CoV-2 RNA for at least 12 weeks, which probably infected his mother at home approximately 7–8 weeks after the initial diagnosis. Except for this case, no secondary transmission was observed from the confirmed cases in either the hospital or the community. To conclude, in the day care setting of immunocompromised children and young adults, the rate of in-hospital transmission of SARS-CoV-2 was 1.6% when applying the stringent policy of infection prevention and control, including universal mask application and rapid and extensive contact investigation. Severely immunocompromised children/young adults with COVID-19 would have to be carefully managed after the mandatory isolation period while keeping the possibility of prolonged shedding of live virus in mind.
The dissipation of ion-acoustic surface waves propagating in a semi-bounded and collisional plasma which has a boundary with vacuum is theoretically investigated and this result is used for the analysis of edge-relevant plasma simulated by Divertor Plasma Simulator-2 (DiPS-2). The collisional damping of the surface wave is investigated for weakly ionized plasmas by comparing the collisionless Landau damping with the collisional damping as follows: (1) the ratio of ion temperature $({T_i})$ to electron temperature $({T_e})$ should be very small for the weak collisionality $({T_i}/{T_e} \ll 1)$; (2) the effect of collisionless Landau damping is dominant for the small parallel wavenumber, and the decay constant is given as $\gamma \approx{-} \sqrt {\mathrm{\pi }/2} {k_\parallel }{\lambda _{De}}\omega _{pi}^2/{\omega _{pe}}$; and (3) the collisional damping dominates for the large parallel wavenumber, and the decay constant is given as $\gamma \approx{-} {\nu _{in}}/16$, where ${\nu _{in}}$ is the ion–neutral collisional frequency. An experimental simulation of the above theoretical prediction has been done in the argon plasma of DiPS-2, which has the following parameters: plasma density ${n_e} = (\textrm{2--9)} \times \textrm{1}{\textrm{0}^{11}}\;\textrm{c}{\textrm{m}^{ - 3}}$, ${T_e} = 3.7- 3.8\;\textrm{eV}$, ${T_i} = 0.2- 0.3\;\textrm{eV}$ and collision frequency ${\nu _{in}} = 23- 127\;\textrm{kHz}$. Although the wavelength should be specified with the given parameters of DiPS-2, the collisional damping is found to be $\gamma = ( - 0.9\;\textrm{to}\; - 5) \times {10^4}\;\textrm{rad}\;{\textrm{s}^{ - 1}}$ for ${k_\parallel }{\lambda _{De}} = 10$, while the Landau damping is found to be $\gamma = ( - 4\;\textrm{to}\; - 9) \times {10^4}\;\textrm{rad}\;{\textrm{s}^{ - 1}}$ for ${k_\parallel }{\lambda _{De}} = 0.1$.
As part of a long-term experiment to determine the impacts of composted manure and straw amendments (replacing 50% of chemical fertilizer with composted pig manure, wheat straw return combined with chemical fertilizer, and setting no fertilizer and chemical fertilizer-only as controls) on rice-associated weeds in a rice (Oryza sativa L.)–wheat (Triticum aestivum L.) rotation system, species richness, abundance, density, and biomass of weeds were assessed during years 8 and 9. Fertilization decreased the species richness and total density of rice-associated weeds but increased their total biomass. The species richness and densities of broadleaf and sedge weeds decreased with fertilization, while species richness of grass weeds increased only with straw return and density was not significantly affected. The shoot biomass per square meter of grass and broadleaf weeds was significantly higher with fertilization treatments than with the no-fertilizer control, while that of sedge weeds declined with fertilizer application. With fertilization, the densities of monarch redstem (Ammannia baccifera L.) and smallflower umbrella sedge (Cyperus difformis L.) decreased, that of Chinese sprangletop [Leptochloa chinensis (L.) Nees] increased, and those of barnyardgrass [Echinochloa crus-galli (L.) P. Beauv.] and monochoria [Monochoria vaginalis (Burm. f.) C. Presl ex Kunth] were not significantly affected. Ammannia baccifera was the most abundant weed species in all treatments. Whereas composted pig manure plus fertilizer resulted in higher density of A. baccifera and lower shoot biomass per plant than chemical fertilizer only, wheat straw return plus chemical fertilizer caused lower density and shoot biomass of A. baccifera. Therefore, it may be possible that fertilization strategies that suppress specific weeds could be used as improved weed management program components in rice production systems.
Accurate prognostication is important for patients and their families to prepare for the end of life. Objective Prognostic Score (OPS) is an easy-to-use tool that does not require the clinicians’ prediction of survival (CPS), whereas Palliative Prognostic Score (PaP) needs CPS. Thus, inexperienced clinicians may hesitate to use PaP. We aimed to evaluate the accuracy of OPS compared with PaP in inpatients in palliative care units (PCUs) in three East Asian countries.
Method
This study was a secondary analysis of a cross-cultural, multicenter cohort study. We enrolled inpatients with far-advanced cancer in PCUs in Japan, Korea, and Taiwan from 2017 to 2018. We calculated the area under the receiver operating characteristics (AUROC) curve to compare the accuracy of OPS and PaP.
Results
A total of 1,628 inpatients in 33 PCUs in Japan and Korea were analyzed. OPS and PaP were calculated in 71.7% of the Japanese patients and 80.0% of the Korean patients. In Taiwan, PaP was calculated for 81.6% of the patients. The AUROC for 3-week survival was 0.74 for OPS in Japan, 0.68 for OPS in Korea, 0.80 for PaP in Japan, and 0.73 for PaP in Korea. The AUROC for 30-day survival was 0.70 for OPS in Japan, 0.71 for OPS in Korea, 0.79 for PaP in Japan, and 0.74 for PaP in Korea.
Significance of results
Both OPS and PaP showed good performance in Japan and Korea. Compared with PaP, OPS could be more useful for inexperienced physicians who hesitate to estimate CPS.
The effects of early thiamine use on clinical outcomes in critically ill patients with acute kidney injury (AKI) are unclear. The purpose of this study was to investigate the associations between early thiamine administration and clinical outcomes in critically ill patients with AKI. The data of critically ill patients with AKI within 48 h after ICU admission were extracted from the Medical Information Mart for Intensive Care III (MIMIC III) database. PSM was used to match patients early receiving thiamine treatment to those not early receiving thiamine treatment. The association between early thiamine use and in-hospital mortality due to AKI was determined using a logistic regression model. A total of 15 066 AKI patients were eligible for study inclusion. After propensity score matching (PSM), 734 pairs of patients who did and did not receive thiamine treatment in the early stage were established. Early thiamine use was associated with lower in-hospital mortality (OR 0·65; 95 % CI 0·49, 0·87; P < 0·001) and 90-d mortality (OR 0·58; 95 % CI 0·45, 0·74; P < 0·001), and it was also associated with the recovery of renal function (OR 1·26; 95 % CI 1·17, 1·36; P < 0·001). In the subgroup analysis, early thiamine administration was associated with lower in-hospital mortality in patients with stages 1 to 2 AKI. Early thiamine use was associated with improved short-term survival in critically ill patients with AKI. It was possible beneficial role in patients with stages 1 to 2 AKI according to the Kidney Disease: Improving Global Outcomes criteria.
Normative data are essential for neuropsychological evaluations, but they are scarce for Mandarin-speaking populations, despite Mandarin being the language with the most native speakers. Several normative data studies have been reported in recent years for Mandarin speakers, who reside in different countries/regions (e.g., mainland China, Taiwan, and Singapore, etc.). This review aims to serve as a reference guide to appropriate norms when working with a Mandarin-speaking patient and to guide future endeavors in test validation and development in areas where studies to date fall short.
Method:
We conducted a systematic review utilizing the PsycInfo, PubMed, and China Knowledge Resource Integrated databases as well as additional literature search through citations. We performed evaluations of the existing norms based on their test selection, cognitive domains covered, sample size, language, regions of participant recruitment, stratification by age/gender/education levels, and reporting of other psychometric properties. We focused on articles that included performance-based tests for adults but excluded those with purely clinical norms or from commercial publishers.
Results:
We reviewed 1155 articles found through literature search and identified 43 articles reporting normative data for this population that met our inclusion criteria. Sixty-five distinctive tests and 127 versions were covered. The results are presented within two detailed tables organized by articles and tests, respectively.
Conclusions:
We discussed the strengths and limitations of these normative reports. Practitioners are recommended to utilize normative data that most closely approximate a test-taker’s cultural and demographic backgrounds. Limitations of the current review are also discussed.
The aim of the present study was to compare the rate of preterm birth (PTB) and growth from birth to 18 years between twins conceived by in vitro fertilization (IVF) and twins conceived by spontaneous conception (SC) in mainland China. The retrospective cohort study included 1164 twins resulting from IVF and 25,654 twins conceived spontaneously, of which 494 from IVF and 6338 from SC were opposite-sex twins. PTB and low birth weight (LBW), and growth, including length/height and weight, were compared between the two groups at five stages: infancy (0 year), toddler period (1–2 years), preschool (3–5 years), primary or elementary school (6–11 years), and adolescence (10–18 years). Few statistically significant differences were found for LBW and growth between the two groups after adjusting for PTB and other confounders. Twins born by IVF faced an increased risk of PTB compared with those born by SC (adjusted odds ratio [aOR] 8.21, 95% confidence interval [CI] [3.19, 21.13], p < .001 in all twins and aOR 10.12, 95% CI [2.32, 44.04], p = .002 in opposite-sex twins). Twins born by IVF experienced a similar growth at five stages (0–18 years old) when compared with those born by SC. PTB risk, however, is significantly higher for twins conceived by IVF than those conceived by SC.
Recent cannabis exposure has been associated with lower rates of neurocognitive impairment in people with HIV (PWH). Cannabis’s anti-inflammatory properties may underlie this relationship by reducing chronic neuroinflammation in PWH. This study examined relations between cannabis use and inflammatory biomarkers in cerebrospinal fluid (CSF) and plasma, and cognitive correlates of these biomarkers within a community-based sample of PWH.
Methods:
263 individuals were categorized into four groups: HIV− non-cannabis users (n = 65), HIV+ non-cannabis users (n = 105), HIV+ moderate cannabis users (n = 62), and HIV+ daily cannabis users (n = 31). Differences in pro-inflammatory biomarkers (IL-6, MCP-1/CCL2, IP-10/CXCL10, sCD14, sTNFR-II, TNF-α) by study group were determined by Kruskal–Wallis tests. Multivariable linear regressions examined relationships between biomarkers and seven cognitive domains, adjusting for age, sex/gender, race, education, and current CD4 count.
Results:
HIV+ daily cannabis users showed lower MCP-1 and IP-10 levels in CSF compared to HIV+ non-cannabis users (p = .015; p = .039) and were similar to HIV− non-cannabis users. Plasma biomarkers showed no differences by cannabis use. Among PWH, lower CSF MCP-1 and lower CSF IP-10 were associated with better learning performance (all ps < .05).
Conclusions:
Current daily cannabis use was associated with lower levels of pro-inflammatory chemokines implicated in HIV pathogenesis and these chemokines were linked to the cognitive domain of learning which is commonly impaired in PWH. Cannabinoid-related reductions of MCP-1 and IP-10, if confirmed, suggest a role for medicinal cannabis in the mitigation of persistent inflammation and cognitive impacts of HIV.
Several studies supported the usefulness of “the surprise question” in terms of 1-year mortality of patients. “The surprise question” requires a “Yes” or “No” answer to the question “Would I be surprised if this patient died in [specific time frame].” However, the 1-year time frame is often too long for advanced cancer patients seen by palliative care personnel. “The surprise question” with shorter time frames is needed for decision making. We examined the accuracy of “the surprise question” for 7-day, 21-day, and 42-day survival in hospitalized patients admitted to palliative care units (PCUs).
Method
This was a prospective multicenter cohort study of 130 adult patients with advanced cancer admitted to 7 hospital-based PCUs in South Korea. The accuracy of “the surprise question” was compared with that of the temporal question for clinician's prediction of survival.
Results
We analyzed 130 inpatients who died in PCUs during the study period. The median survival was 21.0 days. The sensitivity, specificity, and overall accuracy for the 7-day “the surprise question” were 46.7, 88.7, and 83.9%, respectively. The sensitivity, specificity, and overall accuracy for the 7-day temporal question were 6.7, 98.3, and 87.7%, respectively. The c-indices of the 7-day “the surprise question” and 7-day temporal question were 0.662 (95% CI: 0.539–0.785) and 0.521 (95% CI: 0.464–0.579), respectively. The c-indices of the 42-day “the surprise question” and 42-day temporal question were 0.554 (95% CI: 0.509–0.599) and 0.616 (95% CI: 0.569–0.663), respectively.
Significance of results
Surprisingly, “the surprise questions” and temporal questions had similar accuracies. The high specificities for the 7-day “the surprise question” and 7- and 21-day temporal question suggest they may be useful to rule in death if positive.
Fruit intake may influence gestational diabetes mellitus (GDM) risk. However, prospective evidence remains controversial and limited. The current study aimed to investigate whether total fruit and specific fruit intake influence GDM risk.
Design:
A prospective cohort study was conducted. Dietary information was collected by a 3-d 24-h dietary recall. All participants underwent a standard 75-g oral glucose tolerance test at 24–28 gestational weeks. Log-binomial models were used to estimate the association between fruit intake and GDM risk, and the results are presented as relative risks (RR) and 95 % CI.
Setting:
Southwest China.
Participants:
Totally, 1453 healthy pregnant women in 2017.
Results:
Total fruit intake was not associated with lower GDM risk (RR of 1·03 (95 % CI 0·83, 1·27) (Ptrend = 0·789)). The RR of GDM risk was 0·73 for the highest anthocyanin-rich fruit intake quartile compared with the lowest quartile (95 % CI 0·56, 0·93; Ptrend = 0·015). A higher grape intake had a linear inverse association with GDM risk (Q4 v. Q1: RR = 0·65; 95 % CI 0·43, 0·98; Ptrend = 0·044), and after further adjustment for anthocyanin intake, the inverse association tended to be non-linear (Q4 v. Q1: RR = 0·65; 95 % CI 0·44, 0·98; Ptrend = 0·079). However, we did not find an association between glycaemic index-grouped fruit, glycaemic load-grouped fruit or other fruit subtype intake and GDM risk.
Conclusions:
In conclusion, specific fruit intake (particularly anthocyanin-rich fruit and grapes) but not total fruit intake was inversely associated with GDM risk.
To explore the relationship between parameters of Na and K excretion using 24-h urine sample and mild cognitive impairment (MCI) in general population.
Design:
This is a cross-sectional study.
Setting:
Community-based general population in Emin China.
Participants:
Totally, 1147 subjects aged ≥18 years were selected to complete the study, with a multistage proportional random sampling method. Cognitive status was assessed with Mini Mental State Examination (MMSE) questionnaire and timed 24-h urine specimens were collected. Finally, 561 participants aged ≥35 years with complete urine sample and MMSE data were included for the current analysis and divided into groups by tertiles of 24-h urinary sodium to potassium ratio (24-h UNa/K) as lowest (T1), middle (T2) and highest (T3) groups.
Results:
The MMSE score was significantly lower in T3, compared with the T1 group (26·0 v. 25·0, P = 0·002), and the prevalent MCI was significantly higher in T3 than in T1 group (11·7 % v. 25·8 %, P < 0·001). In multiple linear regression, 24-UNa/K (β: −0·184, 95 % CI −0·319, −0·050, P = 0·007) was negatively associated with MMSE score. In multivariable logistic regression, compared with T1 group, 24-h UNa/K in the T2 and T3 groups showed 2·01 (95 % CI 1·03, 3·93, P = 0·041) and 3·38 (95 % CI 1·77, 6·44, P < 0·001) fold odds for presence of MCI, even after adjustment for confounders. More augmented results were demonstrated in sensitivity analysis by excluding individuals taking anti-hypertensive agents.
Conclusions:
Higher 24-h UNa/K is in an independent association with prevalent MCI.
We aimed to examine the association between low-carbohydrate diet (LCD) scores during the first trimester and gestational diabetes mellitus (GDM) risk in a Chinese population. A total of 1455 women were included in 2017. Dietary information during the first trimester was collected by 24-h dietary recalls for 3 d. The overall, animal and plant LCD scores, which indicated adherence to different low-carbohydrate dietary patterns, were calculated. GDM was diagnosed based on the results of a 75-g, 2-h oral glucose tolerance test at 24–28 weeks gestation. Log-binomial models were used to estimate relative risks (RR) and 95 % CI. The results showed that the multivariable-adjusted RR of GDM from the lowest to the highest quartiles of the overall LCD score were 1·00 (reference), 1·15 (95 % CI 0·92, 1·42), 1·30 (95 % CI 1·06, 1·60) and 1·24 (95 % CI 1·01, 1·52) (P = 0·026 for trend). Multivariable-adjusted RR (95 % CI) of GDM from the lowest to the highest quartiles of the animal LCD score were 1·00 (reference), 1·20 (95 % CI 0·96, 1·50), 1·41 (95 % CI 1·14, 1·73) and 1·29 (95 % CI 1·04, 1·59) (P = 0·002 for trend). After additional adjustment for gestational weight gain before GDM diagnosis, the association of the overall LCD score with GDM risk was non-significant, while the association of animal LCD score with GDM risk remained significant. In conclusion, a low-carbohydrate dietary pattern characterised by high animal fat and protein during the first trimester is associated with an increased risk of GDM in Chinese women.
We quantitatively assessed the fit failure rate of N95 respirators according to the number of donning/doffing and hours worn.
Design:
Experimental study.
Setting:
A tertiary-care referral center in South Korea.
Participants:
In total, 10 infection control practitioners participated in the fit test.
Methods:
The first experiment comprised 4 consecutive 1-hour donnings and fit tests between each donning. The second experiment comprised 2 consecutive 3-hour donnings and fit tests between each donning. The final experiment comprised fit tests after an 1-hour donning or a 2-hour donning.
Results:
For 1-hour donnings, 60%, 70%, and 90% of the participants had fit failures after 2, 3, and 4 consecutive donnings, respectively. For 3-hour donnings, 50% had fit failure after the first donning and 70% had failures after 2 consecutive donnings. All participants passed the fit test after refitting whenever fit failure occurred. The final experiment showed that 50% had fit failure after a single use of 1 hour, and 30% had fit failure after a single use of 2 hours.
Conclusions:
High fit-failure rates were recorded after repeated donning and extended use of N95 respirators. Caution is needed for reuse (≥1 time) and extended use (≥1 hour) of N95 respirators in high-risk settings such as those involving aerosol-generating procedures. Although adequate refitting may recover the fit factor, the use of clean gloves and strict hand hygiene afterward should be ensured when touching the outer surfaces of N95 respirators for refitting.
A study was conducted to identify whether composted manure and straw amendments (replacement of a portion of chemical fertilizer [50% of the total nitrogen application] with composted pig manure, and straw return [all straw from the previous rice crop] combined with chemical fertilizer) compared with no fertilization and chemical fertilizer only would change the dominant species of wheat-associated weeds as well as influence their growth and seed yield in a rice (Oryza sativa L.)–wheat (Triticum aestivum L.) rotation system. The study was initiated in 2010, and the treatment effects on the species, density, plant height, shoot biomass, seed yield of dominant weeds, and wheat yields were assessed in 2017 and 2018. Fertilization significantly increased the height, density, and yield of wheat, as well as the shoot biomass of wheat-associated weeds, but decreased the weed species number. A total of 17 and 14 weed species were recorded in the experimental wheat fields in 2017 and 2018, respectively. The most dominant weed species were American sloughgrass [Beckmannia syzigachne (Steud.) Fernald] and catchweed bedstraw (Galium aparine L.), which made up more than 64% of the weed community in all treatments. When the chemical fertilizer application was amended with pig manure compost and straw return, the relative abundance of B. syzigachne significantly decreased, while the relative abundance of G. aparine significantly increased. The application of the chemical fertilizer-only treatment resulted in increases in the density, shoot biomass, and seed yield of B. syzigachne, while the composted manure and straw amendments applied together with chemical fertilizer led to significant increases in the density, shoot biomass, and seed yield of G. aparine. Consequently, further research on ways to promote greater cropping system diversity will be needed to prevent the selection of weed species that are adapted to a limited suite of crop management practices.
Radiocarbon (14C) dating has been widely used to determine the age of deposits, but there have been frequent reports of inconsistencies in age among different dating materials. In this study, we performed radiocarbon dating on a total of 33 samples from 8-m-long sediment cores recovered from the wetland of the Muljangori volcanic cone on Jeju Island, South Korea. Ten pairs of humic acid (HA) and plant fragments (PF) samples, and three pairs of HA and humin samples, from the same depths were compared in terms of age. The PF were consistently younger than the HA. Interestingly, the age difference between HA and PF samples showed a long-term change during the past 8000 years. To test whether there was an association between this long-term age difference and climate change, we compared with the carbon/nitrogen (C/N) ratios and total organic carbon isotope (δ13CTOC) values of the sediments, as indicators of the relative abundance of terrestrial and aquatic plants; these parameters showed similar long-term trends. This suggests that the increasing (decreasing) trend in age difference was influenced by long-term dry (wet) climate change.
A series of new synthetic armored cables were developed and tested to ensure that they were suitable for use with the RECoverable Autonomous Sonde (RECAS), which is a newly designed freezing-in thermal ice probe. The final version of the cable consists of two concentric conductors that can be used as the power and signal lines. Two polyfluoroalkoxy jackets are used for electrical insulation (one for insulation between conductors, and the other for insulation of the outer conductor). The outer insulation layer is coated by polyurethane jacket to seal the connections between the cable and electrical units. The 0.65 mm thick strength member is made from aramid fibers woven together. To hold these aramid fibers in place, a sheathing layer was produced from a polyamide fabric cover net. The outer diameter of the final version of the cable is ~6.1 mm. The permissible bending radius is as low as 17–20 mm. The maximal breaking force under straight tension is ~12.2 kN. The cable weight is only ~0.061 kg m−1. The mechanical and electrical properties and environmental suitability of the cable were determined through laboratory testing and joint testing with the probe.