We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
While past research suggested that living arrangements are associated with suicide death, no study has examined the impact of sustained living arrangements and the change in living arrangements. Also, previous survival analysis studies only reported a single hazard ratio (HR), whereas the actual HR may change over time. We aimed to address these limitations using causal inference approaches.
Methods
Multi-point data from a general Japanese population sample were used. Participants reported their living arrangements twice within a 5-year time interval. After that, suicide death, non-suicide death and all-cause mortality were evaluated over 14 years. We used inverse probability weighted pooled logistic regression and cumulative incidence curve, evaluating the association of time-varying living arrangements with suicide death. We also studied non-suicide death and all-cause mortality to contextualize the association. Missing data for covariates were handled using random forest imputation.
Results
A total of 86,749 participants were analysed, with a mean age (standard deviation) of 51.7 (7.90) at baseline. Of these, 306 died by suicide during the 14-year follow-up. Persistently living alone was associated with an increased risk of suicide death (risk difference [RD]: 1.1%, 95% confidence interval [CI]: 0.3–2.5%; risk ratio [RR]: 4.00, 95% CI: 1.83–7.41), non-suicide death (RD: 7.8%, 95% CI: 5.2–10.5%; RR: 1.56, 95% CI: 1.38–1.74) and all-cause mortality (RD: 8.7%, 95% CI: 6.2–11.3%; RR: 1.60, 95% CI: 1.42–1.79) at the end of the follow-up. The cumulative incidence curve showed that these associations were consistent throughout the follow-up. Across all types of mortality, the increased risk was smaller for those who started to live with someone and those who transitioned to living alone. The results remained robust in sensitivity analyses.
Conclusions
Individuals who persistently live alone have an increased risk of suicide death as well as non-suicide death and all-cause mortality, whereas this impact is weaker for those who change their living arrangements.
The home-field advantage (HFA) hypothesis establishes that plant litter decomposes faster at ‘home’ sites than in ‘away’ sites due to more specialized decomposers acting at home sites. This hypothesis has predominantly been tested through ‘yes or no’ transplanting experiments, where the litter decomposition of a focal species is quantified near and away from their conspecifics. Herein, we evaluated the occurrence and magnitude of home-field effects on the leaf litter decomposition of Myrcia ramuliflora (O.Berg) N. Silveira (Myrtaceae) along a natural gradient of conspecific litterfall input and also if home-field effects are affected by litter and soil traits. Litter decomposition of M. ramuliflora was assessed through litterbags placed in 39 plots in a tropical heath vegetation over a period of 12 months. We also characterized abiotic factors, litter layer traits, and litter diversity. Our results indicated the occurrence of positive (i.e. Home-field advantage) and negative (i.e. Home-field disadvantage) effects in more than half of the plots. Positive and negative effects occurred in a similar frequency and magnitude. Among all predictors tested, only the community weighted mean C/N ratio of the litterfall input was associated with home-field effects. Our results reinforce the lack of generality for home-field effects found in the literature and thus challenge the understanding of litter-decomposer interaction in tropical ecosystems.
To investigate factors that influence antibiotic prescribing decisions, we interviewed 49 antibiotic stewardship champions and stakeholders across 15 hospitals. We conducted thematic analysis and subcoding of decisional factors. We identified 31 factors that influence antibiotic prescribing decisions. These factors may help stewardship programs identify educational targets and design more effective interventions.
Red ear syndrome is a rare disorder in which the colour of the ear suddenly becomes red, with discomfort, pain and a burning sensation. This paper reports a case of primary red ear syndrome presenting with vestibular migraine.
Case report
A 39-year-old woman from Bangladesh reported dizziness and repeated headaches experienced since 18 years of age. She initially attended our hospital with dizziness aged 34 years. When dizzy, the colour of her right ear sometimes became red. Therefore, she was diagnosed with red ear syndrome with vestibular migraine.
Conclusion
This patient experienced repeated episodes of a red ear with discomfort, leading to the diagnosis of red ear syndrome. In addition, she had repeated dizziness and headaches, and was also diagnosed with vestibular migraine. The diagnosis of red ear syndrome with vestibular migraine should be considered in cases of dizziness and headache with recurrent redness of the ear.
In the absence of pyuria, positive urine cultures are unlikely to represent infection. Conditional urine reflex culture policies have the potential to limit unnecessary urine culturing. We evaluated the impact of this diagnostic stewardship intervention.
Design:
We conducted a retrospective, quasi-experimental (nonrandomized) study, with interrupted time series, from August 2013 to January 2018 to examine rates of urine cultures before versus after the policy intervention. We compared 3 intervention sites to 3 control sites in an aggregated series using segmented negative binomial regression.
Setting:
The study included 6 acute-care hospitals within the Veterans’ Health Administration across the United States.
Participants:
Adult patients with at least 1 urinalysis ordered during acute-care admission, excluding pregnant patients or those undergoing urological procedures, were included.
Methods:
At the intervention sites, urine cultures were performed if a preceding urinalysis met prespecified criteria. No such restrictions occurred at the control sites. The primary outcome was the rate of urine cultures performed per 1,000 patient days. The safety outcome was the rate of gram-negative bloodstream infection per 1,000 patient days.
Results:
The study included 224,573 urine cultures from 50,901 admissions in 24,759 unique patients. Among the intervention sites, the overall average number of urine cultures performed did not significantly decrease relative to the preintervention period (5.9% decrease; P = 0.8) but did decrease by 21% relative to control sites (P < .01). We detected no significant difference in the rates of gram-negative bloodstream infection among intervention or control sites (P = .49).
Conclusions:
Conditional urine reflex culture policies were associated with a decrease in urine culturing without a change in the incidence of gram-negative bloodstream infection.
Research has shown that ADHD symptoms and functional impairment often persist beyond childhood into adulthood. Thus an effective therapy that can be tolerated over long-term use in adults is needed. This is the first long term safety and tolerability study of an adult ADHD medication in Asia.
Objectives:
Assess long-term safety, tolerability, and efficacy of atomoxetine (ATX) in adult Japanese ADHD patients.
Aims:
Demonstrate the safety and tolerability of long-term ATX.
Methods:
ATX (40-120 mg/day) was evaluated based on integrated analyses of a 10 week double-blind (DB) study and a 48 week open-label long term (LT) extension study. Long-term safety and tolerability were assessed by adverse events, discontinuation rate, and vital-signs. Efficacy measures included change from baseline in Conners’ Adult ADHD Rating Scale- Investigator Rated (CAARS-Inv:SV) total symptoms score, behavior Rating Inventory of Executive Function (BRIEF-A), and Adult ADHD/QoL Measure (AAQoL).
Results:
233 patients took ATX (LT mean final prescribed dose: 108.3 mg/day). AEs leading to discontinuations were seen in 37 (15.9%) patients, the most common being nausea in 10 (4.3%) patients. Statistically significant baseline-to-endpoint reductions in mean CAARS-Inv:SV total symptoms score during in the DB study continued throughout the LT study. Similar reductions were seen in BRIEF-A Self Report scores. These findings along with AAQoL results indicated that patients perceived improvements in both QoL and Executive Function.
Conclusions:
Long-term ATX treatment was shown to be generally safe and tolerable in Japanese adult ADHD patients. Results also suggested ATX improved ADHD core symptoms, QoL and Executive Functions.
Although most hospitals report very high levels of hand hygiene compliance (HHC), the accuracy of these overtly observed rates is questionable due to the Hawthorne effect and other sources of bias. In the study, we aimed (1) to compare HHC rates estimated using the standard audit method of overt observation by a known observer and a new audit method that employed a rapid (<15 minutes) “secret shopper” method and (2) to pilot test a novel feedback tool.
Design
Quality improvement project using a quasi-experimental stepped-wedge design.
Setting
This study was conducted in 5 acute-care hospitals (17 wards, 5 intensive care units) in the Midwestern United States.
Methods
Sites recruited a hand hygiene observer from outside the acute-care units to rapidly and covertly observe entry and exit HHC during the study period, October 2016–September 2017. After 3 months of observations, sites received a monthly feedback tool that communicated HHC information from the new audit method.
Results
The absolute difference in HHC estimates between the standard and new audit methods was ~30%. No significant differences in HHC were detected between the baseline and feedback phases (OR, 0.92; 95% CI, 0.84–1.01), but the standard audit method had significantly higher estimates than the new audit method (OR, 9.83; 95% CI, 8.82–10.95).
Conclusions
HHC estimates obtained using the new audit method were substantially lower than estimates obtained using the standard audit method, suggesting that the rapid, secret-shopper method is less subject to bias. Providing feedback using HHC from the new audit method did not seem to impact HHC behaviors.
NUMO and JAEA have been conducting a joint research since FY2011, which is aimed
to enhance the methodology of repository design and performance assessment in
preliminary investigation stage for the deep geological disposal of high-level
radioactive waste. As a part of this joint research, we have been developing
glass dissolution models which include various processes derived from
glass-overpack-bentonite buffer interaction, considering the precipitation of
Fe-silicates associated with steel overpack corrosion, and Si transport through
altered layer of glass. The objective of this modeling work is to show
comprehensively the lifetime of the vitrified waste due to glass matrix
dissolution timescales through sensitivity analysis, and to identify the
feature/process that most strongly influences the lifetime, and to identify
future R&D issues that would help to improve the nuclide transport
analysis with confidential value and the safety case in future. The sensitivity
analysis suggested that the duration of the glass dissolution might be predicted
in the ranges from 3.8×103 to 1.9×105
years. Also, the results indicated that the precipitation of
Fe–silicate has the strongest influence on the long-team behavior of
vitrified waste.
Meiotic maturation of oocytes requires a variety of ATP-dependent reactions, such as germinal vesicle breakdown, spindle formation, and rearrangement of plasma membrane structure, which is required for fertilization. Mitochondria are accordingly expected be localized to subcellular sites of energy utilization. Although microtubule-dependent cellular traffic for mitochondria has been studied extensively in cultured neuronal (and some other somatic) cells, the molecular mechanism of their dynamics in mammalian oocytes at different stages of maturation remains obscure. The present work describes dynamic aspects of mitochondria in porcine oocytes at the germinal vesicle stage. After incubation of oocytes with MitoTracker Orange followed by centrifugation, mitochondria-enriched ooplasm was obtained using a glass needle and transferred into a recipient oocyte. The intracellular distribution of the fluorescent mitochondria was then observed over time using a laser scanning confocal microscopy equipped with an incubator. Kinetic analysis revealed that fluorescent mitochondria moved from central to subcortical areas of oocytes and were dispersed along plasma membranes. Such movement of mitochondria was inhibited by either cytochalasin B or cytochalasin D but not by colcemid, suggesting the involvement of microfilaments. This method of visualizing mitochondrial dynamics in live cells permits study of the pathophysiology of cytoskeleton-dependent intracellular traffic of mitochondria and associated energy metabolism during meiotic maturation of oocytes.
The objective of this study was to determine the pattern of energy metabolites net flux across the portal-drained viscera (PDV) and total splanchnic tissues (TSP) in mature sheep fed varying levels of lucerne hay cubes. Four Suffolk mature sheep (61.4 ± 3.6 kg BW) surgically fitted with multi-catheters were fed four levels of dry matter intake (DMI) of lucerne hay cubes ranging from 0.4- to 1.6-fold the metabolizable energy (ME) requirements for maintenance. Six sets of blood samples were simultaneously collected from arterial and venous catheters at 30-min intervals. With increasing DMI, apparent total tract digestibility increased linearly and quadratically for dry matter (P < 0.05), quadratically (P < 0.05) with a linear tendency (P < 0.1) for organic matter and tended to increase quadratically (P < 0.1) for NDF. PDV release of volatile fatty acids (VFA) and β-hydroxybutyric acid was relatively low at 0.4 M and then linearly increased (P < 0.05) with increasing DMI. Net PDV flux of non-esterified fatty acids showed curvilinear decrease from 0.4 to 1.2 M and then increased at 1.6 M. The respective proportions of each VFA appearing in the portal blood differed (P < 0.05) with DMI and this difference was more obvious from 0.4 to 0.8 M than from 0.8 to 1.6 M. Heat production, as a percentage of ME intake (MEI), decreased linearly (P < 0.05) with increasing DMI accounting for 37%, 21%, 16% and 13% for PDV and 62%, 49%, 33% and 27% for TSP at 0.4, 0.8, 1.2 and 1.6 M, respectively. As a proportion of MEI, total energy recovery including heat production, decreased linearly with increasing DMI (P < 0.05) accounting for 113%, 83%, 62% and 57% for PDV and 140%, 129%, 86% and 83% for TSP at 0.4, 0.8, 1.2 and 1.6 M, respectively. Regression analysis revealed a linear response between MEI (MJ/day per kg BW) and total energy release (MJ/day per kg BW) across the PDV and TSP, respectively. However, respective contributions of energy metabolites to net energy release across the PDV and TSP were highly variable among treatments and did not follow the same pattern of changes in DMI.
We successfully obtained the first optical spectra of the faint light echoes around Cassiopeia A and Tycho Brahe's supernova remnants (SNRs) with FOCAS and the Subaru Telescope. We conclude that Cas A and Tycho's SN 1572 belong to the Type IIb and normal Type Ia supernovae, respectively. Light echo spectra are important in order to obtain further insight into the supernova explosion mechanism of Tycho's SN 1572: how the Type Ia explosion actually proceeds, and whether accretion occurs from a companion or by the merging of two white dwarfs. The proximity of the SN 1572 remnant has allowed detailed studies, such as the possible identification of the binary companion, and provides a unique opportunity to test theories of the explosion mechanism and the nature of the progenitor. Future light-echo spectra, obtained in different spatial directions of SN 1572, will enable to construct a three-dimensional spectroscopic view of the explosion.
We present the first reported case of primary small cell carcinoma of the lacrimal sac.
Case report:
A 67-year-old Japanese woman was referred to our department with a two-month history of left medial canthal swelling, epiphora and occasional nasal bleeding. Nasal endoscopy revealed a readily bleeding tumour in the left inferior meatus. Computed tomography and magnetic resonance imaging scans demonstrated that the tumour was mainly located in the left lacrimal sac. Histopathological studies of a biopsy specimen revealed small cell carcinoma. The patient was treated with four cycles of chemotherapy consisting of cisplatin and etoposide, in combination with radiotherapy. There was no evidence of recurrence or metastasis for five years.
Conclusion:
Small cell carcinoma originating in the head and neck region has been reported to be highly aggressive and to have a poor prognosis. We report a case of primary small cell carcinoma of the lacrimal sac successfully treated with chemo-radiotherapy.
A total of 6346 swine sera collected at an abattoir in the city of Obihiro, Hokkaido during the years 1978–87 were tested for the presence of antibodies to swine and human influenza viruses. A high incidence of antibody to A/New Jersey/8/76 (swine type H1N1) virus was observed throughout the 10 years except for the occasional month and a single long period of 15 months. Antibodies to human H3N2 virus in swine appeared to be related to the epidemics of human influenza which occurred in the study area during the years 1980–3, but unrelated to the epidemics during the years 1984–7. A large number of swine were found to be antibody positive to a human H1N1 virus during the period April to June 1964, and a smaller number, during the period November 1986 to June 1987. Both were in relation to human influenza epidemics. However, there were long periods where human H1N1 antibodies in swine could not be found.
The first occurrence of swine influenza in Japan was recognized in 1977, when it was presumed that the disease was introduced via imported swine (Shibata elal. 1978). Further outbreaks of swine influenza and a high prevalence of antibody to the virus in Japanese swine populations have been reported by several workers (Yamane, Sukeno & Ishida, 1978; Sugimura elal. 1981; Ogawa elal. 1983). An outbreak of influenza virus infection due to an H3N2 strain was previously seen in a herd of swine in Osaka, Japan (Sugimura etal. 1975). Later the co-existence of swine (H1N1) and human (H3N2) influenza viruses was confirmed by serological and virological studies on Japanese swine populations (Onta et al. 1978; Sugimura et al. 1980; Arikawa et al. 1982). In a previous report (Miwa et al. 1986), we suggested that the swine became infected with a human H1N1 virus as piglets during an epidemic of influenza which occurred in the human population at the same time. The present study was undertaken to evaluate the changes in the prevalence of antibodies against swine and human influenza viruses in Japanese swine during the past 10 years.
There are few data on circulatory pro-inflammatory cytokine levels and cytokine gene polymorphisms in H. pylori-positive patients. A cross-sectional study was conducted to examine the effects of H. pylori infection, gastric atrophy, and the IL-8 T-251A polymorphism on plasma IL-8 levels in 98 Japanese adults. Seventy-one subjects were positive for H. pylori infection. The geometric mean of plasma IL-8 concentration was significantly higher in subjects with H. pylori infection than in those without (P=0·001). The development of atrophy was negatively associated with IL-8 levels in the H. pylori-positive subjects, although not significantly. Plasma IL-8 levels in the T/T genotype were associated with H. pylori infection and atrophy status (P=0·016). Our findings suggested that circulating IL-8 levels were associated with H. pylori infection. The effect of H. pylori infection on plasma IL-8 levels was not clearly modified by the IL-8 T-251A polymorphism.