We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Objectives: Medical devices and the hospital environment can be contaminated easily by multidrug-resistant bacteria. The effectiveness of cleaning practices is often suboptimal because environmental cleaning in hospitals is complex and depends on human factors, the physical and chemical characteristics of environment, and the viability of the microorganisms. Ultraviolet-C (UV-C) lamps can be used to reduce the spread of microorganisms. We evaluated the effectiveness of an ultraviolet-C (UV-C) device on terminal room cleaning and disinfection. Methods: The study was conducted at an ICU of a medical center in Taiwan. We performed a 3-stage evaluation for the effectiveness of UV-C radiation, including pre–UV-C radiation, UV-C radiation, and a bleaching procedure. The 3 stages of evaluation were implemented in the ICU rooms from which a patient had been discharged or transferred. We collected the data from adenosine triphosphate (ATP) bioluminescence testing, colonized strains, and their corresponding colony counts by sampling from the environmental surfaces and air. We tested 8 high-touch surfaces, including 2 sides of bed rails, headboards, footboards, bedside tables, monitors, pumping devices, IV stands, and oxygen flow meters. Results: In total, 1,696 environmental surfaces and 72 air samples were analyzed. The levels of ATP bioluminescence and colony counts of isolated bacteria decreased significantly after UV-C radiation and bleaching disinfection for both the environmental and air samples (P < .001). Resistant bacteria (vancomycin-resistant Enterococcus, VRE) were commonly isolated on the hard-to-clean surfaces of monitors, oxygen flow meters, and IV pumps. However, they were also eradicated (P < .001). Conclusions: UV-C can significantly reduce environmental contamination by multidrug-resistant microorganisms. UV-C is an effective device to assist staff in cleaning the hospital environment.
This study compares Chinese people’s trust and trustworthiness, risk attitude, and time preference before and after the onset of the COVID-19 pandemic in China. We compare the preferences of subjects in two online experiments with samples drawn from 31 provinces across mainland China before and after the onset of the pandemic. We test two competing hypotheses regarding trust and trustworthiness. On the one hand, the outbreak as a collective threat could enhance in-group cohesion and cooperation and thus increase trust and trustworthiness. On the other hand, to the extent that people expect their future income to decline, they may become more self-protective and self-controlled, and thus less trusting and trustworthy and more risk averse and patient. Comparing before and after the onset, we found that the subjects increased in trustworthiness. After the onset, trust and trustworthiness (and risk aversion and present bias too) were positively correlated with the COVID-19 prevalence rate in the provinces. Subjects with more pessimistic expectations about income change showed more risk aversion and lower discount rates, supporting the speculation concerning self-control.
Bipolar disorder is a chronic mental disorder related to cognitive deficits. Low serum vitamin D levels are significantly associated with compromised cognition in neuropsychiatric disorders. Although patients with bipolar disorder frequently exhibit hypovitaminosis D, the association between vitamin D and cognition in bipolar disorder, and their neuroaxonal integrity, is unclear.
Aims
To investigate the interaction effects between vitamin D and neurofilament light chain (NfL) levels on cognitive domains in bipolar disorder.
Method
Serum vitamin D and NfL levels were determined in 100 euthymic patients with bipolar disorder in a cross-sectional study. Cognitive function was measured with the Brief Assessment of Cognition in Affective Disorders. We stratified by age groups and used general linear models to identify associations between vitamin D and NfL levels and their interaction effects on cognitive domains.
Results
The mean vitamin D and NfL levels were 16.46 ng/nL and 11.10 pg/mL, respectively; 72% of patients were vitamin D deficient. In the older group, more frequent hospital admissions and lower physical activity were identified in the group with versus without vitamin D deficiency. The age-modified interaction effect of vitamin D and NfL was associated with composite neurocognitive scores and verbal fluency in both age groups, and with processing speed domain in the younger group.
Conclusions
We observed a high vitamin D deficiency prevalence in bipolar disorder. We identified the interaction of vitamin D and NfL on cognitive domains, and the effect was modified by age. Longitudinal or randomised controlled studies enrolling patients with various illness durations and mood statuses are required to validate our findings.
Numerous studies of resting-state functional imaging and voxel-based morphometry (VBM) have revealed differences in specific brain regions of patients with bipolar disorder (BD), but the results have been inconsistent.
Methods
A whole-brain voxel-wise meta-analysis was conducted on resting-state functional imaging and VBM studies that compared differences between patients with BD and healthy controls using Seed-based d Mapping with Permutation of Subject Images software.
Results
A systematic literature search identified 51 functional imaging studies (1842 BD and 2190 controls) and 83 VBM studies (2790 BD and 3690 controls). Overall, patients with BD displayed increased resting-state functional activity in the left middle frontal gyrus, right inferior frontal gyrus (IFG) extending to the right insula, right superior frontal gyrus and bilateral striatum, as well as decreased resting-state functional activity in the left middle temporal gyrus extending to the left superior temporal gyrus and post-central gyrus, left cerebellum, and bilateral precuneus. The meta-analysis of VBM showed that patients with BD displayed decreased VBM in the right IFG extending to the right insula, temporal pole and superior temporal gyrus, left superior temporal gyrus extending to the left insula, temporal pole, and IFG, anterior cingulate cortex, left superior frontal gyrus (medial prefrontal cortex), left thalamus, and right fusiform gyrus.
Conclusions
The multimodal meta-analyses suggested that BD showed similar patterns of aberrant brain activity and structure in the insula extending to the temporal cortex, fronto-striatal-thalamic, and default-mode network regions, which provide useful insights for understanding the underlying pathophysiology of BD.
Problematic internet use, especially in people with substance use disorder, may negatively affect their quality of life (QoL). However, it is unclear whether sleep quality is a key mediator in the association between problematic internet use and QoL among people with substance use disorder.
Aims
This study aimed to investigate the relationship between problematic internet use and QoL and how sleep quality may mediate the association between these two variables.
Method
Overall, 319 people (85% male) with substance use disorder (mean age 42.2 years, s.d. 8.9) participated in a cross-sectional study in Taiwan. The Smartphone Application-Based Addiction Scale, Bergan Social Media Addiction Scale, Internet Gaming Disorder-Short Form, Pittsburgh Sleep Quality Index and World Health Organization Quality of Life Questionnaire Brief Version were used.
Results
The prevalence of sleep problems was 56%. There were significant and direct associations between sleep quality and two types of problematic internet use, and between sleep quality and different dimensions of QoL. All types of problematic internet use were significantly and negatively correlated with QoL. Mediated effects of sleep quality in relationships between the different types of problematic internet use and all dimensions of QoL were significant, except for problematic use of social media.
Conclusions
Different types of problematic internet use in people with substance use disorder may be directly associated with reduced QoL. Sleep quality as a significant mediator in this association may be an underlying mechanism to explain pathways between problematic internet use and QoL in this population.
To clarify the concept of disruptive technologies in health care, provide examples and consider implications of potentially disruptive technologies for health technology assessment (HTA).
Methods
We conducted a systematic review of conceptual and empirical papers on healthcare technologies that are described as “disruptive.” We searched MEDLINE and Embase from 2013 to April 2019 (updated in December 2021). Data extraction was done in duplicate by pairs of reviewers utilizing a data extraction form. A qualitative data analysis was undertaken based on an analytic framework for analysis of the concept and examples. Key arguments and a number of potential predictors of disruptive technologies were derived and implications for HTA organizations were discussed.
Results
Of 4,107 records, 28 were included in the review. Most of the papers included conceptual discussions and business models for disruptive technologies; only few papers presented empirical evidence. The majority of the evidence is related to the US healthcare system. Key arguments for describing a technology as disruptive include improvement of outcomes for patients, improved access to health care, reduction of costs and better affordability, shift in responsibilities between providers, and change in the organization of health care. A number of possible predictors for disruption were identified to distinguish these from “sustaining” innovations.
Conclusions
Since truly disruptive technologies could radically change technology uptake and may modify provision of care patterns or treatment paths, they require a thorough evaluation of the consequences of using these technologies, including economic and organizational impact assessment and careful monitoring.
Subthreshold depression (sD) negatively impacts well-being and psychosocial function and is more prevalent compared with major depressive disorder (MDD). However, as adults with sD are less likely to seek face-to-face intervention, internet-based cognitive-behavioral therapy (ICBT) may overcome barriers of accessibility to psychotherapy. Although several trials explored the efficacy of ICBT for sD, the results remain inconsistent. This study evaluated whether ICBT is effective in reducing depressive symptoms among Chinese adults with sD.
Methods
A randomized controlled trial was performed. The participants were randomly assigned to 5 weeks of ICBT, group-based face-to-face cognitive-behavioral therapy (CBT), or a waiting list (WL). Assessments were conducted at baseline, post-intervention and at a 6-month follow-up. The primary outcome measured depressive symptoms using the Center for Epidemiological Studies Depression Scale (CES-D). Outcomes were analyzed using a mixed-effects model to assess the effects of ICBT.
Results
ICBT participants reported greater reductions on all the outcomes compared to the WL group at post-intervention. The ICBT group showed larger improvement on the Patient Health Questionnaire-9 (PHQ-9) at post-intervention (d = 0.12) and at follow-up (d = 0.10), and with CES-D at post-intervention (d = 0.06), compared to the CBT group.
Conclusions
ICBT is effective in reducing depressive symptoms among Chinese adults with sD, and improvements in outcomes were sustained at a 6-month follow-up. Considering the low rates of face-to-face psychotherapy, our findings highlight the considerable potential and implications for the Chinese government to promote the use of ICBT for sD in China.
The effects of non-invasive, non-convulsive electrical neuromodulation (NINCEN) on depression, anxiety and sleep disturbance are inconsistent in different studies. Previous meta-analyses on transcranial direct current stimulation (tDCS) and cerebral electrotherapy stimulation (CES) suggested that these methods are effective on depression. However, not all types of NINECN were included; results on anxiety and sleep disturbance were lacking and the influence of different populations and treatment parameters was not completely analyzed. We searched PubMed, Embase, PsycInfo, PsycArticles and CINAHL before March 2021 and included published randomized clinical trials of all types of NINCEN for symptoms of depression, anxiety and sleep in clinical and non-clinical populations. Data were pooled using a random-effects model. The main outcome was change in the severity of depressive symptoms after NINCEN treatment. A total of 58 studies on NINCEN were included in the meta-analysis. Active tDCS showed a significant effect on depressive symptoms (Hedges' g = 0.544), anxiety (Hedges' g = 0.667) and response rate (odds ratio = 1.9594) compared to sham control. CES also had a significant effect on depression (Hedges' g = 0.654) and anxiety (Hedges' g = 0.711). For all types of NINCEN, active stimulation was significantly effective on depression, anxiety, sleep efficiency, sleep latency, total sleep time, etc. Our results showed that tDCS has significant effects on both depression and anxiety and that these effects are robust for different populations and treatment parameters. The rational expectation of the tDCS effect is ‘response’ rather than ‘remission’. CES also is effective for depression and anxiety, especially in patients with disorders of low severity.
We aimed to investigate the coronavirus disease 2019 (COVID-19)-related knowledge and practices of cancer patients and to assess their anxiety- and depression-related to COVID-19 during the early surge phase of the pandemic.
Methods:
An online questionnaire survey of cancer patients was conducted from February 10-29, 2020. Knowledge and practices related to COVID-19 were assessed using a custom-made questionnaire. The Hospital Anxiety and Depression Scale was used to assess the presence of anxiety and depression, with scores beyond 7 indicating anxiety or depressive disorder. Univariate and multiple linear regression analyses were used to identify the high-risk groups according to the level of knowledge, practices, anxiety, and depression scores.
Results:
A total of 341 patients were included. The rate of lower level of knowledge and practices was 49.9% and 18.8%, respectively. Education level of junior high school degree or lower showed a significant association with lower knowledge score (β: −3.503; P < 0.001) and lower practices score (β: −2.210; P < 0.001) compared to the education level of college degree and above. The prevalence of anxiety and depression among the respondents was 17.6% and 23.2%, respectively. A higher depression score was associated with older age, marital status of the widowed, and lower level of education, knowledge score, and practices score (P < 0.05).
Conclusions:
Targeted COVID-19-related education interventions are required for cancer patients with a lower level of knowledge to help improve their practices. Interventions are also required to address the anxiety and depression of cancer patients.
Neuroinflammation and brain structural abnormalities are found in bipolar disorder (BD). Elevated levels of cytokines and chemokines have been detected in the serum and cerebrospinal fluid of patients with BD. This study investigated the association between peripheral inflammatory markers and brain subregion volumes in BD patients.
Methods:
Euthymic patients with bipolar I disorder (BD-I) aged 20–45 years underwent whole-brain magnetic resonance imaging. Plasma levels of monocyte chemoattractant protein-1 (MCP-1), chitinase-3-like protein 1 (also known as YKL-40), fractalkine (FKN), soluble tumour necrosis factor receptor-1 (sTNF-R1), interleukin-1β, and transforming growth factor-β1 were measured on the day of neuroimaging. Clinical data were obtained from medical records and interviewing patients and reliable others.
Results:
We recruited 31 patients with a mean age of 29.5 years. In multivariate regression analysis, plasma level YKL-40, a chemokine, was the most common inflammatory marker among these measurements displaying significantly negative association with the volume of various brain subareas across the frontal, temporal, and parietal lobes. Higher YKL-40 and sTNF-R1 levels were both significantly associated with lower volumes of the left anterior cingulum, left frontal lobe, right superior temporal gyrus, and supramarginal gyrus. A greater number of total lifetime mood episodes were also associated with smaller volumes of the right caudate nucleus and bilateral frontal lobes.
Conclusions:
The volume of brain regions known to be relevant to BD-I may be diminished in relation to higher plasma level of YKL-40, sTNF-R1, and more lifetime mood episodes. Macrophage and macrophage-like cells may be involved in brain volume reduction among BD-I patients.
To explore and develop effective treatments is crucial for patients with Alzheimer’s dementia (AD). In pathology, the amyloid deposits of AD result in disruption of the balance between long-term potentiation (LTP) and long-term depression (LTD) of neuronal cells and synaptic plasticity. Transcranial direct current stimulation (tDCS) has been proposed to affect long-term synaptic plasticity through LTP and LTD, thereby improving cognitive ability. Although an increasing number of studies have been concluded a positive therapeutic effect on cognition in AD, tDCS studies to date are limited on exploring the duration of its efficacy. In this pilot study, we investigate the effects of tDCS in AD and verify its extending beneficial effects for 3 months follow-up period after the end of stimulation.
Method:
34 AD participants aged 55-90 years (mean age 75.9 (66-86)) were included in a double-blind, randomized, sham-controlled crossover study. All participants were randomly assigned to receive 10 consecutive daily sessions of active tDCS (or sham) and switched groups 3 months later. The anodal electrode was on the left dorsal lateral prefrontal cortex and the cathodal electrode was on the right supraorbital area. In each active session, we applied a current intensity of 2 mA and an electrode size of 25 cm2 for 30 min in the active group. All subjects received a series of neuropsychological assessments including CDR, MMSE, CASI and WCST at baseline and in 2 weeks, 4 weeks, and 12 weeks post-tDCS (or sham) 10 sessions. Chi-square tests, Wilcoxon signed rank tests and Mann-Whitney U tests were used to assess the differences in participant demographic characteristics and to compare the differences of test scores between groups.
Results:
The active tDCS group showed significant improvements on CASI total scores from baseline to 2-weeks, 1-month and 3-months after active stimulations, though the improvement declined over time. There are also different presentations in total correct items, conceptual level responses, failure to maintain sets of WCST between active tDCS and sham groups. There is no difference in MMSE, CASI and WCST scores in the sham groups.
Conclusion:
These results suggest a long term-beneficial effects of tDCS in AD.
Dementia with Lewy Bodies (DLB), this second most common form of degenerative dementia, presents more functional disability, more potentially fatal complication, more impaired quality of life than Alzheimer’s dementia. There is no FDA-proved medication can slow, stop or improve the progression of cognitive declines in DLB. Identifying effective treatments is a critical issue for DLB. In neuropathology, extracelluar α-syn oligomers interfere with the expression of long-term potentiation, and influence memory and learning. Transcranial direct current stimulation (tDCS) has been proposed to affect long-term synaptic plasticity through LTP and LTD, thereby improving cognitive ability. So far, only two researches assess the effect of tDCS in DLB. In this pilot study, we investigate the effects of tDCS in DLB.
Method:
Using a double-blind, randomized, sham- controlled and crossover trial design, 11 DLB aged 55-90 years (mean age 77.8) were included in the study. DLB diagnostics is according to DSM-5 criteria. The CDR ratings of DLB participants ranged from 0.5 to 2. The active tDCS (or sham) process includes consecutive daily sessions of active tDCS (or sham) for 10 days. The anodal electrode was over the left dorsal lateral prefrontal cortex (DLPFC) and the cathodal electrode on the right supraorbital area. In each session, we applied a current intensity of 2 mA and an electrode size of 25 cm2 for 30 min in the active group. All subjects received a series of neuropsychological tests, which included CDR, MMSE, CASI, NPI and WCST, before and after these treatment sessions. Chi-square tests, Wilcoxon signed rank tests and Mann-Whitney U tests were used to assess the differences in participant demographic characteristics and to compare the differences among groups.
Results:
On CASI, MMSE, NPI and WCST, there were no statistically significant differences between pre- and post the 10-session course for the active and the sham groups. No side effects reported during or immediately after active tDCS stimulation.
Conclusion:
These results suggest that left DLPFC anodal, and right deltoid cathodal tDCS, do not improve cognition, behavioral and psychological symptoms in DLB. Larger-scale trials are needed to confirm the effect of tDCS in DLB.
Noncompressible torso hemorrhage (NCTH) is a major challenge in prehospital bleeding control and is associated with high mortality. This study was performed to estimate medical knowledge and the perceived barriers to information acquisition among health-care workers (HCWs) regarding NCTH in China.
Methods:
A self-administered and validated questionnaire was distributed among 11 WeChat groups consisting of HCWs engaged in trauma, emergency, and disaster rescue.
Results:
A total of 575 HCWs participated in this study. In the knowledge section, the majority (87.1%) denied that successful hemostasis could be obtained by external compression. Regarding attitudes, the vast majority of HCWs exhibited positive attitudes toward the important role of NCTH in reducing prehospital preventable death (90.4%) and enthusiasm for continuous learning (99.7%). For practice, fewer than half of HCWs (45.7%) had heard of NCTH beforehand, only a minority (14.3%) confirmed they had attended relevant continuing education, and 16.3% HCWs had no access to updated medical information. The most predominant barrier to information acquisition was the lack of continuing training (79.8%).
Conclusions:
Knowledge and practice deficiencies do exist among HCWs. Obstacles to update medical information warrant further attention. Furthermore, education program redesign is also needed.
A 1178 J near diffraction limited 527 nm laser is realized in a complete closed-loop adaptive optics (AO) controlled off-axis multi-pass amplification laser system. Generated from a fiber laser and amplified by the pre-amplifier and the main amplifier, a 1053 nm laser beam with the energy of 1900 J is obtained and converted into a 527 nm laser beam by a KDP crystal with 62% conversion efficiency, 1178 J and beam quality of 7.93 times the diffraction limit (DL). By using a complete closed-loop AO configuration, the static and dynamic wavefront distortions of the laser system are measured and compensated. After correction, the diameter of the circle enclosing 80% energy is improved remarkably from 7.93DL to 1.29DL. The focal spot is highly concentrated and the 1178 J, 527 nm near diffraction limited laser is achieved.
The aim of this study was to present the clinical characteristics and dynamic changes in laboratory parameters of the coronavirus disease 2019 (COVID-19) in Guangzhou, and explore the probable early warning indicators of disease progression.
Method:
We enrolled all the patients diagnosed with COVID-19 in the Guangzhou No. 8 People’s Hospital. The patients’ demographic and epidemiologic data were collected, including chief complaints, lab results, and imaging examination findings.
Results:
The characteristics of the patients in Guangzhou are different from those in Wuhan. The patients were younger in age, predominately female, and their condition was not commonly combined with other diseases. A total of 75% of patients suffered fever on admission, followed by cough occurring in 62% patients. Comparing the mild/normal and severe/critical patients, being male, of older age, combined with hypertension, abnormal blood routine test results, raised creatine kinase, glutamic oxaloacetic transaminase, lactate dehydrogenase, C-reactive protein, procalcitonin, D-dimer, fibrinogen, activated partial thromboplastin time, and positive proteinuria were early warning indicators of severe disease.
Conclusion:
The patients outside epidemic areas showed different characteristics from those in Wuhan. The abnormal laboratory parameters were markedly changed 4 weeks after admission, and also were different between the mild and severe patients. More evidence is needed to confirm highly specific and sensitive potential early warning indicators of severe disease.
The relationship between exposure to famine in early life and the risk of ascending aorta dilatation (AAD) in adulthood is still unclear; therefore, we aimed to examine the association in the Chinese population. We investigated the data of 2598 adults who were born between 1952 and 1964 in Guangdong, China. All enrolled subjects were categorised into five groups: not exposed to famine, exposed during fetal period, and exposed during early, mid or late childhood. AAD was assessed by cardiac ultrasound. Multivariate logistic regression and interaction tests were performed to estimate the OR and CI on the association between famine exposure and AAD. There were 2598 (943 male, mean age 58·3 ± 3·68 years) participants were enrolled, and 270 (10·4 %) subjects with AAD. We found that famine exposure (OR = 2·266, 95 % CI 1·477, 3·477, P = 0·013) was associated with elevated AAD after adjusting for multiple confounders. In addition, compared with the non-exposed group, the adjusted OR for famine exposure during fetal period, early, mid or late childhood were 1·374 (95 % CI 0·794, 2·364, P = 0·251), 1·976 (95 % CI 1·243, 3·181, P = 0·004), 1·929 (95 % CI 1·237, 3·058, P = 0·004) and 2·227 (95 % CI 1·433, 3·524, P < 0·001), respectively. Subgroup analysis showed that the effect of famine exposure on the association with AAD was more pronounced in female, current smokers, people with BMI ≥ 24 kg/m2 and hypertensive patients. We observed that exposure to famine during early life was linked to AAD in adulthood.
We investigated the effects of botulinum toxin on gait in Parkinson’s disease (PD) patients with foot dystonia. Six patients underwent onabotulinum toxin A injection and were assessed by Burke–Fahn–Marsden Dystonia Rating Scale (BFMDRS), visual analog scale (VAS) of pain, Timed Up and Go (TUG), Berg Balance Test (BBT), and 3D gait analysis at baseline, 1 month, and 3 months. BFMDRS (p = 0.002), VAS (p = 0.024), TUG (p = 0.028), and BBT (p = 0.034) were improved. Foot pressures at Toe 1 (p = 0.028) and Midfoot (p = 0.018) were reduced, indicating botulinum toxin’s effects in alleviating the dystonia severity and pain and improving foot pressures during walking in PD.
Up to now, the Chinese government has only made very general comments on the application of international humanitarian law to cyberspace. There are indeed Chinese academic papers concerning this issue, but the discussion of the principle of distinction is limited both in length and in academic depth. Compared with the West, research by Chinese scholars on this topic is still in a relatively preliminary stage. At present, there is no specific deconstruction or clarification of the application of the principle of distinction in cyberspace in Chinese academia. As the first paper written by Chinese scholars specifically devoted to this question, this piece provides a different perspective by injecting the positions of Chinese officials and the views of Chinese scholars. The authors aim to clarify whether the existing rules are still completely applicable in the cyber context, and if needed, to find out what kind of improvements and clarifications can be made. Weighing in on these debates, we argue that despite the potential technical challenges and uncertainties, the principle of distinction should be applied to cyberspace. It should also be carefully re-examined and clarified from the standpoint of preventing over-militarization and maximizing the protection of the interests of civilians. For human targets, the elements of combatant status identified in customary international law and relevant treaties are not well suited to the digital battlefield. Nevertheless, cyber combatants are still obligated to distinguish themselves from civilians. In applying the principle of distinction, we argue that it makes more sense to focus on substantive elements over formal elements such as carrying arms openly or having a fixed distinctive sign recognizable at a distance. In interpreting “direct participation in hostilities”, the threshold of harm requires an objective likelihood instead of mere subjective intention; the belligerent nexus should be confirmed, and the causal link should be proximate. Applying the “cyber kill chain” model by analogy helps us to grasp the whole process of direct participation in hostilities during cyber warfare. For non-human targets, all military objectives must cumulatively fulfil both the “effective contribution” and “definite military advantage” criteria, which are equally indispensable. The same requirements apply to dual-use objects. Furthermore, certain data should fall within the ambit of civilian objects.
Inflammation might play a role in bipolar disorder (BD), but it remains unclear the relationship between inflammation and brain structural and functional abnormalities in patients with BD. In this study, we focused on the alterations of functional connectivity (FC), peripheral pro-inflammatory cytokines and their correlations to investigate the role of inflammation in FC in BD depression.
Methods
In this study, 42 unmedicated patients with BD II depression and 62 healthy controls (HCs) were enrolled. Resting-state-functional magnetic resonance imaging was performed in all participants and independent component analysis was used. Serum levels of Interleukin-6 (IL-6) and Interleukin-8 (IL-8) were measured in all participants. Correlation between FC values and IL-6 and IL-8 levels in BD was calculated.
Results
Compared with the HCs, BD II patients showed decreased FC in the left orbitofrontal cortex (OFC) implicating the limbic network and the right precentral gyrus implicating the somatomotor network. BD II showed increased IL-6 (p = 0.039), IL-8 (p = 0.002) levels. Moreover, abnormal FC in the right precentral gyrus were inversely correlated with the IL-8 (r = −0.458, p = 0.004) levels in BD II. No significant correlation was found between FC in the left OFC and cytokines levels.
Conclusions
Our findings that serum IL-8 levels are associated with impaired FC in the right precentral gyrus in BD II patients suggest that inflammation might play a crucial role in brain functional abnormalities in BD.