To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Trematodes of the genus Ogmocotyle are intestinal flukes that can infect a variety of definitive hosts, resulting in significant economic losses worldwide. However, there are few studies on molecular data of these trematodes. In this study, the mitochondrial (mt) genome of Ogmocotyle ailuri isolated from red panda (Ailurus fulgens) was determined and compared with those from Pronocephalata to investigate the mt genome content, genetic distance, gene rearrangements and phylogeny. The complete mt genome of O. ailuri is a typical closed circular molecule of 14 642 base pairs, comprising 12 protein-coding genes (PCGs), 22 transfer RNA genes, 2 ribosomal RNA genes and 2 non-coding regions. All genes are transcribed in the same direction. In addition, 23 intergenic spacers and 2 locations with gene overlaps were determined. Sequence identities and sliding window analysis indicated that cox1 is the most conserved gene among 12 PCGs in O. ailuri mt genome. The sequenced mt genomes of the 48 Plagiorchiida trematodes showed 5 types of gene arrangement based on all mt genome genes, with the gene arrangement of O. ailuri being type I. Phylogenetic analysis using concatenated amino acid sequences of 12 PCGs revealed that O. ailuri was closer to Ogmocotyle sikae than to Notocotylus intestinalis. These data enhance the Ogmocotyle mt genome database and provide molecular resources for further studies of Pronocephalata taxonomy, population genetics and systematics.
Slowed information processing speed (IPS) is the core contributor to cognitive impairment in patients with late-life depression (LLD). The hippocampus is an important link between depression and dementia, and it may be involved in IPS slowing in LLD. However, the relationship between a slowed IPS and the dynamic activity and connectivity of hippocampal subregions in patients with LLD remains unclear.
One hundred thirty-four patients with LLD and 89 healthy controls were recruited. Sliding-window analysis was used to assess whole-brain dynamic functional connectivity (dFC), dynamic fractional amplitude of low-frequency fluctuations (dfALFF) and dynamic regional homogeneity (dReHo) for each hippocampal subregion seed.
Cognitive impairment (global cognition, verbal memory, language, visual–spatial skill, executive function and working memory) in patients with LLD was mediated by their slowed IPS. Compared with the controls, patients with LLD exhibited decreased dFC between various hippocampal subregions and the frontal cortex and decreased dReho in the left rostral hippocampus. Additionally, most of the dFCs were negatively associated with the severity of depressive symptoms and were positively associated with various domains of cognitive function. Moreover, the dFC between the left rostral hippocampus and middle frontal gyrus exhibited a partial mediation effect on the relationships between the scores of depressive symptoms and IPS.
Patients with LLD exhibited decreased dFC between the hippocampus and frontal cortex, and the decreased dFC between the left rostral hippocampus and right middle frontal gyrus was involved in the underlying neural substrate of the slowed IPS.
Deficits in visuospatial attention, known as neglect, are common following brain injury, but underdiagnosed and poorly treated, resulting in long-term cognitive disability. In clinical settings, neglect is often assessed using simple pen-and-paper tests. While convenient, these cannot characterise the full spectrum of neglect. This protocol reports a research programme that compares traditional neglect assessments with a novel virtual reality attention assessment platform: The Attention Atlas (AA).
The AA was codesigned by researchers and clinicians to meet the clinical need for improved neglect assessment. The AA uses a visual search paradigm to map the attended space in three dimensions and seeks to identify the optimal parameters that best distinguish neglect from non-neglect, and the spectrum of neglect, by providing near-time feedback to clinicians on system-level behavioural performance. A series of experiments will address procedural, scientific, patient, and clinical feasibility domains.
Analyses focuses on descriptive measures of reaction time, accuracy data for target localisation, and histogram-based raycast attentional mapping analysis; which measures the individual’s orientation in space, and inter- and intra-individual variation of visuospatial attention. We will compare neglect and control data using parametric between-subjects analyses. We present example individual-level results produced in near-time during visual search.
The development and validation of the AA is part of a new generation of translational neuroscience that exploits the latest advances in technology and brain science, including technology repurposed from the consumer gaming market. This approach to rehabilitation has the potential for highly accurate, highly engaging, personalised care.
Background: Catheter-associated urinary tract infection (CAUTI) is considered a preventable healthcare-associated infection. Many local and national interventions using multimodal prevention measures have targeted CAUTI incidence as the primary outcome. Other undesirable events related to urinary catheters and infections such as overuse of urine culturing and antimicrobial prescribing for asymptomatic bacteriuria, are not captured by CAUTI surveillance, and may not be the targets of such interventions. The aim of this study was to assess the impact of expanded national surveillance targeting various aspects of urinary tract infections, culturing and treatment practices, and catheter use in internal medicine wards. Methods: The Israeli National Center for Infection Control (NCIC) issued CAUTI prevention guidelines and initiated in 2016 a urinary tract event surveillance system that targets the incidence of CAUTI, urinary catheter utilization ratio, and the proportion of urine cultures sent and patients treated in the absence of symptoms. The surveillance is conducted for 1 month 3 times per year. Hospitals are required to report all positive urine cultures (>100,000 CFU) collected in internal medicine wards, along with the following data: admission date, symptoms of infection, dates of urinary catheter use, and antibiotic treatment. These data enable the NCIC to validate hospital classifications of each event. In addition, during each surveillance month, hospitals conduct point-prevalence surveys of compliance with CAUTI prevention measures. An electronic data collection form with built-in algorithms supports the local teams during the surveillance process. Results: Between 2016 and 2019, a total of 3,028 positive urine cultures not present on admission were reported by internal medicine wards in 30 hospitals. A significant decrease was observed in the incidence of CAUTI (from 4.7 to 2.9; P < .001) and in the proportion of asymptomatic bacteriuria treated with antibiotics (from 31% to 20%; P = .02) (Table 1). The catheter utilization ratio decreased from 0.25 to 0.23 (P < .001). The rate of cultures sent from asymptomatic patients decreased from 1.5 to 1.1 (P < .01). Point-prevalence surveys in internal medicine wards detected a significant increase in the use of closed urinary drainage systems (from 79% to 97% in 2018, P < .001) and documentation of a daily nurse assessment of the need for a catheter (from 74% to 81%, P < .001). Conclusions: National surveillance of undesirable urinary tract events resulted in a significant reduction in CAUTI, antibiotic treatment for ASB, and the rate of cultures sent from asymptomatic patients. A small decrease was observed in catheter utilization ratio. CAUTI surveillance programs should include other undesirable urinary tract events.
OBJECTIVES/GOALS: Our goal is to develop a silk fibroin scaffold-based neural tissue construct and characterize it in a rat model of cortical injury. We aim to optimize the construct for transplantation, test pharmacologic interventions that may enhance its survival, and evaluate its integration with the host brain. METHODS/STUDY POPULATION: To optimize cell density and health, silk fibroin scaffolds varying in porosity and stiffness were seeded with E18 GFP+ rat cortical neurons and imaged at DIV 5. Different seeding methods and loads were similarly tested. Constructs, loaded with an inhibitor of apoptosis (ROCK inhibitor Y-27632) or necroptosis (necrostatin-1) in a fibrin hydrogel, were transplanted into aspiration lesions created in the primary motor cortex of Sprague-Dawley rats, and graft survival was compared to negative control at 2 weeks. Lastly, constructs were transplanted and evaluated via immunohistochemistry at 1, 2, and 4-month time points for survival, differentiation, inflammation, and anatomic integration. RESULTS/ANTICIPATED RESULTS: Scaffolds with smaller pore sizes retained more cells after seeding. Softer scaffolds, which enhance hemostasis at transplantation, did not compromise cell health on live/dead assay. We anticipate that seeding concentrated cell suspensions onto multiple surfaces of the construct will produce the most evenly seeded and cell-dense constructs. Based on a prior pilot study, we anticipate that necrostatin-1 will significantly improve intermediate-term construct survival. We have observed up to 15% cell survival at 1 month with retained neuronal identity and abundant axonal projections into the brain despite evidence of persistent inflammation; we anticipate similar outcomes at later time points. DISCUSSION/SIGNIFICANCE OF IMPACT: Our construct, due to its exceptional longevity in vitro, manipulability, and modularity, is an attractive platform for neural tissue engineering. In the present work, we optimize and validate this technology for transplantation with the goal of addressing the morbidity burden of cortical injury.
Cognitive impairment in late-life depression is common and associated with a higher risk of all-cause dementia. Late-life depression patients with comorbid cardiovascular diseases (CVDs) or related risk factors may experience higher risks of cognitive deterioration in the short term. We aim to investigate the effect of CVDs and their related risk factors on the cognitive function of patients with late-life depression.
A total of 148 participants were recruited (67 individuals with late-life depression and 81 normal controls). The presence of hypertension, coronary heart disease, diabetes mellitus, or hyperlipidemia was defined as the presence of comorbid CVDs or related risk factors. Global cognitive functions were assessed at baseline and after a one-year follow-up by the Mini-Mental State Examination (MMSE). Global cognitive deterioration was defined by the reliable change index (RCI) of the MMSE.
Late-life depression patients with CVDs or related risk factors were associated with 6.8 times higher risk of global cognitive deterioration than those without any of these comorbidities at a one-year follow-up. This result remained robust after adjusting for age, gender, and changes in the Hamilton Depression Rating Scale (HAMD) scores.
This study suggests that late-life depression patients with comorbid CVDs or their related risk factors showed a higher risk of cognitive deterioration in the short-term (one-year follow up). Given that CVDs and their related risk factors are currently modifiable, active treatment of these comorbidities may delay rapid cognitive deterioration in patients with late-life depression.
A liver transplant recipient developed hospital-acquired symptomatic hepatitis C virus (HCV) genotype 6a infection 14 months post transplant.
Standard outbreak investigation.
Patient chart review, interviews of patients and staff, observational study of patient care practices, environmental surveillance, blood collection simulation experiments, and phylogenetic study of HCV strains using partial envelope gene sequences (E1–E2) of HCV genotype 6a strains from the suspected source patient, the environment, and the index patient were performed.
Investigations and data review revealed no further cases of HCV genotype 6a infection in the transplant unit. However, a suspected source with a high HCV load was identified. HCV genotype 6a was found in a contaminated reusable blood-collection tube holder with barely visible blood and was identified as the only shared item posing risk of transmission to the index case patient. Also, 14 episodes of sequential blood collection from the source patient and the index case patient were noted on the computerized time log of the laboratory barcoding system during their 13 days of cohospitalization in the liver transplant ward. Disinfection of the tube holders was not performed after use between patients. Blood collection simulation experiments showed that HCV and technetium isotope contaminating the tip of the sleeve capping the sleeved-needle can reflux back from the vacuum-specimen tube side to the patient side.
A reusable blood-collection tube holder without disinfection between patients can cause a nosocomial HCV infection. Single-use disposable tube holders should be used according to the recommendations by Occupational Safety and Health Administration and World Health Organization.
OBJECTIVES/SPECIFIC AIMS: Patients with locally advanced pancreatic cancer typically have poor outcomes, with a median survival of ~16 months. Novel methods to improve local control are needed. Nab-paclitaxel (abraxane) has shown efficacy in pancreatic cancer and is FDA approved for metastatic disease in combination with gemcitabine. Nab-paclitaxel is also a promising radiosensitizer based on laboratory studies, but it has never been clinically tested with definitive radiotherapy for locally advanced disease. METHODS/STUDY POPULATION: We performed a phase 1 study using a 3+3 dose-escalation strategy to determine the safety and tolerability of dose escalated nab-paclitaxel with fractionated radiotherapy for patients with unresectable or borderline resectable pancreatic cancer. Following induction chemotherapy with 2 cycles of nab-paclitaxel and gemcitabine, patients were treated with weekly nab-paclitaxel and daily radiotherapy to a dose of 52.5 Gy in 25 fractions. Final dose-limiting toxicity (DLT) determination was performed at day 65 after the start of radiotherapy. RESULTS/ANTICIPATED RESULTS: Nine patients received nab-paclitaxel at a dose level of either 100 mg/m2 (n=3) or 125 mg/m2 (n=6). One DLT (grade 3 neuropathy) was observed in a patient who received 125 mg/m2 of nab-paclitaxel. Other grade 3 toxicities included fatigue (11%), anemia (11%), and neutropenia (11%). No grade 4 toxicities were observed. With a median follow-up of 8 months (range 5–28 months), median survival was 19 months and median progression-free survival was 10 months. Following chemoradiation, 3 patients underwent surgical resection, all with negative margins and limited tumor viability. Of the 3 patients, 2 initially had borderline resectable tumors and 1 had an unresectable tumor. Tumor (SMAD-4, Caveolin-1) and peripheral (circulating tumor cells and microvesicles) biomarkers were collected and are being analyzed. DISCUSSION/SIGNIFICANCE OF IMPACT: The combination of fractionated radiation and weekly nab-paclitaxel was safe and well tolerated. This regimen represents a potentially promising therapy for patients with unresectable and borderline resectable pancreatic cancer and warrants further investigation.
Babesiosis is an emerging zoonotic disease caused by intraerythrocytic protozoa and transmitted by ticks. The first well-documented case of human Babesia infection was reported in 1957 in a splenectomized resident of Yugoslavia, who died after an acute illness marked by anemia, fever, hemoglobinuria, and renal failure. Intraerythrocytic parasites were noted and tentatively identified as Babesia bovis. Since then, other Babesia species have been found to cause disease in humans: Babesia microti, Babesia duncani, Babesia duncani-type, and Babesia divergens-like in North America; B. divergens, B. microti, and Babesia venatorum in Europe; and B. microti-like and KO-1 in Asia. The clustering of cases of human B. microti infection in the United States contrasts with the sporadic occurrence of the disease in Europe, Africa, and Asia. Rarely, babesiosis may be transmitted through blood transfusion or transplacentally.
More than 90 species in the genus Babesia infect a wide variety of wild and domestic animals. Humans are an uncommon and terminal host for Babesia species, which depend on other species for their development and transmission. The most common cause for human babesiosis is B. imicroti, a babesia of rodents. The primary reservoir for B. microti in eastern North America is the white-footed mouse (Peromyscus leucopus). As many as two-thirds of P. leucopus have been found to be parasitemic in endemic areas. Babesia species are transmitted by hard-bodied (ixodid) ticks. The primary vector in eastern North America is Ixodes scapularis (also known as Ixodes dammini), which is the same tick that transmits Borrelia burgdorferi, the etiologic agent of Lyme disease, and Anaplasma phagocytophilum, the agent of human granulocytic anaplasmosis. Thus, simultaneous human infection with two or more of these pathogens may occur.
Objective: To examine clinical response and symptomatic remission in two studies of lisdexamfetamine dimesylate (LDX) in children with attention-deficit/hyperactivity disorder (ADHD).
Methods: In a 4-week, placebo-controlled, double-blind trial, children 6–12 years of age with ADHD received LDX (30–70 mg/day) or placebo. In an open-label trial, children from previous studies were titrated to optimal dose over 4 weeks and maintained up to 1 year. Primary and secondary efficacy assessments were the ADHD Rating Scale IV (ADHD-RS-IV) and Clinical Global Impressions-Improvement (CGI-I) scale, respectively. Clinical response was defined as ≥30% reduction in ADHD-RS-IV total score with a CGI-I rating of 1 or 2; symptomatic remission was defined by ADHD-RS-IV total score ≤18.
Results: In the 4-week study (N=285), at any postdose assessment, 79.3% achieved response (median 13 days) and 67.1% achieved remission (median 22 days) with LDX versus 29.2% and 23.6% with placebo. In the long-term study (N=251), at any postdose assessment, 96.0% responded and 62.7% maintained response; 88.8% achieved remission and 46.4% maintained remission.
Conclusion: Most children treated with LDX achieved clinical response and symptomatic remission at one time point; once achieved, almost half maintained remission.
The present study aimed to compare the effects of a general dietary intervention and an intervention with low glycaemic load (GL) on glycaemic control, blood lipid metabolism and pregnancy outcomes in women with gestational diabetes mellitus.
Participants were randomly assigned to two groups, receiving either an individualized general dietary intervention (Control group) or an intensive low-GL intervention (Low-GL group) every two weeks, from 24–26 weeks of gestation to delivery.
The Center of Maternal Primary Care in Guangdong General Hospital, China.
Ninety-five women with gestational diabetes mellitus were enrolled from June 2008 to July 2009.
After the intervention, both groups significantly decreased their dietary intakes of energy, fat and carbohydrate. The Low-GL group had significantly lower values for GL (122 v. 136) and glycaemic index (50 v. 54) but greater dietary fibre intake (33 v. 29 g/d) than did the Control group (all P<0·01). Significantly greater decreases in fasting plasma glucose (−0·33 v. −0·02 mmol/l, P<0·01) and 2 h postprandial glucose (−2·98 v. −2·51 mmol/l, P<0·01), significantly lower increases in total cholesterol (0·12 v. 0·23 mmol/l) and TAG (0·41 v. 0·56 mmol/l) and a significantly lower decrease in HDL cholesterol (−0·01 v. −0·11 mmol/l) were also observed in the Low-GL group compared with the Control group (all P<0·05). There were no significant differences in body weight gain, birth weight or other maternal–fetal perinatal outcomes between the two groups.
The low-GL targeted dietary intervention outperformed the general dietary intervention in glycaemic control and the improvement of blood lipid levels in women with gestational diabetes mellitus.
In this paper, a novel set of macros with line/space width from 128nm/128nm, 64nm/64nm to 32nm/32nm was designed and installed on 20nm technology-node hardware. The pitch-dependent pad erosion post Cu CMP was studied by atomic-force microscopy (AFM), scanning electron microscopy (SEM) and transmission electron microscopy (TEM) quantitatively on these macros. Two methods were investigated to reduce the difference between pitch- and density-induced CMP non-uniformity. The first is using new scheme of partial Cu plating process followed by SiCNH insulator deposition and then CMP. The second is through the selection of slurries and pads. Both results are discussed in this paper.
Innovative printing technology enables fine feature deposition (below 10μm) of electronic materials onto low-temperature, non-planar substrates without masks. This could be a promising technology to meet the requirements of present and future microelectronic systems. Silver nanoparticles (NP) ink is widely used for printed electronics; however, its electrical conductivity is low compared to bulk materials. In order to improve the electrical conductivity of printed tracks for the aerosol printing technique, we developed a novel carbon nanotubes (CNTs)/silver NP ink by mechanical stirring and sonication. The produced sample inks with different concentration of CNTs that were printed with Aerosol Jet® printing system. We found that the CNTs bridged the defects in some printed silver lines, thereby lowering the electrical resistivity by 38%. However, no further improvements were observed with a higher CNT concentration in the silver NP ink samples. We hypothesize that CNT bridges connects the defects thus decreasing the resistivity of printed silver lines when CNT concentration is under the percolation level. However, when it is above a concentration threshold, the resistivity of printed silver lines stops decreasing and even increases because of Schottky barrier effect.
The challenges associated with meeting 20nm technology requirements for better Cu CMP process uniformity and lower defectivity have been studied. Required improvements in uniformity were obtained through platen process optimization along with evaluation & selection of specific Cu slurries and pads and their performance reported. The principal factors influencing defect formation, including Cu barrier metallurgy, interconnect pattern density and process queue times were studied. Specific new post CMP clean chemistries were evaluated to assess their capability to suppress defect formation and their performance reported. The trade off between uniformity and defect suppression as a function slurry, pad and post Cu CMP clean chemistry is described.
Polyimide (PI)-matrix composite films containing inorganic nanoparticles (nano-Al2O3 and nano-TiO2) have been fabricated. A proposed model is used to explain different structures of the (Al2O3–TiO2)/PI (ATP) films synthesized by employing in situ polymerization. Dependences of dielectric permittivities of the ATP films on frequency and temperature were studied. Results show the breakdown strength of the films decreases with prolonging the corona aging time. The incorporation of the nano-Al2O3 and nano-TiO2 particles significantly improves the corona resistance of the films. The corona aging also influences the infrared absorbance, the glass transition temperature (Tg), and loss factor (tanδ) of the ATP films.
Simple sequence repeats (SSR) and random amplification of polymorphic DNA (RAPD) molecular markers were used to assess the genetic diversity of 80 isolates of Phytophthora infestans in potato (Solanum tuberosum) from Fujian, Heilongjiang, Hebei and Inner Mongolia Provinces in China. Polymorphism was identified by 13 SSR primers and 14 RAPD primers in the isolates of P. infestans in potato. A total of 76 bands were amplified by SSRs, with the percentage of polymorphic bands (PPB) being 78.9% and the similarity coefficient ranging between 0.00 and 0.42. A total of 189 bands were amplified by RAPDs, with the percentage of polymorphic bands being 95.2% and the similarity coefficient ranging between 0.04 and 0.66. Analysis of genetic diversity showed that there exists higher genetic variation in the Fujian population in comparison to the populations of Heilongjiang, Hebei and Inner Mongolia. Nei's genetic identity analysis indicates that the genetic similarity between populations of Heilongjiang and Inner Mongolia is the highest and that between Fujian and Hebei is the lowest. A cluster analysis revealed that isolates from Fujian, in the south of China, are distantly related to those from Heilongjiang, Hebei and Inner Mongolia in the north, and the Fujian population is distributed among more groups than the other three, exhibiting a higher genetic diversity.
Reaction time (RT) variability is one of the strongest findings to emerge in cognitive-experimental research of attention deficit hyperactivity disorder (ADHD). We set out to confirm the association between ADHD and slow and variable RTs and investigate the degree to which RT performance improves under fast event rate and incentives. Using a group familial correlation approach, we tested the hypothesis that there are shared familial effects on RT performance and ADHD.
A total of 144 ADHD combined-type probands, 125 siblings of the ADHD probands and 60 control participants, ages 6–18, performed a four-choice RT task with baseline and fast-incentive conditions.
ADHD was associated with slow and variable RTs, and with greater improvement in speed and RT variability from baseline to fast-incentive condition. RT performance showed shared familial influences with ADHD. Under the assumption that the familial effects represent genetic influences, the proportion of the phenotypic correlation due to shared familial influences was estimated as 60–70%.
The data are inconsistent with models that consider RT variability as reflecting a stable cognitive deficit in ADHD, but instead emphasize the extent to which energetic or motivational factors can have a greater effect on RT performance in ADHD. The findings support the role of RT variability as an endophenotype mediating the link between genes and ADHD.