Book chapters will be unavailable on Saturday 24th August between 8am-12pm BST. This is for essential maintenance which will provide improved performance going forwards. Please accept our apologies for any inconvenience caused.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Several life-threatening diseases of the kidney have their origins in mutational events that occur during embryonic development. In this study, we investigate the role of the Wolffian duct (WD), the earliest embryonic epithelial progenitor of renal tubules, in the etiology of autosomal dominant polycystic kidney disease (ADPKD). ADPKD is associated with a germline mutation of one of the two Pkd1 alleles. For the disease to occur, a second event that disrupts the expression of the other inherited Pkd1 allele must occur. We postulated that this secondary event can occur in the pronephric WD. Using Cre-Lox recombination, mice with WD-specific deletion of one or both Pkd1 alleles were generated. Homozygous Pkd1-targeted deletion in WD-derived tissues resulted in mice with large cystic kidneys and serologic evidence of renal failure. In contrast, heterozygous deletion of Pkd1 in the WD led to kidneys that were phenotypically indistinguishable from control in the early postnatal period. High-throughput sequencing, however, revealed underlying gene and microRNA (miRNA) changes in these heterozygous mutant kidneys that suggest a strong predisposition toward developing ADPKD. Bioinformatic analysis of this data demonstrated an upregulation of several miRNAs that have been previously associated with PKD; pathway analysis further demonstrated that the differentially expressed genes in the heterozygous mutant kidneys were overrepresented in signaling pathways associated with maintenance and function of the renal tubular epithelium. These results suggest that the WD may be an early epithelial target for the genetic or molecular signals that can lead to cyst formation in ADPKD.
In this study, the pull-in phenomenon of a Nano-actuator is investigated employing a nonlocal Bernoulli-Euler beam model with clamped-clamped conditions. The model accounts for viscous damping, residual stresses, the van der Waals (vdW) force and electrostatic forces with nonlocal effects. The hybrid differential transformation/finite difference method (HDTFDM) is used to analyze the nonlocal effects on a graphene sheet nanobeam, which is electrostatically actuated under the influence of the coupling effect, the von Kármán nonlinear strains and the fringing field effect. The pull-in voltage as calculated by the presented model deviates by no more than 0.29% from previous literature, verifying the validity of the HDTFDM. Furthermore, the nonlocal nonlinear behavior of the electrostatically actuated nanobeam is investigated, and the effects of viscous damping, residual stresses, and length-gap ratio are examined in detail. Overall, the results reveal that small scale effects significantly influence the characteristics of the graphene sheet nanobeam actuator.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
One of the main goals in biogeochemistry is to explore the global relationships between organisms and chemical elements in different ecosystems. A diversity of analytical techniques based on chemical-physical or molecular biological procedures are available to explore different organisms and their abiotic and biotic interactions in a variety of ecosystems. Even though many of these modern analytical techniques are irreplaceable in today´s research, most of them can only provide indirect results because they are built on a “black-box” approach, where the biological species in an ecosystem or a geological environment are disrupted for extraction of nucleic acids, proteins, etc. Essential biological information, such as the morphology of specific species, their location, distribution, association with other organisms in their natural environment, and individual activities and functions, is therefore lost. Fluorescence in situ hybridization (FISH) helps retrieve this information without either cultivation or extraction of cell components, and can therefore provide a quick and useful complement to different “black-box”-based approaches. FISH is based on fluorescently labeled gene probes with a unique nucleotide composition designed to match specific genes in different cellular species. Thus, different biological species can be identified simultaneously with different gene probes labeled with different fluorochromes in their natural environment. The technique has undergone extensive development with around 30 variations for different applications. FISH is evaluated either by microscopy (e.g., fluorescence microscopy, Raman micro spectroscopy, Nano-SIMS), or by nonmicroscope-based methods, such as flow cytometry, microarray technology, or molecular biological methods such as proteomics. This chapter will serve as a guide for sample preparation, selection of appropriate FISH protocols, evaluation and design of gene probes, and evaluation of FISH experiments.
To assess whether disparities in energy consumption and insufficient energy intake in India have changed over time across socio-economic status (SES).
This cross-sectional, population-based survey study examines the relationship between several SES indicators (i.e. wealth, education, caste, occupation) and energy consumption in India at two time points almost 20 years apart. Household food intake in the last 30 d was assessed in 1993–94 and in 2011–12. Average dietary energy intake per person in the household (e.g. kilocalories) and whether the household consumed less than 80 % of the recommended energy intake (i.e. insufficient energy intake) were calculated. Linear and relative risk regression models were used to estimate the relationship between SES and average energy consumed per day per person and the relative risk of consuming an insufficient amount of energy.
Rural and urban areas across India.
A nationally representative sample of households.
Among rural households, there was a positive association between SES and energy intake across all four SES indicators during both survey years. Similar results were seen for energy insufficiency vis-à-vis recommended energy intake levels. Among urban households, wealth was associated with energy intake and insufficiency at both time points, but there was no educational patterning of energy insufficiency in 2011–12.
Results suggest little overall change in the SES patterning of energy consumption and percentage of households with insufficient energy intake from 1993–94 to 2011–12 in India. Policies in India need to improve energy intake among low-SES households, particularly in rural areas.
The objectives of this paper are to: (1) identify contextual factors such as policy that impacted the implementation of community-based primary health care (CBPHC) innovations among 12 Canadian research teams and (2) describe strategies used by the teams to address contextual factors influencing implementation of CBPHC innovations. In primary care settings, consideration of contextual factors when implementing change has been recognized as critically important to success. However, contextual factors are rarely recorded, analyzed or considered when implementing change. The lack of consideration of contextual factors has negative implications not only for successfully implementing primary health care (PHC) innovations, but also for their sustainability and scalability. For this evaluation, data collection was conducted using self-administered questionnaires and follow-up telephone interviews with team representatives. We used a combination of directed and conventional content analysis approaches to analyze the questionnaire and interview data. Representatives from all 12 teams completed the questionnaire and 11 teams participated in the interviews; 40 individuals participated in this evaluation. Four themes representing contextual factors that impacted the implementation of CBPHC innovations were identified: (I) diversity of jurisdictions (II) complexity of interactions and collaborations (III) policy, and (IV) the multifaceted nature of PHC. The teams used six strategies to address these contextual factors including: (1) conduct an environmental scan at the beginning (2) maintaining engagement among partners and stakeholders by encouraging open and inclusive communication; (3) contextualizing the innovation for different settings; (4) anticipating and addressing changes, delays, and the need for additional resources; (5) fostering a culture of research and innovation among partners and stakeholders; and (6) ensuring information about the innovation is widely available. Implementing CBPHC innovations across jurisdictions is complex and involves navigating through multiple contextual factors. Awareness of the dynamic nature of context should be considered when implementing innovations.
We used multivariable analyses to assess whether meeting core elements was associated with antibiotic utilization. Compliance with 7 elements versus not doing so was associated with higher use of broad-spectrum agents for community-acquired infections [days of therapy per 1,000 patient days: 155 (39) vs 133 (29), P = .02] and anti-methicillin-resistant S. aureus agents [days of therapy per 1,000 patient days: 145 (37) vs 124 (30), P = .03].
Scholars and observers worry that Congress has lost its capacity to perform its functions in the American political system. Drawing on an array of data on Congress’s activities and processes along with in-depth interviews with long-serving lawmakers and high-level staffers, we take stock of how changes to internal processes have affected Congress’s institutional capacities. In doing so, we make two interrelated arguments. First, we argue that Congress can take transformative action whether the legislative process is centralized and leadership-led or whether it is decentralized and committee-led. Second, we argue that Congress is better able than in previous eras to engage in conflict-clarifying representation in order to express and educate the public on the positions of the parties. We conclude that changes to congressional processes in recent years should be viewed as adaptations to the challenges of contemporary lawmaking. These adaptations help preserve Congress’s institutional capacity, but they have undoubtedly had negative consequences for open deliberation and individual member input into legislation.
Background: There is an unmet need for blood-based biomarkers that can reliably detect MS disease activity. Serum Biomarkers of interest includ Neurofilament-light-chain (NfL), Glial-fibrillary-strocyte-protein(GFAP) and Tau. Bone Marrow Transplantation (BMT) is reserved for aggressive forms of MS and has been shown to halt detectable CNS inflammatory activity for prolonged periods. Significant pre-treatment tissue damage at followed by inflammatory disease abeyance should be reflected longitudinal sera collected from these patients. Methods: Sera were collected from 23 MS patients pre-treatment, and following BMT at 3, 6, 9 and 12-months in addition from 33 non-inflammatory neurological controls. Biomarker quantification was performed with SiMoA. Results: Pre-AHSCT levels of serum NfL and GFAP but not Tau were elevated compared to controls (p=0.0001), and NfL correlated with lesion-based disease activity (6-month-relapse, MRI-T2 and Gadolinium-enhancement). 3-months post-treatment, while NfL levels remained elevated, Tau/GFAP paradoxically increased (p=0.0023/0.0017). These increases at 3m correlated with MRI ‘pseudoatrophy’ at 6-months. NfL/Tau levels dropped to that of controls by 6-months (p=0.0036/0.0159). GFAP levels dropped progressively after 6-months although even at 12-months remained higher than controls (p=0.004). Conclusions: NfL was the closest correlate of MS disease activity and treatment response. Chemotherapy-related toxicity may account for transient increases in NfL, Tau and MRI brain atrophy post-BMT.
Background: Stimulation frequency has been considered a crucial determinant of efficacy in deep brain stimulation (DBS). DBS at frequencies over 250Hz is not currently employed and consensus in the field suggests that higher frequencies are not clinically effective. With the recent demonstration of clinically effective ultra-high frequency (UHF) spinal cord stimulation at 10kHz we tested whether UHF stimulation could also be clinically useful in movement disorder patients with DBS. Methods: We studied the effects of conventional (130Hz) and UHF stimulation in five patients with Parkinson’s disease (PD) with STN DBS and in one patient with essential tremor (ET) with VIM DBS. We compared the clinical benefit and adverse effects of stimulation at various amplitudes either intraoperatively or postoperatively with the electrodes externalized. Results: Motor performance improved in all six patients with UHF DBS. 10kHz stimulation at amplitudes ≥3.0mA appeared to be as effective as 130Hz in improving motor symptoms (46.2% vs 53.5% motor score reduction, p=0.110, N=90 trials). Interestingly, 10kHz stimulation resulted in fewer stimulation-induced paresthesiae and speech adverse effects than 130Hz stimulation. Conclusions: Our results indicate that DBS at 10kHz produces clinical benefits while possibly reducing stimulation-induced adverse effects in patients with movement disorders.
We present an account of why we decided to retract a paper. We discovered a lack of adherence to conventional trials registration, execution, interpretation and reporting, and consequently, with the authors, needed to correct the scientific record. We set out our responses in general to strengthen research integrity.
Declaration of interest
K.S.B. is Editor-in-Chief of the British Journal of Psychiatry. W.L., K.R.K. and S.M.L. are members of the senior editorial committee and the research integrity committee for the journal. In the past three years, S.M.L. has received research support from Janssen and Lundbeck, and personal support from Janssen, Otsuka and Sunovion.
Introduction: During the one-year CCFP-EM program, residents rotate through different teaching sites. The purpose of this project is to investigate differences in procedural skills acquisition between these sites, which will help identify the effectiveness of each setting for teaching procedural skills amongst EM trainees. Methods: Over a two year period, residents enrolled in a CCFP-EM residency training program were asked to log their procedures and the sites where they were performed. The cumulative data was analyzed to show the number and types of procedures performed at each site. Results: A total of 477 procedures were logged over two years, with 198 procedures performed at urban tertiary emergency departments (EDs), 116 at community EDs, 87 at intensive care units (ICUs), 37 at urgent care centre, 24 in clinics, and 15 at other settings. Overall, 48 point of care ultrasounds, 75 vascular access procedures, 99 reduction/casting, 48 lumbar punctures, 29 procedural sedations, 125 minor surgical procedures, and 32 other procedures were performed. The majority of procedures were performed at the tertiary care urban ED, followed closely by community ED setting. The only exception was vascular access, which was performed most commonly in ICU settings. Conclusion: Our urban tertiary care ED setting provided the most learning opportunity for procedural skill acquisition, suggesting that having maximized time allocated in this setting is essential for EM learners to acquire procedural skills. One exception is that EM learners gain more vascular access training in ICUs.
Introduction: It is recommended that seniors consulting to the Emergency Department (ED) undergo a comprehensive geriatric screening, which is difficult for most EDs. Patient self-assessment using electronic tablet could be an interesting solution to this issue. However, the acceptability of self-assessment by older ED patients remains unknown. Assessing acceptability is a fundamental step in evaluating new interventions. The main objective of this project is to compare the acceptability of older patient self-assessment in the ED to that of a standard assessment made by a professional, according to seniors and their caregivers. Methods: Design: This randomized crossover design cohort study took place between May and July 2018. Participants: 1) Patients aged ≥65 years consulting to the ED, 2) their caregiver, when present. Measurements: Patients performed self-assessment of their frailty, cognitive and functional status using an electronic tablet. Acceptability was measured using the Treatment Acceptability and Preferences (TAP) questionnaires. Analyses: Descriptive analyses were performed for sociodemographic variables. Scores were adjusted for confounding variables using multivariate linear regression. Thematic content analysis was performed by two independent analysts for qualitative data collected in the TAP's open-ended question. Results: A total of 67 patients were included in this study. Mean age was 75.5 ± 8.0 and 55.2% of participants were women. Adjusted mean TAP scores for RA evaluation and patient self-assessment were 2.36 and 2.20, respectively. We found no difference between the two types of evaluations (p = 0.0831). When patients are stratified by age groups, patients aged 85 and over (n = 11) showed a difference between the TAPs scores, 2.27 for RA evaluation and 1.72 for patient self-assessment (p = 0.0053). Our qualitative data shows that this might be attributed to the use of technology, rather than to the self-assessment itself. Data from 9 caregivers showed a 2.42 mean TAP score for RA evaluation and 2.44 for self-assessment. However, this relatively small sample size prevented us to perform statistical tests. Conclusion: Our results show that older patients find self-assessment in the ED using an electronic tablet just as acceptable as a standard evaluation by a professional.
Introduction: Although acute gastroenteritis is an extremely common childhood illness, there is a paucity of literature characterizing the associated pain and its management. Our primary objective was to quantify the pain experienced by children with acute gastroenteritis in the 24-hours prior to emergency department (ED) presentation. Secondary objectives included describing maximum pain, analgesic use, discharge recommendations, and factors that influenced analgesic use in the ED. Methods: Study participants were recruited into this prospective cohort study by the Alberta Provincial Pediatric EnTeric Infection TEam between January 2014 and September 2017. This study was conducted at two Canadian pediatric EDs; the Alberta Children's Hospital (Calgary) and the Stollery Children's Hospital (Edmonton). Eligibility criteria included < 18 years of age, acute gastroenteritis (□ 3 episodes of diarrhea or vomiting in the previous 24 hours), and symptom duration □ 7 days. The primary study outcome, caregiver-reported maximum pain in the 24-hours prior to presentation, was assessed using the 11-point Verbal Numerical Rating Scale. Results: We recruited 2136 patients, median age 20.8 months (IQR 10.4, 47.4); 45.8% (979/2136) female. In the 24-hours prior to enrolment, 28.6% (610/2136) of caregivers reported that their child experienced moderate (4-6) and 46.2% (986/2136) severe (7-10) pain in the preceding 24-hours. During the emergency visit, 31.1% (664/2136) described pain as moderate and 26.7% (571/2136) as severe. In the ED, analgesia was provided to 21.2% (452/2131) of children. The most commonly administered analgesics in the ED were ibuprofen (68.1%, 308/452) and acetaminophen (43.4%, 196/452); at home, acetaminophen was most commonly administered (77.7%, 700/901), followed by ibuprofen (37.5%, 338/901). Factors associated with analgesia use in the ED were greater pain scores during the visit, having a primary-care physician, shorter illness duration, fewer diarrheal episodes, presence of fever and hospitalization. Conclusion: Although children presenting to the ED with acute gastroenteritis experience moderate to severe pain, both prior to and during their emergency visit, analgesic use is limited. Future research should focus on appropriate pain management through the development of effective and safe pain treatment plans.
Introduction: According to WHO, one third of patients aged ≥65 fall every year. Those falls account for 25% of all geriatric emergency department (ED) visits. Fear of falling (FOF) is common in older patients who sustained a fall and is associated with a decline in mobility and health issues for patients. We hypothesized that there is an association between FOF and return to ED (RTED) and future falls. Objective: To assess the relation between FOF and RTED and subsequent falls in older ED patients Methods: This research was conducted as part of the Canadian Emergency Team Initiative in elderly (CETIe) multicenter prospective cohort study from 2011 to 2016. Participants: Patients 65 years or older were assessed and discharged from ED following a minor trauma. They had to be independent in all basic activities of daily living and being able to communicate in English or French. Measures: Primary outcome was RTED and secondary outcome was subsequent falls. Both were self-reported at 3 and 6 months. Patients were stratified according to Short Falls Efficacy Scale International (SFES-I) score, assessing FOF in different situations. A total score is calculated to determine the mild, moderate or severe level of FOF. Previous falls and TUG were used to evaluate patients’ mobility. OARS, ISAR and SOF were used to evaluated patient frailty. Descriptive statistical were performed and multiple regression were performed to show the association between SFES-1 score and outcomes. Results: FOF was measured in 2899 participants, of which 2214 participated at the 3 months follow-up and 2009 participated at the 6 months follow-up. Odds Ratio (OR) of return to ED at 3 months was 1.10 for moderate FOF and 1.52 for severe FOF (Type 3 test p = 0.11). At 6 months, OR was 1.03 for moderate FOF and 1.25 for severe FOF (Type 3 test p = 0.63). OR of subsequent fall at 3 months was 1.80 for moderate FOF and 2.18 for severe FOF (Type 3 test p < 0.001). At 6 months, OR of subsequent fall was 1.63 for moderate FOF and 2.37 for severe FOF (Type 3 test p < 0.001). Conclusion: The multicenter cohort study showed that severe fear of falling is strongly associated with subsequent falls over the next 6 months following ED discharge, but not significantly associated with return to ED episodes. Further research should be done to analyze the association between severe FOF and RTED.
Introduction: Individualizing risk for stroke following a transient ischemic attack (TIA) is a topic of intense research, as existing scores are context-dependent or have not been well validated. The Canadian TIA Score stratifies risk of subsequent stroke into low, moderate and high risk. Our objective was to prospectively validate the Canadian TIA Score in a new cohort of emergency department (ED) patients. Methods: We conducted a prospective cohort study in 14 Canadian EDs over 4 years. We enrolled consecutive adult patients with an ED visit for TIA or nondisabling stroke. Treating physicians recorded standardized clinical variables onto data collection forms. Given the ability of prompt emergency carotid endarterectomy (CEA) to prevent stroke (NNT = 3) in high risk patients, our primary outcome was the composite of subsequent stroke or CEA ≤7 days. We conducted telephone follow-up using the validated Questionnaire for Verifying Stroke Free Status at 7 and 90 days. Outcomes were adjudicated by panels of 3 local stroke experts, blinded to the index ED data collection form. Based on prior work, we estimated a sample size of 5,004 patients including 93 subsequent strokes, would yield 95% confidence bands of +/− 10% for sensitivity and likelihood ratio (LR). Our analyses assessed interval LRs (iLR) with 95% CIs. Results: We prospectively enrolled 7,569 patients with mean 68.4 +/−14.7 years and 52.4% female, of whom 107 (1.4%) had a subsequent stroke and 74 (1.0%) CEA ≤7 days (total outcomes = 181). We enrolled 81.2% of eligible patients; missed patients were similar to enrolled. The Canadian TIA Score stratified the stroke/CEA ≤7days risk as: Low (probability <0.2%, iLR 0.20 [95%CI 0.091-0.44]; Moderate (probability 1.3%, iLR 0.79 [0.68-0.92]; High (probability 2.6%, iLR 2.2 [1.9-2.6]. Sensitivity analysis for just stroke ≤7 days yielded similar results: Low iLR 0.17 [95%CI 0.056-0.52], Medium iLR 0.89 [0.75-1.1], High iLR 2.0 [1.6-2.4]. Conclusion: The Canadian TIA Score accurately identifies TIA patients risk for stroke/CEA ≤7 days. Patients classified as low risk can be safely discharged following a careful ED assessment with elective follow-up. Patients at moderate risk can undergo additional testing in the ED, have antithrombotic therapy optimized, and be offered early stroke specialist follow-up. Patients at high risk should in most cases be fully investigated and managed ideally in consultation with a stroke specialist during their index ED visit.