To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Objectives: This study aimed to evaluate the influence of lower limb loss (LL) on mental workload by assessing neurocognitive measures in individuals with unilateral transtibial (TT) versus those with transfemoral (TF) LL while dual-task walking under varying cognitive demand. Methods: Electroencephalography (EEG) was recorded as participants performed a task of varying cognitive demand while being seated or walking (i.e., varying physical demand). Results: The findings revealed both groups of participants (TT LL vs. TF LL) exhibited a similar EEG theta synchrony response as either the cognitive or the physical demand increased. Also, while individuals with TT LL maintained similar performance on the cognitive task during seated and walking conditions, those with TF LL exhibited performance decrements (slower response times) on the cognitive task during the walking in comparison to the seated conditions. Furthermore, those with TF LL neither exhibited regional differences in EEG low-alpha power while walking, nor EEG high-alpha desynchrony as a function of cognitive task difficulty while walking. This lack of alpha modulation coincided with no elevation of theta/alpha ratio power as a function of cognitive task difficulty in the TF LL group. Conclusions: This work suggests that both groups share some common but also different neurocognitive features during dual-task walking. Although all participants were able to recruit neural mechanisms critical for the maintenance of cognitive-motor performance under elevated cognitive or physical demands, the observed differences indicate that walking with a prosthesis, while concurrently performing a cognitive task, imposes additional cognitive demand in individuals with more proximal levels of amputation.
We provide an update on diagnostic methods for the detection of urogenital schistosomiasis (UGS) in men and highlight that satisfactory urine-antigen diagnostics for UGS lag much behind that for intestinal schistosomiasis, where application of a urine-based point-of-care strip assay, the circulating cathodic antigen (CCA) test, is now advocated. Making specific reference to male genital schistosomiasis (MGS), we place greater emphasis on parasitological detection methods and clinical assessment of internal genitalia with ultrasonography. Unlike the advances made in defining a clinical standard protocol for female genital schistosomiasis, MGS remains inadequately defined. Whilst urine filtration with microscopic examination for ova of Schistosoma haematobium is a convenient but error-prone proxy of MGS, we describe a novel low-cost sampling and direct visualization method for the enumeration of ova in semen. Using exemplar clinical cases of MGS from our longitudinal cohort study among fishermen along the shoreline of Lake Malawi, the portfolio of diagnostic needs is appraised including: the use of symptomatology questionnaires, urine analysis (egg count and CCA measurement), semen analysis (egg count, circulating anodic antigen measurement and real-time polymerase chain reaction analysis) alongside clinical assessment with portable ultrasonography.
Several research teams have previously traced patterns of emerging conduct problems (CP) from early or middle childhood. The current study expands on this previous literature by using a genetically-informed, experimental, and long-term longitudinal design to examine trajectories of early-emerging conduct problems and early childhood discriminators of such patterns from the toddler period to adolescence. The sample represents a cohort of 731 toddlers and diverse families recruited based on socioeconomic, child, and family risk, varying in urbanicity and assessed on nine occasions between ages 2 and 14. In addition to examining child, family, and community level discriminators of patterns of emerging conduct problems, we were able to account for genetic susceptibility using polygenic scores and the study's experimental design to determine whether random assignment to the Family Check-Up (FCU) discriminated trajectory groups. In addition, in accord with differential susceptibility theory, we tested whether the effects of the FCU were stronger for those children with higher genetic susceptibility. Results augmented previous findings documenting the influence of child (inhibitory control [IC], gender) and family (harsh parenting, parental depression, and educational attainment) risk. In addition, children in the FCU were overrepresented in the persistent low versus persistent high CP group, but such direct effects were qualified by an interaction between the intervention and genetic susceptibility that was consistent with differential susceptibility. Implications are discussed for early identification and specifically, prevention efforts addressing early child and family risk.
This study investigates suicide risk in late childhood and early adolescence in relation to a family-centered intervention, the Family Check-Up, for problem behavior delivered in early childhood. At age 2, 731 low-income families receiving nutritional services from Women, Infants, and Children programs were randomized to the Family Check-Up intervention or to a control group. Trend-level main effects were observed on endorsement of suicide risk by parents or teachers from ages 7.5 to 14, with higher rates of suicide risk endorsement in youth in the control versus intervention condition. A significant indirect effect of intervention was also observed, with treatment-related improvements in inhibitory control across childhood predicting reductions in suicide-related risk both at age 10.5, assessed via diagnostic interviews with parents and youth, and at age 14, assessed via parent and teacher reports. Results add to the emerging body of work demonstrating long-term reductions in suicide risk related to family-focused preventive interventions, and highlight improvements in youth self-regulatory skills as an important mechanism of such reductions in risk.
Building on prior work using Tom Dishion's Family Check-Up, the current article examined intervention effects on dysregulated irritability in early childhood. Dysregulated irritability, defined as reactive and intense response to frustration, and prolonged angry mood, is an ideal marker of neurodevelopmental vulnerability to later psychopathology because it is a transdiagnostic indicator of decrements in self-regulation that are measurable in the first years of life that have lifelong implications for health and disease. This study is perhaps the first randomized trial to examine the direct effects of an evidence- and family-based intervention, the Family Check-Up (FCU), on irritability in early childhood and the effects of reductions in irritability on later risk of child internalizing and externalizing symptomatology. Data from the geographically and sociodemographically diverse multisite Early Steps randomized prevention trial were used. Path modeling revealed intervention effects on irritability at age 4, which predicted lower externalizing and internalizing symptoms at age 10.5. Results indicate that family-based programs initiated in early childhood can reduce early childhood irritability and later risk for psychopathology. This holds promise for earlier identification and prevention approaches that target transdiagnostic pathways. Implications for future basic and prevention research are discussed.
OBJECTIVES/SPECIFIC AIMS: 1) Describe strategies pediatric providers perceive improve chlamydia screening of sexually active female adolescents (SA), and 2) describe barriers to regular screening of SA for chlamydia METHODS/STUDY POPULATION: Using qualitative methods, 14 general pediatric providers across 7 clinical sites in Vermont were interviewed to ascertain best practices and remaining challenges. Semi-structured interviews lasting 30-45 minutes were audiotaped and transcribed. Chlamydia screening rates provided by BCBS-VT were used to categorize participant responses across three performance tiers, data were coded, and themes identified within these tiers. RESULTS/ANTICIPATED RESULTS: Facilitators: When asked to describe facilitators of chlamydia screening, providers in the top tier of chlamydia screening emphasized the importance of adequate insurance to cover the cost of testing. Providers in the middle performance tier cited use of pre-visit questionnaires, and those in the bottom performance tier identified no best practices. Other strategies included improving physician confidence and awareness, establishing practice- and individual-level routines, and providing strong leadership and communication of local screening rates. Barriers: Across the 3 performance tiers, the most common challenges to consistent chlamydia screening were threats to patient confidentiality, cost of the screening test, and requirement for patient disclosure of sexual activity. Less commonly, providers were concerned that adolescent patients were not reliable to obtain screens off-site, or fill treatment prescriptions without the help of a parent. DISCUSSION/SIGNIFICANCE OF IMPACT: The need for systematic, confidential, and inexpensive means for screening SA for chlamydia was highlighted in both the best practices and challenges described by providers of pediatric care in the suburban practice setting. Policy and practice interventions may target these needs to improve the reproductive health of female adolescents.
Objective: Post-stroke cognitive impairment is common, but mechanisms and risk factors are poorly understood. Frailty may be an important risk factor for cognitive impairment after stroke. We investigated the association between pre-stroke frailty and acute post-stoke cognition. Methods: We studied consecutively admitted acute stroke patients in a single urban teaching hospital during three recruitment waves between May 2016 and December 2017. Cognition was assessed using the Mini-Montreal Cognitive Assessment (min=0; max=12). A Frailty Index was used to generate frailty scores for each patient (min=0; max=100). Clinical and demographic information were collected, including pre-stroke cognition, delirium, and stroke-severity. We conducted univariate and multiple-linear regression analyses with covariates forced in (covariates included were: age, sex, stroke severity, stroke-type, pre-stroke cognitive impairment, delirium, previous stroke/transient ischemic attack) to investigate the association between pre-stroke frailty and post-stroke cognition. Results: Complete data were available for 154 stroke patients. Mean age was 68 years (SD=11; range=32–97); 93 (60%) were male. Median mini-Montreal Cognitive Assessment score was 8 (IQR=4–12). Mean Frailty Index score was 18 (SD=11). Pre-stroke cognitive impairment was apparent in 13/154 (8%) patients. Pre-stroke frailty was significantly associated with lower post-stroke cognition (Standardized-Beta=−0.40; p<0.001) and this association was independent of covariates (Unstandardized-Beta=−0.05; p=0.005). Additional significant variables in the multiple regression model were age (Unstandardized-Beta=−0.05; p=0.002), delirium (Unstandardized-Beta=−2.81; p<0.001), pre-stroke cognitive impairment (Unstandardized-Beta=−2.28; p=0.001), and stroke-severity (Unstandardized-Beta=−0.20; p<0.001). Conclusions: Pre-stroke frailty may be a moderator of post-stroke cognition, independent of other well-established post-stroke cognitive impairment risk factors. (JINS, 2019, 25, 501–506)
The present review evaluated the effectiveness of environmental-based interventions aimed at improving the dietary and physical activity behaviours and body composition indices of adults in institutions.
A systematic review was conducted. Electronic databases (MEDLINE, Embase, PsycINFO, CINAHL, The Cochrane Library, Web of Science, ProQuest Dissertation and Theses, Scopus and Athena) were searched for relevant articles published between database inception and October 2017. Searching, selecting and reporting were undertaken according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement.
Military establishments and maritime workplaces.
Adults in institutions, aged 18–45 years.
A total of 27842 articles were screened for eligibility, nine studies (reported in eleven articles) were included in the review. Five studies used multilevel strategies and four used environmental strategies only. Duration of follow-up ranged from 3 weeks to 10 years. Eight of the studies reported significant positive effects on dietary behaviours, but effect sizes varied. The study that targeted physical activity had no effect on activity levels but did have a significant positive effect on physical fitness. No evidence was identified that the studies resulted in improvements in body composition indices.
The evidence base appears to be in favour of implementing environmental interventions in institutions to improve the dietary behaviours of adults. However, due to the small number of studies included in the review, and the variable methodological quality of the studies and intervention reporting, further well-designed evaluation studies are required.
Soldier operational performance is determined by their fitness, nutritional status, quality of rest/recovery, and remaining injury/illness free. Understanding large fluctuations in nutritional status during operations is critical to safeguarding health and well-being. There are limited data world-wide describing the effect of extreme climate change on nutrient profiles. This study investigated the effect of hot-dry deployments on vitamin D status (assessed from 25-hydroxyvitamin D (25(OH)D) concentration) of young, male, military volunteers. Two data sets are presented (pilot study, n 37; main study, n 98), examining serum 25(OH)D concentrations before and during 6-month summer operational deployments to Afghanistan (March to October/November). Body mass, percentage of body fat, dietary intake and serum 25(OH)D concentrations were measured. In addition, parathyroid hormone (PTH), adjusted Ca and albumin concentrations were measured in the main study to better understand 25(OH)D fluctuations. Body mass and fat mass (FM) losses were greater for early (pre- to mid-) deployment compared with late (mid- to post-) deployment (P<0·05). Dietary intake was well-maintained despite high rates of energy expenditure. A pronounced increase in 25(OH)D was observed between pre- (March) and mid-deployment (June) (pilot study: 51 (sd 20) v. 212 (sd 85) nmol/l, P<0·05; main study: 55 (sd 22) v. 167 (sd 71) nmol/l, P<0·05) and remained elevated post-deployment (October/November). In contrast, PTH was highest pre-deployment, decreasing thereafter (main study: 4·45 (sd 2·20) v. 3·79 (sd 1·50) pmol/l, P<0·05). The typical seasonal cycling of vitamin D appeared exaggerated in this active male population undertaking an arduous summer deployment. Further research is warranted, where such large seasonal vitamin D fluctuations may be detrimental to bone health in the longer-term.
Development involves synergistic interplay among genotypes and the physical and cultural environments, and integrating genetics into experimental designs that manipulate the environment can improve understanding of developmental psychopathology and intervention efficacy. Consistent with differential susceptibility theory, individuals can vary in their sensitivity to environmental conditions including intervention for reasons including their genotype. As a consequence, understanding genetic influences on intervention response is critical. Empirically, we tested an interaction between a genetic index representing sensitivity to the environment and the Family Check-Up intervention. Participants were drawn from the Early Steps Multisite randomized prevention trial that included a low-income and racially/ethnically diverse sample of children and their families followed longitudinally (n = 515). As hypothesized, polygenic sensitivity to the environment moderated the effects of the intervention on 10-year-old children's symptoms of internalizing psychopathology, such that children who were genetically sensitive and were randomly assigned to the intervention had fewer symptoms of child psychopathology than genetically sensitive children assigned to the control condition. A significant difference in internalizing symptoms assessed with a clinical interview emerged between the intervention and control groups for those 0.493 SD above the mean on polygenic sensitivity, or 25% of the sample. Similar to personalized medicine, it is time to understand individual and sociocultural differences in treatment response and individualize psychosocial interventions to reduce the burden of child psychopathology and maximize well-being for children growing up in a wide range of physical environments and cultures.
Herbicide resistance is ‘wicked’ in nature; therefore, results of the many educational efforts to encourage diversification of weed control practices in the United States have been mixed. It is clear that we do not sufficiently understand the totality of the grassroots obstacles, concerns, challenges, and specific solutions needed for varied crop production systems. Weed management issues and solutions vary with such variables as management styles, regions, cropping systems, and available or affordable technologies. Therefore, to help the weed science community better understand the needs and ideas of those directly dealing with herbicide resistance, seven half-day regional listening sessions were held across the United States between December 2016 and April 2017 with groups of diverse stakeholders on the issues and potential solutions for herbicide resistance management. The major goals of the sessions were to gain an understanding of stakeholders and their goals and concerns related to herbicide resistance management, to become familiar with regional differences, and to identify decision maker needs to address herbicide resistance. The messages shared by listening-session participants could be summarized by six themes: we need new herbicides; there is no need for more regulation; there is a need for more education, especially for others who were not present; diversity is hard; the agricultural economy makes it difficult to make changes; and we are aware of herbicide resistance but are managing it. The authors concluded that more work is needed to bring a community-wide, interdisciplinary approach to understanding the complexity of managing weeds within the context of the whole farm operation and for communicating the need to address herbicide resistance.
Seven half-day regional listening sessions were held between December 2016 and April 2017 with groups of diverse stakeholders on the issues and potential solutions for herbicide-resistance management. The objective of the listening sessions was to connect with stakeholders and hear their challenges and recommendations for addressing herbicide resistance. The coordinating team hired Strategic Conservation Solutions, LLC, to facilitate all the sessions. They and the coordinating team used in-person meetings, teleconferences, and email to communicate and coordinate the activities leading up to each regional listening session. The agenda was the same across all sessions and included small-group discussions followed by reporting to the full group for discussion. The planning process was the same across all the sessions, although the selection of venue, time of day, and stakeholder participants differed to accommodate the differences among regions. The listening-session format required a great deal of work and flexibility on the part of the coordinating team and regional coordinators. Overall, the participant evaluations from the sessions were positive, with participants expressing appreciation that they were asked for their thoughts on the subject of herbicide resistance. This paper details the methods and processes used to conduct these regional listening sessions and provides an assessment of the strengths and limitations of those processes.
This study aimed to determine the prevalence and assemblages of Giardia duodenalis present in Scottish beef and dairy cattle at different ages, to try to ascertain if cattle could play a role in the spread of zoonotic assemblages of Giardia. A total of 388 fecal samples (128 beef and 253 dairy, seven of unknown breed) were collected from 19 farms in Scotland. Samples were sub-divided by host age, 1, 2, 3, 4, 5 and 6, 7–24 and ⩾25 weeks. DNA was extracted and tested by PCR to detect G. duodenalis DNA. Of the 388 samples, 126 tested positive, giving an overall prevalence of 32.5%, with positive samples being observed in all age groups tested. The prevalence in dairy cattle was 44.7% (113/235), which was significantly higher (P < 0.001) than the prevalence in beef cattle 10.1% (13/128). Sequence analysis demonstrated the presence of assemblage E (77.2%, sequence types E-S1–E-S5), assemblage B (18.2%) and assemblage A (sub-assemblages AI-AII) (4.6%). These data demonstrate that G. duodenalis is found routinely in both dairy and beef cattle throughout Scotland; the presence of assemblages A and B also indicates that cattle may play a role in the spread of potentially zoonotic assemblages of Giardia.
This article looks at the ways in which the Panacea Society – a heterodox, millenarian group based in Bedford during the inter-war years – spread its ideas: through personal, familial and shared belief networks across the British empire; by building new modes of attracting adherents, in particular a global healing ministry; and by shipping its publications widely. It then examines how the society appealed to its (white) members in the empire in three ways: through its theology, which put Britain at the centre of the world; by presuming the necessity and existence of a ‘Greater Britain’ and the British empire, while in so many other quarters these entities were being questioned in the wake of World War I; and by a deliberately cultivated and nostalgic notion of ‘Englishness’. The Panacea Society continued and developed the idea of the British empire as providential at a time when the idea no longer held currency in most circles. The article draws on the rich resource of letters in the Panacea Society archive to contribute to an emerging area of scholarship on migrants’ experience in the early twentieth-century British empire (especially the dominions) and their sense of identity, in this case both religious and British.
The asymptotic phase θ of an initial point x in the stable manifold of a limit cycle (LC) identifies the phase of the point on the LC to which the flow φt(x) converges as t → ∞. The infinitesimal phase response curve (iPRC) quantifies the change in timing due to a small perturbation of a LC trajectory. For a stable LC in a smooth dynamical system, the iPRC is the gradient ∇x(θ) of the phase function, which can be obtained via the adjoint of the variational equation. For systems with discontinuous dynamics, the standard approach to obtaining the iPRC fails. We derive a formula for the iPRCs of LCs occurring in piecewise smooth (Filippov) dynamical systems of arbitrary dimension, subject to a transverse flow condition. Discontinuous jumps in the iPRC can occur at the boundaries separating subdomains, and are captured by a linear matching condition. The matching matrix, M, can be derived from the saltation matrix arising in the associated variational problem. For the special case of linear dynamics away from switching boundaries, we obtain an explicit expression for the iPRC. We present examples from cell biology (Glass networks) and neuroscience (central pattern generator models). We apply the iPRCs obtained to study synchronization and phase-locking in piecewise smooth LC systems in which synchronization arises solely due to the crossing of switching manifolds.
We present the first results of simultaneous INTEGRAL and RXTE observations of the microquasar GRS 1915+105. We focus on the analysis of the unique highly variable observation and show that we might have observed a new class of variability. We then study the energetic depenelence of a low frequency QPO from our steady observations.
Accurate and reproducible patient positioning is a critical step in radiotherapy for breast cancer. This has seen the use of permanent skin markings becoming standard practice in many centres. Permanent skin markings may have a negative impact on long-term cosmetic outcome, which may in turn, have psychological implications in terms of body image. The aim of this study was to investigate the feasibility of using a semi-permanent tattooing device for the administration of skin marks for breast radiotherapy set-up.
Materials and methods
This was designed as a phase II double-blinded randomised-controlled study comparing our standard permanent tattoos with the Precision Plus Micropigmentation (PPMS) device method. Patients referred for radical breast radiotherapy were eligible for the study. Each study participant had three marks applied using a randomised combination of the standard permanent and PPMS methods and was blinded to the type of each mark. Follow up was at routine appointments until 24 months post radiotherapy. Participants and a blind assessor were invited to score the visibility of each tattoo at each follow-up using a Visual Analogue Scale. Tattoo scores at each time point and change in tattoo scores at 24 months were analysed by a general linear model using the patient as a fixed effect and the type of tattoo (standard or research) as covariate. A simple questionnaire was used to assess radiographer feedback on using the PPMS.
In total, 60 patients were recruited to the study, of which 55 were available for follow-up at 24 months. Semi-permanent tattoos were more visible at 24 months than the permanent tattoos. Semi-permanent tattoos demonstrated a greater degree of fade than the permanent tattoos at 24 months (final time point) post completion of radiotherapy. This was not statistically significant, although it was more apparent for the patient scores (p=0·071) than the blind assessor scores (p=0·27). No semi-permanent tattoos required re-marking before the end of radiotherapy and no adverse skin reactions were observed.
The PPMS presents a safe and feasible alternative to our permanent tattooing method. An extended period of follow-up is required to fully assess the extent of semi-permanent tattoo fade.
You don't know that p unless it's on account of your cognitive abilities that you believe truly that p. Virtue epistemologists think there's some such ability constraint on knowledge. This looks to be in considerable tension, though, with putative faith-based knowledge. For at least on a popular Christian conception, when you believe something truly on the basis of faith this isn't because of anything you're naturally competent to do. Rather, faith-based beliefs are entirely a product of divine agency. Appearances to the contrary, I argue in this article that there's no deep tension between faith-based knowledge and virtue epistemology. Not if we learn to conceive of faith as a kind of extended knowledge.