To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Order of the Holy Trinity for the Redemption of Captives (or Trinitarian Order) is one of the least studied continental religious groups to have expanded into thirteenth-century England. This article examines shifting notions of Trinitarian redemption in late medieval England through the prism of the order's writing about Yorkshire hermit St Robert of Knaresborough (d. 1218). Against the Weberian theory of the routinization of charisma, it demonstrates that Robert's inspirational sanctity was never bound too rigidly by his Trinitarian hagiographers, who rather co-opted his unstable charisma in distinct yet complementary ways to facilitate institutional reinvention and spiritual flourishing in the fourteenth and fifteenth centuries.
To provide an answer to the question of whether we need global health ethics, we set ourselves three goals in this chapter. First, we explore a number of different ways that we might understand the term global health ethics. Second, we consider the arguments that could be used either to support or dismiss what we call substantive accounts of global health ethics. Finally, we make some suggestions in relation to what (if any) global obligations may bind us. Our discussions will use public health as an example throughout to illustrate our points. The reason for this focus is that, in our view, we ought to think of public health as providing systematic structural support for population health, with the key aim of fulfilling the basic requirements to protect health and prevent illness. This is not to suggest that other forms of healthcare are unimportant, just that public health will fulfill a primary role in any attempt to address questions of global justice in relation to existing health inequalities.
Prenatal choline is a key nutrient, like folic acid and vitamin D, for fetal brain development and subsequent mental function. We sought to determine whether effects of higher maternal plasma choline concentrations on childhood attention and social problems, found in an initial clinical trial of choline supplementation, are observed in a second cohort.
Of 183 mothers enrolled from an urban safety net hospital clinic, 162 complied with gestational assessments and brought their newborns for study at 1 month of age; 83 continued assessments through 4 years of age. Effects of maternal 16 weeks of gestation plasma choline concentrations ⩾7.07 μM, 1 s.d. below the mean level obtained with supplementation in the previous trial, were compared to lower levels. The Attention Problems and Withdrawn Syndrome scales on Child Behavior Checklist 1½–5 were the principal outcomes.
Higher maternal plasma choline was associated with lower mean Attention Problems percentiles in children, and for male children, with lower Withdrawn percentiles. Higher plasma choline concentrations also reduced Attention Problems percentiles for children of mothers who used cannabis during gestation as well as children of mothers who had gestational infection.
Prenatal choline's positive associations with early childhood behaviors are found in a second, more diverse cohort. Increases in attention problems and social withdrawal in early childhood are associated with later mental illnesses including attention deficit disorder and schizophrenia. Choline concentrations in the pregnant women in this study replicate other research findings suggesting that most pregnant women do not have adequate choline in their diets.
Partnership working has been a long-standing objective of health and social policy, in recognition of the reality that few policy puzzles are simple and the preserve of any single agency or government department. ‘Wicked issues’ that display complex multiple causes in search of solutions that are intersectoral require joined up approaches at both national and local levels. Despite this awareness, in practice effective joint working remains the exception rather than the rule, and governments and their agents struggle to achieve success while remaining trapped in their silos and protecting their narrow interests (Hunter and Perkins, 2014; Perkins and Hunter, 2014).
How to do partnership working differently was behind the creation of health and wellbeing boards (HWBs) at the time of the 2012 NHS changes. At the time, there was much enthusiasm for these new entities and expectations ran high. Resources were made available to support boards and to help members consider how to create effective partnerships across local government and the National Health Service (NHS).
Sadly, some eight years later, the shine has gone off HWBs despite the fact that the issues they were set up to tackle remain as visible and deep-seated as ever. Indeed, after nearly a decade of austerity, which has contributed to a sharp rise in health inequalities and entrenched the North–South divide, and with the potential fallout from Brexit around the corner with as yet unknown but almost certainly negative consequences, the need for powerful HWBs that can bring about real change to improve the lot of ravaged communities has never been greater.
A recent study (led by the author of this chapter) of HWBs in England established in 2012, when responsibility for public health transferred to local government, found that, with few exceptions, HWBs punched below their weight and were not the powerful system leaders that had been hoped for (Hunter et al, 2017). Similar findings were reported in a series of reviews conducted by Shared Intelligence in the early years of HWBs (Shared Intelligence, 2013, 2014, 2015). With the advent of sustainability and transformation programmes (STPs) and integrated care partnerships (ICPs) following the introduction in 2014 of a major programme of reform within the NHS (NHSE, 2014), an opportunity for HWBs to become key drivers for a new integrated approach with local government at its centre has largely been missed.
Background: In March 2012, the Veterans’ Health Administration (VHA) published the Guideline for the Prevention of Clostridium difficile infection (CDI) in VHA Inpatient Acute-Care Facilities, with a goal of 30% reduction of cases within 2 years. In March 2011, this facility, along with 31 others, served as a pilot site to develop the guidelines. Methods: The CDI prevention bundle was implemented to prevent new onset CDI cases in the facility with 4 core measures: (1) environmental cleaning (EMS), (2) hand hygiene, (3) contact precautions, and (4) cultural transformation. Education was provided to EMS staff, nursing, and care providers on the CDI case definition, criteria for testing, empiric isolation for patients with diarrhea, hand hygiene, and PPE to control spread. In 2014, antimicrobial stewardship was added, and within 5 years an algorithm for isolation and testing was published. Cases were reviewed weekly using TheraDoc software and were reported monthly to the national VHA Inpatient Evaluation Center (IPEC). Isolation was communicated using a ward roster/isolation list in TheraDoc for all unit champions to consult daily. CDI cases were classified using NHSN definitions for a laboratory-identified (LabID) event, recurrent cases, and community-onset cases. Real-time case review and weekly multidisciplinary case discussions identified opportunities for improved compliance with the core measures. Results: Over an 8-year period, CDI healthcare-onset LabID events decreased by 73%. The cases decreased from 149 to 40 over the 8-year period. The infection rate decreased 70% from 16.19 per 10,000 bed days of care in FY2011 (October 2010) to 4.88 in FY2019. The incidence of community onset infections increased from 75 in FY2011 to a high of 146 in FY2018 for a rate of 8.15 to 18.17. In FY2019, there was a decrease in both LabID events and community-onset cases to lows of 40 and 102, respectively. Inappropriate testing decreased by 84% from 50 in FY2011 to 8 in FY2019. Conclusions: A multidisciplinary team approach that included support from leadership and clinical providers as well as front line staff involvement, daily rounding, and case review by infection preventionists has reduced all CDI cases over an 8-year period using the modified VHA CDI bundle. TheraDoc enabled case review, correct isolation, changes to cleaning practices, and more appropriate lab testing. The antimicrobial stewardship program that includes clinical pharmacists working daily with providers was a strong driver for change.
This chapter is premised on the belief that ethical medical and mental healthcare require a social justice framework, one that takes as its principal stance that healthcare is a human right. Utilizing Ruger’s health capability paradigm, we consider how, both within the United States itself and globally, the engagement of a perspective of positive rights regarding health and healthcare is required when approaching how to best address social disparities and access to medical and mental health treatment. We first consider the health capability paradigm and how it conceptually frames, within a social justice perspective, the right to health and the provision of healthcare for children and youth. We next explore how this model can be considered within a discussion of how dual medical and mental healthcare is currently practiced in a context where patients referred and treated are from lower- to lower-middle-class socioeconomic status communities, and who utilize Medicaid as their primary source of insurance for healthcare. We focus specifically on the provision of forms of medical and mental healthcare that are often the least considered and subsequently unreimbursed by Medicaid, neuropsychological assessment and consultation/liaison for mental health. We next present a case that addresses the application of the health capability paradigm within a child and adolescent psychiatric care setting, and then end with a discussion of how communities, both locally and within the United States, can better engage this model as a means for justifying more equitable and conscientious care for children and adolescents.
Leukoaraiosis, or white matter rarefaction, is a common imaging finding in aging and is presumed to reflect vascular disease. When severe in presentation, potential congenital or acquired etiologies are investigated, prompting referral for neuropsychological evaluation in addition to neuroimaging. T2-weighted imaging is the most common magnetic resonance imaging (MRI) approach to identifying white matter disease. However, more advanced diffusion MRI techniques may provide additional insight into mechanisms that influence the abnormal T2 signal, especially when clinical presentations are discrepant with imaging findings.
We present a case of a 74-year-old woman with severe leukoaraoisis. She was examined by a neurologist, neuropsychologist, and rheumatologist, and completed conventional (T1, T2-FLAIR) MRI, diffusion tensor imaging (DTI), and advanced single-shell, high b-value diffusion MRI (i.e., fiber ball imaging [FBI]).
The patient was found to have few neurological signs, no significant cognitive impairment, a negative workup for leukoencephalopathy, and a positive antibody for Sjogren’s disease for which her degree of leukoaraiosis would be highly atypical. Tractography results indicate intact axonal architecture that was better resolved using FBI rather than DTI.
This case illustrates exceptional cognitive resilience in the face of severe leukoaraiosis and the potential for advanced diffusion MRI to identify brain reserve.
Implementation of genome-scale sequencing in clinical care has significant challenges: the technology is highly dimensional with many kinds of potential results, results interpretation and delivery require expertise and coordination across multiple medical specialties, clinical utility may be uncertain, and there may be broader familial or societal implications beyond the individual participant. Transdisciplinary consortia and collaborative team science are well poised to address these challenges. However, understanding the complex web of organizational, institutional, physical, environmental, technologic, and other political and societal factors that influence the effectiveness of consortia is understudied. We describe our experience working in the Clinical Sequencing Evidence-Generating Research (CSER) consortium, a multi-institutional translational genomics consortium.
A key aspect of the CSER consortium was the juxtaposition of site-specific measures with the need to identify consensus measures related to clinical utility and to create a core set of harmonized measures. During this harmonization process, we sought to minimize participant burden, accommodate project-specific choices, and use validated measures that allow data sharing.
Identifying platforms to ensure swift communication between teams and management of materials and data were essential to our harmonization efforts. Funding agencies can help consortia by clarifying key study design elements across projects during the proposal preparation phase and by providing a framework for data sharing data across participating projects.
In summary, time and resources must be devoted to developing and implementing collaborative practices as preparatory work at the beginning of project timelines to improve the effectiveness of research consortia.
Osteoporosis was not a public health concern in black South African (SA) women, until recently when it was reported that the prevalence of vertebral fractures was 9.1% in black compared to 5.0% in white SA women. Accordingly, this study aimed to measure bone mineral density (BMD) of older black SA women and to investigate its association with risk factors for osteoporosis, including strength, muscle and fat mass, dietary intake and objectively measured physical activity (PA).
Methods and materials
Older black SA women (age, 68 (range; 60–85 years) n = 122) completed sociodemographic and quantitative food frequency questionnaires (QFFQ), fasting venous blood samples (25-hydroxycholecalciferol: Vitamin D-25), 24 h urine collection (estimate protein intake), grip strength and PA monitoring (activPAL). Dual-energy x-ray absorptiometry (DXA) scans of the hip (femoral neck and total) and lumbar spine determined BMD and whole-body scans for fat and fat-free soft tissue mass (FFSTM). WHO classifications were used to determine osteopenia (t-score -2.5 to -1), and osteoporosis (t-score < -2.5).
At the lumbar spine 34.4% of the women (n = 42) had osteopenia and 19.7% (n = 24) had osteoporosis. Osteopenia at the left femoral neck was 32% (n = 40) and osteoporosis was 13.1% (n = 16) of participants. The total left hip BMD indicated osteopenia in 27.9% (n = 34) and osteoporosis in 13.1% (n = 16) of participants. Multinomial regression revealed no differences in age (y) or frequency of falls in the past year between all groups (p = 0.727). Compared to those with normal BMD, participants with osteoporosis at the hip neck and lumbar spine were shorter, weighed less and had a lower body mass index (BMI) (all p < 0.05). When adjusted for height, the osteoporotic group (hip neck and lumbar spine) had lower trunk fat (% whole body), FFSTM (kg) and grip strength (kg), compared to those with normal BMD (p < 0.05). Only protein intake (g; 24 h urine analyses) was lower in women with osteoporosis (all sites) compared to those with normal BMD. Fat, carbohydrate and micronutrient intakes (relative to total daily energy intake), and vitamin D concentrations were not associated with BMD (all sites). Number of daily step count and stepping time (min) were inversely associated with BMI (p < 0.05), but not with BMD (all sites; p > 0.05).
A high prevalence of osteopenia and osteoporosis was evident at the lumbar spine and hip in older black SA women. This study highlights the importance of strength, body composition, and protein intake in maintaining BMD and preventing the development of osteoporosis in older women.
As consumer-directed care programmes become increasingly common in aged care provision, there is a heightened requirement for literature summarising the experience and perspectives of recipients. We conducted rapid evidence reviews on two components of consumer experience of home- and community-based aged care: (a) drivers of choice when looking for a service (Question 1 (Q1)); and (b) perceptions of quality of services (Question 2 (Q2)). We systematically searched MEDLINE and EMBASE databases, and conducted manual (non-systematic) searches of primary and grey literature (e.g. government reports) across CINAHL, Scopus, PsychINFO, and Web of Science, Trove and OpenGrey databases. Articles deemed eligible after abstract/full-text screening subsequently underwent risk-of-bias assessment to ensure their quality. The final included studies (Q1: N = 21; Q2: N = 19) comprised both quantitative and qualitative articles, which highlighted that consumer choices of services are driven by a combination of: desire for flexibility in service provision; optimising mobility; need for personal assistance, security and safety, interaction, and social/leisure activities; and to target and address previously unmet needs. Similarly, consumer perspectives of quality include control and autonomy, interpersonal interactions, flexibility of choice, and safety and affordability. Our reviews suggest that future model development should take into account consumers’ freedom to choose services in a flexible manner, and the value they place on interpersonal relationships and social interaction.
Maternal inflammation in early pregnancy has been identified epidemiologically as a prenatal pathogenic factor for the offspring's later mental illness. Early newborn manifestations of the effects of maternal inflammation on human fetal brain development are largely unknown.
Maternal infection, depression, obesity, and other factors associated with inflammation were assessed at 16 weeks gestation, along with maternal C-reactive protein (CRP), cytokines, and serum choline. Cerebral inhibition was assessed by inhibitory P50 sensory gating at 1 month of age, and infant behavior was assessed by maternal ratings at 3 months of age.
Maternal CRP diminished the development of cerebral inhibition in newborn males but paradoxically increased inhibition in females. Similar sex-dependent effects were seen in mothers' assessment of their infant's self-regulatory behaviors at 3 months of age. Higher maternal choline levels partly mitigated the effect of CRP in male offspring.
The male fetal-placental unit appears to be more sensitive to maternal inflammation than females. Effects are particularly marked on cerebral inhibition. Deficits in cerebral inhibition 1 month after birth, similar to those observed in several mental illnesses, including schizophrenia, indicate fetal developmental pathways that may lead to later mental illness. Deficits in early infant behavior follow. Early intervention before birth, including prenatal vitamins, folate, and choline supplements, may help prevent fetal development of pathophysiological deficits that can have life-long consequences for mental health.
In recent years, unmanned aerial vehicle (UAV) technology has expanded to include UAV sprayers capable of applying pesticides. Very little research has been conducted to optimize application parameters and measure the potential of off-target movement from UAV-based pesticide applications. Field experiments were conducted in Raleigh, NC during spring 2018 to characterize the effect of different application speeds and nozzle types on target area coverage and uniformity of UAV applications. The highest coverage was achieved with an application speed of 1 m s−1 and ranged from 30% to 60%, whereas applications at 7 m s−1 yielded 13% to 22% coverage. Coverage consistently decreased as application speed increased across all nozzles, with extended-range flat-spray nozzles declining at a faster rate than air-induction nozzles, likely due to higher drift. Experiments measuring the drift potential of UAV-applied pesticides using extended-range flat spray, air-induction flat-spray, turbo air–induction flat-spray, and hollow-cone nozzles under 0, 2, 4, 7, and 9 m s−1 perpendicular wind conditions in the immediate 1.75 m above the target were conducted in the absence of natural wind. Off-target movement was observed under all perpendicular wind conditions with all nozzles tested but was nondetectable beyond 5 m away from the target. Coverage from all nozzles exhibited a concave-shaped curve in response to the increasing perpendicular wind speed due to turbulence. The maximum target coverage in drift studies was observed when the perpendicular wind was 0 and 8.94 m s−1, but higher turbulence at the two highest perpendicular wind speeds (6.71 and 8.94 m s−1) increased coverage variability, whereas the lowest variability was observed at 2.24 m s−1 wind speed. Results suggested that air-induction flat-spray and turbo air–induction flat-spray nozzles and an application speed of 3 m s−1 provided an adequate coverage of target areas while minimizing off-target movement risk.
Tanzania is commonly cited as “a success story” where a cohesive society has been built in tandem with its nationhood. In this chapter, we offer an account of interplay between ethnicity and social norms in the context of nation building in Tanzania and highlight the historical transformation of localized, ethnic-based mechanisms for self-protection, “trust networks”, to a national framework for trust enhancement and resolution of conflicts at local levels. This, we argue, was the key for acceptance of national identity by Tanzanians for self-protection, and, hence, a transition from divided pasts to cohesive futures. The chapter traces nation building efforts in Tanzania, and explains why Tanzania is an exception to the patterns of violence and instability experienced in Sub-Saharan Africa. It is argued that that, although conflicts are sometime inevitable, cross-cutting identities such as occupation, and particularly the all-encompassing identity of nationality, can help to decrease the likelihood that conflicts will divide the nation. Diversity may present a challenge to national unity, but it is not insuperable if the political leadership is genuinely committed to deemphasizing ethnic group identities in the public sphere and pursues policies which consider the goal of equality.
Residual herbicides are routinely applied to control troublesome weeds in pumpkin production. Fluridone and acetochlor, Groups 12 and 15 herbicides, respectively, provide broad-spectrum PRE weed control. Field research was conducted in Virginia and New Jersey to evaluate pumpkin tolerance and weed control to PRE herbicides. Treatments consisted of fomesafen at two rates, ethalfluralin, clomazone, halosulfuron, fluridone, S-metolachlor, acetochlor emulsifiable concentrate (EC), acetochlor microencapsulated (ME), and no herbicide. At one site, fluridone, acetochlor EC, acetochlor ME, and halosulfuron injured pumpkin 81%, 39%, 34%, and 35%, respectively, at 14 d after planting (DAP); crop injury at the second site was 40%, 8%, 19%, and 33%, respectively. Differences in injury between the two sites may have been due to the amount and timing of rainfall after herbicides were applied. Fluridone provided 91% control of ivyleaf morningglory and 100% control of common ragweed at 28 DAP. Acetochlor EC controlled redroot pigweed 100%. Pumpkin treated with S-metolachlor produced the most yield (10,764 fruits ha–1) despite broadcasting over the planted row; labeling requires a directed application to row-middles. A separate study specifically evaluated fluridone applied PRE at 42, 84, 126, 168, 252, 336, and 672 g ai ha–1. Fluridone resulted in pumpkin injury ≥95% when applied at rates of ≥168 g ai ha–1; significant yield loss was noted when the herbicide was applied at rates >42 g ai ha–1. We concluded that fluridone and acetochlor formulations are unacceptable candidates for pumpkin production.
Introduction: Although use of point of care ultrasound (PoCUS) protocols for patients with undifferentiated hypotension in the Emergency Department (ED) is widespread, our previously reported SHoC-ED study showed no clear survival or length of stay benefit for patients assessed with PoCUS. In this analysis, we examine if the use of PoCUS changed fluid administration and rates of other emergency interventions between patients with different shock types. The primary comparison was between cardiogenic and non-cardiogenic shock types. Methods: A post-hoc analysis was completed on the database from an RCT of 273 patients who presented to the ED with undifferentiated hypotension (SBP <100 or shock index > 1) and who had been randomized to receive standard care with or without PoCUS in 6 centres in Canada and South Africa. PoCUS-trained physicians performed scans after initial assessment. Shock categories and diagnoses recorded at 60 minutes after ED presentation, were used to allocate patients into subcategories of shock for analysis of treatment. We analyzed actual care delivered including initial IV fluid bolus volumes (mL), rates of inotrope use and major procedures. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: Although there were expected differences in the mean fluid bolus volume between patients with non-cardiogenic and cardiogenic shock, there was no difference in fluid bolus volume between the control and PoCUS groups (non-cardiogenic control 1878 mL (95% CI 1550 – 2206 mL) vs. non-cardiogenic PoCUS 1687 mL (1458 – 1916 mL); and cardiogenic control 768 mL (194 – 1341 mL) vs. cardiogenic PoCUS 981 mL (341 – 1620 mL). Likewise there were no differences in rates of inotrope administration, or major procedures for any of the subcategories of shock between the control group and PoCUS group patients. The most common subcategory of shock was distributive. Conclusion: Despite differences in care delivered by subcategory of shock, we did not find any significant difference in actual care delivered between patients who were examined using PoCUS and those who were not. This may help to explain the previously reported lack of outcome difference between groups.
Introduction: Point of care ultrasound has been reported to improve diagnosis in non-traumatic hypotensive ED patients. We compared diagnostic performance of physicians with and without PoCUS in undifferentiated hypotensive patients as part of an international prospective randomized controlled study. The primary outcome was diagnostic performance of PoCUS for cardiogenic vs. non-cardiogenic shock. Methods: SHoC-ED recruited hypotensive patients (SBP < 100 mmHg or shock index > 1) in 6 centres in Canada and South Africa. We describe previously unreported secondary outcomes relating to diagnostic accuracy. Patients were randomized to standard clinical assessment (No PoCUS) or PoCUS groups. PoCUS-trained physicians performed scans after initial assessment. Demographics, clinical details and findings were collected prospectively. Initial and secondary diagnoses including shock category were recorded at 0 and 60 minutes. Final diagnosis was determined by independent blinded chart review. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: 273 patients were enrolled with follow-up for primary outcome completed for 270. Baseline demographics and perceived category of shock were similar between groups. 11% of patients were determined to have cardiogenic shock. PoCUS had a sensitivity of 80.0% (95% CI 54.8 to 93.0%), specificity 95.5% (90.0 to 98.1%), LR+ve 17.9 (7.34 to 43.8), LR-ve 0.21 (0.08 to 0.58), Diagnostic OR 85.6 (18.2 to 403.6) and accuracy 93.7% (88.0 to 97.2%) for cardiogenic shock. Standard assessment without PoCUS had a sensitivity of 91.7% (64.6 to 98.5%), specificity 93.8% (87.8 to 97.0%), LR+ve 14.8 (7.1 to 30.9), LR- of 0.09 (0.01 to 0.58), Diagnostic OR 166.6 (18.7 to 1481) and accuracy of 93.6% (87.8 to 97.2%). There was no significant difference in sensitivity (-11.7% (-37.8 to 18.3%)) or specificity (1.73% (-4.67 to 8.29%)). Diagnostic performance was also similar between other shock subcategories. Conclusion: As reported in other studies, PoCUS based assessment performed well diagnostically in undifferentiated hypotensive patients, especially as a rule-in test. However performance was similar to standard (non-PoCUS) assessment, which was excellent in this study.
This contribution discusses results obtained from 3-D neutron diffraction and 2-D fabric analyser in situ deformation experiments on laboratory-prepared polycrystalline deuterated ice and ice containing a second phase. The two-phase samples used in the experiments are composed of an ice matrix with (1) air bubbles, (2) rigid, rhombohedral-shaped calcite and (3) rheologically soft, platy graphite. Samples were tested at 10°C below the melting point of deuterated ice at ambient pressures, and two strain rates of 1 × 10−5 s−1 (fast) and 2.5 × 10−6 s−1 (medium). Nature and distribution of the second phase controlled the rheological behaviour of the ice by pinning grain boundary migration. Peak stresses increased with the presence of second-phase particles and during fast strain rate cycles. Ice-only samples exhibit well-developed crystallographic preferred orientations (CPOs) and dynamically recrystallized microstructures, typifying deformation via dislocation creep, where the CPO intensity is influenced in part by the strain rate. CPOs are accompanied by a concentration of [c]-axes in cones about the compression axis, coinciding with increasing activity of prismatic-<a> slip activity. Ice with second phases, deformed in a relatively slower strain rate regime, exhibit greater grain boundary migration and stronger CPO intensities than samples deformed at higher strain rates or strain rate cycles.
People with dementia fall twice as often and have more serious fall-related injuries than healthy older adults. While gait impairment as a generic term is understood as a fall risk factor in this population, a clear elaboration of the specific components of gait that are associated with falls risk is needed for knowledge translation to clinical practice and the development of fall prevention strategies for people with dementia.
To review gait parameters and characteristics associated with falls in people with dementia.
Electronic databases CINAHL, EMBASE, MedLine, PsycINFO, and PubMed were searched (from inception to April 2017) to identify prospective cohort studies evaluating the association between gait and falls in people with dementia.
Increased double support time variability, use of mobility aids, walking outdoors, higher scores on the Unified Parkinson’s Disease Rating Scale, and lower average walking bouts were associated with elevated risk of any fall. Increased double support time and step length variability were associated with recurrent falls. The reviewed articles do not support using the Performance Oriented Mobility Assessment and the Timed Up-and-Go tests to predict any fall in this population. There is limited research on the use of dual-task gait assessments for predicting falls in people with dementia.
This systematic review shows the specific spatiotemporal gait parameters and features that are associated with falls in people with dementia. Future research is recommended to focus on developing specialized treatment methods for these specific gait impairments in this patient population.
Laser-based compact MeV X-ray sources are useful for a variety of applications such as radiography and active interrogation of nuclear materials. MeV X rays are typically generated by impinging the intense laser onto ~mm-thick high-Z foil. Here, we have characterized such a MeV X-ray source from 120 TW (80 J, 650 fs) laser interaction with a 1 mm-thick tantalum foil. Our measurements show X-ray temperature of 2.5 MeV, flux of 3 × 1012 photons/sr/shot, beam divergence of ~0.1 sr, conversion efficiency of ~1%, that is, ~1 J of MeV X rays out of 80 J incident laser, and source size of 80 m. Our measurement also shows that MeV X-ray yield and temperature is largely insensitive to nanosecond laser contrasts up to 10−5. Also, preliminary measurements of similar MeV X-ray source using a double-foil scheme, where the laser-driven hot electrons from a thin foil undergoing relativistic transparency impinging onto a second high-Z converter foil separated by 50–400 m, show MeV X-ray yield more than an order of magnitude lower compared with the single-foil results.