To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To identify phenotypes of type 1 diabetes based on glucose curves from continuous glucose-monitoring (CGM) using functional data (FD) analysis to account for longitudinal glucose patterns. We present a reliable prediction model that can accurately predict glycemic levels based on past data collected from the CGM sensor and real-time risk of hypo-/hyperglycemic for individuals with type 1 diabetes.
A longitudinal cohort study of 443 type 1 diabetes patients with CGM data from a completed trial. The FD analysis approach, sparse functional principal components (FPCs) analysis was used to identify phenotypes of type 1 diabetes glycemic variation. We employed a nonstationary stochastic linear mixed-effects model (LME) that accommodates between-patient and within-patient heterogeneity to predict glycemic levels and real-time risk of hypo-/hyperglycemic by creating specific target functions for these excursions.
The majority of the variation (73%) in glucose trajectories was explained by the first two FPCs. Higher order variation in the CGM profiles occurred during weeknights, although variation was higher on weekends. The model has low prediction errors and yields accurate predictions for both glucose levels and real-time risk of glycemic excursions.
By identifying these distinct longitudinal patterns as phenotypes, interventions can be targeted to optimize type 1 diabetes management for subgroups at the highest risk for compromised long-term outcomes such as cardiac disease or stroke. Further, the estimated change/variability in an individual’s glucose trajectory can be used to establish clinically meaningful and patient-specific thresholds that, when coupled with probabilistic predictive inference, provide a useful medical-monitoring tool.
This article studies Chinese central government policies in relation to food market building and food security between 1979 and 2008. It investigates major changes in the state's grain purchase pricing, urban subsidized food sales and the state monopoly over rural-to-urban food circulation that were effected in an attempt to ensure both food availability and accessibility under fiscal constraint. By observing the gradual transition from state monopoly to the market, this article traces the mechanisms which enabled the Chinese government to both establish a monopsony by generating artificial price signals for farmers to generate food output, and act as a monopolistic seller by providing subsidized low-priced food to urban consumers in order to fulfil its goal of low-cost industrialization. Thus, China's food security largely hinged on the government's budget to subsidize the price gap. The Chinese government juggled between food security and fiscal affordability to formulate a food budget that would neither excessively impact food security nor cause a crisis to government finance. China's food security puzzle was eventually worked out in the mid-2000s with the boosting of national income, which enhanced the population's access to food and eased the central government's food security concerns.
The impact of healthcare system integration on infection prevention programs is unknown. Using catheter-associated urinary tract infection (CAUTI) prevention as an example, we hypothesize that US Department of Veterans Affairs (VA) nursing homes have a more robust infection prevention infrastructure due to integration and centralization compared with non–VA nursing homes.
VA and non-VA nursing homes participating in the AHRQ Safety Program for Long-Term Care collaborative.
Nursing homes provided baseline information about their infection prevention programs to assess strengths and gaps related to CAUTI prevention via a needs assessment questionnaire.
A total of 353 of 494 nursing homes from 41 states (71%; 47 VA and 306 non-VA facilities) responded. VA nursing homes reported more hours per week devoted to infection prevention-related activities (31 vs 12 hours; P<.001) and were more likely to have committees that reviewed healthcare-associated infections. Compared with non-VA facilities, a higher percentage of VA nursing homes reported tracking CAUTI rates (94% vs 66%; P<.001), sharing CAUTI data with leadership (94% vs 70%; P=.014) and with nursing personnel (85% vs 56%, P=.003). However, fewer VA nursing homes reported having policies for appropriate catheter use (64% vs 81%; P=.004) and catheter insertion (83% vs 94%; P=.004).
Among nursing homes participating in an AHRQ-funded collaborative, VA and non-VA nursing homes differed in their approach to CAUTI prevention. Best practices from both settings should be applied universally to create an optimal infection prevention program within emerging integrated healthcare systems.
The elevated risk of suicide in prison and after release is a well-recognised and serious problem. Despite this, evidence concerning community-based offenders' suicide risk is sparse. We conducted a population-based nested case–control study of all people in a community justice pathway in England and Wales. Our data show 13% of general population suicides were in community justice pathways before death. Suicide risks were highest among individuals receiving police cautions, and those having recent, or impending prosecution for sexual offences. Findings have implications for the training and practice of clinicians identifying and assessing suicidality, and offering support to those at elevated risk.
To examine patterns and predictors of primary mental health care service use following 2 major Australian natural disaster events.
Utilizing data from a national minimum dataset, descriptive and regression analyses were conducted to identify levels and predictors of the use of the Access to Allied Psychological Services (ATAPS) program over a 2-year period following 2 major Australian bushfire and flood/cyclone disasters.
The bushfire disaster resulted in significantly greater and more enduring ATAPS service volume, while service delivery for both disasters peaked in the third quarter. Consumers affected by bushfires (IRR 1.51, 95% CI 1.20–1.89), diagnosed with depression (IRR 2.57, 95% CI 1.60-4.14), anxiety (IRR 2.06, 95% CI 1.21-3.49), or both disorders (IRR 2.15, 95% CI 1.35-3.42) utilized treatment at higher rates.
The substantial demand for primary mental health care services following major natural disasters can vary in magnitude and trajectory with disaster type. Disaster-specific ATAPS services provide a promising model to cater for this demand in primary care settings. Disaster type and need-based variables as drivers of ATAPS use intensity indicate an equitable level of service use in line with the program intention. Established service usage patterns can assist with estimating capacity requirements in similar disaster circumstances. (Disaster Med Public Health Preparedness. 2015;9:275-282)
Archaeologists from the Center for American Archeology (CAA) in Kampsville, Illinois, are engaged in a program to test the potential for ground-penetrating radar (GPR) and electrical resistance tomography (ERT) to effectively document the internal structure of a variety of Middle (ca.2200–1550 B.P.) and Late Woodland (ca.1550–950 B.P.) mounds in the Lower Illinois River Valley (LIV). This project, embedded within ongoing CAA regional research efforts and the Arizona State University Kampsville Field School, demonstrates that both GPR and ERT permit the identification and measurement of significant internal mound structures. Key structural elements can be confidently identified in the geophysical data from the five test mounds, and excavation results can be conclusively linked with results of excavation in mounds that have been tested. This study opens the way for the development of a set of procedures for a regional research initiative in the LIV to understand structural variation between Middle and Late Woodland mounds using ground-based remote sensing methods as a primary source of data and thus minimizing invasive and destructive investigation techniques.
There is good evidence of the positive effects of person-centered care (PCC) on agitation in dementia. We hypothesized that a person-centered environment (PCE) would achieve similar outcomes by focusing on positive environmental stimuli, and that there would be enhanced outcomes by combining PCC and PCE.
38 Australian residential aged care homes with scope for improvement in both PCC and PCE were stratified, then randomized to one of four intervention groups: (1) PCC; (2) PCE; (3) PCC +PCE; (4) no intervention. People with dementia, over 60 years of age and consented were eligible. Co-outcomes assessed pre and four months post-intervention and at 8 months follow-up were resident agitation, emotional responses in care, quality of life and depression, and care interaction quality.
From 38 homes randomized, 601 people with dementia were recruited. At follow-up the mean change for quality of life and agitation was significantly different for PCE (p = 0.02, p = 0.05, respectively) and PCC (p = 0.0003, p = 0.002 respectively), compared with the non-intervention group (p = 0.48, p = 0.93 respectively). Quality of life improved non-significantly for PCC+PCE (p = 0.08), but not for agitation (p = 0.37). Improvements in care interaction quality (p = 0.006) and in emotional responses to care (p = 0.01) in PCC+PCE were not observed in the other groups. Depression scores did not change in any of the groups. Intervention compliance for PCC was 59%, for PCE 54% and for PCC+PCE 66%.
The hypothesis that PCC+PCE would improve quality of life and agitation even further was not supported, even though there were improvements in the quality of care interactions and resident emotional responses to care for some of this group. The Australian New Zealand Clinical Trials Registry Number is ACTRN 12608000095369.
The issue of time remains a crucial one in Lower Illinois Valley archaeology, and key problems remain unresolved. In this paper, new radiocarbon assays and published dates are used to test hypotheses concerning intra-site bluff top mound chronologies, timing and structure of valley settlement, and the emergence of regional symbolic communities during the Middle Woodland period (ca. 50 cal B.C.-cal A.D. 400). We show that within sites Middle Woodland mounds were constructed first on prominent, distal bluff ridges and subsequently in less-visible spaces, though additional dates are needed to fully understand intra-site chronology. Our analyses generally support previous studies suggesting a north-to-south settlement trajectory of the valley, though habitation site dates indicate a more complicated pattern of regional occupation that has yet to be fully explicated. In addition, floodplain regional symbolic communities also emerged along a north-to-south pattern, though not as rapidly as bluff crest mounds. Importantly, results indicate future areas of research necessary to elucidate regional chronology, resettlement of the valley, and community interactions.
Little is known about the effectiveness of advance care planning in the United Kingdom, although policy documents recommend that it should be available to all those with life-limiting illness.
An exploratory patient preference randomized controlled trial of advance care planning discussions with an independent mediator (maximum three sessions) was conducted in London outpatient oncology clinics and a nearby hospice. Seventy-seven patients (mean age 62 years, 39 male) with various forms of recurrent progressive cancer participated, and 68 (88%) completed follow-up at 8 weeks. Patients completed visual analogue scales assessing perceived ability to discuss end-of-life planning with healthcare professionals or family and friends (primary outcome), happiness with the level of communication, and satisfaction with care, as well as a standardized measure of anxiety and depression.
Thirty-eight patients (51%) showed preference for the intervention. Discussions with professionals or family and friends about the future increased in the intervention arms, whether randomized or preference, but happiness with communication was unchanged or worse, and satisfaction with services decreased. Trial participation did not cause significant anxiety or depression and attrition was low.
Significance of results:
A randomized trial of advance care planning is possible. This study provides new evidence on its acceptability and effectiveness for patients with advanced cancer.
Advance care planning (ACP) provides patients with an opportunity to consider, discuss, and plan their future care with health professionals. Numerous policy documents recommend that ACP should be available to all with life-limiting illness.
Forty patients with recurrent progressive cancer completed one or more ACP discussions with a trained planning mediator using a standardized topic guide. Fifty-two interviews were transcribed verbatim and analyzed for qualitative thematic content.
Most patients had not spoken extensively to health professionals or close persons about the future. Their concerns related to experiencing distressing symptoms or worrying how family members would cope. Some patients wished for more accurate information and were unaware of their options for care. Many felt it was doctors' responsibility to initiate such discussions, but perceived that their doctors were reluctant to do so. However, some patients felt that the time was not yet right for these conversations.
Significance of results:
This article reports on the recorded content of ACP discussions. The extent to which patients want to engage in ACP is variable, and support and training are needed for health professionals to initiate such discussions. Our findings do not fully support the current United Kingdom policy of introducing ACP early in life-threatening disease.
Mental illness is common among prisoners, but little evidence exists regarding changes in symptoms in custody over time.
To investigate the prevalence and predictors of psychiatric symptoms among prisoners during early custody.
In a prospective cohort study, 3079 prisoners were screened for mental illness within 3 days of reception. To establish baseline diagnoses and symptoms, 980 prisoners were interviewed; all remaining in custody were followed up 1 month and 2 months later.
Symptom prevalence was highest during the first week of custody. Prevalence showed a linear decline among men and convicted prisoners, but not women or remand prisoners. It decreased among prisoners with depression, but not among prisoners with other mental illnesses.
Overall, imprisonment did not exacerbate psychiatric symptoms, although differences in group responses were observed. Continued discussion regarding non-custodial alternatives for vulnerable groups and increased support for all during early custody are recommended.
Multidisciplinary antimicrobial utilization teams (AUTs) have been proposed as a mechanism for improving antimicrobial use, but data on their efficacy remain limited.
To determine the impact of an AUT on antimicrobial use at a teaching hospital.
Randomized controlled intervention trial.
A 953-bed, public, university-affiliated, urban teaching hospital.
Patients who were given selected antimicrobial agents (piperacillin-tazobactam, levofloxacin, or vancomycin) by internal medicine ward teams.
Twelve internal medicine teams were randomly assigned monthly: 6 teams to an intervention group (academic detailing by the AUT) and 6 teams to a control group that was given indication-based guidelines for prescription of broad-spectrum antimicrobials (standard of care), during a 10-month study period.
Proportion of appropriate empirical, definitive (therapeutic), and end (overall) antimicrobial usage.
A total of 784 new prescriptions of piperacillin-tazobactam, levofloxacin, and vancomycin were reviewed. The proportion of antimicrobial prescriptions written by the intervention teams that was considered to be appropriate was significantly higher than the proportion of antimicrobial prescriptions written by the control teams that was considered to be appropriate: 82% versus 73% for empirical (risk ratio [RR], 1.14; 95% confidence interval [CI], 1.04-1.24), 82% versus 43% for definitive (RR, 1.89; 95% CI, 1.53-2.33), and 94% versus 70% for end antimicrobial usage (RR, 1.34; 95% CI, 1.25-1.43). In multivariate analysis, teams that received feedback from the AUT alone (adjusted RR, 1.37; 95% CI, 1.27-1.48) or from both the AUT and the infectious diseases consultation service (adjusted RR, 2.28; 95% CI, 1.64-3.19) were significantiy more likely to prescribe end antimicrobial usage appropriately, compared with control teams.
A multidisciplinary AUT that provides feedback to prescribing physicians was an effective method in improving antimicrobial use.
Pulse crop management can increase pulse yields and N fixation, but the effects of previous pulse crop management on subsequent crop performance is poorly understood. Field studies were conducted at three locations, in the Parkland region of Alberta, Canada, between 2004 and 2006. Tannin-free faba bean, narrowleaf lupin, and field pea were planted at 0.5, 1.0, 1.5, and 2.0 times the recommended pulse planting density (PPD), with or without barley as a model weed. Faba bean produced the highest seed yields in higher precipitation environments, whereas pea produced the highest seed yields in lower precipitation environments. Lupin seed yields were consistently low. In the absence of weed interference, faba bean, pea, and lupin N-fixation yields ranged from 70 to 223, 78 to 147, and 46 to 173 kg N ha−1, respectively. On average, faba bean produced the highest N-fixation yield. The absence of weed interference and a high PPD increased pulse seed and N-fixation yields. Quality wheat crops were grown on pulse stubble without additional N fertilizer in some site–years. Management practices that increased N fixation resulted in only marginal subsequent wheat yield increases. Subsequent wheat seed yield was primarily influenced by pulse species. Pea stubble produced 11% higher wheat yields than lupin stubble but only 2% higher wheat yields than faba bean stubble. Consistently high wheat yields on pea stubble may be attributed to synchronized N release from decomposing pea residues with subsequent crop N demand and superior non-N rotational benefits.
Greenhouse experiments were conducted to determine whether multiple applications of glyphosate and time of glyphosate application with regard to the crop's growth stage had a significant effect on the growth and development of glyphosate-resistant canola. Glyphosate was applied as single applications at the two-, four-, or six-leaf stage of canola; as sequential double applications at the two- and four-, two- and six-, and four- and six-leaf stages of canola; and as a triple application at all three stages. Of the plant growth parameters measured, single applications of glyphosate resulted in significant reductions to stem weight and shoot weight compared with nontreated plants, and multiple applications of glyphosate caused significant reductions to leaf area, leaf weight, stem weight, and shoot weight. Single applications of glyphosate were less injurious to glyphosate-resistant canola compared with multiple applications, and canola growth parameter reductions were greatest after earlier glyphosate applications.
We investigated knowledge, attitudes, and behaviors of prescribers concerning piperacillin-tazobactam use at 4 Emory University-affiliated hospitals. Discussions during focus groups indicated that the participants' perceived knowledge of clinical criteria for appropriate piperacillin-tazobactam use was inadequate. Retrospective review of medical records identified inappropriate practices. These findings have influenced ongoing interventions aimed at optimizing piperacillin-tazobactam use.
Faced with reduced numbers choosing to study foreign languages (as in England and Wales), strategies
to create and maintain student interest need to be explored. One such strategy
is to create ‘taster’ courses in languages, for potential university applicants. The findings presented arise
from exploratory research, undertaken to inform the design of a selection of web-based taster courses for less widely taught
languages. 687 school students, aged 14-18, were asked to identify a web site that they liked and to state their main reason for
liking it. They were invited to include recreational sites and told that their answers could help with web design for the
taster courses. To explore the reasons, two focus groups were conducted and student feedback on the developing taster course
site was collected. Students nominated search engines and academic sites, sites dedicated to hobbies, enthusiasms, youth
culture and shopping. They liked them for their visual attributes, usability, interactivity, support for schoolwork and for
their cultural and heritage associations, as well as their content and functionality. They emerged as sensitive readers of web
content, visually aware and with clear views on how text should be presented. These findings informed design of the taster course
site. They are broadly in line with existing design guidelines but add to our
knowledge about school students’ use of the web and about designing web-based learning materials. They may also be relevant
to web design at other levels, for example for undergraduates.
We evaluated gross motor function following botulinum toxin A (BTX-A) injections in the lower limbs of children with spastic cerebral palsy in a randomized clinical trial, using a cross-over design. Forty-nine children (24 males, 25 females, age range 22 to 80 months) were randomly allocated to two groups: group 1 received BTX-A and physiotherapy, and group 2 received physiotherapy alone for 6 months. At the end of this period, group 2 received BTX-A and physiotherapy and group 1 continued with physiotherapy alone. Assessment measures were the Gross Motor Function Measure (GMFM), the Vulpe Assessment Battery (VAB), joint range of movement, the Modified Ashworth Scale, and a parental questionnaire. Sustained gains in gross motor function were found in both groups of children but the only additional benefit found in group 1 was a significant increase in fine motor rating on the VAB. By contrast, parents rated the benefit of treatment highly. It is likely that assessment at 3 and 6 months post injection was too late to demonstrate peak gross motor function response and that changes in GMFM are not sustained over 6 months with a single dose. Further studies should investigate changes over shorter time periods and consider covariables such as BTX-A dosage, number of injection sites, and the role of repeated injections combined with other interventions such as casting.