To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A predictive risk stratification tool (PRISM) to estimate a patient's risk of an emergency hospital admission in the following year was trialled in general practice in an area of the United Kingdom. PRISM's introduction coincided with a new incentive payment (‘QOF’) in the regional contract for family doctors to identify and manage the care of people at high risk of emergency hospital admission.
Alongside the trial, we carried out a complementary qualitative study of processes of change associated with PRISM's implementation. We aimed to describe how PRISM was understood, communicated, adopted, and used by practitioners, managers, local commissioners and policy makers. We gathered data through focus groups, interviews and questionnaires at three time points (baseline, mid-trial and end-trial). We analyzed data thematically, informed by Normalisation Process Theory (1).
All groups showed high awareness of PRISM, but raised concerns about whether it could identify patients not yet known, and about whether there were sufficient community-based services to respond to care needs identified. All practices reported using PRISM to fulfil their QOF targets, but after the QOF reporting period ended, only two practices continued to use it. Family doctors said PRISM changed their awareness of patients and focused them on targeting the highest-risk patients, though they were uncertain about the potential for positive impact on this group.
Though external factors supported its uptake in the short term, with a focus on the highest risk patients, PRISM did not become a sustained part of normal practice for primary care practitioners.
New approaches are needed to safely reduce emergency admissions to hospital by targeting interventions effectively in primary care. A predictive risk stratification tool (PRISM) identifies each registered patient's risk of an emergency admission in the following year, allowing practitioners to identify and manage those at higher risk. We evaluated the introduction of PRISM in primary care in one area of the United Kingdom, assessing its impact on emergency admissions and other service use.
We conducted a randomized stepped wedge trial with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. PRISM was implemented in eleven primary care practice clusters (total thirty-two practices) over a year from March 2013. We analyzed routine linked data outcomes for 18 months.
We included outcomes for 230,099 registered patients, assigned to ranked risk groups.
Overall, the rate of emergency admissions was higher in the intervention phase than in the control phase: adjusted difference in number of emergency admissions per participant per year at risk, delta = .011 (95 percent Confidence Interval, CI .010, .013). Patients in the intervention phase spent more days in hospital per year: adjusted delta = .029 (95 percent CI .026, .031). Both effects were consistent across risk groups.
Primary care activity increased in the intervention phase overall delta = .011 (95 percent CI .007, .014), except for the two highest risk groups which showed a decrease in the number of days with recorded activity.
Introduction of a predictive risk model in primary care was associated with increased emergency episodes across the general practice population and at each risk level, in contrast to the intended purpose of the model. Future evaluation work could assess the impact of targeting of different services to patients across different levels of risk, rather than the current policy focus on those at highest risk.
Emergency admissions to hospital are a major financial burden on health services. In one area of the United Kingdom (UK), we evaluated a predictive risk stratification tool (PRISM) designed to support primary care practitioners to identify and manage patients at high risk of admission. We assessed the costs of implementing PRISM and its impact on health services costs. At the same time as the study, but independent of it, an incentive payment (‘QOF’) was introduced to encourage primary care practitioners to identify high risk patients and manage their care.
We conducted a randomized stepped wedge trial in thirty-two practices, with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. We analysed routine linked data on patient outcomes for 18 months (February 2013 – September 2014). We assigned standard unit costs in pound sterling to the resources utilized by each patient. Cost differences between the two study phases were used in conjunction with differences in the primary outcome (emergency admissions) to undertake a cost-effectiveness analysis.
We included outcomes for 230,099 registered patients. We estimated a PRISM implementation cost of GBP0.12 per patient per year.
Costs of emergency department attendances, outpatient visits, emergency and elective admissions to hospital, and general practice activity were higher per patient per year in the intervention phase than control phase (adjusted δ = GBP76, 95 percent Confidence Interval, CI GBP46, GBP106), an effect that was consistent and generally increased with risk level.
Despite low reported use of PRISM, it was associated with increased healthcare expenditure. This effect was unexpected and in the opposite direction to that intended. We cannot disentangle the effects of introducing the PRISM tool from those of imposing the QOF targets; however, since across the UK predictive risk stratification tools for emergency admissions have been introduced alongside incentives to focus on patients at risk, we believe that our findings are generalizable.
Asian populations have a higher percentage body fat (%BF) and are at higher risk for CVD and related complications at a given BMI compared with those of European descent. We explored whether %BF was disproportionately elevated in rural Bangladeshi women with low BMI. Height, weight, mid-upper arm circumference, triceps and subscapular skinfolds and bioelectrical impedance analysis (BIA) were measured in 1555 women at 3 months postpartum. %BF was assessed by skinfolds and by BIA. BMI was calculated in adults and BMI Z-scores were calculated for females <20 years old. Receiver operating characteristic (ROC) curves found the BMI and BMI Z-score cut-offs that optimally classified women as having moderately excessive adipose tissue (defined as >30 % body fat). Linear regressions estimated the association between BMI and BMI Z-score (among adolescents) and %BF. Mean BMI was 19·2 (sd 2·2) kg/m2, and mean %BF was calculated as 23·7 (sd 4·8) % by skinfolds and 23·3 (sd 4·9) % by BIA. ROC analyses indicated that a BMI value of approximately 21 kg/m2 optimised sensitivity (83·6 %) and specificity (84·2 %) for classifying subjects with >30 % body fat according to BIA among adults. This BMI level is substantially lower than the WHO recommended standard cut-off point of BMI ≥ 25 kg/m2. The equivalent cut-off among adolescents was a BMI Z-score of –0·36, with a sensitivity of 81·3 % and specificity of 80·9 %. These findings suggest that Bangladeshi women exhibit excess adipose tissue at substantially lower BMI compared with non-South Asian populations. This is important for the identification and prevention of obesity-related metabolic diseases.
Hauora Pacific (HP), a research group for Māori and Pacific nursing students, explored the adaptation of a behaviour modification programme as a health promotion strategy and smoking cessation resource for Māori and Pacific people. Each of seven HP members, supported by a mentor from their family or church, conducted a focus group with about six participants of their own ethnicity. The focus groups met twice. Data were collected on smoking beliefs, and what might aid smoking cessation. In the second meeting, themes common to the seven focus groups from the first meetings were validated, and a draft behaviour modification workbook for “quitting in groups” was discussed and edited. The initial surprise for HP members was that their participants did not share the dominant discourse on health risks from smoking; nor did they want to be told they should quit. Participants framed smoking as a positive activity. Discussion highlighted the common belief that “quitting in groups” would not be a preferred way to stop smoking, linked to personal shame from an inability to stop smoking; and the potential for a group to be too judgmental or pressuring. Although some work on the adaptation of a behaviour modification resource for “quitting in groups” did occur, participants felt that much more Māori or Pacific input would be required to shift an essentially western approach to behaviour change, into something another culture could feel ownership of. Addiction was seen as the issue that had been least well addressed in the past, and participants believed that having more trained-and-available people would be their preferred health resource.
Excavations at Tinney's Lane, Sherborne in 2002 uncovered extensive evidence for Late Bronze Age settlement and pottery production, dating from a short time period probably within the 12th or 11th century cal bc. Well-preserved deposits of burnt stone, broken vessels, and burnt sherds, together with resulting debris redeposited in associated pits, were accompanied by a series of post-hole structures interpreted as round-houses and four-post settings. Environmental evidence in the form of charcoal, charred plant remains, and molluscs has provided important information concerning sources of fuel and water for pottery production as well as allowing a reconstruction of the local vegetation. Finds of fired clay, metal, stone, shale, flint, and bone include items from distant sources, informing topics such as site status and exchange, and include many categories of tools and equipment that would have been used within the pottery-making processes. Analysis of the spatial distribution of these finds amongst the structures and surviving layers of burning has allowed the definition of a series of industrial activity areas, each comprising one or more round-houses, a four-post structure, bonfire bases or pits used for firing, and other pits with specific related functions. Altogether the site has provided some of the best evidence for pottery production within prehistoric Britain.
Spatial data sets can be analysed by counting the number of objects in equally sized bins. The bin counts are related to the Pólya urn process, where coloured balls (for example, white or black) are removed from the urn at random. If there are insufficient white or black balls for the prescribed number of trials, the Pólya urn process becomes untenable. In this case, we modify the Pólya urn process so that it continues to describe the removal of volume within a spatial distribution of objects. We determine when the standard formula for the variance of the standard Pólya distribution gives a good approximation to the true variance. The variance quantifies an index for assessing whether a spatial point data set is at its most randomly distributed state, called the complete spatial randomness (CSR) state. If the bin size is an order of magnitude larger than the size of the objects, then the standard formula for the CSR limit is indicative of when the CSR state has been attained. For the special case when the object size divides the bin size, the standard formula is in fact exact.
25-Hydroxyvitamin D (25(OH)D) half-life is a potential biomarker for investigating vitamin D metabolism and requirements. We performed a pilot study to assess the approach and practical feasibility of measuring 25(OH)D half-life after an oral dose. A total of twelve healthy Gambian men aged 18–23 years were divided into two groups to investigate the rate and timing of (1) absorption and (2) plasma disappearance after an 80 nmol oral dose of 25(OH)D2. Fasting blood samples were collected at baseline and, in the first group, every 2 h post-dose for 12 h, at 24 h, 48 h and on day 15. In the second group, fasting blood samples were collected on days 3, 4, 5, 6, 9, 12, 15, 18 and 21. Urine was collected for 2 h after the first morning void at baseline and on day 15. 25(OH)D2 plasma concentration was measured by ultra-performance liquid chromatography-tandem MS/MS and corrected for baseline. Biomarkers of vitamin D, Ca and P metabolism were measured at baseline and on day 15. The peak plasma concentration of 25(OH)D2 was 9·6 (sd 0·9) nmol/l at 4·4 (sd 1·8) h. The terminal slope of 25(OH)D2 disappearance was identified to commence from day 6. The terminal half-life of plasma 25(OH)D2 was 13·4 (sd 2·7) d. There were no significant differences in plasma 25(OH)D3, total 1,25(OH)2D, parathyroid hormone, P, Ca and ionised Ca and urinary Ca and P between baseline and day 15 and between the two groups. The present study provides data on the plasma response to oral 25(OH)D2 that will underpin and contribute to the further development of studies to investigate 25(OH)D half-life.
Few studies have investigated the absorption of phylloquinone (vitamin K1). We recruited twelve healthy, non-obese adults. On each study day, fasted subjects took a capsule containing 20 μg of 13C-labelled phylloquinone with one of three meals, defined as convenience, cosmopolitan and animal-oriented, in a three-way crossover design. The meals were formulated from the characteristics of clusters identified in dietary pattern analysis of data from the National Diet and Nutrition Survey conducted in 2000–1. Plasma phylloquinone concentration and isotopic enrichment were measured over 8 h. Significantly more phylloquinone tracer was absorbed when consumed with the cosmopolitan and animal-oriented meals than with the convenience meal (P = 0·001 and 0·035, respectively). Estimates of the relative availability of phylloquinone from the meals were: convenience meal = 1·00; cosmopolitan meal = 0·31; animal-oriented meal = 0·23. Combining the tracer data with availability estimates for phylloquinone from the meals provides overall relative bioavailability values of convenience = 1·00, cosmopolitan = 0·46 and animal-oriented = 0·29. Stable isotopes provide a useful tool to investigate further the bioavailability of low doses of phylloquinone. Different meals can affect the absorption of free phylloquinone. The meal-based study design used in the present work provides an approach that reflects more closely the way foods are eaten in a free-living population.
Gyps vulture populations across the Indian subcontinent collapsed in the 1990s and continue to decline. Repeated population surveys showed that the rate of decline was so rapid that elevated mortality of adult birds must be a key demographic mechanism. Post mortem examination showed that the majority of dead vultures had visceral gout, due to kidney damage. The realisation that diclofenac, a non-steroidal anti-inflammatory drug potentially nephrotoxic to birds, had become a widely used veterinary medicine led to the identification of diclofenac poisoning as the cause of the decline. Surveys of diclofenac contamination of domestic ungulate carcasses, combined with vulture population modelling, show that the level of contamination is sufficient for it to be the sole cause of the decline. Testing on vultures of meloxicam, an alternative NSAID for livestock treatment, showed that it did not harm them at concentrations likely to be encountered by wild birds and would be a safe replacement for diclofenac. The manufacture of diclofenac for veterinary use has been banned, but its sale has not. Consequently, it may be some years before diclofenac is removed from the vultures' food supply. In the meantime, captive populations of three vulture species have been established to provide sources of birds for future reintroduction programmes.
Previous studies of vitamin C absorption in man using stable isotope probes have given results which cannot easily be reconciled with those obtained using non-isotope measurement. In order to investigate some of the apparent paradoxes we have conducted a study using two consecutive doses of vitamin C, one labelled and one unlabelled, given 90 min apart. Compatibility of the experimental results with two feasible models was investigated. In Model 1, ingested vitamin C enters a pre-existing pool before absorption, which occurs only when a threshold is exceeded; in Model 2, ingested vitamin C is exchanged with a pre-existing flux before absorption. The key difference between these two models lies in the predicted profile of labelled material in plasma. Model 1 predicts that the second unlabelled dose will produce a secondary release of labelled vitamin C which will not be observed on the basis of Model 2. In all subjects Model 1 failed to predict the observed plasma concentration profiles for labelled and unlabelled vitamin C, but Model 2 fitted the experimental observations. We speculate on possible physiological explanations for this behaviour, but from the limited information available cannot unequivocally confirm the model structure by identifying the source of the supposed flux.
This study tested the hypothesis that carbon monoxide poisoning would
produce a deficit of attentional control, the supervisory attention
system, as indexed by attention switching and attentional scheduling,
and that routine attentional orienting would be unaffected.
Seventy-three cases of carbon monoxide poisoning were assessed at 3
days and 1 month post poisoning on tasks of attentional orienting, and
tasks of the supervisory attention system. The results were compared to
a group of 53 healthy community participants. A deficit of the
supervisory attentional system was documented on a task of attention
switching in survivors of both deliberate and accidental CO poisoning,
leaving attentional scheduling intact. There was no deficit of
attentional orienting in the current study. Alteration of consciousness
was found to predict subsequent supervisory attention system impairment
in correlation analyses, and the deficit was persistent for a 1 month
follow-up period. (JINS, 2004, 10, 843–850.)
Factors affecting absorption of physiological doses of vitamin C in man have not been widely studied, partly because few suitable tools exist to distinguish recently absorbed vitamin C from endogenous vitamin. Stable isotope-labelled vitamin C provides such a tool. Fifteen healthy non-smoking subjects aged 26–59 years were studied. Each received 30 mg L-[1-13C]ascorbic acid orally on two occasions, 3–4 weeks apart. The ascorbate was given alone or with Fe (100 mg as ferrous fumarate) or with red grape juice, which is rich in polyphenols. Blood was collected at frequent intervals for 1 h, and then each hour for a further 3 h. Total concentration of vitamin C was measured fluorometrically and its 13C-isotope enrichment was measured by GC–MS after conversion to volatile trimethylsilyl esters. Peak plasma enrichment occurred within 25–50 min. No kinetic variables were significantly altered by the iron fumarate supplement. Grape juice attenuated vitamin C absorption, reaching significance at the 20 min time point. There were weak correlations between isotope enrichment and body weight or endogenous ascorbate concentration. The increment in total plasma ascorbate was smaller if calculated from isotope enrichment than from vitamin C concentration increase. The dilution pool was much larger than the plasma ascorbate pool. Further studies are needed to resolve these paradoxes. Stable isotope-labelled ascorbate is potentially useful for measurement of vitamin C absorption by human subjects.
We screened a cDNA library generated from harvested and stored sporophores of Agaricus bisporus and identified 19 genes with higher transcript levels than at the time of harvest. Five of these genes had no detectable mRNA levels prior to detachment from the mycelium. Sequence analysis of ten clones revealed significant similarities to known genes, these code for proteins involved in polymer breakdown and metabolism, cell wall synthesis, stress tolerance, cytochrome P450 activity and DNA binding. The diversity of functions of these genes suggests the changes in the sporophore after harvest involve several different physiological processes.
To determine the effect of different methods of training on the ability of hospital workers to wear respirators and pass a qualitative fit test, and to compare the direct cost of the training.
179 hospital employees were recruited for study and were stratified into three groups based on the type of training they received in the use of respirators. Employees in Group A received one-on-one training by the hospital's industrial hygienist and were fit tested as part of this training. Employees in Group B received classroom instruction and demonstration by infection control nurses in the proper use of respirators, but were not fit tested as part of training. Employees in Group C received no formal training. Each participant in our study underwent a subsequent qualitative fit test using irritant smoke to check for the employee's ability to adjust correctly the fit and seal of the respirator. The direct cost of each method of training was determined by accounting for the cost of trainers and the cost of employee-hours lost during training.
775-bed Veterans' Affairs hospital.
94% of Group A participants (49 of 52) passed the qualitative fit test, compared to 91% of Group B participants (58 of 64) and 79% of Group C participants (50 of 63; P=.036, 2 × 3 chi-square). Group A had a significantly higher pass rate than Group C (P=.043), but Group B did not differ significantly from Group A or Group C. Location or professional status did not affect pass rate, but prior experience wearing respirators did. When the study groups were compared after stratifying for prior experience, we found no difference in pass rates, except when Groups A and B (those with any training) were combined and compared with Group C (107 of 116 versus 50 of 63, P=.05, Mantel-Haenszel chi-square).
We estimate that the method of training involving individual instruction followed by fit testing took 20 minutes per employee to complete, compared to 10 minutes per six employee class for the method of classroom demonstration. The difference in direct cost between the two methods, applied to the training of 1,200 employees at our hospital, would be approximately $19,000 per year.
Our study indicates that training in the proper use of respirators is important, but the method of training may not be, as the two methods we evaluated were nearly equivalent in their pass rates on fit testing (94% versus 91%). Fit testing as part of training may have enhanced the performance of our participants marginally, but was more time consuming and accounted for most of the excess cost.
Chinburg and Reid have recently constructed examples of hyperbolic 3-manifolds in which every closed geodesic is simple. These examples are constructed in a highly non-generic way and it is of interest to understand in the general case the geometry of and structure of the set of closed geodesics in hyperbolic 3-manifolds. For hyperbolic 3-manifolds which contain immersed totally geodesic surfaces there are always non-simple closed geodesics. Here we construct examples of manifolds with non-simple closed geodesics and no totally geodesic surfaces.
An algorithm is given for determining presence or absence of injectively (at the fundamental group level) immersed tori (and constructing them, if present) in a branched cover of S3, branched over the figure eight knot, with all branching indices greater than 2. Such tori are important for understanding the topology of 3-manifolds in light of (for example) the Jaco-Shalen–Johannson torus decomposition theorem and the fact that the figure eight knot is universal, i.e., that all 3-manifolds are representable as branched covers of S3, branched over the figure eight knot.
The algorithm is principally geometric in its derivation and graph-theoretic in its operation. It is applied to two examples, one of which has an incompressible torus and the other of which is atoroidal.
Email your librarian or administrator to recommend adding this to your organisation's collection.