To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Hyperprolific sows rear more piglets than they have teats, and to accommodate this, milk replacers are often offered as a supplement. Milk replacers are based on bovine milk, yet components of vegetable origin are often added. This may reduce growth, but could also accelerate maturational changes. Therefore, we investigated the effect of feeding piglets a milk replacer with gradually increasing levels of wheat flour on growth, gut enzyme activity and immune function compared with a diet based entirely on bovine milk. The hypothesis tested was that adding a starch component (wheat flour) induces maturation of the mucosa as measured by higher digestive activity and improved integrity and immunity of the small intestines (SI). To test this hypothesis, piglets were removed from the sow at day 3 and fed either a pure milk replacer diet (MILK) or from day 11 a milk replacer diet with increasing levels of wheat (WHEAT). The WHEAT piglets had an increased enzyme activity of maltase and sucrase in the proximal part of the SI compared with the MILK group. There were no differences in gut morphology, histopathology and gene expression between the groups. In conclusion, the pigs given a milk replacer with added wheat displayed immunological and gut mucosal enzyme maturational changes, indicatory of adaptation towards a vegetable-based diet. This was not associated with any clinical complications, and future studies are needed to show whether this could improve responses in the subsequent weaning process.
Public health strategies have focused largely on physical health. However, there is increasing recognition that raising mental health awareness and tackling stigma is crucial to reduce disease burden. National campaigns have had some success but tackling issues locally is particularly important.
To assess the public's awareness and perception of the monthly BBC Cornwall mental health phone-in programmes that have run for 8.5 years in Cornwall, UK (population 530 000).
A consultation, review and feedback process involving a multiagency forum of mental and public health professionals, people with lived experience and local National Health Service trust's media team was used to develop a brief questionnaire. This was offered to all attendees at two local pharmacies covering populations of 27 000 over a 2-week period.
In total, 14% (95% CI 11.9–16.5) were aware of the radio show, 11% (95% CI 9.0–13.1) have listened and the majority (76%) of those who listened did so more than once. The estimated reach is 70 000 people in the local population, of whom approximately 60 000 listen regularly. The show is highly valued among respondents with modal and median scores of 4 out of 5.
Local radio is a successful, cost-effective and impactful way to reach a significant proportion of the population and likely to raise awareness, reduce stigma and be well received. The format has been adopted in other regions thus demonstrating easy transferability. It could form an essential part of a public health strategy to improve a population's mental well-being.
Declaration of interest
W.H. received support from the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care (CLAHRC) for the South West Peninsula UK. The views expressed in this publication are those of the authors and not necessarily those of the NHS, the NIHR or the Department of Health. L.R. and D.S. were involved in delivering the programmes but had no role in their evaluation.
Resource use measurement is known to be a challenging and time-consuming, but essential step in economic evaluations of health care interventions. Measuring true quantities of resources utilized is of major importance for generating valid costing estimates. As consequence of the absence of a gold standard and of acknowledged guidelines, the choice of a measurement method is often based on practicality instead of methodological evidence. An overview of resource use measurement issues is currently lacking. Such overview could enhance clearance in the quality of resource use measurement methods in economic evaluations and may facilitate to opt for evidence based measurement methods in the future. This study aims to provide an overview of methodological evidence regarding resource use measurement issues in economic evaluations.
Literature was searched by three different methods. First, a search strategy was used in six different databases. Second, the Database of Instruments for Resource Use Measurement (DIRUM) was hand-searched. Third, experts from six different European Union countries within the field of health economics were asked to provide relevant studies. Data was analyzed according to the Resource Use Measurement Issues (RUMI-) framework, which was developed for this study.
Of the 3,478 articles provided in the initial search, 77 were fully analyzed. An overview with evidence is provided for every resource use measurement issue. Most research focused around the issue ‘how to measure’, in particular the effect of self-reported data versus administrative data. In contrast, little to no research has been done on issues ‘what to measure’ and ‘for which purpose to measure’.
Results of this study provide insight in the effect of a chosen measurement method. The results stress the importance of measuring the true quantities of resources utilized for generating valid costing estimates. Furthermore, this article highlights the lack of evidence in appropriate resource use measurement methods.
Energy intake (EI) and energy expenditure (EE) should not be considered independent entities, but more an inter-connected system. With increased physical activity and reduced snacking initiatives as prevalent Public Health measures, any changes to subsequent EI from these recommendations should be monitored. The aim of this study was to investigate changes in acute EI and appetite over four conditions: (1) a control condition with no snack and no exercise (CON); (2) a snack condition (+1 MJ; SK); (3) a moderate-intensity cycling exercise condition (−1 MJ; EX); and finally (4) both snack and exercise condition (+1 MJ, −1 MJ; EXSK). Acute changes in appetite (visual analogue scale) and lunchtime EI (ad libitum pizza meal) were recorded in twenty boys and eighteen girls (12–13 years). Lunch EI was not significantly different between conditions or sexes (P>0·05). Relative EI was calculated, where the energy manipulation (+1 MJ from the snack or −1 MJ from the exercise) was added to lunchtime EI. Relative EI indicated no significant differences between the sexes (P>0·05); however, in the EX condition, relative EI was significantly lower (P<0·001) compared with all other conditions. Appetite increased significantly over time (P<0·001) and was significantly higher in the CON and EX conditions compared with the SK and EXSK conditions. No significant sex differences were found between conditions. When aiming to evoke an acute energy deficit, increasing EE created a significantly larger relative energy deficit than the removal of the mid-morning snack. Sex was not a confounder to influence EI or appetite between any of the conditions.
Many mental health service users delay or avoid disclosing their condition to employers because of experience, or anticipation, of discrimination. However, non-disclosure precludes the ability to request ‘reasonable adjustments’. There have been no intervention studies to support decisionmaking about disclosure to an employer.
To determine whether the decision aid has an effect that is sustained beyond its immediate impact; to determine whether a large-scale trial is feasible; and to optimise the designs of a larger trial and of the decision aid.
In this exploratory randomised controlled trial (RCT) in London, participants were randomly assigned to use of a decision aid plus usual care or usual care alone. Follow-up was at 3 months. Primary outcomes were: (a) stage of decision-making; (b) decisional conflict; and (c) employment-related outcomes (trial registration number: NCT01379014).
We recruited 80 participants and interventions were completed for 36 out of 40 in the intervention group; in total 71 participants were followed up. Intention-to-treat analysis showed that reduction in decisional conflict was significantly greater in the intervention group than among controls (mean improvement −22.7 (s.d. = 15.2) v. −11.2 (s.d. = 18.1), P = 0.005). More of the intervention group than controls were in full-time employment at follow-up (P = 0.03).
The observed reduction in decisional conflict regarding disclosure has a number of potential benefits which next need to be tested in a definitive trial.
A series of editorials in this Journal have argued that psychiatry is in the midst of a crisis. The various solutions proposed would all involve a strengthening of psychiatry's identity as essentially ‘applied neuroscience’. Although not discounting the importance of the brain sciences and psychopharmacology, we argue that psychiatry needs to move beyond the dominance of the current, technological paradigm. This would be more in keeping with the evidence about how positive outcomes are achieved and could also serve to foster more meaningful collaboration with the growing service user movement.
Submicroscopic, rare chromosomal copy number variants (CNVs) contribute to neurodevelopmental disorders but it is not known whether they define atypical clinical cases.
To identify whether large, rare CNVs in attention-deficit hyperactivity disorder (ADHD) are confined to a distinct clinical subgroup.
A total of 567 children with ADHD aged 5–17 years were recruited from community clinics. Psychopathology was assessed using the Child and Adolescent Psychiatric Assessment. Large, rare CNVs (>500 kb, <1% frequency) were defined from single nucleotide polymorphism data.
Copy number variant carriers (13.6%) showed no differences from non-carriers in ADHD symptom severity, symptom type, comorbidity, developmental features, family history or pre-/ perinatal markers. The only significant difference was a higher rate of intellectual disability (24% v. 9%, χ2 = 15.5, P = 0.001). Most CNV carriers did not have intellectual disability.
Large, rare CNVs are not restricted to an atypical form of ADHD but may be more highly enriched in children with cognitive problems.
Recent reports estimate the prevalence of autism-spectrum conditions in
the UK to be 1%.
To use different methods to estimate the prevalence of autism-spectrum
conditions, including previously undiagnosed cases, in
We carried out a survey of autism-spectrum conditions using the Special
Educational Needs (SEN) register. A diagnosis survey was distributed to
participating schools to be handed out to parents of all children aged
5–9 years. The mainstream primary school population was screened for
The prevalence estimates generated from the SEN register and diagnosis
survey were 94 per 10 000 and 99 per 10 000 respectively. A total of 11
children received a research diagnosis of an autism-spectrum condition
following screening and assessment. The ratio of known:unknown cases is
about 3:2 (following statistical weighting procedures). Taken together,
we estimate the prevalence to be 157 per 10 000, including previously
This study has implications for planning diagnostic, social and health
The narrow-striped mongoose Mungotictis decemlineata is a small, endemic carnivore currently known to occur only in the dry deciduous forests of the central and southern Menabe regions of western Madagascar. It is categorized as Endangered on the IUCN Red List and is threatened by rapid habitat loss from deforestation. From live-trapping and village surveys we found M. decemlineata to be distributed throughout the largest area of connected forest in central Menabe and most of the larger forest fragments in southern Menabe. We estimated there are a minimum of 2,000–3,400 adults in central Menabe and 6,400–8,650 adults in southern Menabe. Although this represents the total known population, the southern limits of the species' range are still unclear. Fifty-four individuals were live-trapped in central Menabe. M. decemlineata abundance was not correlated with forest structure or invertebrate abundance and diversity at the sampled sites. The building of access roads for logging may have a long-lasting effect by increasing the level of human disturbance, predation by domestic dogs, and illegal cutting within the surrounding area. Conservation management efforts to save M. decemlineata need immediate implementation, with emphasis on cooperative efforts with local villages to reduce the rate of slash-and-burn agriculture and logging of the remaining dry deciduous forest of the region. Research to determine population trends and status of M. decemlineata south of the Morondava and Mangoky rivers is required.
Previous research with an on-line processing task found that individuals without social anxiety generate benign inferences when ambiguous social information is encountered, but people with high social anxiety or social phobia do not (Hirsch and Mathews, 1997, 2000). In the present study, we tested if it is possible to induce a benign (or less negative) inferential bias in people who report anxiety about interviews by requiring them to take the perspective of an interview confident person, rather than their own. High interview anxious volunteers were allocated to read descriptions of job interviews, either taking their own perspective in the described situation or that of a confident interviewee. At certain points during the text, a target letter string appeared and participants were asked to indicate whether it formed a word or a non-word (lexical decision). Some of the lexical decisions occurred in the context of ambiguous text that could be interpreted in both a threatening and a benign manner. In a baseline condition, decisions were made following text for which there was only one possible inference (either threat or benign). The results indicated that, compared to the self referent condition, participants who adopted the perspective of a confident other person showed enhanced inhibition of threat inferences.
Although the clinical benefits of dietary supplementation with n-3 polyunsaturated fatty acids (PUFA) has been recognised for a number of years, the molecular mechanisms by which particular PUFA affect metabolism of cells within the synovial joint tissues are not understood. This study set out to investigate how n-3 PUFA and other classes of fatty acids affect both degradative and inflammatory aspects of metabolism of articular cartilage chondrocytes using an in vitro model of cartilage degradation. Using well-established culture models, cartilage explants from normal bovine and human osteoarthritic cartilage were supplemented with either n-3 or n-6 PUFA, and cultures were subsequently treated with interleukin 1 to initiate catabolic processes that mimic cartilage degradation in arthritis. Results show that supplementation specifically with n-3 PUFA, but not n-6 PUFA, causes a decrease in both degradative and inflammatory aspects of chondrocyte metabolism, whilst having no effect on the normal tissue homeostasis. Collectively, our data provide evidence supporting dietary supplementation of n-3 PUFA, which in turn may have a beneficial effect of slowing and reducing inflammation in the pathogenesis of degenerative joint diseases in man.
It is important to have a simple, accurate method for recording
eye movements. Of the two popular approaches commonly adopted,
electro-oculography (EOG) and infrared oculography (IROG), IROG
is often accepted as the more accurate, and it is the method
that is currently used most frequently to examine eye movements
in schizophrenia. This study investigated whether the
misclassification of blinks as saccades affects saccade rates
when the presence of a blink is determined using only IROG
recordings of eye position. Both vertical electro-oculography
(VEOG), which can be used to objectively identify blinks, and
IROG were recorded while 17 schizophrenia patients and 19 healthy
controls were presented with sinusoidal stimuli. Of the blinks
identified with the VEOG for the total group of participants,
a substantial number (37%) were misclassified as catch-up and
anticipatory saccades when only the IROG was used. Furthermore,
in the schizophrenia group, but not in the healthy control group,
the use of the IROG led to a significant misclassification of
blinks as anticipatory saccades. Therefore, when IROG alone
is used to identify blinks, the misclassification of blinks
as saccades is likely to introduce measurement error into estimates
of saccade rates, particularly estimates of anticipatory saccade
rates in schizophrenia patients.
Research studies have found that smooth pursuit
eye movement dysfunction may serve as an index of genetic
liability to develop schizophrenia. The heritability of
various measures of smooth pursuit eye tracking proficiency
and the saccades that occur during smooth pursuit was examined
in 64 monozygotic (MZ) and 48 dizygotic (DZ) twin pairs.
Two age cohorts were assessed (11–12 and 17–18
years of age). Intraclass correlations indicated significant
similarity in the MZ twins for almost all measures in both
age cohorts, whereas few of the DZ twin correlations attained
significance. Biometrical modeling indicated that genetic
mechanisms influence performance on both global and specific
eye tracking measures, accounting for about 40% to 60%
of the variance. These findings suggest that the underlying
brain systems responsible for smooth pursuit and saccade
generation during pursuit are under partial genetic control.