To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Epidemiological studies have reported that the increased risk of developing psychosis in cannabis users is dose related. In addition, experimental research has shown that the active constituent of cannabis responsible for its psychotogenic effect is Delta-9-Tetrahydrocannabinol (THC) (Murray et al, 2007). Recent evidence has suggested an increased in potency (% TCH) in the cannabis seized in the UK (Potter et al, 2007).
We predicted that first episode psychosis patients are more likely to use higher potency cannabis and more frequently than controls.
We collected information concerning socio-demographic, clinical characteristics and cannabis use (age at first use, frequency, length of use, type of cannabis used) from a sample of 191 first-episode psychosis patients and 120 matched healthy volunteers. All were recruited as part of the Genetic and Psychosis (GAP) study which studied all patients who presented to the South London and Maudsley Trust.
There was no significant difference in the life-time prevalence of cannabis use or age at first use between cases and controls. However, cases were more likely to be regular users (p=0.05), to be current users (p=0.04) and to have smoked cannabis for longer (p=0.01). Among cannabis users, 86.8% of 1st Episode Psychosis Patients preferentially used Skunk/Sinsemilla compared to 27.7% of Controls. Only 13.2 % of 1st Episode psychosis Patients chose to use Resin/Hash compared to 76.3% of controls. The concentration of TCH in these in South East London, ranges between 8.5 and 14 % (Potter et al, 2007). Controls (47%) were more likely to use Hash (Resin) whose average TCH concentration is 3.4% (Potter et al, 2007).
Patients with first episode psychosis have smoked higher potency cannabis, for longer and with greater frequency, than healthy controls.
Childhood maltreatment is one of the strongest predictors of adulthood depression and alterations to circulating levels of inflammatory markers is one putative mechanism mediating risk or resilience.
To determine the effects of childhood maltreatment on circulating levels of 41 inflammatory markers in healthy individuals and those with a major depressive disorder (MDD) diagnosis.
We investigated the association of childhood maltreatment with levels of 41 inflammatory markers in two groups, 164 patients with MDD and 301 controls, using multiplex electrochemiluminescence methods applied to blood serum.
Childhood maltreatment was not associated with altered inflammatory markers in either group after multiple testing correction. Body mass index (BMI) exerted strong effects on interleukin-6 and C-reactive protein levels in those with MDD.
Childhood maltreatment did not exert effects on inflammatory marker levels in either the participants with MDD or the control group in our study. Our results instead highlight the more pertinent influence of BMI.
Declaration of interest
D.A.C. and H.W. work for Eli Lilly Inc. R.N. has received speaker fees from Sunovion, Jansen and Lundbeck. G.B. has received consultancy fees and funding from Eli Lilly. R.H.M.-W. has received consultancy fees or has a financial relationship with AstraZeneca, Bristol-Myers Squibb, Cyberonics, Eli Lilly, Ferrer, Janssen-Cilag, Lundbeck, MyTomorrows, Otsuka, Pfizer, Pulse, Roche, Servier, SPIMACO and Sunovian. I.M.A. has received consultancy fees or has a financial relationship with Alkermes, Lundbeck, Lundbeck/Otsuka, and Servier. S.W. has sat on an advisory board for Sunovion, Allergan and has received speaker fees from Astra Zeneca. A.H.Y. has received honoraria for speaking from Astra Zeneca, Lundbeck, Eli Lilly, Sunovion; honoraria for consulting from Allergan, Livanova and Lundbeck, Sunovion, Janssen; and research grant support from Janssen. A.J.C. has received honoraria for speaking from Astra Zeneca, honoraria for consulting with Allergan, Livanova and Lundbeck and research grant support from Lundbeck.
Filamentary structures can form within the beam of protons accelerated during the interaction of an intense laser pulse with an ultrathin foil target. Such behaviour is shown to be dependent upon the formation time of quasi-static magnetic field structures throughout the target volume and the extent of the rear surface proton expansion over the same period. This is observed via both numerical and experimental investigations. By controlling the intensity profile of the laser drive, via the use of two temporally separated pulses, both the initial rear surface proton expansion and magnetic field formation time can be varied, resulting in modification to the degree of filamentary structure present within the laser-driven proton beam.
Introduction: Emergency Department Overcrowding (EDOC) is a multifactorial issue that leads to Access Block for patients needing emergency care. Identified as a national problem, patients presenting to a Canadian Emergency Department (ED) at a time of overcrowding have higher rates of admission to hospital and increased seven-day mortality. Using the well accepted input-throughput-output model to study EDOC, current research has focused on throughput as a measure of patient flow, reported as ED length of stay (LOS). In fact, ED LOS and ED beds occupied by inpatients are two “extremely important indicators of EDOC identified by a 2005 survey of Canadian ED directors. One proposed solution to improve ED throughput is to utilize a physician at triage (PAT) to rapidly assess newly arriving patients. In 2017, a pilot PAT program was trialed at Kelowna General Hospital (KGH), a tertiary care hospital, as part of a PDSA cycle. The aim was to mitigate EDOC by improving ED throughput by the end of 2018, to meet the national targets for ED LOS suggested in the 2013 CAEP position statement. Methods: During the fiscal periods 1-6 (April 1 to September 7, 2017) a PAT shift occurred daily from 1000-2200, over four long weekends. ED LOS, time to inpatient bed, time to physician initial assessment (PIA), number of British Columbia Ambulance Service (BCAS) offload delays, and number of patients who left without being seen (LWBS) were extracted from an administrative database. Results were retrospectively analyzed and compared to data from 1000-2200 of non-PAT trial days during the trial periods. Results: Median ED LOS decreased from 3.8 to 3.4 hours for high-acuity patients (CTAS 1-3), from 2.1 to 1.8 hours for low-acuity patients (CTAS 4-5), and from 9.3 to 8.0 hours for all admitted patients. During PAT trial weekends, there was a decrease in the average time to PIA by 65% (from 73 to 26 minutes for CTAS 2-5), average number of daily BCAS offload delays by 39% (from 2.3 to 1.4 delays per day), and number of patients who LWBS from 2.4% to 1.7%. Conclusion: The implementation of PAT was associated with improvements in all five measures of ED throughput, providing a potential solution for EDOC at KGH. ED LOS was reduced compared to non-PAT control days, successfully meeting the suggested national targets. PAT could improve efficiency, resulting in the ability to see more patients in the ED, and increase the quality and safety of ED practice. Next, we hope to prospectively evaluate PAT, continuing to analyze these process measures, perform a cost-benefit analysis, and formally assess ED staff and patient perceptions of the program.
Analysing temporal patterns in foodborne illness is important to designing and implementing effective food safety measures. The reported incidence of illness due to Salmonella in the USA. Foodborne Diseases Active Surveillance Network (FoodNet) sites has exhibited no declining trend since 1996; however, there have been significant annual trends among principal Salmonella serotypes, which may exhibit complex seasonal patterns. Data from the original FoodNet sites and penalised cubic B-spline regression are used to estimate temporal patterns in the reported incidence of illness for the top three Salmonella serotypes during 1996–2014. Our results include 95% confidence bands around the estimated annual and monthly curves for each serotype. The results show that Salmonella serotype Typhimurium exhibits a statistically significant declining annual trend and seasonality (P < 0.001) marked by peaks in late summer and early winter. Serotype Enteritidis exhibits a significant annual trend with a higher incidence in later years and seasonality (P < 0.001) marked by a peak in late summer. Serotype Newport exhibits no significant annual trend with significant seasonality (P < 0.001) marked by a peak in late summer.
A range of endophenotypes characterise psychosis, however there has been limited work understanding if and how they are inter-related.
This multi-centre study includes 8754 participants: 2212 people with a psychotic disorder, 1487 unaffected relatives of probands, and 5055 healthy controls. We investigated cognition [digit span (N = 3127), block design (N = 5491), and the Rey Auditory Verbal Learning Test (N = 3543)], electrophysiology [P300 amplitude and latency (N = 1102)], and neuroanatomy [lateral ventricular volume (N = 1721)]. We used linear regression to assess the interrelationships between endophenotypes.
The P300 amplitude and latency were not associated (regression coef. −0.06, 95% CI −0.12 to 0.01, p = 0.060), and P300 amplitude was positively associated with block design (coef. 0.19, 95% CI 0.10–0.28, p < 0.001). There was no evidence of associations between lateral ventricular volume and the other measures (all p > 0.38). All the cognitive endophenotypes were associated with each other in the expected directions (all p < 0.001). Lastly, the relationships between pairs of endophenotypes were consistent in all three participant groups, differing for some of the cognitive pairings only in the strengths of the relationships.
The P300 amplitude and latency are independent endophenotypes; the former indexing spatial visualisation and working memory, and the latter is hypothesised to index basic processing speed. Individuals with psychotic illnesses, their unaffected relatives, and healthy controls all show similar patterns of associations between endophenotypes, endorsing the theory of a continuum of psychosis liability across the population.
Academic health systems and their investigators are challenged to systematically assure clinical research regulatory compliance. This challenge is heightened in the emerging era of centralized single Institutional Review Boards for multicenter studies, which rely on monitoring programs at each participating site.
To describe the development, implementation, and outcome measurement of an institution-wide paired training curriculum and internal monitoring program for clinical research regulatory compliance.
Standard operating procedures (SOPs) were developed to facilitate investigator and research professional adherence to institutional policies, federal guidelines, and international standards. An SOP training curriculum was developed and implemented institution-wide. An internal monitoring program was launched, utilizing risk-based monitoring plans of pre-specified frequency and intensity, assessed upon Institutional Review Boards approval of each prospective study. Monitoring plans were executed according to an additional SOP on internal monitoring, with monitoring findings captured in a REDCap database.
We observed few major violations across 3 key domains of clinical research conduct and demonstrated a meaningful decrease in the rates of nonmajor violations in each, over the course of 2 years.
The paired training curriculum and monitoring program is a successful institution-wide clinical research regulatory compliance model that will continue to be refined.
This paper reports three cases of severe post-stapedectomy granuloma, emphasising the variable presentation of this devastating complication and the challenges of its management.
A retrospective review was conducted of three cases of post-stapedectomy granuloma requiring surgical debulking between 2010 and 2015. Clinical symptoms, serial imaging, histopathology and post-operative outcomes were considered.
Intra-operatively, extensive granulation tissue with erosion of the otic capsule was found. There was spread along the VIIth and VIIIth cranial nerves to the cochlear nucleus in one patient. Post-operative clinical improvement was demonstrable, corroborated by diminution of contrast enhancement on serial magnetic resonance imaging. Facial nerve function recovered, tinnitus amelioration was variable and some otalgia persisted. Post-operative complications included grade IV facial weakness and late Pseudomonas aeruginosa meningitis, which all resolved.
To the authors’ knowledge, this paper reports the only case of post-stapedectomy granuloma tracking to the brainstem. Otalgia was present in all our cases, and may be deemed a red flag symptom of progressive bony destruction and otic capsule involvement. Although granuloma remains rare, it should be considered in any patient with worsening otological symptoms following stapes surgery.
Bone-anchored hearing aids improve hearing for patients for whom conventional behind-the-ear aids are problematic. However, uptake of bone-anchored hearing aids is low and it is important to understand why this is the case.
A narrative review was conducted. Studies examining why people accept or decline bone-anchored hearing aids and satisfaction levels of people with bone-anchored hearing aids were reviewed.
Reasons for declining bone-anchored hearing aids included limited perceived benefits, concerns about surgery, aesthetic concerns and treatment cost. No studies providing in-depth analysis of the reasons for declining or accepting bone-anchored hearing aids were identified. Studies of patient satisfaction showed that most participants reported benefits with bone-anchored hearing aids. However, most studies used cross-sectional and/or retrospective designs and only included people with bone-anchored hearing aids.
Important avenues for further research are in-depth qualitative research designed to fully understand the decision-making process for bone-anchored hearing aids and rigorous quantitative research comparing satisfaction of people who receive bone-anchored hearing aids with those who receive alternative (or no) treatments.
This study aimed to compare the efficacy of diode laser, coblation and cold dissection tonsillectomy in paediatric patients.
A total of 120 patients aged 10–15 years with recurrent tonsillitis were recruited. Participants were prospectively randomised to diode laser, coblation or cold dissection tonsillectomy. Operative time and blood loss were recorded. Pain was recorded on a Wong–Baker FACES® pain scale.
The operative time (10 ± 0.99 minutes), blood loss (20 ± 0.85 ml) and pain were significantly lower with coblation tonsillectomy than with cold dissection tonsillectomy (20 ± 1.0 minutes and 30 ± 1.0 ml; p = 0.0001) and diode laser tonsillectomy (15 ± 0.83 minutes and 25 ± 0.83 ml; p = 0.0001). Diode laser tonsillectomy had a shorter operative time (p = 0.0001) and less blood loss (p = 0.001) compared with cold dissection tonsillectomy. However, at post-operative day seven, the diode laser tonsillectomy group had significantly higher pain scores compared with the cold dissection (p = 0.042) and coblation (p = 0.04) tonsillectomy groups.
Both coblation and diode laser tonsillectomy are associated with significantly reduced blood loss and shorter operative times compared with cold dissection tonsillectomy. However, we advocate coblation tonsillectomy because of the lower post-operative pain scores compared with diode laser and cold dissection tonsillectomy.
In recent practice, we have used tissue transfer (pedicled or free flap) to augment the pharyngeal circumference of the neopharynx following salvage total laryngectomy, even in patients who have sufficient pharyngeal mucosa for primary closure. In this study, the rates of pharyngocutaneous fistula were compared in soft tissue flap reconstructed patients versus patients who underwent primary closure.
A retrospective assessment was carried out of all patients who had undergone a salvage total laryngectomy between 2000 and 2010. The presence or absence of a pharyngocutaneous fistula was compared in those who received reconstruction closure versus those who received primary closure.
The reconstruction closure group (n = 7) had no incidence of pharyngocutaneous fistula, whereas the primary closure group (n = 38) had 10 fistulas, giving pharyngocutaneous fistula rates of 0 per cent versus 26 per cent, respectively.
The findings revealed a lower rate of pharyngocutaneous fistula with tissue transfer compared with primary closure of the neopharynx.
The objective of this study was to estimate the sensitivity and specificity of a culture method and a polymerase chain reaction (PCR) method for detection of two Campylobacter species: C. jejuni and C. coli. Data were collected during a 3-year survey of UK broiler flocks, and consisted of parallel sampling of caeca from 436 batches of birds by both PCR and culture. Batches were stratified by season (summer/non-summer) and whether they were the first depopulation of the flock, resulting in four sub-populations. A Bayesian approach in the absence of a gold standard was adopted, and the sensitivity and specificity of the PCR and culture for each Campylobacter subtype was estimated, along with the true C. jejuni and C. coli prevalence in each sub-population. Results indicated that the sensitivity of the culture method was higher than that of PCR in detecting both species when the samples were derived from populations infected with at most one species of Campylobacter. However, from a mixed population, the sensitivity of culture for detecting both C. jejuni or C. coli is reduced while PCR is potentially able to detect both species, although the total probability of correctly identifying at least one species by PCR is similar to that of the culture method.
Little information is available about perceptions of influenza vaccination of parents with healthy children in daycare. Therefore, we systematically explored the relationship between parental risk perception and influenza vaccination in children attending daycare. We distributed a self-administered paper survey to parents of children aged 6–59 months attending licensed daycare centres in Tarrant County, Texas. We used conditional logistic regression with penalized conditional likelihood to estimate odds ratios (ORs) and 95% profile likelihood confidence limits (PL) for parental risk-perception factors and influenza vaccination. A high level of parental prevention behaviours (OR 9·1, 95% PL 3·2, 31) and physician recommendation (OR 8·2, 95% PL 2·7, 30) had the highest magnitudes of association with influenza vaccination of healthy children in daycare. Our results provide evidence about critical determinants of influenza vaccination of healthy children in daycare, which could help inform public health interventions aimed at increasing influenza vaccination coverage in this population.
Feed conversion into milk, nutrient excretion in manure and subsequent environment impacts of manure management are highly influenced by the diets that farmers feed their lactating cows (Bos taurus). On confinement-based dairy farms, determinations of diet composition are relatively straightforward because the types, amounts and nutrients contained in stored feeds are often well known. However, on grazing-based dairy farms, diet composition is more difficult to determine because forage intake during grazing must be estimated. The objectives of this study were to determine relationships between (1) feed N intake (NI), milk production, milk urea N (MUN), feed N use efficiency (FNUE) and excreted manure N (ExN); and (2) between feed P intake (PI), dung P concentrations (g/kg dry matter (DM)) and excreted manure P (ExP) for grazing-based lactating cows having a very wide range of diets and milk production. An additional objective was to evaluate how well these relationships compare with similar relationships based on more direct measurement of feed–milk–manure on confinement-based dairy farms. Four dairy farms located in southeastern Australia were visited during autumn and spring, and data were collected on feed, milk and dung of 18 cows on each farm. Estimated dry matter intake (DMI) from pasture comprised 12% to 75% of total diet DMI, and the crude protein (CP) concentrations in the total diets ranged from 167 to 248 g/kg. During spring, as diet CP increased FNUE declined. Total diet DMI and NI provided the best predictors of ExN, and PI provided the most accurate prediction of ExP. These results indicated accuracy in the study's indirect estimates of pasture DMI. Likely due to high levels and great variability in dietary CP and P concentrations associated with use of diet supplements, MUN did not appear to be a good indicator of dietary CP, and P in dung was not a good indicator of dietary P.