To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Research participants want to receive results from studies in which they participate. However, health researchers rarely share the results of their studies beyond scientific publication. Little is known about the barriers researchers face in returning study results to participants.
Using a mixed-methods design, health researchers (N=414) from more than 40 U.S. universities were asked about barriers to providing results to participants. Respondents were recruited from universities with Clinical and Translational Science Award (CTSA) programs and Prevention Research Centers (PRCs).
Respondents reported the percent of their research where they experienced each of the four barriers to disseminating results to participants: logistical/methodological, financial, systems, and regulatory. A fifth barrier, investigator capacity, emerged from data analysis. Training for research faculty and staff, promotion and tenure incentives, and funding agencies supporting dissemination of results to participants were solutions offered to overcoming barriers.
Study findings add to literature on research dissemination by documenting health researchers’ perceived barriers to sharing study results with participants. Implications for policy and practice suggest that additional resources and training could help reduce dissemination barriers and increase the return of results to participants.
For over a decade a transdiagnostic clinical staging framework for youth with anxiety, mood and psychotic disorders (linked with measurement of multidimensional outcomes), has been utilised in over 8,000 young people presenting to the enhanced primary (headspace) and secondary care clinics of the Brain and Mind Centre of the University of Sydney. This framework has been evaluated alongside a broad range of other clinical, neurobiological, neuropsychological, brain imaging, circadian, metabolic, longitudinal cohort and controlled intervention studies. This has led to specific tests of its concurrent, discriminant and predictive validity. These extensive data provide strong preliminary evidence that: i) varying stages of illness are associated with predicted differences in a range of independent and objectively measured neuropsychological and other biomarkers (both cross-sectionally and longitudinally); and, ii) that earlier stages of illness progress at variable rates to later and more severe or persistent disorders. Importantly, approximately 15-20% of those young people classed as stage 1b or ‘attenuated’ syndromes at presentation progress to more severe or persistent disorders. Consequently, this cohort should be the focus of active secondary prevention trials. In clinical practice, we are moving to combine the staging framework with likely pathophysiological paths (e.g. neurodevelopmental-psychotic, anxiety-depression, circadian-bipolar) to underpin enhanced treatment selection.
In-patients in crisis report poor experiences of mental healthcare not conducive to recovery. Concerns include coercion by staff, fear of assault from other patients, lack of therapeutic opportunities and limited support. There is little high-quality evidence on what is important to patients to inform recovery-focused care.
To conduct a systematic review of published literature, identifying key themes for improving experiences of in-patient mental healthcare.
A systematic search of online databases (MEDLINE, PsycINFO and CINAHL) for primary research published between January 2000 and January 2016. All study designs from all countries were eligible. A qualitative analysis was undertaken and study quality was appraised. A patient and public reference group contributed to the review.
Studies (72) from 16 countries found four dimensions were consistently related to significantly influencing in-patients' experiences of crisis and recovery-focused care: the importance of high-quality relationships; averting negative experiences of coercion; a healthy, safe and enabling physical and social environment; and authentic experiences of patient-centred care. Critical elements for patients were trust, respect, safe wards, information and explanation about clinical decisions, therapeutic activities, and family inclusion in care.
A number of experiences hinder recovery-focused care and must be addressed with the involvement of staff to provide high-quality in-patient services. Future evaluations of service quality and development of practice guidance should embed these four dimensions.
Declaration of interest
K.B. is editor of British Journal of Psychiatry and leads a national programme (Synergi Collaborative Centre) on patient experiences driving change in services and inequalities.
Objectives: Studies of neurocognitively elite older adults, termed SuperAgers, have identified clinical predictors and neurobiological indicators of resilience against age-related neurocognitive decline. Despite rising rates of older persons living with HIV (PLWH), SuperAging (SA) in PLWH remains undefined. We aimed to establish neuropsychological criteria for SA in PLWH and examined clinically relevant correlates of SA. Methods: 734 PLWH and 123 HIV-uninfected participants between 50 and 64 years of age underwent neuropsychological and neuromedical evaluations. SA was defined as demographically corrected (i.e., sex, race/ethnicity, education) global neurocognitive performance within normal range for 25-year-olds. Remaining participants were labeled cognitively normal (CN) or impaired (CI) based on actual age. Chi-square and analysis of variance tests examined HIV group differences on neurocognitive status and demographics. Within PLWH, neurocognitive status differences were tested on HIV disease characteristics, medical comorbidities, and everyday functioning. Multinomial logistic regression explored independent predictors of neurocognitive status. Results: Neurocognitive status rates and demographic characteristics differed between PLWH (SA=17%; CN=38%; CI=45%) and HIV-uninfected participants (SA=35%; CN=55%; CI=11%). In PLWH, neurocognitive groups were comparable on demographic and HIV disease characteristics. Younger age, higher verbal IQ, absence of diabetes, fewer depressive symptoms, and lifetime cannabis use disorder increased likelihood of SA. SA reported increased independence in everyday functioning, employment, and health-related quality of life than non-SA. Conclusions: Despite combined neurological risk of aging and HIV, youthful neurocognitive performance is possible for older PLWH. SA relates to improved real-world functioning and may be better explained by cognitive reserve and maintenance of cardiometabolic and mental health than HIV disease severity. Future research investigating biomarker and lifestyle (e.g., physical activity) correlates of SA may help identify modifiable neuroprotective factors against HIV-related neurobiological aging. (JINS, 2019, 25, 507–519)
A 3-yr watermelon experiment was established in fall 2013 to evaluate cover crop, polyethylene mulch, tillage, and herbicide application components for weed control, yield, and profitability. Conservation tillage, either with a cereal rye cover crop alone or integrated with polyethylene mulch, was compared to the standard industry practice of conventional tillage with bedded polyethylene mulch. The study also used a non-bedded conventional tillage system without polyethylene to determine polyethylene and cover crop residue effects. Within each of the four systems, herbicide treatments comprised halosulfuron applied (1) at 26.3 g ai ha–1 PRE, (2) at 26.3 g ai ha–1 POST, or (3) sequentially at 26.3 g ai ha–1 PRE and POST. Each system also had a nontreated control. In addition, clethodim was applied in all plots twice POST at 140 g ai ha–1, except for nontreated in each system. In 2014, polyethylene or cereal rye cover crop effectively controlled tall morningglory, coffee senna, and carpetweed early season in nontreated plots, whereas the integration of the two was effective at controlling common purslane. Tall morningglory and purslane control was insufficient late season regardless of production system and herbicide application. In 2015, polyethylene effectively controlled cutleaf eveningprimrose, sicklepod, and arrowleaf sida early season in nontreated plots. Yellow nutsedge control was insufficient late season regardless of production system and herbicide application. Utilizing sequential halosulfuron applications did not increase weed control over PRE or POST alone in all years. Polyethylene use resulted in yields higher than systems without in all years. Across all 3 yr, net returns were highest for polyethylene mulch systems. The results of this experiment underscore the need for more progress in developing integrated conservation systems for watermelon production. Effective herbicides, low-disturbance cultivation, and/or hand weeding are most likely the key to success in conservation specialty crop systems.
Surgical site infections (SSIs) following colorectal surgery (CRS) are among the most common healthcare-associated infections (HAIs). Reduction in colorectal SSI rates is an important goal for surgical quality improvement.
To examine rates of SSI in patients with and without cancer and to identify potential predictors of SSI risk following CRS
American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) data files for 2011–2013 from a sample of 12 National Comprehensive Cancer Network (NCCN) member institutions were combined. Pooled SSI rates for colorectal procedures were calculated and risk was evaluated. The independent importance of potential risk factors was assessed using logistic regression.
Of 22 invited NCCN centers, 11 participated (50%). Colorectal procedures were selected by principal procedure current procedural technology (CPT) code. Cancer was defined by International Classification of Disease, Ninth Revision, Clinical Modification (ICD-9-CM) codes.
The primary outcome of interest was 30-day SSI rate.
A total of 652 SSIs (11.06%) were reported among 5,893 CRSs. Risk of SSI was similar for patients with and without cancer. Among CRS patients with underlying cancer, disseminated cancer (SSI rate, 17.5%; odds ratio [OR], 1.66; 95% confidence interval [CI], 1.23–2.26; P=.001), ASA score ≥3 (OR, 1.41; 95% CI, 1.09–1.83; P=.001), chronic obstructive pulmonary disease (COPD; OR, 1.6; 95% CI, 1.06–2.53; P=.02), and longer duration of procedure were associated with development of SSI.
Patients with disseminated cancer are at a higher risk for developing SSI. ASA score >3, COPD, and longer duration of surgery predict SSI risk. Disseminated cancer should be further evaluated by the Centers for Disease Control and Prevention (CDC) in generating risk-adjusted outcomes.
Tett, Hundley, and Christiansen (2017) argue that the concept of validity generalization in meta-analysis is a myth, as the variability of the effect size appears to decrease with increasing moderator specificity such that the level of precision needed to deem an estimate “generalizable” is actually reached at levels of situational specificity that are so high as to (paradoxically) refute an inference of generalizability. This notion highlights the need to move away from claiming that effects are either “generalizable” or “situationally specific” and instead look more critically and less dichotomously at degrees of generalizability, or effect size variability.
ALL MUSEUMS have limited resources, and all acquisitions use some of these resources, whether they be space, money, staff time or materials. When museums purchase collections objects, some “costs” are evident up front, and there are frequently mechanisms in place at a high level to monitor that use of museum resources. For example, in art museums this is often a committee of the board of directors empowered to approve or deny curatorial proposals for major (i.e., expensive) acquisitions. However when the object is “free,” as in most invertebrate paleontology collections where material is either donated or collected in the field by museum staff, the cost of acquiring the object is essentially hidden. Some acquisition costs may be fixed regardless of the size or nature of the acquisition, and some vary depending the amount of space the material will require, or the amount of cleaning or conservation needed.
AS RECENTLY as thirty years ago, deaccessioning was a dirty word in museums. It was considered an abrogation of an institution's fundamental responsibility to care of its collections in the public trust. Miller (1985) has chronicled some of the public controversy that attended deaccessioning in the early 1970's. However, in the intervening time, museums' approach to deaccessioning has changed as they grapple with burgeoning collections, decreasing funding bases, pressure to narrow and focus their missions, and rising standards of care. Now deaccessioning is regarded by the majority as a necessary and appropriate tool in collections management, albeit one that operates within strict ethical and legal constraints.
A museum's collections policy should address criteria for deaccessioning, levels of approval needed, methods of disposition, use of funds resulting from sales, and records keeping. In addition, staff must develop procedures for identifying material for deaccessioning. In the following sections we review the currently accepted standards in the museum field for these areas, as well as discussing how some of these issues apply particularly to paleontological collections. It may be appropriate for paleontology departments to develop guidelines and procedures that further refine the museum's overall collections policy to fit the needs of their collections and the conventions of their field.
To objectively evaluate voluntary nutrition and health claims and marketing techniques present on packaging of high-market-share ultra-processed foods (UPF) in Australia for their potential impact on public health.
Packaging information from five high-market-share food manufacturers and one retailer were obtained from supermarket and manufacturers’ websites.
Ingredients lists for 215 UPF were examined for presence of added sugar. Packaging information was categorised using a taxonomy of nutrition and health information which included nutrition and health claims and five common food marketing techniques. Compliance of statements and claims with the Australia New Zealand Food Standards Code and with Health Star Ratings (HSR) were assessed for all products.
Almost all UPF (95 %) contained added sugars described in thirty-four different ways; 55 % of UPF displayed a HSR; 56 % had nutrition claims (18 % were compliant with regulations); 25 % had health claims (79 % were compliant); and 97 % employed common food marketing techniques. Packaging of 47 % of UPF was designed to appeal to children. UPF carried a mean of 1·5 health and nutrition claims (range 0–10) and 2·6 marketing techniques (range 0–5), and 45 % had HSR≤3·0/5·0.
Most UPF packaging featured nutrition and health statements or claims despite the high prevalence of added sugars and moderate HSR. The degree of inappropriate or inaccurate statements and claims present is concerning, particularly on packaging designed to appeal to children. Public policies to assist parents to select healthy family foods should address the quality and accuracy of information provided on UPF packaging.
With reference to theory published earlier, formulas are given for the estimation of (i) abundances of morphological types among field galaxies, (ii) of selection probabilities, and (iii) of ‘space luminosity functions’. Strictly, the theory applies to ‘homogeneous classes’ of galaxies. This term designates a category of galaxies, say C, so finely defined that the probability, say Φ(m | C), that a galaxy of category C will be included in the catalogue depends on its photographic apparent magnitude m and on nothing else. The practical use of the theory is illustrated on data in the HMS Catalogue. It appears that certain combinations of the Hubble morphological types satisfy the definition of a homogeneous class. Such, for example, is the case for combinations of ellipticals E0–E3 and, separately, of spirals Sc, Scp, SBc. However, the combination of these two categories is not a homogeneous class.
In order to validate the theory empirically, calculations were performed to predict the abundances of eight combinations of morphological types among cluster galaxies listed in the HMS Catalogue, each combination being treated as a distinct homogeneous class. Additional hypotheses underlying these calculations are: (a) abundances of morphological types, (b) luminosity functions of these types, and (c) selection probabilities for cluster galaxies coincide with those for field galaxies. A comparison with the observations, reaching the value of z = 0.07, is satisfactory. This tends to validate the combination of formulas (i), (ii), (iii) with the additional hypotheses (a), (b) and (c). Incidentally, the result tends to support the steady state cosmology.
Acute kidney injury after cardiac surgery is a frequent and serious complication among children with congenital heart disease (CHD) and adults with acquired heart disease; however, the significance of kidney injury in adults after congenital heart surgery is unknown. The primary objective of this study was to determine the incidence of acute kidney injury after surgery for adult CHD. Secondary objectives included determination of risk factors and associations with clinical outcomes.
This single-centre, retrospective cohort study was performed in a quaternary cardiovascular ICU in a paediatric hospital including all consecutive patients ⩾18 years between 2010 and 2013.
Data from 118 patients with a median age of 29 years undergoing cardiac surgery were analysed. Using Kidney Disease: Improving Global Outcome creatinine criteria, 36% of patients developed kidney injury, with 5% being moderate to severe (stage 2/3). Among higher-complexity surgeries, incidence was 59%. Age ⩾35 years, preoperative left ventricular dysfunction, preoperative arrhythmia, longer bypass time, higher Risk Adjustment for Congenital Heart Surgery-1 category, and perioperative vancomycin use were significant risk factors for kidney injury development. In multivariable analysis, age ⩾35 years and vancomycin use were significant predictors. Those with kidney injury were more likely to have prolonged duration of mechanical ventilation and cardiovascular ICU stay in the univariable regression analysis.
We demonstrated that acute kidney injury is a frequent complication in adults after surgery for CHD and is associated with poor outcomes. Risk factors for development were identified but largely not modifiable. Further investigation within this cohort is necessary to better understand the problem of kidney injury.