To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Cognitive Battery of the National Institutes of Health Toolbox (NIH-TB) is a collection of assessments that have been adapted and normed for administration across the lifespan and is increasingly used in large-scale population-level research. However, despite increasing adoption in longitudinal investigations of neurocognitive development, and growing recommendations that the Toolbox be used in clinical applications, little is known about the long-term temporal stability of the NIH-TB, particularly in youth.
The present study examined the long-term temporal reliability of the NIH-TB in a large cohort of youth (9–15 years-old) recruited across two data collection sites. Participants were invited to complete testing annually for 3 years.
Reliability was generally low-to-moderate, with intraclass correlation coefficients ranging between 0.31 and 0.76 for the full sample. There were multiple significant differences between sites, with one site generally exhibiting stronger temporal stability than the other.
Reliability of the NIH-TB Cognitive Battery was lower than expected given early work examining shorter test-retest intervals. Moreover, there were very few instances of tests meeting stability requirements for use in research; none of the tests exhibited adequate reliability for use in clinical applications. Reliability is paramount to establishing the validity of the tool, thus the constructs assessed by the NIH-TB may vary over time in youth. We recommend further refinement of the NIH-TB Cognitive Battery and its norming procedures for children before further adoption as a neuropsychological assessment. We also urge researchers who have already employed the NIH-TB in their studies to interpret their results with caution.
Elective surgical patients routinely bathe with chlorhexidine gluconate (CHG) at home days prior to their procedures. However, the impact of home CHG bathing on surgical site CHG concentration is unclear. We examined 3 different methods of applying CHG and hypothesized that different application methods would impact resulting CHG skin concentration.
Trifludimoxazin, a new protoporphyrinogen oxidase–inhibiting herbicide, is being evaluated for possible use as a soil-residual active herbicide treatment in cotton for control of small-seeded annual broadleaf weeds. Laboratory and greenhouse studies were conducted to compare vertical mobility and cotton tolerance of trifludimoxazin to flumioxazin and saflufenacil, which are two currently registered protoporphyrinogen oxidase–inhibiting herbicides for use in cotton, in three West Texas soils. Vertical soil mobility of trifludimoxazin was similar to flumioxazin in Acuff loam and Olton loam soils, but was more mobile than flumioxazin in the Amarillo loamy sand soil. The depth of trifludimoxazin movement after a 2.5-cm irrigation event ranged from 2.5 to 5.0 cm in all soils, which would not allow for crop selectivity based on herbicide placement, because ideal cotton seeding depth is from 0.6 to 2.54 cm deep. Greenhouse studies indicated that PRE treatments were more injurious than the 14 d preplant treatment when summarized across soils for the three herbicides (43% and 14% injury, respectively). No differences in visual cotton response or dry weight was observed after trifludimoxazin preplant as compared with the nontreated control within each of the three West Texas soils and was similar to the flumioxazin preplant across soils. On the basis of these results, a use pattern for trifludimoxazin in cotton may be established with the use of a more than 14-d preplant restriction before cotton planting.
In a survey of hospitals and of patients with Clostridioides difficile infection (CDI), we found that most facilities had educational materials or protocols for education of CDI patients. However, approximately half of CDI patients did not recall receiving education during their admission, and knowledge deficits regarding CDI prevention were common.
A 65-year-old male with a history of hypertension presents to the emergency department (ED) with new onset of non-traumatic back pain. The patient is investigated for life-threatening diagnoses and screened for “red flag symptoms,” including fever, neurologic abnormalities, bowel/bladder symptoms, and a history of injectiondrug use (IVDU). The patient is treated symptomatically and discharged home but represents to the ED three additional times, each time with new and progressive symptoms. At the time of admission, he is unable to ambulate, has perineal anesthesia, and 500 cc of urinary retention. Whole spine magnetic resonance imaging (MRI) confirms a thoracic spinal epidural abscess. This case, and many like it, prompts the questions: when should emergency physicians consider the diagnosis of a spinal epidural abscess, and what is the appropriate evaluation of these patients in the ED? (Figure 1).
Drumlins form at the ice/bed interface through subglacial processes that are not directly observable. The internal stratigraphy of drumlins provides insight into how they developed and associated subglacial processes, but traditional stratigraphic logging techniques are limited to natural exposures and excavations. Using ground-penetrating radar, we imaged the internal stratigraphy of seven drumlins from a recently exposed drumlin field in the forefield of Múlajökull, Iceland. Data were collected with 100 and 200 MHz antennas with maximum resolvable depths of 8 and 4 m, respectively. Longitudinal echograms contained coherent down-ice dipping reflectors over the lengths of the drumlins. Near the drumlin heads (i.e., stoss sides), down-glacier dipping beds lie at high angles to the surface, whereas on the lee sides, the down-glacier dipping beds lie at low angles, or conform, to drumlin surfaces. Transverse echograms exhibited unconformities along the flanks of drumlin heads and conformable bedding across the lee side widths of the drumlins. These observations were ground-truthed with stratigraphic logs from a subset of drumlins and good agreement was found. The stratigraphic patterns support previous conclusions that drumlins at Múlajökull formed on a deformable bed through both depositional and erosional processes which may alternate between its surge and quiescent phases.
A Gallery of Combustion and Fire is the first book to provide a graphical perspective of the extremely visual phenomenon of combustion in full color. It is designed primarily to be used in parallel with, and supplement existing combustion textbooks that are usually in black and white, making it a challenge to visualize such a graphic phenomenon. Each image includes a description of how it was generated, which is detailed enough for the expert but simple enough for the novice. Processes range from small scale academic flames up to full scale industrial flames under a wide range of conditions such as low and normal gravity, atmospheric to high pressures, actual and simulated flames, and controlled and uncontrolled flames. Containing over 500 color images, with over 230 contributors from over 75 organizations, this volume is a valuable asset for experts and novices alike.
OBJECTIVES/GOALS: Lung transplant (LTx) candidates benefit from use of non-ideal donor organs. Each organ procurement organization (OPO) defines “acceptable” donor organs introducing unmeasured variation in donor pursuit. We characterized non-ideal donor pursuit among OPOs to identify drivers of risk aversion in LTx. METHODS/STUDY POPULATION: We queried the UNOS registry for adult donors who donated ≥1 organ for transplantation from 12/2007-12/2018. Non-ideal donors were those with any of age>50, smoking history ≥20 pack-years, PaO2/FiO2 (P/F) ratio<350, donation after cardiac death (DCD) status, or CDC increased risk (IRD) status. Non-ideal donor pursuit rate was defined as the proportion of non-ideal donors at each OPO from whom consent for lung donation was requested with lower numbers indicating increased risk aversion. We estimated the correlation between non-ideal and overall donor pursuit using a Spearman correlation coefficient. Adjusted non-ideal donor pursuit rates were estimated using multivariable logistic regression. RESULTS/ANTICIPATED RESULTS: Overall, 18,333 deceased donors were included and classified as ideal or non-ideal. Among 58 OPOs, rates of non-ideal donor pursuit ranged from 0.24-1.00 Figure). Of 5 non-ideal characteristics, DCD and IRD status were associated with the most and least risk aversion, respectively. Non-ideal donor pursuit was strongly correlated with overall donor pursuit (r = 0.99). On adjusted analysis, older age (OR 0.15, 95% CI 0.13-0.16), smoking history (OR 0.38, 95% CI 0.34-0.44), low P/F ratio (OR 0.12, 95% CI 0.11-0.14), and DCD status (OR 0.04, 95% CI 0.03-0.04) were all independently associated with significant risk aversion, corresponding to decreased rates of donor pursuit. DISCUSSION/SIGNIFICANCE OF IMPACT: OPOs differ in their levels of risk aversion in LTx and risk aversion is not uniform across selected categories of non-ideal lung donor. Consideration of new OPO performance metrics that encourage the pursuit of non-ideal lung donors is warranted.
Registry-based trials have emerged as a potentially cost-saving study methodology. Early estimates of cost savings, however, conflated the benefits associated with registry utilisation and those associated with other aspects of pragmatic trial designs, which might not all be as broadly applicable. In this study, we sought to build a practical tool that investigators could use across disciplines to estimate the ranges of potential cost differences associated with implementing registry-based trials versus standard clinical trials.
We built simulation Markov models to compare unique costs associated with data acquisition, cleaning, and linkage under a registry-based trial design versus a standard clinical trial. We conducted one-way, two-way, and probabilistic sensitivity analyses, varying study characteristics over broad ranges, to determine thresholds at which investigators might optimally select each trial design.
Registry-based trials were more cost effective than standard clinical trials 98.6% of the time. Data-related cost savings ranged from $4300 to $600,000 with variation in study characteristics. Cost differences were most reactive to the number of patients in a study, the number of data elements per patient available in a registry, and the speed with which research coordinators could manually abstract data. Registry incorporation resulted in cost savings when as few as 3768 independent data elements were available and when manual data abstraction took as little as 3.4 seconds per data field.
Registries offer important resources for investigators. When available, their broad incorporation may help the scientific community reduce the costs of clinical investigation. We offer here a practical tool for investigators to assess potential costs savings.
Introduction: Emergency care serves as an important health resource for First Nations (FN) persons. Previous reporting shows that FN persons visit emergency departments at almost double the rate of non-FN persons. Working collaboratively with FN partners, academic researchers and health authority staff, the objective of this study is to investigate FN emergency care patient visit statistics in Alberta over a five year period. Methods: Through a population-based retrospective cohort study for the period from April 1, 2012 to March 31, 2017, patient demographics and emergency care visit characteristics for status FN patients in Alberta were analyzed and compared to non-FN statistics. Frequencies and percentages (%) describe patients and visits by categorical variables (e.g., Canadian Triage Acuity Scale (CTAS)). Means and standard deviations (medians and interquartile ranges (IQR)) describe continuous variables (e.g., distances) as appropriate for the data distribution. These descriptions are repeated for the FN and non-FN populations, separately. Results: The data set contains 11,686,288 emergency facility visits by 3,024,491 unique persons. FN people make up 4.8% of unique patients and 9.4% of emergency care visits. FN persons live further from emergency facilities than their non-FN counterparts (FN median 6 km, IQR 1-24; vs. non-FN median 4 km, IQR 2-8). FN visits arrive more often by ground ambulance (15.3% vs. 10%). FN visits are more commonly triaged as less acute (59% CTAS levels 4 and 5, compared to non-FN 50.4%). More FN visits end in leaving without completing treatment (6.7% vs. 3.6%). FN visits are more often in the evening – 4:01pm to 12:00am (43.6% vs. 38.1%). Conclusion: In a collaborative validation session, FN Elders and health directors contextualized emergency care presentation in evenings and receiving less acute triage scores as related to difficulties accessing primary care. They explained presentation in evenings, arrival by ambulance, and leaving without completing treatment in terms of issues accessing transport to and from emergency facilities. Many factors interact to determine FN patients’ emergency care visit characteristics and outcomes. Further research needs to separate the impact of FN identity from factors such as reasons for visiting emergency facilities, distance traveled to care, and the size of facility where care is provided.
Attention Deficit Hyperactivity Disorder (ADHD) is a serious risk factor for co-occurring psychiatric disorders and negative psychosocial consequences in adulthood. Given this background, there is great need for an effective treatment of adult ADHD patients.
Therefore, our research group has conducted a first controlled randomized multicenter study on the evaluation of disorder-tailored DBT-based group program in adult ADHD compared to a psychophar-macological treatment.
Between 2007 and 2010, in a four-arm-design 433 patients were randomized to a manualized dialectical behavioural therapy (DBT) based group program plus methylphenidate or placebo or clinical management plus methylphenidate or placebo with weekly sessions in the first twelve weeks and monthly sessions thereafter. Therapists are graduated psychologists or physicians. Treatment integrity is established by independent supervision. Primary endpoint (ADHD symptoms measured by the Conners Adult ADHD Rating Scale) is rated by interviewers blind to the treatment allocation (Current Controlled Trials ISRCTN54096201). The trial is funded by the German Federal Ministry of Research and Education (01GV0606) and is part of the German network for the treatment of ADHD in children and adults (ADHD-NET). In the lecture the first data of our interim analysis are presented (baseline data, results of treatment compliance and adherence).
Atrazine offers growers a reliable option to control a broad spectrum of weeds in grain sorghum production systems when applied PRE or POST. However, because of the extensive use of atrazine in grain sorghum and corn, it has been found in groundwater in the United States. Given this issue, field experiments were conducted in 2017 and 2018 in Fayetteville and Marianna, Arkansas, to explore the tolerance of grain sorghum to applications of assorted photosystem II (PSII)-inhibiting herbicides in combination with S-metolachlor (PRE and POST) or mesotrione (POST only) as atrazine replacements. All experiments were designed as a factorial, randomized complete block; the two factors were (1) PSII herbicide and (2) the herbicide added to create the mixture. The PSII herbicides were prometryn, ametryn, simazine, fluometuron, metribuzin, linuron, diuron, atrazine, and propazine. The second factor consisted of either no additional herbicide, S-metolachlor, or mesotrione; however, mesotrione was excluded in the PRE experiments. Crop injury estimates, height, and yield data were collected or calculated in both studies. In the PRE study, injury was less than 10% for all treatments except those containing simazine, which caused 11% injury 28 d after application (DAA). Averaged over PSII herbicide, S-metolachlor–containing treatments caused 7% injury at 14 and 28 DAA. Grain sorghum in atrazine-containing treatments yielded 97% of the nontreated. Grain sorghum receiving other herbicide treatments had significant yield loss due to crop injury, compared with atrazine-containing treatments. In the POST study, ametryn- and prometryn-containing treatments were more injurious than all other treatments 14 DAA. Grain sorghum yield in all POST treatments was comparable to atrazine, except prometryn plus mesotrione, which was 65% of the nontreated. More herbicides should be evaluated to find a comparable fit to atrazine when applied PRE in grain sorghum. However, when applied POST, diuron, fluometuron, linuron, metribuzin, propazine, and simazine have some potential to replace atrazine in terms of crop tolerance and should be further tested as part of a weed control program across a greater range of environments.
There is lack of Cameroonian adult neuropsychological (NP) norms, limited knowledge concerning HIV-associated neurocognitive disorders in Sub-Saharan Africa, and evidence of differential inflammation and disease progression based on viral subtypes. In this study, we developed demographically corrected norms and assessed HIV and viral genotypes effects on attention/working memory (WM), learning, and memory.
We administered two tests of attention/WM [Paced Auditory Serial Addition Test (PASAT)-50, Wechsler Memory Scale (WMS)-III Spatial Span] and two tests of learning and memory [Brief Visuospatial Memory Test-Revised (BVMT-R), Hopkins Verbal Learning Test-Revised (HVLT-R)] to 347 HIV+ and 395 seronegative adult Cameroonians. We assessed the effects of viral factors on neurocognitive performance.
Compared to controls, people living with HIV (PLWH) had significantly lower T-scores on PASAT-50 and attention/WM summary scores, on HVLT-R total learning and learning summary scores, on HVLT-R delayed recall, BVMT-R delayed recall and memory summary scores. More PLWH had impairment in attention/WM, learning, and memory. Antiretroviral therapy (ART) and current immune status had no effect on T-scores. Compared to untreated cases with detectable viremia, untreated cases with undetectable viremia had significantly lower (worse) T-scores on BVMT-R total learning, BVMT-R delayed recall, and memory composite scores. Compared to PLWH infected with other subtypes (41.83%), those infected with HIV-1 CRF02_AG (58.17%) had higher (better) attention/WM T-scores.
PLWH in Cameroon have impaired attention/WM, learning, and memory and those infected with CRF02_AG viruses showed reduced deficits in attention/WM. The first adult normative standards for assessing attention/WM, learning, and memory described, with equations for computing demographically adjusted T-scores, will facilitate future studies of diseases affecting cognitive function in Cameroonians.
To determine the effect of an electronic medical record (EMR) nudge at reducing total and inappropriate orders testing for hospital-onset Clostridioides difficile infection (HO-CDI).
An interrupted time series analysis of HO-CDI orders 2 years before and 2 years after the implementation of an EMR intervention designed to reduce inappropriate HO-CDI testing. Orders for C. difficile testing were considered inappropriate if the patient had received a laxative or stool softener in the previous 24 hours.
Four hospitals in an academic healthcare network.
All patients with a C. difficile order after hospital day 3.
Orders for C. difficile testing in patients administered a laxative or stool softener in <24 hours triggered an EMR alert defaulting to cancellation of the order (“nudge”).
Of the 17,694 HO-CDI orders, 7% were inappropriate (8% prentervention vs 6% postintervention; P < .001). Monthly HO-CDI orders decreased by 21% postintervention (level-change rate ratio [RR], 0.79; 95% confidence interval [CI], 0.73–0.86), and the rate continued to decrease (postintervention trend change RR, 0.99; 95% CI, 0.98–1.00). The intervention was not associated with a level change in inappropriate HO-CDI orders (RR, 0.80; 95% CI, 0.61–1.05), but the postintervention inappropriate order rate decreased over time (RR, 0.95; 95% CI, 0.93–0.97).
An EMR nudge to minimize inappropriate ordering for C. difficile was effective at reducing HO-CDI orders, and likely contributed to decreasing the inappropriate HO-CDI order rate after the intervention.