To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In the past several years, transparency has emerged as one of the leading accountability mechanisms through which platform companies have attempted to regain the trust of the public, politicians, and regulatory authorities. The goal of this chapter is to contextualize the recent examples of transparency as implemented by platform companies with an overview of the relevant literature on transparency in both theory and practice, consider the potential positive governance impacts of transparency as a form of accountability in the current political moment, and reflect on the shortfalls of transparency that should be considered by legislators, academics, and funders weighing the relative benefits of policy or research dealing with transparency in this area.
OBJECTIVES/GOALS: Physician-scientists play a vital role in biomedical research but this chosen career path has many challenges, such as long training periods and funding. The University of Rochester (UR) CTSI pipeline programs address this by enabling medical trainees to partake in enriched research experiences. METHODS/STUDY POPULATION: The UR CTSI TL1 is a training grant from the National Center for Advancing Translational Science (NCATS), which funds predoctoral trainees. The TL1-funded physician-scientist pipeline includes the Academic Research Track (ART) year-out program and the Medical Science Training Program (MSTP). We describe the characteristics and training outcomes of TL1-funded trainees. We also obtained testimonials of current and former trainees regarding their career component decision-making, and their perception of programs, in order to identify how best to address the challenges of the physician-scientist workforce, and to facilitate the transition between the clinic and bench. RESULTS/ANTICIPATED RESULTS: From 2006-2019, the UR CTSI has had 56 ART trainees and 17 MSTP trainees complete training; six trainees have transitioned into the MSTP after completing the ART program. As of 2019, 63 of 67 graduated trainees (94%) have continued their engagement in CTS after graduation. Importantly, our programs have facilitated the careers of 31 women (39.7%) and 12 under-represented minorities (15.4%). We will present a breadth of qualitative data to inform which parts of the TL1-related programs have been successful, and which parts could use programmatic improvement to aid the transition into the physician-scientist workforce. DISCUSSION/SIGNIFICANCE OF IMPACT: Physician-scientist training barriers in the US have resulted in a shortage of these professionals in the clinical and translation workforce. Our data show the UR CTSI has been successful in addressing several of these challenges via the TL1-funded ART, MSTP, and ART/MSTP dual program pipeline.
Patients with distributive shock who are unresponsive to traditional vasopressors are commonly considered to have severe distributive shock and are at high mortality risk. Here, we assess the cost-effectiveness of adding angiotensin II to the standard of care (SOC) for severe distributive shock in the US critical care setting from a US payer perspective.
Short-term mortality outcomes were based on 28-day survival rates from the ATHOS-3 study. Long-term outcomes were extrapolated to lifetime survival using individually estimated life expectancies for survivors. Resource use and adverse event costs were drawn from the published literature. Health outcomes evaluated were lives saved, life-years gained, and quality-adjusted life-years (QALYs) gained using utility estimates for the US adult population weighted for sepsis mortality. Deterministic and probabilistic sensitivity analyses assessed uncertainty around results. We analyzed patients with severe distributive shock from the ATHOS-3 clinical trial.
The addition of angiotensin II to the SOC saved .08 lives at Day 28 compared to SOC alone. The cost per life saved was estimated to be $108,884. The addition of angiotensin II to the SOC was projected to result in a gain of .96 life-years and .66 QALYs. This resulted in an incremental cost-effectiveness ratio of $12,843 per QALY. The probability of angiotensin II being cost-effective at a threshold of $50,000 per QALY was 86 percent.
For treatment of severe distributive shock, angiotensin II is cost-effective at acceptable thresholds.
National guidance cautions against low-intensity interventions for people with personality disorder, but evidence from trials is lacking.
To test the feasibility of conducting a randomised trial of a low-intensity intervention for people with personality disorder.
Single-blind, feasibility trial (trial registration: ISRCTN14994755). We recruited people aged 18 or over with a clinical diagnosis of personality disorder from mental health services, excluding those with a coexisting organic or psychotic mental disorder. We randomly allocated participants via a remote system on a 1:1 ratio to six to ten sessions of Structured Psychological Support (SPS) or to treatment as usual. We assessed social functioning, mental health, health-related quality of life, satisfaction with care and resource use and costs at baseline and 24 weeks after randomisation.
A total of 63 participants were randomly assigned to either SPS (n = 33) or treatment as usual (n = 30). Twenty-nine (88%) of those in the active arm of the trial received one or more session (median 7). Among 46 (73%) who were followed up at 24 weeks, social dysfunction was lower (−6.3, 95% CI −12.0 to −0.6, P = 0.03) and satisfaction with care was higher (6.5, 95% CI 2.5 to 10.4; P = 0.002) in those allocated to SPS. Statistically significant differences were not found in other outcomes. The cost of the intervention was low and total costs over 24 weeks were similar in both groups.
SPS may provide an effective low-intensity intervention for people with personality disorder and should be tested in fully powered clinical trials.
Ergothioneine (ERG) is an unusual thio-histidine betaine amino acid that has potent antioxidant activities. It is synthesised by a variety of microbes, especially fungi (including in mushroom fruiting bodies) and actinobacteria, but is not synthesised by plants and animals who acquire it via the soil and their diet, respectively. Animals have evolved a highly selective transporter for it, known as solute carrier family 22, member 4 (SLC22A4) in humans, signifying its importance, and ERG may even have the status of a vitamin. ERG accumulates differentially in various tissues, according to their expression of SLC22A4, favouring those such as erythrocytes that may be subject to oxidative stress. Mushroom or ERG consumption seems to provide significant prevention against oxidative stress in a large variety of systems. ERG seems to have strong cytoprotective status, and its concentration is lowered in a number of chronic inflammatory diseases. It has been passed as safe by regulatory agencies, and may have value as a nutraceutical and antioxidant more generally.
Navajo Nation residents experience extreme rates of poverty, food insecurity and diet-related diseases. While many residents travel far to shop at grocery stores, there are small stores closer to home that could provide more healthy options, like fruits and vegetables (F&V). Little is known from the perspective of store owners and managers regarding the barriers and facilitators to offering F&V; the present study contributes to filling that gap.
Data were collected through structured interviews from a sampling frame of all store owners or managers in the setting (n 29).
Small stores in Navajo Nation, New Mexico, USA. Navajo Nation is predominantly rural and the largest federally recognized Native American tribe in the USA.
Sixteen managers and six owners at twenty-two stores.
When asked about the types of foods that were most commonly purchased at their stores, most participants reported snacks and drinks (82 and 68 %, respectively). Many participants reported they would like to offer more fresh F&V. However, barriers included varying perceived customer demand, limited F&V choices from distributors and (for some managers) limited authority over product selection.
Findings contribute to the discussion on engaging store owners and managers in providing quality, healthy foods close to home in low-income, rural regions.
To detect modest associations of dietary intake with disease risk, observational studies need to be large and control for moderate measurement errors. The reproducibility of dietary intakes of macronutrients, food groups and dietary patterns (vegetarian and Mediterranean) was assessed in adults in the UK Biobank study on up to five occasions using a web-based 24-h dietary assessment (n 211 050), and using short FFQ recorded at baseline (n 502 655) and after 4 years (n 20 346). When the means of two 24-h assessments were used, the intra-class correlation coefficients (ICC) for macronutrients varied from 0·63 for alcohol to 0·36 for polyunsaturated fat. The ICC for food groups also varied from 0·68 for fruit to 0·18 for fish. The ICC for the FFQ varied from 0·66 for meat and fruit to 0·48 for bread and cereals. The reproducibility was higher for vegetarian status (κ > 0·80) than for the Mediterranean dietary pattern (ICC = 0·45). Overall, the reproducibility of pairs of 24-h dietary assessments and single FFQ used in the UK Biobank were comparable with results of previous prospective studies using conventional methods. Analyses of diet–disease relationships need to correct for both measurement error and within-person variability in dietary intake in order to reliably assess any such associations with disease in the UK Biobank.
OBJECTIVES/SPECIFIC AIMS: Delirium, a form of acute brain dysfunction, characterized by changes in attention and alertness, is a known independent predictor of mortality in the Intensive Care Unit (ICU). We sought to understand whether catatonia, a more recently recognized form of acute brain dysfunction, is associated with increased 30-day mortality in critically ill older adults. METHODS/STUDY POPULATION: We prospectively enrolled critically ill patients at a single institution who were on a ventilator or in shock and evaluated them daily for delirium using the Confusion Assessment for the ICU and for catatonia using the Bush Francis Catatonia Rating Scale. Coma, was defined as a Richmond Agitation Scale score of −4 or −5. We used the Cox Proportional Hazards model predicting 30-day mortality after adjusting for delirium, coma and catatonia status. RESULTS/ANTICIPATED RESULTS: We enrolled 335 medical, surgical or trauma critically ill patients with 1103 matched delirium and catatonia assessments. Median age was 58 years (IQR: 48 - 67). Main indications for admission to the ICU included: airway disease or protection (32%; N=100) or sepsis and/or shock (25%; N=79. In the unadjusted analysis, regardless of the presence of catatonia, non-delirious individuals have the highest median survival times, while delirious patients have the lowest median survival time. Comparing the absence and presence of catatonia, the presence of catatonia worsens survival (Figure 1). In a time-dependent Cox model, comparing non-delirious individuals, holding catatonia status constant, delirious individuals have 1.72 times the hazards of death (IQR: 1.321, 2.231) while those with coma have 5.48 times the hazards of death (IQR: 4.298, 6.984). For DSM-5 catatonia scores, a 1-unit increase in the score is associated with 1.18 times the hazards of in-hospital mortality. Comparing two individuals with the same delirium status, an individual with a DSM-5 catatonia score of 0 (no catatonia) will have 1.178 times the hazard of death (IQR: 1.086, 1.278), while an individual with a score of 3 catatonia items (catatonia) present will have 1.63 times the hazard of death. DISCUSSION/SIGNIFICANCE OF IMPACT: Non-delirious individuals have the highest median survival times, while those who are comatose have the lowest median survival times after a critical illness, holding catatonia status constant. Comparing the absence and presence of catatonia, the presence of catatonia seems to worsen survival. Those individual who are both comatose and catatonic have the lowest median survival time.
Chemical bonding in native oxides of GaAs, before and after etching, is detected by X-Ray Photoelectron Spectroscopy (XPS). It is correlated with surface energy engineering (SEE), measured via Three Liquid Contact Angle Analysis (3LCAA), and oxygen coverage, measured by High Resolution Ion Beam Analysis (HR-IBA).
Before etching, GaAs native oxides are found to be hydrophobic with an average surface energy, γT, of 33 ± 1 mJ/m2, as measured by 3LCAA. After dilute NH4OH etching, GaAs becomes highly hydrophilic and its surface energy, γT, increases by a factor 2 to a reproducible value of 66 ± 1 mJ/m2. Using HR-IBA, oxygen coverage on GaAs is found to decrease from 7.2 ± 0.5 monolayers (ML) to 3.6 ± 0.5 ML. The 1.17 ratio of Ga to As, measured by HR-IBA, remains constant after etching.
XPS is used to measure oxidation of Ga and As, as well as surface stoichiometry on two locations of several GaAs(100) wafers before and after etching. The relative proportions of Ga and As are unaffected by adventitious carbon contamination. The 1.16 Ga:As ratio, measured by XPS, matches HR-IBA analysis. The proportions of oxidized Ga and As do not change significantly after etching. However, the initial ratio of As2O5 to As2O3, within the oxidized As, significantly decreases after etching from approximately 3:1 to 3:2.
Absolute oxygen coverage, as a function of surface processing, is determined within 0.5 ML by HR-IBA. XPS offers insight into these modifications by detecting electronic states and phase composition changes of GaAs oxides. The changes in surface chemistry are correlated to changes in hydro-affinity and surface energies measured by 3LCAA.
The biocomponent ratio in liquid fuels as well as the usage of renewable resources for fuel consumption in the transport sector needs to be increased as a result of EU directive 2003/30/EC. Based on radiocarbon (14C) measurements, it should be relatively simple and fast to measure the weight percentage of the fossil and biological sources by accelerator mass spectrometry (AMS) as recommended in the ASTM D 6866-12 and EN 16640 standards. In this study, a relatively easy and fast sample preparation and measurement method based on AMS measurements was developed at the Hertelendi Laboratory of Environmental Studies (HEKAL) using reference samples from the Hungarian MOL Nyrt. oil company. Considering the recent EU regulation for mixing rates of liquid fuels in the transport sector (0.7–2% biofuel content) and the projected higher rates (2–10% biofuel content), the method is applicable to determine fatty acid methyl ester (FAME) and/or hydrotreated vegetable oil (HVO) derived proportions of fuel blends with a 1σ uncertainty better than±0.3% m/m.
Method of levels (MOL) is an innovative transdiagnostic cognitive therapy with potential advantages over existing psychological treatments for psychosis.
The Next Level study is a feasibility randomised controlled trial (RCT) of MOL for people experiencing first-episode psychosis. It aims to determine the suitability of MOL for further testing in a definitive trial (trial registration ISRCTN13359355).
The study uses a parallel group non-masked feasibilityRCT design with two conditions: (a) treatment as usual (TAU) and (b) TAU plus MOL. Participants (n = 36) were recruited from early intervention in psychosis services. Outcome measures are completed at baseline, 10 and 14 months. The primary outcomes are recruitment and retention.
Participants’ demographic and clinical characteristics are presented along with baseline data.
Next Level has recruited to target, providing evidence that it is feasible to recruit to a RCT of MOL for first-episode psychosis.
Two radiocarbon (14C) excursions are caused by an increase of incoming cosmic rays on a short time scale found in the Late Holocene (AD 774–775 and AD 993–994), which are widely explained as due to extreme solar proton events (SPE). In addition, a larger event has also been reported at 5480 BC (Miyake et al. 2017a), which is attributed to a special mode of a grand solar minimum, as well as another at 660 BC (Park et al. 2017). Clearly, other events must exist, but could have different causes. In order to detect more such possible events, we have identified periods when the 14C increase rate is rapid and large in the international radiocarbon calibration (IntCal) data (Reimer et al. 2013). In this paper, we follow on from previous studies and identify a possible excursion starting at 814–813 BC, which may be connected to the beginning of a grand solar minimum associated with the beginning of the Hallstatt period, which is characterized by relatively constant 14C ages in the period from 800–400 BC. We compare results of annual 14C measurements from tree rings of sequoia (California) and cedar (Japan), and compare these results to other identified excursions, as well as geomagnetic data. We note that the structure of the increase from 813 BC is similar to the increase at 5480 BC, suggesting a related origin. We also assess whether there are different kinds of events that may be observed and may be consistent with different types of solar phenomena, or other explanations.
A comparative study was undertaken to adopt and evaluate a radiocarbon (14C) preparation procedure for accelerator mass spectrometry (AMS) measurements of cremated bones at our laboratory, including different types of archaeological samples (cremated bone, bone, charcoal, charred grain). All 14C analyses were performed using the EnvironMICADAS AMS instrument at the Hertelendi Laboratory of Environmental Studies (HEKAL) and the ancillary analyses were also performed at the Institute for Nuclear Research (ATOMKI). After the physical and chemical cleaning of cremated bones, CO2 was extracted by acid hydrolysis followed by sealed-tube graphitization and 14C measurement. The supplementary δ13C measurements were also performed on CO2 gas while FTIR was measured on the powder fraction. Based on the FTIR and 14C analyses, our chemical pretreatment protocol was successful in removing contamination from the samples. Good reproducibility was obtained for the 0.2–0.3 mm fraction of blind-tested cremated samples and a maximum age difference of only 150 yr was found for the remaining case studies. This confirms the reliability of our procedure for 14C dating of cremated bones. However, in one case study, the age difference of 300 yr between two cremated fragments originating from the same urn shows that other processes affecting the cremated samples in the post-burial environment can substantially influence the 14C age, so caution must be exercised.
OBJECTIVES/SPECIFIC AIMS: Background: Delirium is a well described form of acute brain organ dysfunction characterized by decreased or increased movement, changes in attention and concentration as well as perceptual disturbances (i.e., hallucinations) and delusions. Catatonia, a neuropsychiatric syndrome traditionally described in patients with severe psychiatric illness, can present as phenotypically similar to delirium and is characterized by increased, decreased and/or abnormal movements, staring, rigidity, and mutism. Delirium and catatonia can co-occur in the setting of medical illness, but no studies have explored this relationship by age. Our objective was to assess whether advancing age and the presence of catatonia are associated with delirium. METHODS/STUDY POPULATION: Methods: We prospectively enrolled critically ill patients at a single institution who were on a ventilator or in shock and evaluated them daily for delirium using the Confusion Assessment for the ICU and for catatonia using the Bush Francis Catatonia Rating Scale. Measures of association (OR) were assessed with a simple logistic regression model with catatonia as the independent variable and delirium as the dependent variable. Effect measure modification by age was assessed using a Likelihood ratio test. RESULTS/ANTICIPATED RESULTS: Results: We enrolled 136 medical and surgical critically ill patients with 452 matched (concomitant) delirium and catatonia assessments. Median age was 59 years (IQR: 52–68). In our cohort of 136 patients, 58 patients (43%) had delirium only, 4 (3%) had catatonia only, 42 (31%) had both delirium and catatonia, and 32 (24%) had neither. Age was significantly associated with prevalent delirium (i.e., increasing age associated with decreased risk for delirium) (p=0.04) after adjusting for catatonia severity. Catatonia was significantly associated with prevalent delirium (p<0.0001) after adjusting for age. Peak delirium risk was for patients aged 55 years with 3 or more catatonic signs, who had 53.4 times the odds of delirium (95% CI: 16.06, 176.75) than those with no catatonic signs. Patients 70 years and older with 3 or more catatonia features had half this risk. DISCUSSION/SIGNIFICANCE OF IMPACT: Conclusions: Catatonia is significantly associated with prevalent delirium even after controlling for age. These data support an inverted U-shape risk of delirium after adjusting for catatonia. This relationship and its clinical ramifications need to be examined in a larger sample, including patients with dementia. Additionally, we need to assess which acute brain syndrome (delirium or catatonia) develops first.
OBJECTIVES/SPECIFIC AIMS: Drug development is a common research pursuit for basic and clinical scientists that interfaces diagnostic/therapeutic challenges with funding agencies, pharmaceutical industry, regulatory systems, and education. The University at Buffalo Clinical and Translational Science Institute (CTSI) has implemented a Drug Development Core (DDC) with goals that foster team science and collaboration, optimize laboratory use, and networks investigators. Our goals are to foster collaborations within the region and with other CTSAs. METHODS/STUDY POPULATION: The DDC met with 300 potential investigators from 14 departments and several local companies. There were 35 portal requests from 15 departments and 7 companies; 8 were from training programs. For 28 requests, a reviewer provided consultation, while 7 required discussions and review of data. DDC assisted with 15 grant applications (outcomes pending), 10 industry-related new drug development requests and 1 regulatory review. Curriculum reviews noted overlap and gaps. Cross-institute opportunities for M.D.-Ph.D. research mentoring were identified. RESULTS/ANTICIPATED RESULTS: The DDC met with 300 potential investigators from 14 departments and several local companies. There were 35 portal requests from 15 departments and 7 companies; 8 were from training programs. For 28 requests, a reviewer provided consultation, while 7 required discussions and review of data. DDC assisted with 15 grant applications (outcomes pending), 10 industry-related new drug development requests and 1 regulatory review. Curriculum reviews noted overlap and gaps. Cross-institute opportunities for M.D.-Ph.D. research mentoring were identified. DISCUSSION/SIGNIFICANCE OF IMPACT: The CTSI DDC was well received by investigators. The request process fosters collaboration among researchers with similar interests and identifies core laboratory resources that add innovation to ongoing research, funding applications, education, and interinstitutional planning.
Digital literacy has been cited as one of the primary challenges to ensuring data reuse and increasing the value placed on open science. Incorporating published data into classrooms and training is at the core of tackling this issue. This article presents case studies in teaching with different published data platforms, in three different countries (the Netherlands, Canada, and the United States), to students at different levels and with differing skill levels. In outlining their approaches, successes, and failures in teaching with open data, it is argued that collaboration with data publishers is critical to improving data reuse and education. Moreover, increased opportunities for digital skills training and scaffolding across program curriculum are necessary for managing the learning curve and teaching students the values of open science.
It is not clear how to effectively recruit healthy research volunteers.
We developed an electronic health record (EHR)-based algorithm to identify healthy subjects, who were randomly assigned to receive an invitation to join a research registry via the EHR’s patient portal, letters, or phone calls. A follow-up survey assessed contact preferences.
The EHR algorithm accurately identified 858 healthy subjects. Recruitment rates were low, but occurred more quickly via the EHR patient portal than letters or phone calls (2.7 vs. 19.3 or 10.4 d). Effort and costs per enrolled subject were lower for the EHR patient portal (3.0 vs. 17.3 or 13.6 h, $113 vs. $559 or $435). Most healthy subjects indicated a preference for contact via electronic methods.
Healthy subjects can be accurately identified from EHR data, and it is faster and more cost-effective to recruit healthy research volunteers using an EHR patient portal.