To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Non-medical opioid use (NMOU) is a growing crisis. Cancer patients at elevated risk of NMOU (+risk) are frequently underdiagnosed. The aim of this paper was to develop a nomogram to predict the probability of +risk among cancer patients receiving outpatient supportive care consultation at a comprehensive cancer center.
3,588 consecutive patients referred to a supportive care clinic were reviewed. All patients had a diagnosis of cancer and were on opioids for pain. All patients were assessed using the Edmonton Symptom Assessment Scale (ESAS), Screener and Opioid Assessment for Patients with Pain (SOAPP-14), and CAGE-AID (Cut Down-Annoyed-Guilty-Eye Opener) questionnaires. “+risk” was defined as an SOAPP-14 score of ≥7. A nomogram was devised based on the risk factors determined by the multivariate logistic regression model to estimate the probability of +risk.
731/3,588 consults were +risk. +risk was significantly associated with gender, race, marital status, smoking status, depression, anxiety, financial distress, MEDD (morphine equivalent daily dose), and CAGE-AID score. The C-index was 0.8. A nomogram was developed and can be accessed at https://is.gd/soappnomogram. For example, for a male Hispanic patient, married, never smoked, with ESAS scores for depression = 3, anxiety = 3, financial distress = 7, a CAGE score of 0, and an MEDD score of 20, the total score is 9 + 9+0 + 0+6 + 10 + 23 + 0+1 = 58. A nomogram score of 58 indicates the probability of +risk of 0.1.
Significance of results
We established a practical nomogram to assess the +risk. The application of a nomogram based on routinely collected clinical data can help clinicians establish patients with +risk and positively impact care planning.
Alcohol use disorder (AUD) is common and associated with increased risk of suicide.
To examine healthcare utilisation prior to suicide in persons with AUD in a large population-based cohort, which may reveal opportunities for prevention.
A national cohort study was conducted of 6 947 191 adults in Sweden in 2002, including 256 647 (3.7%) with AUD, with follow-up for suicide through 2015. A nested case–control design examined healthcare utilisation among people with AUD who died by suicide and 10:1 age- and gender-matched controls.
In 86.7 million person-years of follow-up, 15 662 (0.2%) persons died by suicide, including 2601 (1.0%) with AUD. Unadjusted and adjusted relative risks for suicide associated with AUD were 8.15 (95% CI 7.86–8.46) and 2.22 (95% CI 2.11–2.34). Of the people with AUD who died by suicide, 39.7% and 75.6% had a healthcare encounter <2 weeks or <3 months before the index date respectively, compared with 6.3% and 25.4% of controls (adjusted prevalence ratio (PR) and difference (PD), <2 weeks: PR = 3.86, 95% CI 3.50–4.25, PD = 26.4, 95% CI 24.2–28.6; <3 months: PR = 2.03, 95% CI 1.94–2.12, PD = 34.9, 95% CI 32.6–37.1). AUD accounted for more healthcare encounters within 2 weeks of suicide among men than women (P = 0.01). Of last encounters, 48.1% were in primary care and 28.9% were in specialty out-patient clinics, mostly for non-psychiatric diagnoses.
Suicide among persons with AUD is often shortly preceded by healthcare encounters in primary care or specialty out-patient clinics. Encounters in these settings are important opportunities to identify active suicidality and intervene accordingly in patients with AUD.
OBJECTIVES/GOALS: We compared the validity of an International Classification of Diseases, Clinical Modification (ICD) algorithm for identifying high-grade cervical intraepithelial neoplasia and adenocarcinoma in situ (together referred to as CIN2+) from ICD 9th revision (ICD-9) and 10th revision (ICD-10) codes. METHODS/STUDY POPULATION: Using Tennessee Medicaid data, we identified cervical diagnostic procedures in 2008-2017 among females aged 18-39 years in Davidson County, TN. Gold-standard cases were pathology-confirmed CIN2+ diagnoses validated by HPV-IMPACT, a population-based surveillance project in catchment areas of five US states. Procedures in the ICD transition year (2015) were excluded to account for implementation lag. We pre-grouped diagnosis and procedure codes by theme. We performed feature selection using least absolute shrinkage and selection operator (LASSO) logistic regression with 10-fold cross validation and validated models by ICD-9 era (2008-2014, N = 6594) and ICD-10 era (2016-2017, N = 1270). RESULTS/ANTICIPATED RESULTS: Of 7864 cervical diagnostic procedures, 880 (11%) were true CIN2+ cases. LASSO logistic regression selected the strongest features of case status: Having codes for a CIN2+ tissue diagnosis, non-specific CIN tissue diagnosis, high-grade squamous intraepithelial lesion, receiving a cervical treatment procedure, and receiving a cervical/vaginal biopsy. Features of non-case status were codes for a CIN1 tissue diagnosis, Pap test, and HPV DNA test. The ICD-9 vs ICD-10 algorithms predicted case status with 68% vs 63% sensitivity, 95% vs 94% specificity, 63% vs 64% positive predictive value, 96% vs 94% negative predictive value, 92% vs 89% accuracy, and C-indices of 0.95 vs 0.92, respectively. DISCUSSION/SIGNIFICANCE OF IMPACT: Overall, the algorithm’s validity for identifying CIN2+ case status was similar between coding versions. ICD-9 had slightly better discriminative ability. Results support a prior study concluding that ICD-10 implementation has not substantially improved the quality of administrative data from ICD-9.
Dietary fibre fermentation in humans and monogastric animals is considered to occur in the hindgut, but it may also occur in the lower small intestine. This study aimed to compare ileal and hindgut fermentation in the growing pig fed a human-type diet using a combined in vivo/in vitro methodology. Five pigs (23 (sd 1·6) kg body weight) were fed a human-type diet. On day 15, pigs were euthanised. Digesta from terminal jejunum and terminal ileum were collected as substrates for fermentation. Ileal and caecal digesta were collected for preparing microbial inocula. Terminal jejunal digesta were fermented in vitro with a pooled ileal digesta inoculum for 2 h, whereas terminal ileal digesta were fermented in vitro with a pooled caecal digesta inoculum for 24 h. The ileal organic matter fermentability (28 %) was not different from hindgut fermentation (35 %). However, the organic matter fermented was 66 % greater for ileal fermentation than hindgut fermentation (P = 0·04). Total numbers of bacteria in ileal and caecal digesta did not differ (P = 0·09). Differences (P < 0·05) were observed in the taxonomic composition. For instance, ileal digesta contained 32-fold greater number of the genus Enterococcus, whereas caecal digesta had a 227-fold greater number of the genus Ruminococcus. Acetate synthesis and iso-valerate synthesis were greater (P < 0·05) for ileal fermentation than hindgut fermentation, but propionate, butyrate and valerate synthesis was lower. SCFA were absorbed in the gastrointestinal tract location where they were synthesised. In conclusion, a quantitatively important degree of fermentation occurs in the ileum of the growing pig fed a human-type diet.
This project will work closely with existing service partners involved in street level services and focus on testing and evaluating three approaches for street level interventions for youth who are homeless and who have severe or moderate mentally illness. Youth will be asked to choose their preferred service approach:
Housing First related initiatives focused on interventions designed to move youth to appropriate and available housing and ongoing housing supports.
Treatment First initiatives to provide Mental Health/Addiction supports and treatment solutions, and; Simultaneous attention to both Housing and Treatment Together
Our primary objective is to understand the service delivery preferences of homeless youth and understand the outcomes of these choices. Our research questions include:
1. Which approaches to service are chosen by youth?
2. What are the differences and similarities between groups choosing each approach?
3. What are the critical ingredients needed to effectively implement services for homeless youth from the perspectives of youth, families and service providers?
Focus groups with staff and family members will occur to assist in understanding the nature of each of service approach, changes that evolve within services, & facilitators and barriers to service delivery. This work will be important in determining which approach is chosen by youth and why. Evaluating the outcomes with each choice will provide valuable information about outcomes for the service options chosen by youth. This assist in better identifying weaknesses in the services offered and inform further development of treatment options that youth will accept.
Determining best practices for managing free farrowing systems is crucial for uptake. Cross-fostering, the exchange of piglets between litters, is routinely performed amongst crate-housed sows. However, cross-fostering can increase fighting amongst the litter and may be more challenging within free farrowing systems as sows have more freedom to respond to cross-fostered piglets. This study compared the effect of either cross-fostering (FOS), or a control of sham-fostering (CON), of four focal piglets per litter on Day 6 postpartum in crates (CRATE) and free farrowing pens (PEN). The post-treatment behavioural responses of sows were recorded (Day 6 = 60 min; Day 7 = 300 min; n = 48), as were the average daily gain (ADG; g/day), total weight gain (TWG; kg) and body lesion scores of focal piglets and their littermates throughout lactation (Day 6, Day 8, Day 11 and Day 26; n = 539) and the post-weaning period (Day 29, Day 32 and Day 60; n = 108). On Day 6, though post-reunion latency to nursing did not differ, latency to successful nursing was longer amongst FOS than CON litters (P < 0.001), more so amongst CRATE FOS than PEN FOS (P < 0.01). On Day 7, PEN FOS sows had fewer successful nursing bouts (P < 0.05) and exhibited decreased lateral (P < 0.01) and increased ventral lying frequencies (P < 0.01) compared to all other housing and treatment combinations. Focal piglet ADG was lower for FOS than CON in the CRATE during Day 6 to Day 8 (P < 0.01) and lower in the PEN during Day 6 to Day 8 (P < 0.001), Day 8 to Day 11 (P < 0.01) and Day 11 to Day 26 (P < 0.05). The TWG of pre-weaned focal piglets (Day 6 to Day 26) was higher amongst CON than FOS litters (P = 0.01). Post-weaning, piglet ADG was higher for PEN than CRATE during Day 26 to Day 29 (P < 0.01) and higher for FOS than CON during Day 26 to Day 29 (P < 0.05), Day 29 to Day 32 (P < 0.001) and Day 32 to Day 60 (P < 0.01); thus, TWG was higher for FOS than CON during the weaner (P = 0.001) and the combined lactation and weaner periods (P = 0.09). In conclusion, sow behaviour was disrupted by cross-fostering in the crates and pens and continued to be disturbed on the following day amongst penned sows. FOS piglets exhibited reduced ADG after cross-fostering, which extended throughout lactation in the pens. However, the increased post-weaning weight gain of FOS piglets meant that their TWG was higher than CON piglets, irrespective of the farrowing system used.
In a study conducted in the database of a large commercial healthcare insurer, we previously demonstrated that use of a commercial pharmacogenetic assay for individuals with mood disorders was associated with decreased resource utilization and cost in the 6 month period following use compared to propensity-score matched controls. We conducted a post hoc analysis to understand variables associated with high cost savings.
The results and methods of the initial study have previously been described. Cases were individuals with mood and anxiety disorders who received a commercial pharmacogenetic assay (Genomind, King of Prussia PA) to inform pharmacotherapy. 817 tested individuals (cases) with mood and/or anxiety disorders were matched to 2745 controls. Overall costs were estimated to be $1,948 lower in the tested group. The differences were largely the result of lesser emergency room and inpatient utilization for cases. In the present analysis, cost difference for cases compared to their matched controls was rank ordered by decile. High cost savers were arbitrarily defined a priori as the top 20% of savers. Using multivariable modeling techniques, an ordinal logistic regression model was generated in which baseline or follow-up variables were statistically tested for independent associations with high, low, and no cost savings.
606 (74%) of cases were net cost savers compared to their controls (cost difference <0). High cost savers (n=121) saved on average $10,690 compared to their matched controls. They were statistically more likely to have been diagnosed with bipolar disorder (n=33/121) than low cost savers (n=57/485) or non-savers (n=31/211), and had a lower Charlson Comorbidity index. High cost savers had fewer mean number of antidepressants in the baseline period (mean=3.16) compared to non-savers (3.73) but more than low cost savers (2.72) (p<0.05 across groups). In a multivariable model, bipolar, count of antidepressants, outpatient visits, and inpatient visits were statistically associated with being a high cost saver; antidepressant count and all-cause inpatient and outpatient visits in the baseline period were inversely associated with cost savings.
Use of a pharmacogenetic assay was associated with cost-savings in the database of a large commercial insurer. Patients with bipolar disorder were more likely to be high cost savers than individuals with other mood and anxiety disorders.
Fear and environmental stressors may negatively affect the welfare of farm animals such as pigs. The present study investigated the effects of music and positive handling on reproductive performance of sows (n = 1014; parity 1 to 8) from a commercial pig farm practicing a batch farrowing system. Every 2 weeks, 56 sows were moved from the gestation unit to conventional-crated farrowing houses 1 week prior to expected farrowing. Treated (T; n = 299) and control (C; n = 715) sows were included in the study. In the farrowing houses, auditory enrichment (music from a radio) was provided to sows of T groups daily from 0600 to 1800 h until the end of lactation. Until the day of farrowing, T sows were additionally subjected, for 15 s per day per sow, to continuous back scratching by one member of farm staff. Litter performance and piglet mortality were recorded and analysed between T and C sows using linear mixed regression models. The number of liveborn piglets (C 13.85 v. T 13.26) and liveborn corrected for fostering (C 13.85 v. T 13.43) was significantly higher (P < 0.05) in C groups compared to the T groups. The number of stillborn piglets was 0.60 and 0.72 in T and C groups, respectively (P > 0.05). With regard to piglet mortality, a linear mixed regression model showed a significant overall effect of treatment in reducing piglet mortality (P < 0.01). Yet, the effect of treatment varied according to litter size (number of liveborn piglets) with a diminishing treatment effect in sows with a high litter size (P < 0.01). Pre-weaning survival was improved in the current study by the combined effect of daily back scratching of sows prior to farrowing and providing music to sows and piglets during lactation. Further research is needed to assess the separate effects of both interventions.
Mycoprotein is a food high in both dietary fibre and non-animal-derived protein. Global mycoprotein consumption is increasing, although its effect on human health has not yet been systematically reviewed. This study aims to systematically review the effects of mycoprotein on glycaemic control and energy intake in humans. A literature search of randomised controlled trials was performed in PubMed, Embase, Web of Science, Google Scholar and hand search. A total of twenty-one studies were identified of which only five studies, totalling 122 participants, met the inclusion criteria. All five studies were acute studies of which one reported outcomes on glycaemia and insulinaemia, two reported on energy intake and two reported on all of these outcomes. Data were extracted, and risk-of-bias assessment was then conducted. The results did not show a clear effect of acute mycoprotein on blood glucose levels, but it showed a decrease in insulin levels. Acute mycoprotein intake also showed to decrease energy intake at an ad libitum meal and post-24 h in healthy lean, overweight and obese humans. In conclusion, the acute ingestion of mycoprotein reduces energy intake and insulinaemia, whereas its impact on glycaemia is currently unclear. However, evidence comes from a very limited number of heterogeneous studies. Further well-controlled studies are needed to elucidate the short- and long-term effects of mycoprotein intake on glycaemic control and energy intake, as well as the mechanisms underpinning these effects.
The prevalence of many diseases in pigs displays seasonal distributions. Despite growing concerns about the impacts of climate change, we do not yet have a good understanding of the role that weather factors play in explaining such seasonal patterns. In this study, national and county-level aggregated abattoir inspection data were assessed for England and Wales during 2010–2015. Seasonally-adjusted relationships were characterised between weekly ambient maximum temperature and the prevalence of both respiratory conditions and tail biting detected at slaughter. The prevalence of respiratory conditions showed cyclical annual patterns with peaks in the summer months and troughs in the winter months each year. However, there were no obvious associations with either high or low temperatures. The prevalence of tail biting generally increased as temperatures decreased, but associations were not supported by statistical evidence: across all counties there was a relative risk of 1.028 (95% CI 0.776–1.363) for every 1 °C fall in temperature. Whilst the seasonal patterns observed in this study are similar to those reported in previous studies, the lack of statistical evidence for an explicit association with ambient temperature may possibly be explained by the lack of information on date of disease onset. There is also the possibility that other time-varying factors not investigated here may be driving some of the seasonal patterns.
Studies suggest that alcohol consumption and alcohol use disorders have distinct genetic backgrounds.
We examined whether polygenic risk scores (PRS) for consumption and problem subscales of the Alcohol Use Disorders Identification Test (AUDIT-C, AUDIT-P) in the UK Biobank (UKB; N = 121 630) correlate with alcohol outcomes in four independent samples: an ascertained cohort, the Collaborative Study on the Genetics of Alcoholism (COGA; N = 6850), and population-based cohorts: Avon Longitudinal Study of Parents and Children (ALSPAC; N = 5911), Generation Scotland (GS; N = 17 461), and an independent subset of UKB (N = 245 947). Regression models and survival analyses tested whether the PRS were associated with the alcohol-related outcomes.
In COGA, AUDIT-P PRS was associated with alcohol dependence, AUD symptom count, maximum drinks (R2 = 0.47–0.68%, p = 2.0 × 10−8–1.0 × 10−10), and increased likelihood of onset of alcohol dependence (hazard ratio = 1.15, p = 4.7 × 10−8); AUDIT-C PRS was not an independent predictor of any phenotype. In ALSPAC, the AUDIT-C PRS was associated with alcohol dependence (R2 = 0.96%, p = 4.8 × 10−6). In GS, AUDIT-C PRS was a better predictor of weekly alcohol use (R2 = 0.27%, p = 5.5 × 10−11), while AUDIT-P PRS was more associated with problem drinking (R2 = 0.40%, p = 9.0 × 10−7). Lastly, AUDIT-P PRS was associated with ICD-based alcohol-related disorders in the UKB subset (R2 = 0.18%, p < 2.0 × 10−16).
AUDIT-P PRS was associated with a range of alcohol-related phenotypes across population-based and ascertained cohorts, while AUDIT-C PRS showed less utility in the ascertained cohort. We show that AUDIT-P is genetically correlated with both use and misuse and demonstrate the influence of ascertainment schemes on PRS analyses.
In many materials development projects, scientists and research heads make decisions to guide the project direction. For example, scientists may decide which processing steps to use, what elements to include in their material selection, or from what suppliers to source their materials. Research heads may decide whether to invest development effort in reducing the environmental impact or production cost of a material. When making these decisions, it would be helpful to know how those decisions affect the achievable performance of the materials under consideration. Often, these decisions are complicated by trade-offs in performance between competing properties. This paper presents an approach for visualizing and evaluating design spaces, where a design space is defined as the set of possible materials under consideration given specified constraints. This design space visualization approach is applied to two case studies with environmental impact motivations: one in biodegradability for solvents, and the other in sustainable materials sourcing for Li-ion batteries. The results demonstrate how this visualization approach can enable data-driven, quantitative decisions for project direction.
The Supplemental Nutrition Assistance Program (SNAP) serves as the primary tool to alleviate food insecurity in the United States. Its effectiveness has been demonstrated in numerous studies, but the majority of SNAP recipients are still food insecure. One factor behind this is the difference in food prices across the country—SNAP benefits are not adjusted to reflect these differences. Using information from Feeding America's Map the Meal Gap (MMG) project, we compare the cost of a meal by county based on the Thrifty Food Plan (TFP)—which is used to set the maximum SNAP benefit—with the cost of the average meal for low-income food-secure households. We find that the cost of the latter meal is higher than the TFP meal for over 99 percent of the counties. We next consider the reduction in food insecurity if, by county, the maximum SNAP benefit level was set to the cost of the average meal for low-income food-secure households. We find that if this approach were implemented, there would be a decline of 50.9 percent in food insecurity among SNAP recipients at a cost of $23 billion.
Thermal infrared data collected by the Thermal Emission Spectrometer (TES) and Thermal Emission Imaging System (THEMIS) instruments have significantly impacted the understanding of martian surface mineralogy. Spatial/temporal variations in igneous lithologies; the discovery of quartz, carbonates, and chlorides; and the widespread identification of amorphous, silica-enriched materials reveal a planet that has experienced a diversity of primary and secondary geo-logic processes including igneous crustal evolution, regional sedimentation, aqueous alteration, and glacial/periglacial activity.
Describe common pathogens and antimicrobial resistance patterns for healthcare-associated infections (HAIs) that occurred during 2015–2017 and were reported to the Centers for Disease Control and Prevention’s (CDC’s) National Healthcare Safety Network (NHSN).
Data from central line-associated bloodstream infections (CLABSIs), catheter-associated urinary tract infections (CAUTIs), ventilator-associated events (VAEs), and surgical site infections (SSIs) were reported from acute-care hospitals, long-term acute-care hospitals, and inpatient rehabilitation facilities. This analysis included device-associated HAIs reported from adult location types, and SSIs among patients ≥18 years old. Percentages of pathogens with nonsusceptibility (%NS) to selected antimicrobials were calculated for each HAI type, location type, surgical category, and surgical wound closure technique.
Overall, 5,626 facilities performed adult HAI surveillance during this period, most of which were general acute-care hospitals with <200 beds. Escherichia coli (18%), Staphylococcus aureus (12%), and Klebsiella spp (9%) were the 3 most frequently reported pathogens. Pathogens varied by HAI and location type, with oncology units having a distinct pathogen distribution compared to other settings. The %NS for most pathogens was significantly higher among device-associated HAIs than SSIs. In addition, pathogens from long-term acute-care hospitals had a significantly higher %NS than those from general hospital wards.
This report provides an updated national summary of pathogen distributions and antimicrobial resistance among select HAIs and pathogens, stratified by several factors. These data underscore the importance of tracking antimicrobial resistance, particularly in vulnerable populations such as long-term acute-care hospitals and intensive care units.
To describe common pathogens and antimicrobial resistance patterns for healthcare-associated infections (HAIs) among pediatric patients that occurred in 2015–2017 and were reported to the Centers for Disease Control and Prevention’s National Healthcare Safety Network (NHSN).
Antimicrobial resistance data were analyzed for pathogens implicated in central line-associated bloodstream infections (CLABSIs), catheter-associated urinary tract infections (CAUTIs), ventilator-associated pneumonias (VAPs), and surgical site infections (SSIs). This analysis was restricted to device-associated HAIs reported from pediatric patient care locations and SSIs among patients <18 years old. Percentages of pathogens with nonsusceptibility (%NS) to selected antimicrobials were calculated by HAI type, location type, and surgical category.
Overall, 2,545 facilities performed surveillance of pediatric HAIs in the NHSN during this period. Staphylococcus aureus (15%), Escherichia coli (12%), and coagulase-negative staphylococci (12%) were the 3 most commonly reported pathogens associated with pediatric HAIs. Pathogens and the %NS varied by HAI type, location type, and/or surgical category. Among CLABSIs, the %NS was generally lowest in neonatal intensive care units and highest in pediatric oncology units. Staphylococcus spp were particularly common among orthopedic, neurosurgical, and cardiac SSIs; however, E. coli was more common in abdominal SSIs. Overall, antimicrobial nonsusceptibility was less prevalent in pediatric HAIs than in adult HAIs.
This report provides an updated national summary of pathogen distributions and antimicrobial resistance patterns among pediatric HAIs. These data highlight the need for continued antimicrobial resistance tracking among pediatric patients and should encourage the pediatric healthcare community to use such data when establishing policies for infection prevention and antimicrobial stewardship.
Quaternary processes and environmental changes are often difficult to assess in remote subantarctic islands due to high surface erosion rates and overprinting of sedimentary products in locations that can be a challenge to access. We present a set of high-resolution, multichannel seismic lines and complementary multibeam bathymetry collected off the eastern (leeward) side of the subantarctic Auckland Islands, about 465 km south of New Zealand's South Island. These data constrain the erosive and depositional history of the island group, and they reveal an extensive system of sediment-filled valleys that extend offshore to depths that exceed glacial low-stand sea level. Although shallow, marine, U-shaped valleys and moraines are imaged, the rugged offshore geomorphology of the paleovalley floors and the stratigraphy of infill sediments suggests that the valley floors were shaped by submarine fluvial erosion, and subsequently filled by lacustrine, fjord, and fluvial sedimentary processes.
A large and growing body of literature has studied consumer willingness to pay (WTP) for local foods in the United States. However, these studies implicitly assume that consumers perceive local foods to have superior quality than nonlocal foods. Little is known about WTP for local foods when taking into account differences in consumer perception of food quality between local and nonlocal foods. In this article, we conduct an economic experiment to assess the effect of locally grown information on consumer WTP and quality perceptions of three broccoli varieties (one commercial variety grown in California and two newly developed local varieties). Our results show that consumers rate both the appearance and the taste of the two local broccoli varieties lower than the California variety when evaluating food quality blindly. However, consumers’ evaluations of the two local varieties improve substantially after being told the two varieties are locally grown. Results also indicate that consumers are willing to pay a price premium for the two local varieties after being told that they are locally grown. Our results provide evidence that locally grown information has a positive effect on both consumer WTP and quality perception of local foods.
Autonomous exploration requires the use of movable platforms that carry a payload of instruments with a certain level of autonomy and communication with the operators. This is particularly challenging in subsurface environments, which may be more dangerous for human access and where communication with the surface is limited. Subsurface robotic exploration, which has been to date very limited, is interesting not only for science but also for cost-effective industrial exploitation of resources and safety assessments in mines. Furthermore, it has a direct application to exploration of extra-terrestrial subsurface environments of astrobiological and geological significance such as caves, lava tubes, impact or volcanic craters and subglacial conduits, for deriving in-situ mineralogical resources and establishing preliminary settlements. However, the technological solutions are generally tailor-made and are therefore considered as costly, fragile and environment-specific, further hindering their extensive and effective applications. To demonstrate the advantages of rover exploration for a broad-community, we have developed KORE (KOmpact Rover for Exploration); a low-cost, re-usable, rover multi-purpose platform. The rover platform has been developed as a technological demonstration for extra-terrestrial subsurface exploration and terrestrial mining operations pertaining to geomorphological mapping, environmental monitoring, gas leak detections and search and rescue operations in case of an accident. The present paper, the first part of a series of two, focuses on describing the development of a robust rover platform to perform dedicated geomorphological, astrobiological and mining tasks. KORE was further tested in the Mine Analogue Research 6 (MINAR6) campaign during September 2018 in the Boulby mine (UK), the second deepest potash mine in Europe at a subsurface depth of 1.1 km, the results of which will be presented in the second paper of this series. KORE is a large, semi-autonomous rover weighing 160 kg with L × W × H dimensions 1.2 m × 0.8 m × 1 m and a payload carrying capacity of 100 kg using 800 W traction power that can power to a maximum speed of 8.4 km h−1. The rover can be easily dismantled in three parts facilitating its transportation to any chosen site of exploration. Presently, the main scientific payloads on KORE are: (1) a three-dimensional mapping camera, (2) a methane detection system, (3) an environmental station capable of monitoring temperature, relative humidity, pressure and gases such as NO2, SO2, H2S, formaldehyde, CO, CO2, O3, O2, volatile organic compounds and particulates and (4) a robotic arm. Moreover, the design of the rover allows for integration of more sensors as per the scientific requirements in future expeditions. At the MINAR6 campaign, the technical readiness of KORE was demonstrated during 6 days of scientific research in the mine, with a total of 22 h of operation.