We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
There is controversy regarding whether the addition of cover gowns offers a substantial benefit over gloves alone in reducing personnel contamination and preventing pathogen transmission.
Design:
Simulated patient care interactions.
Objective:
To evaluate the efficacy of different types of barrier precautions and to identify routes of transmission.
Methods:
In randomly ordered sequence, 30 personnel each performed 3 standardized examinations of mannequins contaminated with pathogen surrogate markers (cauliflower mosaic virus DNA, bacteriophage MS2, nontoxigenic Clostridioides difficile spores, and fluorescent tracer) while wearing no barriers, gloves, or gloves plus gowns followed by examination of a noncontaminated mannequin. We compared the frequency and routes of transfer of the surrogate markers to the second mannequin or the environment.
Results:
For a composite of all surrogate markers, transfer by hands occurred at significantly lower rates in the gloves-alone group (OR, 0.02; P < .001) and the gloves-plus-gown group (OR, 0.06; P = .002). Transfer by stethoscope diaphragms was common in all groups and was reduced by wiping the stethoscope between simulations (OR, 0.06; P < .001). Compared to the no-barriers group, wearing a cover gown and gloves resulted in reduced contamination of clothing (OR, 0.15; P < .001), but wearing gloves alone did not.
Conclusions:
Wearing gloves alone or gloves plus gowns reduces hand transfer of pathogens but may not address transfer by devices such as stethoscopes. Cover gowns reduce the risk of contaminating the clothing of personnel.
Critical shortages of personal protective equipment, especially N95 respirators, during the coronavirus disease 2019 (COVID-19) pandemic continues to be a source of concern. Novel methods of N95 filtering face-piece respirator decontamination that can be scaled-up for in-hospital use can help address this concern and keep healthcare workers (HCWs) safe.
Methods:
A multidisciplinary pragmatic study was conducted to evaluate the use of an ultrasonic room high-level disinfection system (HLDS) that generates aerosolized peracetic acid (PAA) and hydrogen peroxide for decontamination of large numbers of N95 respirators. A cycle duration that consistently achieved disinfection of N95 respirators (defined as ≥6 log10 reductions in bacteriophage MS2 and Geobacillus stearothermophilus spores inoculated onto respirators) was identified. The treated masks were assessed for changes to their hydrophobicity, material structure, strap elasticity, and filtration efficiency. PAA and hydrogen peroxide off-gassing from treated masks were also assessed.
Results:
The PAA room HLDS was effective for disinfection of bacteriophage MS2 and G. stearothermophilus spores on respirators in a 2,447 cubic-foot (69.6 cubic-meter) room with an aerosol deployment time of 16 minutes and a dwell time of 32 minutes. The total cycle time was 1 hour and 16 minutes. After 5 treatment cycles, no adverse effects were detected on filtration efficiency, structural integrity, or strap elasticity. There was no detectable off-gassing of PAA and hydrogen peroxide from the treated masks at 20 and 60 minutes after the disinfection cycle, respectively.
Conclusion:
The PAA room disinfection system provides a rapidly scalable solution for in-hospital decontamination of large numbers of N95 respirators during the COVID-19 pandemic.
Background: Patients with methicillin-resistant Staphylococcus aureus (MRSA) colonization often shed MRSA, resulting in contamination of surfaces in their room. It is not known whether MRSA-colonized patients also frequently contaminate surfaces during medical appointments and other activities outside their room. Methods: We conducted an observational cohort study of MRSA-colonized long-term care facility (LTCF) residents to determine the frequency and mechanisms of contamination of surfaces outside patient rooms. Nares, skin, and clothing of patients in contact precautions for MRSA were cultured for MRSA, and high-touch surfaces in the residents’ room were contaminated with the live virus bacteriophage MS2 and cauliflower mosaic virus DNA. The participants were observed during activities and medical appointments outside their rooms for 3 days, and sites that were contacted were sampled for recovery of MRSA, bacteriophage MS2, and cauliflower mosaic virus DNA. Results: As shown in Fig. 1, bacteriophage MS2 and cauliflower mosaic virus DNA was transferred to 1 or more surfaces outside the resident’s room by 5 of the 7 participants, and MRSA was recovered from surfaces touched by 6 (86%) participants. MRSA was recovered during 16 of 35 episodes (46%) where sampling was performed, and recovery was similar for medical appointments (eg, hemodialysis, physical therapy) and nonmedical activities (eg, using the dining room or activity center). Moreover, MRSA, MS2, and the viral DNA marker were recovered both from sites contacted only by participants’ hands and from sites contacted only by clothing. Bacteriophage MS2 and the viral DNA marker were also recovered from portable equipment and from the nursing station. Conclusions: MRSA-colonized LTCF residents frequently disseminated MRSA and viral surrogate markers to surfaces outside their rooms through contact with contaminated hands and clothing. Efforts to reduce contamination of hands and clothing might reduce the risk for pathogen transmission.
The Cal-DSH Diversion Guidelines provide 10 general guidelines that jurisdictions should consider when developing diversion programs for individuals with a serious mental illness (SMI) who become involved in the criminal justice system. Screening for SMI in a jail setting is reviewed. In addition, important treatment interventions for SMI and substance use disorders are highlighted with the need to address criminogenic risk factors highlighted.
Background: Barrier precautions (eg, gloves and gowns) are often used in clinical settings to reduce the risk for transmission of healthcare-associated pathogens. However, uncertainty persists regarding the efficacy of different types of barrier precautions in preventing transmission. Methods: We used simulated patient care interactions to compare the effectiveness of different levels of barrier precautions in reducing transfer of pathogen surrogate markers. Overall, 30 personnel performed standardized examinations of contaminated mannequins while wearing either no barriers, gloves, or gloves plus cover gowns followed by examination of a noncontaminated mannequin; the order of the barrier precautions was randomly assigned. Participants used their usual technique for hand hygiene, stethoscope cleaning, and protective equipment removal. The surrogate markers included cauliflower mosaic virus DNA, bacteriophage MS2, nontoxigenic Clostridium difficile spores, and a fluorescent tracer. We compared the frequency and route of transfer of each of the surrogate markers to the second mannequin or to the surrounding environment. Results: As shown in Fig. 1, wearing gloves alone or gloves plus gowns significantly reduced transfer of each of the surrogate markers by the hands of participants (P < .05 for each marker). However, wearing gloves or gloves plus gowns only modestly reduced transfer by stethoscopes despite cleaning of stethoscopes between exams by approximately half of the participants. Contamination of the clothing of participants was significantly reduced in the glove plus gown group versus the gloves only or no-barriers groups (P < .05). Conclusion: Barrier precautions are effective in reducing hand transfer of pathogens from patient to patient, but transfer may still occur via devices such as stethoscopes. Cover gowns reduce the risk for contamination of the clothing of personnel.
Funding: Proprietary Organization: The Center for Disease Control.
Here we report the findings from excavations at the open-air Middle Palaeolithic site of Alapars-1 in central Armenia. Three stratified Palaeolithic artefact assemblages were found within a 6-m-thick alluvial-aeolian sequence, located on the flanks of an obsidian-bearing lava dome. Combined sedimentological and chronological analyses reveal three phases of sedimentation and soil development. During Marine Oxygen Isotope Stages 5–3, the manner of deposition changes from alluvial to aeolian, with a development of soil horizons. Techno-typological analysis and geochemical sourcing of the obsidian artefacts reveal differential discard patterns, source exploitation, and artefact densities within strata, suggesting variability in technological organization during the Middle Palaeolithic. Taken together, these results indicate changes in hominin occupation patterns from ephemeral to more persistent in relation to landscape dynamics during the last interglacial and glacial periods in central Armenia.
On coronavirus disease 2019 (COVID-19) wards, severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) nucleic acid was frequently detected on high-touch surfaces, floors, and socks inside patient rooms. Contamination of floors and shoes was common outside patient rooms on the COVID-19 wards but decreased after improvements in floor cleaning and disinfection were implemented.
Disabilities in physical activity and functional independence affect the early rehabilitation of stroke survivors. Moreover, a good instrument for assessing activity disability allows accurate assessment of physical disability and assists in prognosis determination.
Objective:
To compare three assessment tools for physical activity in acute-phase stroke survivors.
Methods:
We conducted this prospective observational study at an affiliated hospital of a Medical University in Shanghai, China, from June 2018 to November 2019. We administered three instruments to all patients during post-stroke days 5–7, including the Modified Barthel Index (MBI), Instrumental Activities of Daily Living (IADL), and modified Rankin scale (mRs). We analyzed correlations among the aforementioned scales and the National Institutes of Health Stroke Scale (NIHSS) using Spearman’s rank-order correlations test. Univariate analyses were performed using the Mann–Whitney U test. We used a binary logistic regression model to assess the association between the NIHSS (30 days) and patient-related variables. Finally, we used receiver operating characteristic (ROC) curves to assess the predictive value of the multivariate regression models.
Results:
There was a high correlation among the three instruments; furthermore, the MBI had a higher correlation with the NIHSS (days 5–7). The NIHSS (day 30) was correlated with thrombolysis. ROC analysis revealed that the mRs-measured disability level had the highest predictive value of short-term stroke severity (30 days).
Conclusion:
The MBI was the best scale for measuring disability in physical activity, whereas the mRs showed better accuracy in short-term prediction of stroke severity.
Adults with CHD have reduced work participation rates compared to adults without CHD. We aimed to quantify employment rate among adult CHD patients in a population-based registry and to describe factors and barriers associated with work participation.
Methods:
We retrospectively identified adults with employment information in the North Carolina Congenital Heart Defects Surveillance Network. Employment was defined as any paid work in a given year. Logistic regression was used to examine patients’ employment status during each year.
Results:
The registry included 1,208 adult CHD patients with a health care encounter between 2009 and 2013, of whom 1,078 had ≥1 year of data with known employment status. Overall, 401 patients (37%) were employed in their most recent registry year. On multivariable analysis, the odds of employment decreased with older age and were lower for Black as compared to White patients (odds ratio = 0.78; 95% confidence interval: 0.62, 0.98; p = 0.030), and single as compared to married patients (odds ratio = 0.50; 95% confidence interval: 0.39, 0.63; p < 0.001).
Conclusion:
In a registry where employment status was routinely captured, only 37% of adult CHD patients aged 18–64 years were employed, with older patients, Black patients, and single patients being less likely to be employed. Further work is needed to consider how enhancing cardiology follow-up for adults with CHD can integrate support for employment.
We report two cases of respiratory toxigenic Corynebacterium diphtheriae infection in fully vaccinated UK born adults following travel to Tunisia in October 2019. Both patients were successfully treated with antibiotics and neither received diphtheria antitoxin. Contact tracing was performed following a risk assessment but no additional cases were identified. This report highlights the importance of maintaining a high index of suspicion for re-emerging infections in patients with a history of travel to high-risk areas outside Europe.
Project management expertise is employed across many professional sectors, including clinical research organizations, to ensure that efforts undertaken by the organization are completed on time and according to specifications and are capable of achieving the needed impact. Increasingly, project leaders (PLs) who possess this expertise are being employed in academic settings to support clinical and preclinical translational research team science. Duke University’s clinical and translational science enterprise has been an early adopter of project management to support clinical and preclinical programs. We review the history and evolution of project management and the PL role at Duke, examine case studies that illustrate their growing value to our academic research environment, and address challenges and solutions to employing project management in academia. Furthermore, we describe the critical role project leadership plays in accelerating and increasing the success of translational team science and team approaches frequently required for systems biology and “big data” scientific studies. Finally, we discuss perspectives from Duke project leadership professionals regarding the training needs and requirements for PLs working in academic clinical and translational science research settings.
Currently, 564,000 Canadians are living with dementia. This number will continue to rise as the population ages. Family physicians play an integral role in the diagnosis and management of dementia patients. Although studies have looked at family physician perspectives on dementia care in the urban setting, much less is known about challenges in rural areas. This study aimed to explore rural family physicians’ experiences in caring for patients with dementia in rural Alberta, Canada. We conducted three semi-structured focus groups with 16 family physicians to evaluate barriers and facilitators to providing care to persons with dementia in three rural communities. We developed focus group questions based on the theoretical domains framework (TDF) and analysed them using a framework approach. Physician capabilities, opportunities, and motivations appear to play important roles in caring for these patients. These research findings can be used to advance quality of care for rural dementia patients.
Perinatal mood and anxiety disorders (PMADs) are the most common complication of pregnancy and have been found to have long-term implications for both mother and child. In vulnerable patient populations such as those served at Denver Health, a federally qualified health center the prevalence of PMADs is nearly double the nationally reported rate of 15–20%. Nearly 17% of women will be diagnosed with major depression at some point in their lives and those numbers are twice as high in women who live in poverty. Women also appear to be at higher risk for depression in the child-bearing years. In order to better address these issues, an Integrated Perinatal Mental Health program was created to screen, assess, and treat PMADs in alignment with national recommendations to improve maternal–child health and wellness. This program was built upon a national model of Integrated Behavioral Health already in place at Denver Health.
Methods
A multidisciplinary team of physicians, behavioral health providers, public health, and administrators was assembled at Denver Health, an integrated hospital and community health care system that serves as the safety net hospital to the city and county of Denver, CO. This team was brought together to create a universal screen-to-treat process for PMAD’s in perinatal clinics and to adapt the existing Integrated Behavioral Health (IBH) model into a program better suited to the health system’s obstetric population. Universal prenatal and postnatal depression screening was implemented at the obstetric intake visit, a third trimester prenatal care visit, and at the postpartum visit across the clinical system. At the same time, IBH services were implemented across our health system’s perinatal care system in a stepwise fashion. This included our women’s care clinics as well as the family medicine and pediatric clinics. These efforts occurred in tandem to support all patients and staff enabling a specially trained behavioral health provider (psychologists and L.C.S.W.’s) to respond immediately to any positive screen during or after pregnancy.
Results
In August 2014 behavioral health providers were integrated into the women’s care clinics. In January 2015 universal screening for PMADs was implemented throughout the perinatal care system. Screening has improved from 0% of women screened at the obstetric care intake visit in August 2014 to >75% of women screened in August 2016. IBH coverage by a licensed psychologist or licensed clinical social worker exists in 100% of perinatal clinics as of January 2016. As well, in order to gain sustainability, the ability to bill same day visits as well as to bill, and be reimbursed for screening and assessment visits, continues to improve and provide for a model that is self-sustaining for the future.
Conclusion
Implementation of a universal screening process for PMADs alongside the development of an IBH model in perinatal care has led to the creation of a program that is feasible and has the capacity to serve as a national model for improving perinatal mental health in vulnerable populations.
OBJECTIVES/SPECIFIC AIMS: Background: Delirium is a well described form of acute brain organ dysfunction characterized by decreased or increased movement, changes in attention and concentration as well as perceptual disturbances (i.e., hallucinations) and delusions. Catatonia, a neuropsychiatric syndrome traditionally described in patients with severe psychiatric illness, can present as phenotypically similar to delirium and is characterized by increased, decreased and/or abnormal movements, staring, rigidity, and mutism. Delirium and catatonia can co-occur in the setting of medical illness, but no studies have explored this relationship by age. Our objective was to assess whether advancing age and the presence of catatonia are associated with delirium. METHODS/STUDY POPULATION: Methods: We prospectively enrolled critically ill patients at a single institution who were on a ventilator or in shock and evaluated them daily for delirium using the Confusion Assessment for the ICU and for catatonia using the Bush Francis Catatonia Rating Scale. Measures of association (OR) were assessed with a simple logistic regression model with catatonia as the independent variable and delirium as the dependent variable. Effect measure modification by age was assessed using a Likelihood ratio test. RESULTS/ANTICIPATED RESULTS: Results: We enrolled 136 medical and surgical critically ill patients with 452 matched (concomitant) delirium and catatonia assessments. Median age was 59 years (IQR: 52–68). In our cohort of 136 patients, 58 patients (43%) had delirium only, 4 (3%) had catatonia only, 42 (31%) had both delirium and catatonia, and 32 (24%) had neither. Age was significantly associated with prevalent delirium (i.e., increasing age associated with decreased risk for delirium) (p=0.04) after adjusting for catatonia severity. Catatonia was significantly associated with prevalent delirium (p<0.0001) after adjusting for age. Peak delirium risk was for patients aged 55 years with 3 or more catatonic signs, who had 53.4 times the odds of delirium (95% CI: 16.06, 176.75) than those with no catatonic signs. Patients 70 years and older with 3 or more catatonia features had half this risk. DISCUSSION/SIGNIFICANCE OF IMPACT: Conclusions: Catatonia is significantly associated with prevalent delirium even after controlling for age. These data support an inverted U-shape risk of delirium after adjusting for catatonia. This relationship and its clinical ramifications need to be examined in a larger sample, including patients with dementia. Additionally, we need to assess which acute brain syndrome (delirium or catatonia) develops first.
To examine the relationship between parental work characteristics and diet quality among pre-school children in dual-parent households.
Design
Cross-sectional study. Parental work characteristics were measured by the types of combined parental work schedules and work hours. The main outcome variables included meal eating habits as well as ‘health-conscious food’ and ‘unhealthy non-core food’ dietary patterns derived by using principal component analysis. Sociodemographic covariates were considered to reduce confounding and selection biases.
Setting
The Taiwan Birth Cohort Study, Taiwan.
Subjects
A population-based sample of 18 046 children.
Results
Multiple regression analyses indicated that compared with having both parents working standard schedules, having at least one parent who worked non-standard schedules was significantly associated with a lower likelihood of a child eating breakfast every day and a higher consumption of unhealthy non-core foods. If only one parent was employed and worked standard schedules, the children demonstrated greater odds of having home-prepared dinner most of the time. The mother’s working long hours was associated with lower odds of eating breakfast every day, more frequent consumption of unhealthy non-core foods and a lower frequency of healthy food consumption.
Conclusions
The findings raise concern that parents’ non-standard work schedules and mother’s long working hours have negative effects on diet quality of pre-school children. Policy implications include the need for a multifaceted approach to supporting working parents so as to create healthier food environments.
Considerable efforts have been dedicated to developing strategies to prevent and treat recurrent Clostridium difficile infection (rCDI); however, evidence of the impact of rCDI on patient healthcare utilization and outcomes is limited.
OBJECTIVE
To compare healthcare utilization and 1-year mortality among adults who had rCDI, nonrecurrent CDI, or no CDI.
METHODS
We performed a nested case-control study among adult Kaiser Foundation Health Plan members from September 1, 2001, through December 31, 2013. We identified CDI through the presence of a positive laboratory test result and divided patients into 3 groups: patients with rCDI, defined as CDI in the 14–57 days after initial CDI; patients with nonrecurrent CDI; and patients who never had CDI. We conducted 3 matched comparisons: (1) rCDI vs no CDI; (2) rCDI vs nonrecurrent CDI; (3) nonrecurrent CDI vs no CDI. We followed patients for 1 year and compared healthcare utilization between groups, after matching patients on age, sex, and comorbidity.
RESULTS
We found that patients with rCDI consistently have substantially higher levels of healthcare utilization in various settings and greater 1-year mortality risk than both patients who had nonrecurrent CDI and patients who never had CDI.
CONCLUSIONS
Patients who develop an initial CDI are generally characterized by excess underlying, severe illness and utilization. However, patients with rCDI experience even greater adverse consequences of their disease than patients who do not experience rCDI. Our results further support the need for continued emphasis on identifying and using novel approaches to prevent and treat rCDI.
Few decision aids are available for patients with a serious illness who face many treatment and end-of-life decisions. We evaluated the Looking Ahead: Choices for Medical Care When You're Seriously Ill® patient decision aid (PtDA), one component of an early palliative care clinical trial.
Method:
Our participants included individuals with advanced cancer and their caregivers who had participated in the ENABLE (Educate, Nurture, Advise, Before Life Ends) early palliative care telehealth randomized controlled trial (RCT) conducted in a National Cancer Institute-designated cancer center, a U.S. Department of Veterans Affairs medical center, and affiliated outreach clinics in rural New England. ENABLE included six weekly patient and three weekly family caregiver structured sessions. Participants watched the Looking Ahead PtDA prior to session 3, which covered content on decision making and advance care planning. Nurse coaches employed semistructured interviews to obtain feedback from consecutive patient and caregiver participants approximately one week after viewing the Looking Ahead PtDA program (booklet and DVD).
Results:
Between April 1, 2011, and October 31, 2012, 57 patients (mean age = 64), 42% of whom had lung and 23% gastrointestinal cancer, and 20 caregivers (mean age = 59), 80% of whom were spouses, completed the PtDA evaluation. Participants reported a high degree of satisfaction with the PtDA format, as well as with its length and clarity. They found the format of using patient interviews “validating.” The key themes were: (1) “the earlier the better” to view the PtDA; (2) feeling empowered, aware of different options, and an urgency to participate in advance care planning.
Significance of results:
The Looking Ahead PtDA was well received and helped patients with a serious illness realize the importance of prospective decision making in guiding their treatment pathways. We found that this PtDA can help seriously ill patients prior to the end of life to understand and discuss future healthcare decision making. However, systems to routinely provide PtDAs to seriously ill patients are yet not well developed.
The aim of the study was to evaluate the trends in respiratory syncytial virus-related hospitalisations and associated outcomes in children with haemodynamically significant heart disease in the United States of America.
Study design
The Kids’ Inpatient Databases (1997–2012) were used to estimate the incidence of respiratory syncytial virus hospitalisation among children ⩽24 months with or without haemodynamically significant heart disease. Weighted multivariable logistic regression and chi-square tests were used to evaluate the trends over time and factors associated with hospitalisation, comparing eras before and after publication of the 2003 American Academy of Pediatrics palivizumab immunoprophylaxis guidelines. Secondary outcomes included in-hospital mortality, morbidity, length of stay, and cost.
Results
Overall, 549,265 respiratory syncytial virus-related hospitalisations were evaluated, including 2518 (0.5%) in children with haemodynamically significant heart disease. The incidence of respiratory syncytial virus hospitalisation in children with haemodynamically significant heart disease decreased by 36% when comparing pre- and post-palivizumab guideline eras versus an 8% decline in children without haemodynamically significant heart disease (p<0.001). Children with haemodynamically significant heart disease had higher rates of respiratory syncytial virus-associated mortality (4.9 versus 0.1%, p<0.001) and morbidity (31.5 versus 3.5%, p<0.001) and longer hospital length of stay (17.9 versus 3.9 days, p<0.001) compared with children without haemodynamically significant heart disease. The mean cost of respiratory syncytial virus hospitalisation in 2009 was $58,166 (95% CI:$46,017, $70,315).
Conclusions
These data provide stakeholders with a means to evaluate the cost–utility of various immunoprophylaxis strategies.
The subsurface exploration of other planetary bodies can be used to unravel their geological history and assess their habitability. On Mars in particular, present-day habitable conditions may be restricted to the subsurface. Using a deep subsurface mine, we carried out a program of extraterrestrial analog research – MINe Analog Research (MINAR). MINAR aims to carry out the scientific study of the deep subsurface and test instrumentation designed for planetary surface exploration by investigating deep subsurface geology, whilst establishing the potential this technology has to be transferred into the mining industry. An integrated multi-instrument suite was used to investigate samples of representative evaporite minerals from a subsurface Permian evaporite sequence, in particular to assess mineral and elemental variations which provide small-scale regions of enhanced habitability. The instruments used were the Panoramic Camera emulator, Close-Up Imager, Raman spectrometer, Small Planetary Linear Impulse Tool, Ultrasonic drill and handheld X-ray diffraction (XRD). We present science results from the analog research and show that these instruments can be used to investigate in situ the geological context and mineralogical variations of a deep subsurface environment, and thus habitability, from millimetre to metre scales. We also show that these instruments are complementary. For example, the identification of primary evaporite minerals such as NaCl and KCl, which are difficult to detect by portable Raman spectrometers, can be accomplished with XRD. By contrast, Raman is highly effective at locating and detecting mineral inclusions in primary evaporite minerals. MINAR demonstrates the effective use of a deep subsurface environment for planetary instrument development, understanding the habitability of extreme deep subsurface environments on Earth and other planetary bodies, and advancing the use of space technology in economic mining.