To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A multi-disciplinary expert group met to discuss vitamin D deficiency in the UK, and strategies for improving population intakes and status. Changes to UK Government advice since the 1st Rank Forum on Vitamin D (2009) were discussed, including rationale for setting a RNI (10µg/day;400IU/day) for adults and children (4+ years). Current UK data show inadequate intakes among all age groups, and high prevalence of low vitamin D status among specific groups (e.g. pregnant women and adolescent males/females). Evidence of widespread deficiency within some minority ethnic groups, resulting in nutritional rickets (particularly among Black and South Asian infants), raised particular concern. It is too early to establish whether population vitamin D status has altered since Government recommendations changed in 2016. Vitamin D food fortification was discussed as a potential strategy to increase population intakes. Data from dose-response and dietary modelling studies indicate dairy products, bread, hens’ eggs and some meats as potential fortification vehicles. Vitamin D3 appears more effective than vitamin D2 for raising serum 25-hydroxyvitamin D concentration, which has implications for choice of fortificant. Other considerations for successful fortification strategies include: i) need for ‘real-world’ cost information for use in modelling work; ii) supportive food legislation; iii) improved consumer and health professional understanding of vitamin D’s importance; iv) clinical consequences of inadequate vitamin D status; v) consistent communication of Government advice across health/social care professions, and via the food industry. These areas urgently require further research to enable universal improvement in vitamin D intakes and status in the UK population.
Veganism has increased in popularity in the past decade and, despite being a characteristic protected by law, is often viewed negatively by the general population. Little is known about the attitudes of healthcare professionals despite the potential influence on practice and eating disorder patient care. This is one of the first studies to investigate attitudes toward veganism within specialist eating disorder, general mental health and other professionals.
A one-way ANOVA indicated all professionals held positive views toward veganism. General mental health professionals held statistically more positive veganism attitudes than specialist eating disorder and other professionals.
As one of the first studies to suggest eating disorder professionals are not biased against veganism, it has important clinical practice implications, particularly when exploring motivations for adopting a vegan diet (health, weight loss, environmental or animal welfare concerns) in patients with eating disorders. Implications for further research are provided.
Nucleobases are nitrogenous bases composed of monomers that are a major constituent of RNA and DNA, which are an essential part of any cellular life on the Earth. The search for nucleobases in the interstellar medium remains a major challenge, however, the recent detection of nucleobases in meteorite samples and laboratory synthesis in simulated analogue experiments have confirmed their abiotic origin and a possible route for their delivery to the Earth. Nevertheless, cellular life is based on the interacting network of complex structures, and there is substantial lack of information on the possible routes by which such ordered structures may be formed in the prebiotic environment. In the current study, we present the evidence for the synthesis of complex structures due to shock processing of nucleobases. The nucleobases were subjected to the reflected shock temperature of 3500–7000 K (estimated) and pressure of about 15–34 bar for over ~2 ms timescale. Under such extreme thermodynamic conditions, the nucleobases sample experiences superheating and subsequent cooling. Electron microscopic studies of shock processed residue show that nucleobases result in spontaneous formation of complex structures when subjected to extreme conditions of shock. These results suggest that impact shock processes might have contributed to the self-assembly of biologically relevant structures and the origin of life.
Diet is a modifiable risk factor for chronic disease and a potential modulator of telomere length (TL). The study aim was to investigate associations between diet quality and TL in Australian adults after a 12-week dietary intervention with an almond-enriched diet (AED). Participants (overweight/obese, 50–80 years) were randomised to an AED (n 62) or isoenergetic nut-free diet (NFD, n 62) for 12 weeks. Diet quality was assessed using a Dietary Guideline Index (DGI), applied to weighed food records, that consists of ten components reflecting adequacy, variety and quality of core food components and discretionary choices within the diet. TL was measured by quantitative PCR in samples of lymphocytes, neutrophils, and whole blood. There were no significant associations between DGI scores and TL at baseline. Diet quality improved with AED and decreased with NFD after 12 weeks (change from baseline AED + 9·8 %, NFD − 14·3 %; P < 0·001). TL increased in neutrophils (+9·6 bp, P = 0·009) and decreased in whole blood, to a trivial extent (–12·1 bp, P = 0·001), and was unchanged in lymphocytes. Changes did not differ between intervention groups. There were no significant relationships between changes in diet quality scores and changes in lymphocyte, neutrophil or whole blood TL. The inclusion of almonds in the diet improved diet quality scores but had no impact on TL mid-age to older Australian adults. Future studies should investigate the impact of more substantial dietary changes over longer periods of time.
To determine whether cascade reporting is associated with a change in meropenem and fluoroquinolone consumption.
A quasi-experimental study was conducted using an interrupted time series to compare antimicrobial consumption before and after the implementation of cascade reporting.
A 399-bed, tertiary-care, Veterans’ Affairs medical center.
Antimicrobial consumption data across 8 inpatient units were extracted from the Center for Disease Control and Prevention (CDC) National Health Safety Network (NHSN) antimicrobial use (AU) module from April 2017 through March 2019, reported as antimicrobial days of therapy (DOT) per 1,000 days present (DP).
Cascade reporting is a strategy of reporting antimicrobial susceptibility test results in which secondary agents are only reported if an organism is resistant to primary, narrow-spectrum agents. A multidisciplinary team developed cascade reporting algorithms for gram-negative bacteria based on local antibiogram and infectious diseases practice guidelines, aimed at restricting the use of fluoroquinolones and carbapenems. The algorithms were implemented in March 2018.
Following the implementation of cascade reporting, mean monthly meropenem (P =.005) and piperacillin/tazobactam (P = .002) consumption decreased and cefepime consumption increased (P < .001). Ciprofloxacin consumption decreased by 2.16 DOT per 1,000 DP per month (SE, 0.25; P < .001). Clostridioides difficile rates did not significantly change.
Ciprofloxacin consumption significantly decreased after the implementation of cascade reporting. Mean meropenem consumption decreased after cascade reporting was implemented, but we observed no significant change in the slope of consumption. cascade reporting may be a useful strategy to optimize antimicrobial prescribing.
During the Randomized Assessment of Rapid Endovascular Treatment (EVT) of Ischemic Stroke (ESCAPE) trial, patient-level micro-costing data were collected. We report a cost-effectiveness analysis of EVT, using ESCAPE trial data and Markov simulation, from a universal, single-payer system using a societal perspective over a patient’s lifetime.
Primary data collection alongside the ESCAPE trial provided a 3-month trial-specific, non-model, based cost per quality-adjusted life year (QALY). A Markov model utilizing ongoing lifetime costs and life expectancy from the literature was built to simulate the cost per QALY adopting a lifetime horizon. Health states were defined using the modified Rankin Scale (mRS) scores. Uncertainty was explored using scenario analysis and probabilistic sensitivity analysis.
The 3-month trial-based analysis resulted in a cost per QALY of $201,243 of EVT compared to the best standard of care. In the model-based analysis, using a societal perspective and a lifetime horizon, EVT dominated the standard of care; EVT was both more effective and less costly than the standard of care (−$91). When the time horizon was shortened to 1 year, EVT remains cost savings compared to standard of care (∼$15,376 per QALY gained with EVT). However, if the estimate of clinical effectiveness is 4% less than that demonstrated in ESCAPE, EVT is no longer cost savings compared to standard of care.
Results support the adoption of EVT as a treatment option for acute ischemic stroke, as the increase in costs associated with caring for EVT patients was recouped within the first year of stroke, and continued to provide cost savings over a patient’s lifetime.
Ice shelves restrain flow from the Greenland and Antarctic ice sheets. Climate-ocean warming could force thinning or collapse of floating ice shelves and subsequently accelerate flow, increase ice discharge and raise global mean sea levels. Petermann Glacier (PG), northwest Greenland, recently lost large sections of its ice shelf, but its response to total ice shelf loss in the future remains uncertain. Here, we use the ice flow model Úa to assess the sensitivity of PG to changes in ice shelf extent, and to estimate the resultant loss of grounded ice and contribution to sea level rise. Our results have shown that under several scenarios of ice shelf thinning and retreat, removal of the shelf will not contribute substantially to global mean sea level (<1 mm). We hypothesize that grounded ice loss was limited by the stabilization of the grounding line at a topographic high ~12 km inland of its current grounding line position. Further inland, the likelihood of a narrow fjord that slopes seawards suggests that PG is likely to remain insensitive to terminus changes in the near future.
The mental health of third-level students is of major societal concern with the gap between the demand for services and supports offered at crisis level. In Ireland, similar to elsewhere, colleges have responded to this need in vastly differing ways, with student counselling services available to all institutions, and student health departments and sessional psychiatry in some of the larger institutions, with none operating as a single multidisciplinary service. There is an increasing recognition for a more systematised approach, with the establishment of International Networks, Charters and Frameworks. These advocate for a whole institutional approach to student mental health, in addition to the development of an integrated system of supports with effective pathways to appropriate care. This paper, by members of the Youth and Student Special Interest Group of the College of Psychiatrists of Ireland, contextualises student mental health currently and describes future directions for this emerging field. It is a call to action to develop a structure that supports the needs of students with mental health problems across the full range of the spectrum from mild to severe.
Hamilcar and Hannibal Barca embody a colossal father-son military legacy. Yet their family – the so-called ‘Barcid’ dynasty – has a murky history. Modern scholars have presumed that Hamilcar, the first notable historical figure to bear the name Barkas, received it as a ‘nickname’ meaning ‘lightning’. The rationale is that the name derives from the Phoenician word brq and is thus the equivalent to the Greek epithet Keraunos. There is, however, no evidence in our classical sources, to which exclusively we owe our knowledge of events, supporting this. Furthermore, the name Barca was passed on to Hamilcar’s sons, something suggestive of an inherited family surname. This article submits an alternative to the widely endorsed ‘lightning’ theory. This new perspective explores the possibility that the Barcid dynasty had roots in the city of Barce in Cyrenaica and was a relatively new addition to the Carthaginian aristocracy in the third century BC. Using textual evidence from Polybius, Diodorus and others, this fresh take clarifies other aspects of the Barcid dynasty’s tumultuous history, such as their animosity towards the Carthaginian Council of Elders and their departure to Spain in the 220s.
Acute ischemic stroke may affect women and men differently. We aimed to evaluate sex differences in outcomes of endovascular treatment (EVT) for ischemic stroke due to large vessel occlusion in a population-based study in Alberta, Canada.
Methods and Results:
Over a 3-year period (April 2015–March 2018), 576 patients fit the inclusion criteria of our study and constituted the EVT group of our analysis. The medical treatment group of the ESCAPE trial had 150 patients. Thus, our total sample size was 726. We captured outcomes in clinical routine using administrative data and a linked database methodology. The primary outcome of our study was home-time. Home-time refers to the number of days that the patient was back at their premorbid living situation without an increase in the level of care within 90 days of the index stroke event. In adjusted analysis, EVT was associated with an increase of 90-day home-time by an average of 6.08 (95% CI −2.74–14.89, p-value 0.177) days in women compared to an average of 11.20 (95% CI 1.94–20.46, p-value 0.018) days in men. Further analysis revealed that the association between EVT and 90-day home-time in women was confounded by age and onset-to-treatment time.
We found a nonsignificant nominal reduction of 90-day home-time gain for women compared to men in this province-wide population-based study of EVT for large vessel occlusion, which was only partially explained by confounding.
Water-filled boreholes in cold ice refreeze in hours to days, and prior attempts to keep them open with antifreeze resulted in a plug of slush effectively freezing the hole even faster. Thus, antifreeze as a method to stabilize hot-water boreholes has largely been abandoned. In the hot-point drilling case, no external water is added to the hole during drilling, so earlier antifreeze injection is possible while the drill continues melting downward. Here, we use a cylindrical Stefan model to explore slush formation within the parameter space representative of hot-point drilling. We find that earlier injection timing creates an opportunity to avoid slush entirely by injecting sufficient antifreeze to dissolve the hole past the drilled radius. As in the case of hot-water drilling, the alternative is to force mixing in the hole after antifreeze injection to ensure that ice refreezes onto the borehole wall instead of within the solution as slush.
With human influences driving populations of apex predators into decline, more information is required on how factors affect species at national and global scales. However, camera-trap studies are seldom executed at a broad spatial scale. We demonstrate how uniting fine-scale studies and utilizing camera-trap data of non-target species is an effective approach for broadscale assessments through a case study of the brown hyaena Parahyaena brunnea. We collated camera-trap data from 25 protected and unprotected sites across South Africa into the largest detection/non-detection dataset collected on the brown hyaena, and investigated the influence of biological and anthropogenic factors on brown hyaena occupancy. Spatial autocorrelation had a significant effect on the data, and was corrected using a Bayesian Gibbs sampler. We show that brown hyaena occupancy is driven by specific co-occurring apex predator species and human disturbance. The relative abundance of spotted hyaenas Crocuta crocuta and people on foot had a negative effect on brown hyaena occupancy, whereas the relative abundance of leopards Panthera pardus and vehicles had a positive influence. We estimated that brown hyaenas occur across 66% of the surveyed camera-trap station sites. Occupancy varied geographically, with lower estimates in eastern and southern South Africa. Our findings suggest that brown hyaena conservation is dependent upon a multi-species approach focussed on implementing conservation policies that better facilitate coexistence between people and hyaenas. We also validate the conservation value of pooling fine-scale datasets and utilizing bycatch data to examine species trends at broad spatial scales.
Background: Updated IDSA-SHEA guidelines recommend different diagnostic approaches to C. difficile depending on whether There are pre-agreed institutional criteria for patient stool submission. If stool submission criteria are in place, nucleic acid amplification testing (NAAT) alone may be used. If not, a multistep algorithm is suggested, incorporating various combinations of toxin enzyme immunoassay (EIA), glutamate dehydrogenase (GDH), and NAAT, with discordant results adjudicated by NAAT. At our institution, we developed a multistep algorithm leading with NAAT with reflex to EIA for toxin testing if NAAT is positive. This algorithm resulted in a significant proportion of patients with discordant results (NAAT positive and toxin EIA negative) that some experts have categorized as possible carriers or C. difficile colonized. In this study, we describe the impact of a multistep algorithm on hospital-onset, community-onset, and healthcare-facility–associated C. difficile infection (HO-CDI, CO-CDI, and HFA-CDI, respectively) rates and the management of possible carriers. Methods: The study setting was a 399-bed, tertiary-care VA Medical Center in Richmond, Virginia. A retrospective chart review was conducted. The multistep C. difficile testing algorithm was implemented June 4, 2019 (Fig. 1). C. difficile testing results and possible carriers were reviewed for the 5 months before and 4 months after implementation (January 2019 to September 2019). Results: In total, 587 NAATs were performed in the inpatient and outpatient setting (mean, 58.7 per month). Overall, 123 NAATs (21%) were positive: 59 in the preintervention period and 63 in the postintervention period. In the postintervention period, 23 positive NAATs (26%) had a positive toxin EIA. Based on LabID events, the mean rate of HO+CO+HCFA CDI cases per 10,000 bed days of care (BDOC) decreased significantly from 9.49 in the preintervention period to 1.15 in the postintervention period (P = .019) (Fig. 2). Also, 9 of the possible carriers (22%) were treated for CDI based on high clinical suspicion, and 6 of the possible carriers (14%) had a previous history of CDI. Of these, 5 (83%) were treated for CDI. In addition, 1 patient (2%) converted from possible carrier to positive toxin EIA within 14 days. The infectious diseases team was consulted for 11 possible carriers (27%). Conclusions: Implementation of a 2-step C difficile algorithm leading with NAAT was associated with a lower rate of HO+CO+HCFA CDI per 10,000 BDOC. A considerable proportion (22%) of possible carriers were treated for CDI but did not count as LabID events. Only 2% of the possible carriers in our study converted to a positive toxin EIA.
The review aimed to identify factors influencing opioid prescribing as regular pain-management medication for older people.
Chronic pain occurs in 45%–85% of older people, but appears to be under-recognised and under-treated. However, strong opiate prescribing is more prevalent in older people, increasing at the fastest rate in this age group.
This review included all study types, published 1990–2017, which focused on opioid prescribing for pain management among older adults. Arksey and O’Malley’s framework was used to scope the literature. PubMed, EBSCO Host, the UK Drug Database, and Google Scholar were searched. Data extraction, carried out by two researchers, included factors explaining opioid prescribing patterns and prescribing trends.
A total of 613 papers were identified and 53 were included in the final review consisting of 35 research papers, 10 opinion pieces and 8 grey literature sources. Factors associated with prescribing patterns were categorised according to whether they were patient-related, prescriber-driven, or system-driven. Patient factors included age, gender, race, and cognition; prescriber factors included attitudes towards opioids and judgements about ‘normal’ pain; and policy/system factors related to the changing policy landscape over the last three decades, particularly in the USA.
A large number of context-dependent factors appeared to influence opioid prescribing for chronic pain management in older adults, but the findings were inconsistent. There is a gap in the literature relating to the UK healthcare system; the prescriber and the patient perspective; and within the context of multi-morbidity and treatment burden.
When it comes to electing the chief executive of the United States, the presidential debates play an important role in shaping public opinion and the choices facing voters. Having a fair process in place to determine who is eligible to participate in the debates and to guarantee that the debates are conducted neutrally is crucial to ensuring the integrity of the electoral process as a whole. In the past, controversies have arisen concerning which candidates should be invited to participate, which political parties should be represented, and whether the debates have been conducted in a way that is fair and neutral. Most of these controversies have never been resolved satisfactorily. Today, much more work needs to be done to ensure that our presidential primary and general election debates live up to their potential to provide truly diverse policy views to the public and are conducted in a manner that is wholly free from bias. Gender bias in terms of the questions asked of the candidates was evident in 2016, and other kinds of biases may appear in the future. Problematically, the eligibility rules for the general presidential debates have remained unchanged for decades. Meanwhile, government oversight of the debates remains virtually non-existent.
The Eating Assessment in Toddlers FFQ (EAT FFQ) has been shown to have good reliability and comparative validity for ranking nutrient intakes in young children. With the addition of food items (n 4), we aimed to re-assess the validity of the EAT FFQ and estimate calibration factors in a sub-sample of children (n 97) participating in the Growing Up Milk – Lite (GUMLi) randomised control trial (2015–2017). Participants completed the ninety-nine-item GUMLi EAT FFQ and record-assisted 24-h recalls (24HR) on two occasions. Energy and nutrient intakes were assessed at months 9 and 12 post-randomisation and calibration factors calculated to determine predicted estimates from the GUMLi EAT FFQ. Validity was assessed using Pearson correlation coefficients, weighted kappa (κ) and exact quartile categorisation. Calibration was calculated using linear regression models on 24HR, adjusted for sex and treatment group. Nutrient intakes were significantly correlated between the GUMLi EAT FFQ and 24HR at both time points. Energy-adjusted, de-attenuated Pearson correlations ranged from 0·3 (fibre) to 0·8 (Fe) at 9 months and from 0·3 (Ca) to 0·7 (Fe) at 12 months. Weighted κ for the quartiles ranged from 0·2 (Zn) to 0·6 (Fe) at 9 months and from 0·1 (total fat) to 0·5 (Fe) at 12 months. Exact agreement ranged from 30 to 74 %. Calibration factors predicted up to 56 % of the variation in the 24HR at 9 months and 44 % at 12 months. The GUMLi EAT FFQ remained a useful tool for ranking nutrient intakes with similar estimated validity compared with other FFQ used in children under 2 years.
Decisions to treat large-vessel occlusion with endovascular therapy (EVT) or intravenous alteplase depend on how physicians weigh benefits against risks when considering patients’ comorbidities. We explored EVT/alteplase decision-making by stroke experts in the setting of comorbidity/disability.
In an international multi-disciplinary survey, experts chose treatment approaches under current resources and under assumed ideal conditions for 10 of 22 randomly assigned case scenarios. Five included comorbidities (cancer, cardiac/respiratory/renal disease, mild cognitive impairment [MCI], physical dependence). We examined scenario/respondent characteristics associated with EVT/alteplase decisions using multivariable logistic regressions.
Among 607 physicians (38 countries), EVT was chosen less often in comorbidity-related scenarios (79.6% under current resources, 82.7% assuming ideal conditions) versus six “level-1A” scenarios for which EVT/alteplase was clearly indicated by current guidelines (91.1% and 95.1%, respectively, odds ratio [OR] [current resources]: 0.38, 95% confidence interval 0.31–0.47). However, EVT was chosen more often in comorbidity-related scenarios compared to all other 17 scenarios (79.6% versus 74.4% under current resources, OR: 1.34, 1.17–1.54). Responses favoring alteplase for comorbidity-related scenarios (e.g. 75.0% under current resources) were comparable to level-1A scenarios (72.2%) and higher than all others (60.4%). No comorbidity independently diminished EVT odds when considering all scenarios. MCI and dependence carried higher alteplase odds; cancer and cardiac/respiratory/renal disease had lower odds. Being older/female carried lower EVT odds. Relevant respondent characteristics included performing more EVT cases/year (higher EVT-, lower alteplase odds), practicing in East Asia (higher EVT odds), and in interventional neuroradiology (lower alteplase odds vs neurology).
Moderate-to-severe comorbidities did not consistently deter experts from EVT, suggesting equipoise about withholding EVT based on comorbidities. However, alteplase was often foregone when respondents chose EVT. Differences in decision-making by patient age/sex merit further study.
Characterizing non-lethal damage within dry seeds may allow us to detect early signs of ageing and accurately predict longevity. We compared RNA degradation and viability loss in seeds exposed to stressful conditions to quantify relationships between degradation rates and stress intensity or duration. We subjected recently harvested (‘fresh’) ‘Williams 82’ soya bean seeds to moisture, temperature and oxidative stresses, and measured time to 50% viability (P50) and rate of RNA degradation, the former using standard germination assays and the latter using RNA Integrity Number (RIN). RIN values from fresh seeds were also compared with those from accessions of the same cultivar harvested in the 1980s and 1990s and stored in the refrigerator (5°C), freezer (−18°C) or in vapour above liquid nitrogen (−176°C). Rates of viability loss (P50−1) and RNA degradation (RIN⋅d−1) were highly correlated in soya bean seeds that were exposed to a broad range of temperatures [holding relative humidity (RH) constant at about 30%]. However, the correlation weakened when fresh seeds were maintained at high RH (holding temperature constant at 35°C) or exposed to oxidizing agents. Both P50−1 and RIN⋅d−1 parameters exhibited breaks in Arrhenius behaviour near 50°C, suggesting that constrained molecular mobility regulates degradation kinetics of dry systems. We conclude that the kinetics of ageing reactions at RH near 30% can be simulated by temperatures up to 50°C and that RNA degradation can indicate ageing prior to and independent of seed death.
OBJECTIVES/GOALS: Access to pediatric subspecialty care varies by sociodemographic factors. Providers for gender diverse youth (GDY) are rare, and GDY face health disparities, stigma, and discrimination. We examined the association between GDY access to medical and mental health care and rurality, race, parental education, and other GDY-specific factors. METHODS/STUDY POPULATION: We surveyed parents of GDY (<18 years old) across the United States. Participants were recruited through online communities and listserves specific to parents of GDY. We determined associations between access to gender-specific medical or mental health providers and rurality, race, parental education, as well as other GDY-specific factors including age, time since telling their parent their gender identity, parent-adolescent communication, parent stress, and gender identity using chi-square or Fisher’s exact tests. We calculated adjusted odds ratios using logistic regression models. RESULTS/ANTICIPATED RESULTS: We surveyed 166 parents and caregivers from 31 states. The majority (73.2%) identified as white, 66.5% had earned a bachelor’s degree or higher, and 7.6% lived in a zip code designated rural by the Federal Office of Rural Health Policy. We found no evidence of association between reported GDY access to medical or mental health care and race, parental education, or rurality. We did find a significant univariate association between access to mental health care and feminine (either female or transfeminine/transfemale) gender identity (p = 0.033, OR 2.60, 95% CI 1.06 – 6.36). After controlling for parent-adolescent communication in a backwards elimination logistic regression model, it was no longer significant (p = 0.137, OR 2.05, 95% CI 0.80 – 5.25). DISCUSSION/SIGNIFICANCE OF IMPACT: Despite rurality, race, and parental education impacting access to pediatric subspecialty care, we failed to find these associations among GDY accessing gender care. There is a need to better understand structural and societal barriers to care for this population including the impact of stigma and discrimination.