We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Mica particles approximately 10 or 25 mm square and 0.5 mm thick were placed in NaCl-NaTPB solutions to make visual observations of the changes that occur in micas when the interlayer K is replaced by Na. Samples of muscovite, biotite, phlogopite, lepidolite, and lepidomelane were used, and the effects of different degradation periods were photographed.
An increase in the thickness of the particles due to basal planes splitting apart was observed with all micas. This exfoliation released interlayer K and in some cases caused the particles to cleave into separate flakes. Lepidomelane particles remained intact despite a 20-fold increase in thickness in 7 days. Even muscovite and lepidolite exfoliated and cleaved, but much longer degradation periods were needed.
There was a distinct change in the color of the dark biotite, phlogopite and lepidomelane particles when K was removed. Therefore, the initial stages of K depletion at holes, scratches, and edges of the particles were easily followed. As the degradation of the mica particles progressed, however, the color of the mica became a less reliable index of the stage of K depletion. Visual evidence of K depletion at the edges of particles was also obtained with muscovite, but not with lepidolite.
Transverse sections of 25-mm particles of K-depleted biotite were photographed to show the edge expansion that occurred when interlayer K was replaced by Na.
Interlayer K in muscovite, biotite, phlogopite, illite and vermiculite-hydrobiotite samples was replaced by cation exchange with Na. The rate and amount of exchange varied with the mineral and the level of K in solution.
Essentially, all the K in muscovite, biotite, phlogopite and vermiculite was exchangeable when the mass-action effect of the replaced KT was reduced by maintaining a very low level of K in solution. The time required for this exchange varied from < 10 hr with vermiculite to > 45 weeks with muscovite. Only 66% of the K in the illite was exchangeable under these conditions. When the replaced K was allowed to accumulate in the solution, the amount of exchange was determined by the level of K in solution required for equilibrium. These levels decreased with the degree of K-depletion and with the selectivity of the mica for K. The order of selectivity was muscovite > illite > biotite > phlogopite > vermiculite. Decreasing the K in solution from 10 to 7 ppm increased the exchangeable K in biotite from 30 to 100%. A K level of only 0.1 ppm restricted the exchange of K in muscovite to 17%.
A decrease in layer charge was not required for K exchange, but a decrease did occur in K-depleted biotite and vermiculite. Muscovite with the highest layer charge (247 meq/100 g), least expansion with Na (12.3Å), and least sensitivity to solution pH had the highest selectivity for K and the slowest rate of exchange. The K in vermiculite was the most readily exchangeable.
Samples of several naturally fine-grained micaceous minerals were heated at 450°C for 24 hr (after the effects of other temperatures and heating periods were evaluated with the < 2 μm fraction of Grun-dite) and then characterized in terms of their release of K to NaCl-NaTPB (sodium tetraphenylboron) solutions and other potentially related properties.
This heat treatment produced a substantial increase in the amount of K that each mineral released when first placed in the NaCl-NaTPB solution (the greatest increase being 22 m-equiv K/100 g in Marblehead illite). Depending upon the mineral heated, the subsequent rate of K release was increased, decreased or unchanged. Also, all the minerals except glauconite exhibited an increase (ranging from 4 to 38 m-equiv K/100 g) in their maximum degree of K release if they were heated. Thus, it was established that the K release behavior of these minerals is not only subject to appreciable alteration by heat treatments but is altered in a manner that varies with the mineral. The nature of these alterations, however, did not clearly identify an involvement of the other mineral properties that were examined. An increase in NH4- and Cs-exchangeable K occurred when these minerals were heated—presumably as a result of exfoliation. With Morris illite samples, this increase was nearly 28 m-equiv 100 g. Thus, heated samples of these minerals may be useful sinks for the removal of NH4 and Cs in various wastes.
Knowledge of sex differences in risk factors for posttraumatic stress disorder (PTSD) can contribute to the development of refined preventive interventions. Therefore, the aim of this study was to examine if women and men differ in their vulnerability to risk factors for PTSD.
Methods
As part of the longitudinal AURORA study, 2924 patients seeking emergency department (ED) treatment in the acute aftermath of trauma provided self-report assessments of pre- peri- and post-traumatic risk factors, as well as 3-month PTSD severity. We systematically examined sex-dependent effects of 16 risk factors that have previously been hypothesized to show different associations with PTSD severity in women and men.
Results
Women reported higher PTSD severity at 3-months post-trauma. Z-score comparisons indicated that for five of the 16 examined risk factors the association with 3-month PTSD severity was stronger in men than in women. In multivariable models, interaction effects with sex were observed for pre-traumatic anxiety symptoms, and acute dissociative symptoms; both showed stronger associations with PTSD in men than in women. Subgroup analyses suggested trauma type-conditional effects.
Conclusions
Our findings indicate mechanisms to which men might be particularly vulnerable, demonstrating that known PTSD risk factors might behave differently in women and men. Analyses did not identify any risk factors to which women were more vulnerable than men, pointing toward further mechanisms to explain women's higher PTSD risk. Our study illustrates the need for a more systematic examination of sex differences in contributors to PTSD severity after trauma, which may inform refined preventive interventions.
Understanding characteristics of healthcare personnel (HCP) with SARS-CoV-2 infection supports the development and prioritization of interventions to protect this important workforce. We report detailed characteristics of HCP who tested positive for SARS-CoV-2 from April 20, 2020 through December 31, 2021.
Methods:
CDC collaborated with Emerging Infections Program sites in 10 states to interview HCP with SARS-CoV-2 infection (case-HCP) about their demographics, underlying medical conditions, healthcare roles, exposures, personal protective equipment (PPE) use, and COVID-19 vaccination status. We grouped case-HCP by healthcare role. To describe residential social vulnerability, we merged geocoded HCP residential addresses with CDC/ATSDR Social Vulnerability Index (SVI) values at the census tract level. We defined highest and lowest SVI quartiles as high and low social vulnerability, respectively.
Results:
Our analysis included 7,531 case-HCP. Most case-HCP with roles as certified nursing assistant (CNA) (444, 61.3%), medical assistant (252, 65.3%), or home healthcare worker (HHW) (225, 59.5%) reported their race and ethnicity as either non-Hispanic Black or Hispanic. More than one third of HHWs (166, 45.2%), CNAs (283, 41.7%), and medical assistants (138, 37.9%) reported a residential address in the high social vulnerability category. The proportion of case-HCP who reported using recommended PPE at all times when caring for patients with COVID-19 was lowest among HHWs compared with other roles.
Conclusions:
To mitigate SARS-CoV-2 infection risk in healthcare settings, infection prevention, and control interventions should be specific to HCP roles and educational backgrounds. Additional interventions are needed to address high social vulnerability among HHWs, CNAs, and medical assistants.
Identifying long-term care facility (LTCF)-exposed inpatients is important for infection control research and practice, but ascertaining LTCF exposure is challenging. Across a large validation study, electronic health record data fields identified 76% of LTCF-exposed patients compared to manual chart review.
Cohort studies demonstrate that people who later develop schizophrenia, on average, present with mild cognitive deficits in childhood and endure a decline in adolescence and adulthood. Yet, tremendous heterogeneity exists during the course of psychotic disorders, including the prodromal period. Individuals identified to be in this period (known as CHR-P) are at heightened risk for developing psychosis (~35%) and begin to exhibit cognitive deficits. Cognitive impairments in CHR-P (as a singular group) appear to be relatively stable or ameliorate over time. A sizeable proportion has been described to decline on measures related to processing speed or verbal learning. The purpose of this analysis is to use data-driven approaches to identify latent subgroups among CHR-P based on cognitive trajectories. This will yield a clearer understanding of the timing and presentation of both general and domain-specific deficits.
Participants and Methods:
Participants included 684 young people at CHR-P (ages 12–35) from the second cohort of the North American Prodromal Longitudinal Study. Performance on the MATRICS Consensus Cognitive Battery (MCCB) and the Wechsler Abbreviated Scale of Intelligence (WASI-I) was assessed at baseline, 12-, and 24-months. Tested MCCB domains include verbal learning, speed of processing, working memory, and reasoning & problem-solving. Sex- and age-based norms were utilized. The Oral Reading subtest on the Wide Range Achievement Test (WRAT4) indexed pre-morbid IQ at baseline. Latent class mixture models were used to identify distinct trajectories of cognitive performance across two years. One- to 5-class solutions were compared to decide the best solution. This determination depended on goodness-of-fit metrics, interpretability of latent trajectories, and proportion of subgroup membership (>5%).
Results:
A one-class solution was found for WASI-I Full-Scale IQ, as people at CHR-P predominantly demonstrated an average IQ that increased gradually over time. For individual domains, one-class solutions also best fit the trajectories for speed of processing, verbal learning, and working memory domains. Two distinct subgroups were identified on one of the executive functioning domains, reasoning and problem-solving (NAB Mazes). The sample divided into unimpaired performance with mild improvement over time (Class I, 74%) and persistent performance two standard deviations below average (Class II, 26%). Between these classes, no significant differences were found for biological sex, age, years of education, or likelihood of conversion to psychosis (OR = 1.68, 95% CI 0.86 to 3.14). Individuals assigned to Class II did demonstrate a lower WASI-I IQ at baseline (96.3 vs. 106.3) and a lower premorbid IQ (100.8 vs. 106.2).
Conclusions:
Youth at CHR-P demonstrate relatively homogeneous trajectories across time in terms of general cognition and most individual domains. In contrast, two distinct subgroups were observed with higher cognitive skills involving planning and foresight, and they notably exist independent of conversion outcome. Overall, these findings replicate and extend results from a recently published latent class analysis that examined 12-month trajectories among CHR-P using a different cognitive battery (Allott et al., 2022). Findings inform which individuals at CHR-P may be most likely to benefit from cognitive remediation and can inform about the substrates of deficits by establishing meaningful subtypes.
Late Life Major Depressive Disorder (LLD) and Hoarding Disorder (HD) are common in older adults with prevalence estimates up to 29% and 7%, respectively. Both LLD and HD are characterized by executive dysfunction and disability. There is evidence of overlapping neurobiological dysfunction in LLD and HD suggesting potential for compounded executive dysfunction and disability in the context of comorbid HD and LLD. Yet, prevalence of HD in primary presenting LLD has not been examined and potential compounded impact on executive functioning, disability, and treatment response remains unknown. Thus, the present study aimed to determine the prevalence of co-occurring HD in primary presenting LLD and examine hoarding symptom severity as a contributor to executive dysfunction, disability, and response to treatment for LLD.
Participants and Methods:
Eighty-three adults ages 65-90 participating in a psychotherapy study for LLD completed measures of hoarding symptom severity (Savings Inventory-Revised: SI-R), executive functioning (WAIS-IV Digit Span, Letter-Number Sequencing, Coding; Stroop Interference; Trail Making Test-Part B; Letter Fluency), functional ability (World Health Organization Disability Assessment Schedule-II-Short), and depression severity (Hamilton Depression Rating Scale) at post-treatment. Pearson's Chi-squared tests evaluated group differences in cognitive and functional impairment rates and depression treatment response between participants with (HD+LLD) and without (LLD-only) clinically significant hoarding symptoms. Linear regressions were used to examine the association between hoarding symptom severity and executive function performance and functional ability and included as covariates participant age, years of education, gender, and concurrent depression severity.
Results:
At post-treatment, 24.1% (20/83) of participants with LLD met criteria for clinically significant hoarding symptoms (SI-R.41). Relative to LLD-only, the LLD+HD group demonstrated greater impairment rates in Letter-Number Sequencing (χ2(1)=4.0, p=.045) and Stroop Interference (χ2(1)=4.8, p=.028). Greater hoarding symptom severity was associated with poorer executive functioning performance on Digit Span (t(71)=-2.4, β=-0.07, p=.019), Letter-Number Sequencing (t(70)=-2.1, β=-0.05, p=.044), and Letter Fluency (t(71)=-2.8, β=-0.24, p=.006). Rates of functional impairment were significantly higher in the LLD+HD (88.0%) group compared to the LLD-only (62.3%) group, (χ2(1)=5.41, p=.020). Additionally, higher hoarding symptom severity was related to greater disability (t(72)=2.97, β=0.13, p=.004). Furthermore, depression treatment response rates were significantly lower in the LLD+HD group at 24.0% (6/25) compared to 48.3% (28/58) in the LLD-only group, χ2(1)=4.26, p=.039.
Conclusions:
The present study is among the first to report prevalence of clinically significant hoarding symptoms in primary presenting LLD. The findings of 24.1% co-occurrence of HD in primary presenting LLD and increased burden on executive functioning, disability, and depression treatment outcomes have important implications for intervention and prevention efforts. Hoarding symptoms are likely under-evaluated, and thus may be overlooked, in clinical settings where LLD is identified as the primary diagnosis. Taken together with results indicating poorer depression treatment response in LLD+HD, these findings underscore the need for increased screening of hoarding behaviors in LLD and tailored interventions for this LLD+HD group. Future work examining the course of hoarding symptomatology in LLD (e.g., onset age of hoarding behaviors) may provide insights into the mechanisms associated with greater executive dysfunction and disability.
The GINI project investigates the dynamics of inequality among populations over the long term by synthesising global archaeological housing data. This project brings archaeologists together from around the world to assess hypotheses concerning the causes and consequences of inequality that are of relevance to contemporary societies globally.
Emergency departments are high-risk settings for severe acute respiratory coronavirus virus 2 (SARS-CoV-2) surface contamination. Environmental surface samples were obtained in rooms with patients suspected of having COVID-19 who did or did not undergo aerosol-generating procedures (AGPs). SARS-CoV-2 RNA surface contamination was most frequent in rooms occupied by coronavirus disease 2019 (COVID-19) patients who received no AGPs.
Clinical implementation of risk calculator models in the clinical high-risk for psychosis (CHR-P) population has been hindered by heterogeneous risk distributions across study cohorts which could be attributed to pre-ascertainment illness progression. To examine this, we tested whether the duration of attenuated psychotic symptom (APS) worsening prior to baseline moderated performance of the North American prodrome longitudinal study 2 (NAPLS2) risk calculator. We also examined whether rates of cortical thinning, another marker of illness progression, bolstered clinical prediction models.
Methods
Participants from both the NAPLS2 and NAPLS3 samples were classified as either ‘long’ or ‘short’ symptom duration based on time since APS increase prior to baseline. The NAPLS2 risk calculator model was applied to each of these groups. In a subset of NAPLS3 participants who completed follow-up magnetic resonance imaging scans, change in cortical thickness was combined with the individual risk score to predict conversion to psychosis.
Results
The risk calculator models achieved similar performance across the combined NAPLS2/NAPLS3 sample [area under the curve (AUC) = 0.69], the long duration group (AUC = 0.71), and the short duration group (AUC = 0.71). The shorter duration group was younger and had higher baseline APS than the longer duration group. The addition of cortical thinning improved the prediction of conversion significantly for the short duration group (AUC = 0.84), with a moderate improvement in prediction for the longer duration group (AUC = 0.78).
Conclusions
These results suggest that early illness progression differs among CHR-P patients, is detectable with both clinical and neuroimaging measures, and could play an essential role in the prediction of clinical outcomes.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Several hypotheses may explain the association between substance use, posttraumatic stress disorder (PTSD), and depression. However, few studies have utilized a large multisite dataset to understand this complex relationship. Our study assessed the relationship between alcohol and cannabis use trajectories and PTSD and depression symptoms across 3 months in recently trauma-exposed civilians.
Methods
In total, 1618 (1037 female) participants provided self-report data on past 30-day alcohol and cannabis use and PTSD and depression symptoms during their emergency department (baseline) visit. We reassessed participant's substance use and clinical symptoms 2, 8, and 12 weeks posttrauma. Latent class mixture modeling determined alcohol and cannabis use trajectories in the sample. Changes in PTSD and depression symptoms were assessed across alcohol and cannabis use trajectories via a mixed-model repeated-measures analysis of variance.
Results
Three trajectory classes (low, high, increasing use) provided the best model fit for alcohol and cannabis use. The low alcohol use class exhibited lower PTSD symptoms at baseline than the high use class; the low cannabis use class exhibited lower PTSD and depression symptoms at baseline than the high and increasing use classes; these symptoms greatly increased at week 8 and declined at week 12. Participants who already use alcohol and cannabis exhibited greater PTSD and depression symptoms at baseline that increased at week 8 with a decrease in symptoms at week 12.
Conclusions
Our findings suggest that alcohol and cannabis use trajectories are associated with the intensity of posttrauma psychopathology. These findings could potentially inform the timing of therapeutic strategies.
The spatial distribution of in situ sessile organisms, including those from the fossil record, provides information about life histories, such as possible dispersal and/or settlement mechanisms, and how taxa interact with one another and their local environments. At Nilpena Ediacara National Park (NENP), South Australia, the exquisite preservation and excavation of 33 fossiliferous bedding planes from the Ediacara Member of the Rawnsley Quartzite reveals in situ communities of the Ediacara Biota. Here, the spatial distributions of three relatively common taxa, Tribrachidium, Rugoconites, and Obamus, occurring on excavated surfaces were analyzed using spatial point pattern analysis. Tribrachidium have a variable spatial distribution, implying that settlement or post-settlement conditions/preferences had an effect on populations. Rugoconites display aggregation, possibly related to their reproductive methods in combination with settlement location availability at the time of dispersal and/or settlement. Additionally, post-settlement environmental controls could have affected Rugoconites on other surfaces, resulting in lower populations and densities. Both Tribrachidium and Rugoconites also commonly occur as individuals or in low numbers on a number of beds, thus constraining possible reproductive strategies and environmental/substrate preferences. The distribution of Obamus is consistent with selective settlement, aggregating near conspecifics and on substrates of mature microbial mat. This dispersal process is the first example of substrate-selective dispersal among the Ediacara Biota, thus making Obamus similar to numerous modern sessile invertebrates with similar dispersal and settlement strategies.
The transition from residency to paediatric cardiology fellowship is challenging due to the new knowledge and technical skills required. Online learning can be an effective didactic modality that can be widely accessed by trainees. We sought to evaluate the effectiveness of a paediatric cardiology Fellowship Online Preparatory Course prior to the start of fellowship.
Methods:
The Online Preparatory Course contained 18 online learning modules covering basic concepts in anatomy, auscultation, echocardiography, catheterisation, cardiovascular intensive care, electrophysiology, pulmonary hypertension, heart failure, and cardiac surgery. Each online learning module included an instructional video with pre-and post-video tests. Participants completed pre- and post-Online Preparatory Course knowledge-based exams and surveys. Pre- and post-Online Preparatory Course survey and knowledge-based examination results were compared via Wilcoxon sign and paired t-tests.
Results:
151 incoming paediatric cardiology fellows from programmes across the USA participated in the 3 months prior to starting fellowship training between 2017 and 2019. There was significant improvement between pre- and post-video test scores for all 18 online learning modules. There was also significant improvement between pre- and post-Online Preparatory Course exam scores (PRE 43.6 ± 11% versus POST 60.3 ± 10%, p < 0.001). Comparing pre- and post-Online Preparatory Course surveys, there was a statistically significant improvement in the participants’ comfort level in 35 of 36 (97%) assessment areas. Nearly all participants (98%) agreed or strongly agreed that the Online Preparatory Course was a valuable learning experience and helped alleviate some anxieties (77% agreed or strongly agreed) related to starting fellowship.
Conclusion:
An Online Preparatory Course prior to starting fellowship can provide a foundation of knowledge, decrease anxiety, and serve as an effective educational springboard for paediatric cardiology fellows.
We present the Widefield ASKAP L-band Legacy All-sky Blind surveY (WALLABY) Pilot Phase I Hi kinematic models. This first data release consists of Hi observations of three fields in the direction of the Hydra and Norma clusters, and the NGC 4636 galaxy group. In this paper, we describe how we generate and publicly release flat-disk tilted-ring kinematic models for 109/592 unique Hi detections in these fields. The modelling method adopted here—which we call the WALLABY Kinematic Analysis Proto-Pipeline (WKAPP) and for which the corresponding scripts are also publicly available—consists of combining results from the homogeneous application of the FAT and 3DBarolo algorithms to the subset of 209 detections with sufficient resolution and
$S/N$
in order to generate optimised model parameters and uncertainties. The 109 models presented here tend to be gas rich detections resolved by at least 3–4 synthesised beams across their major axes, but there is no obvious environmental bias in the modelling. The data release described here is the first step towards the derivation of similar products for thousands of spatially resolved WALLABY detections via a dedicated kinematic pipeline. Such a large publicly available and homogeneously analysed dataset will be a powerful legacy product that that will enable a wide range of scientific studies.
We investigated the efficacy and complication profile of intranasal dexmedetomidine for transthoracic echocardiography sedation in patients with single ventricle physiology and shunt-dependent pulmonary blood flow during the high-risk interstage period.
Methods:
A single-centre, retrospective review identified interstage infants who received dexmedetomidine for echocardiography sedation. Baseline and procedural vitals were reported. Significant adverse events related to sedation were defined as an escalation in care or need for any additional/increased inotropic support to maintain pre-procedural haemodynamics. Minor adverse events were defined as changes from baseline haemodynamics that resolved without intervention. To assess whether sedation was adequate, echocardiogram reports were reviewed for completeness.
Results:
From September to December 2020, five interstage patients (age 29–69 days) were sedated with 3 mcg/kg intranasal dexmedetomidine. The median sedation onset time and duration time was 24 minutes (range 12–43 minutes) and 60 minutes (range 33–60 minutes), respectively. Sedation was deemed adequate in all patients as complete echocardiograms were accomplished without a rescue dose. When compared to baseline, three (60%) patients had a >10% reduction in heart rate, one (20%) patient had a >10% reduction in oxygen saturations, and one (20%) patient had a >30% decrease in blood pressure. Amongst all patients, no significant complications occurred and haemodynamic changes from baseline did not result in need for intervention or interruption of study.
Conclusions:
Intranasal dexmedetomidine may be a reasonable option for echocardiography sedation in infants with shunt-dependent single ventricle heart disease, and further investigation is warranted to ensure efficacy and safety in an outpatient setting.
Background: Despite a higher prevalence of traumatic spinal cord injury (TSCI) amongst Canadian Indigenous peoples, there is a paucity of studies focused on Indigenous TSCI. We present the first Canada-wide study comparing TSCI amongst Canadian Indigenous and non-Indigenous peoples. Methods: This study is a retrospective analysis of prospectively-collected TSCI data from the Rick Hansen Spinal Cord Injury Registry (RHSCIR) from 2004-2019. We divided participants into Indigenous and non-Indigenous cohorts and compared them with respect to demographics, injury mechanism, level, severity, and outcomes. Results: Compared with non-Indigenous patients, Indigenous patients were younger, more female, less likely to have higher education, and less likely to be employed. The mechanism of injury was more likely due to assault or transportation-related trauma in the Indigenous group. The length of stay for Indigenous patients was longer. Indigenous patients were more likely to be discharged to a rural setting, less likely to be discharged home, and more likely to be unemployed following injury. Conclusions: Our results suggest that more resources need to be dedicated for transitioning Indigenous patients sustaining a TSCI to community living and for supporting these patients in their home communities. A focus on resources and infrastructure for Indigenous patients by engagement with Indigenous communities is needed.