We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A period of the life course where optimal nutrition and food security are crucial for the life-long health and wellbeing of women/birthing parents and infants is preconception, pregnancy, and infancy.(1) It is estimated that nearly one in every four households with pre-school children (0-4 years) experience food insecurity (FI) in the UK.(2) Yet, we lack an evidence-base exploring experiences of FI in this life course stage.(3,4) This study aimed to explore women’s experiences of food insecurity during and after pregnancy, including its influence on infant feeding decisions.
This study was ethically approved (Ref No: LRS/DP-23/24-39437) and pre-registered on OSF Registries (https://osf.io/9hn6r). Semi-structured mixed format individual interviews were conducted between November 2023 and February 2024. Pregnant individuals, those who had given birth ≤12 months ago, ≥18 years old, food insecure, residing in South London and with recourse to public funds were recruited through purposive sampling. The topic guide was informed by FI, pregnancy and postpartum related literature and piloted (n = 2). Interviews were audiorecorded and professionally transcribed. Demographic data was summarised using SPSS. Inductive thematic analysis was used to analyse the data and was completed using NVivo.
Eleven food insecure participants (2 pregnant, 9 new mothers; 2 White European, 9 Black African/Caribbean/British women) participated in the study. Six women were 0-6 months postpartum, and 3 women were between 6-12 months postpartum. The preliminary findings are represented by three themes: 1) A dichotomy: knowing vs affording, 2) Adaptive food coping strategies, and 3) Infant feeding practices. Participants shared detailed accounts of valuing a healthy diet and adapting food practices, yet they still were unable to meet their dietary needs and desires during and after pregnancy. Participants described worry around breastmilk supply; quality and quantity. Complimentary feeding was also identified as a source of worry. “She is still breastfeeding fully. I don’t want to change to milk, which maybe, sometimes, I might not be able to afford it…I won’t stop until she is 1.”Whilst the cost of formula feeding was a driver of a more severe experience of FI.
Policy and practice recommendations include enhancing local breastfeeding support to address FI specific concerns around breastmilk supply and at national level, advocating for greater support for adequate healthy food provision and for a price cap on infant formula. Future interventions must support maternal mental health given the high cognitive stress identified with living with FI during and after pregnancy. Further high-quality research is needed 1) amongst asylum seekers and refugees and non-English speakers who may also experience FI, and 2) exploring cultural influences on breastfeeding and the relationship with FI.
From early on, infants show a preference for infant-directed speech (IDS) over adult-directed speech (ADS), and exposure to IDS has been correlated with language outcome measures such as vocabulary. The present multi-laboratory study explores this issue by investigating whether there is a link between early preference for IDS and later vocabulary size. Infants’ preference for IDS was tested as part of the ManyBabies 1 project, and follow-up CDI data were collected from a subsample of this dataset at 18 and 24 months. A total of 341 (18 months) and 327 (24 months) infants were tested across 21 laboratories. In neither preregistered analyses with North American and UK English, nor exploratory analyses with a larger sample did we find evidence for a relation between IDS preference and later vocabulary. We discuss implications of this finding in light of recent work suggesting that IDS preference measured in the laboratory has low test-retest reliability.
This article examines the development, early operation and subsequent failure of the Tot-Kolowa Red Cross irrigation scheme in Kenya’s Kerio Valley. Initially conceived as a technical solution to address regional food insecurity, the scheme aimed to scale up food production through the implementation of a fixed pipe irrigation system and the provision of agricultural inputs for cash cropping. A series of unfolding circumstances, however, necessitated numerous modifications to the original design as the project became increasingly entangled with deep and complex histories of land use patterns, resource allocation and conflict. Failure to understand the complexity of these dynamics ultimately led to the project’s collapse as the region spiralled into a period of significant unrest. In tracing these events, we aim to foreground the lived realities of imposed development, including both positive and negative responses to the scheme’s participatory obligations and its wider impact on community resilience.
Both cortical and parasympathetic systems are believed to regulate emotional arousal in the service of healthy development. Systemic coordination, or coupling, between putative regulatory functions begins in early childhood. Yet the degree of coupling between cortical and parasympathetic systems in young children remains unclear, particularly in relation to the development of typical or atypical emotion function. We tested whether cortical (ERN) and parasympathetic (respiratory sinus arrhythmia [RSA]) markers of regulation were coupled during cognitive challenge in preschoolers (N = 121). We found no main effect of RSA predicting ERN. We then tested children’s typical and atypical emotion behavior (context-appropriate/context-inappropriate fear, anxiety symptoms, neuroendocrine reactivity) as moderators of early coupling in an effort to link patterns of coupling to adaptive emotional development. Negative coupling (i.e., smaller ERN, more RSA suppression or larger ERN, less RSA suppression) at age 3 was associated with greater atypical and less typical emotion behaviors, indicative of greater risk. Negative age 3 coupling was also visible for children who had greater Generalized Anxiety Disorder symptoms and blunted cortisol reactivity at age 5. Results suggest that negative coupling may reflect a maladaptive pattern across regulatory systems that is identifiable during the preschool years.
Soil amelioration via strategic deep tillage is occasionally utilized within conservation tillage systems to alleviate soil constraints, but its impact on weed seed burial and subsequent growth within the agronomic system is poorly understood. This study assessed the effects of different strategic deep-tillage practices, including soil loosening (deep ripping), soil mixing (rotary spading), or soil inversion (moldboard plow), on weed seed burial and subsequent weed growth, compared with a no-till control. The tillage practices were applied in 2019 at Yerecoin and Darkan, WA, and data on weed seed burial and growth were collected during the following 3-yr winter crop rotation (2019 to 2021). Soil inversion buried 89% of rigid ryegrass (Lolium rigidum Gaudin) and ripgut brome (Bromus diandrus Roth) seeds to a depth of 10 to 20 cm at both sites, while soil loosening and mixing left between 31% and 91% of the seeds in the top 0 to 10 cm of soil, with broad variation between sites. Few seeds were buried beyond 20 cm despite tillage working depths exceeding 30 cm at both sites. Soil inversion reduced the density of L. rigidum to <1 plant m−2 for 3 yr after strategic tillage. Bromus diandrus density was initially reduced to 0 to 1 plant m−2 by soil inversion, but increased to 4 plants m−2 at Yerecoin in 2020 and 147 plants at Darkan in 2021. Soil loosening or mixing did not consistently decrease weed density. The field data were used to parameterize a model that predicted weed density following strategic tillage with greater accuracy for soil inversion than for loosening or mixing. The findings provide important insights into the effects of strategic deep tillage on weed management in conservational agricultural systems and demonstrate the potential of models for optimizing weed management strategies.
Different fertilization strategies can be adopted to optimize the productive components of an integrated crop–livestock systems. The current research evaluated how the application of P and K to soybean (Glycine max (L.) Merr.) or Urochloa brizantha (Hochst. ex A. Rich.) R. D. Webster cv. BRS Piatã associated with nitrogen or without nitrogen in the pasture phase affects the accumulation and chemical composition of forage and animal productivity. The treatments were distributed in randomized blocks with three replications. Four fertilization strategies were tested: (1) conventional fertilization with P and K in the crop phase (CF–N); (2) conventional fertilization with nitrogen in the pasture phase (CF + N); (3) system fertilization with P and K in the pasture phase (SF–N); (4) system fertilization with nitrogen in the pasture phase (SF + N). System fertilization increased forage accumulation from 15 710 to 20 920 kg DM ha/year compared to conventional without nitrogen. Stocking rate (3.1 vs. 2.8 AU/ha; SEM = 0.12) and gain per area (458 vs. 413 kg BW/ha; SEM = 27.9) were higher in the SF–N than CF–N, although the average daily gain was lower (0.754 vs. 0.792 kg LW/day; SEM = 0.071). N application in the pasture phase, both, conventional and system fertilization resulted in higher crude protein, stocking rate and gain per area. Applying nitrogen and relocate P and K from crop to pasture phase increase animal productivity and improve forage chemical composition in integrated crop–livestock system.
Skin-based samples (leather, skin, and parchment) in archaeological, historic and museum settings are among the most challenging materials to radiocarbon (14C) date in terms of removing exogenous carbon sources—comparable to bone collagen in many respects but with much less empirical study to guide pretreatment approaches. In the case of leather, the 14C content of materials used in manufacturing the leather can vary greatly. The presence of leather manufacturing chemicals before pretreatment and their absence afterward is difficult to demonstrate, and the accuracy of dates depends upon isolating the original animal proteins and removing exogenous carbon. Parchments differ in production technique from leather but include similar unknowns. It is not clear that lessons learned in the treatment of one are always salient for treating the other. We measured the 14C content of variously pretreated leather, parchment, skin samples, and extracts, producing apparent ages that varied by hundreds or occasionally thousands of years depending upon sample pretreatment. Fourier Transform Infrared Spectroscopy (FTIR) and C:N ratios provided insight into the chemical composition of carbon reservoirs contributing to age differences. The results of these analyses demonstrated that XAD column chromatography resulted in the most accurate 14C dates for leather and samples of unknown tannage, and FTIR allowed for the detection of contamination that might have otherwise been overlooked.
Delirium is characterised by an acute, fluctuating change in cognition, attention and awareness (Wilson et al. Nature Reviews 2020; 6). This presentation can make the diagnosis of delirium extremely challenging to clinicians (Gofton., Canadian Journal of neurological sciences. 2011; 38 673-680). It is commonly reported in hospitalised patients, particularly in those over the age of sixty five (NICE. Delirium: prevention, diagnosis and management. 2010).
Objectives
Our aim is to identify which investigations and cognitive assessments are completed prior to a referral to the liaison psychiatry services in patients with symptoms of delirium.
Methods
Referrals (N = 6012) to the liaison psychiatry team at Croydon University Hospital made between April and September 2022 were screened. Search parameters used to identify referrals related to a potential diagnosis of delirium were selected by the authors. The terms used were confusion; delirium; agitation; aggression; cognitive decline or impairment; disorientation; challenging behaviour. Data was collected on the completion rates of investigations for delirium as advised by the NICE clinical knowledge summaries. Further data was gathered on neuroimaging (CT or MRI), cognitive assessment tools (MOCA/MMSE) and delirium screening tools (4AT/AMTS).
Results
The study sample identified 114 referrals (61 males and 53 females), with 82% over 65 years at the time of referral. In 96% of referrals, U&E and CRP were performed. Sputum culture (1%), urine toxin screen (4%) and free T3/4 (8%) were the tests utilised the least. Neuroimaging was completed in 41% of referrals (see Graph 1 for a full breakdown of results).
A formal cognitive assessment or delirium screening tool was completed in 32% of referrals. The AMTS and 4AT tools were documented for 65% and 24% respectively. A total of 19 referrals explicitly stated the patient was suspected to have dementia. A delirium screening tool was documented in 47% of these cases however, a formal cognitive assessment was documented in only 5% of these patients.
Following psychiatric assessment 47% of referrals were confirmed as delirium.
Image:
Conclusions
Our data highlights the low level completion of the NICE recommended delirium screen prior to referral to liaison psychiatry. The effective implementation of a delirium screen and cognitive assessment is paramount to reduce the number of inappropriate psychiatric referrals in hospital and helps to identify reversible organic causes of delirium. This in turn will ensure timely treatment of reversible causes of delirium and reduce the length of hospital admission.
Pregnancy is a time of increased vulnerability to psychopathology, yet limited work has investigated the extent to which variation in psychopathology during pregnancy is shared and unshared across syndromes and symptoms. Understanding the structure of psychopathology during pregnancy, including associations with childhood experiences, may elucidate risk and resilience factors that are transdiagnostic and/or specific to particular psychopathology phenotypes. Participants were 292 pregnant individuals assessed using multiple measures of psychopathology. Confirmatory factor analyses found evidence for a structure of psychopathology consistent with the Hierarchical Taxonomy of Psychopathology (HiTOP). A common transdiagnostic factor accounted for most variation in psychopathology, and both adverse and benevolent childhood experiences (ACEs and BCEs) were associated with this transdiagnostic factor. Furthermore, pregnancy-specific anxiety symptoms most closely reflected the dimension of Fear, which may suggest shared variation with manifestations of fear that are not pregnancy-specific. ACEs and BCEs also linked to specific prenatal psychopathology involving thought problems, detachment, and internalizing, externalizing, antagonistic, and antisocial behavior. These findings extend the dimensional and hierarchical HiTOP model to pregnant individuals and show how maternal childhood risk and resilience factors relate to common and specific forms of psychopathology during pregnancy as a period of enhanced vulnerability.
Consumers now demand evidence of welfare assurance at all stages of animal production, marketing, transport and slaughter. In response, retailers have increasingly adopted preferred supply chain relationships which preclude sourcing animals via livestock auction markets. One of the criteria dictating this action is a perceived improvement in animal welfare resulting from direct transport from farm to abattoir.
A survey of complete journey structures of 18 393 slaughterweight lambs from farm to abattoir was conducted between April and July 1997. Journeys were characterized in terms of distances travelled, duration and the number of discrete components within a whole journey which comprised: transport; trans-shipping (when animals were transferred from one vehicle to another); multiple pickups from a number of farms; and holding at either assembly points, lairages or auction markets. The results identified that journeys in the livestock distribution system are diverse and range in complexity, irrespective of marketing channel. Journey complexity was found to be positively related to distance travelled.
The study demonstrates that discussions concerning welfare of livestock in transit should consider the journey structure and not just the marketing channel per se. Furthermore, it also shows that changes taking place in the infrastructure of the marketing and meat processing sectors may result in a reduction in animal welfare.
While studies from the start of the COVID-19 pandemic have described initial negative effects on mental health and exacerbating mental health inequalities, longer-term studies are only now emerging.
Method
In total, 34 465 individuals in the UK completed online questionnaires and were re-contacted over the first 12 months of the pandemic. We used growth mixture modelling to identify trajectories of depression, anxiety and anhedonia symptoms using the 12-month data. We identified sociodemographic predictors of trajectory class membership using multinomial regression models.
Results
Most participants had consistently low symptoms of depression or anxiety over the year of assessments (60%, 69% respectively), and a minority had consistently high symptoms (10%, 15%). We also identified participants who appeared to show improvements in symptoms as the pandemic progressed, and others who showed the opposite pattern, marked symptom worsening, until the second national lockdown. Unexpectedly, most participants showed stable low positive affect, indicating anhedonia, throughout the 12-month period. From regression analyses, younger age, reporting a previous mental health diagnosis, non-binary, or self-defined gender, and an unemployed or a student status were significantly associated with membership of the stable high symptom groups for depression and anxiety.
Conclusions
While most participants showed little change in their depression and anxiety symptoms across the first year of the pandemic, we highlight the divergent responses of subgroups of participants, who fared both better and worse around national lockdowns. We confirm that previously identified predictors of negative outcomes in the first months of the pandemic also predict negative outcomes over a 12-month period.
As part of surveillance of snail-borne trematodiasis in Knowsley Safari (KS), Prescot, United Kingdom, a collection was made in July 2021 of various planorbid (n = 173) and lymnaeid (n = 218) snails. These were taken from 15 purposely selected freshwater habitats. In the laboratory emergent trematode cercariae, often from single snails, were identified by morphology with a sub-set, of those most accessible, later characterized by cytochrome oxidase subunit 1 (cox1) DNA barcoding. Two schistosomatid cercariae were of special note in the context of human cercarial dermatitis (HCD), Bilharziella polonica emergent from Planorbarius corneus and Trichobilharzia spp. emergent from Ampullacaena balthica. The former schistosomatid was last reported in the United Kingdom over 50 years ago. From cox1 analyses, the latter likely consisted of two taxa, Trichobilharzia anseri, a first report in the United Kingdom, and a hitherto unnamed genetic lineage having some affiliation with Trichobilharzia longicauda. The chronobiology of emergent cercariae from P. corneus was assessed, with the vertical swimming rate of B. polonica measured. We provide a brief risk appraisal of HCD for public activities typically undertaken within KS educational and recreational programmes.
The impact of the coronavirus disease 2019 (COVID-19) pandemic on mental health is still being unravelled. It is important to identify which individuals are at greatest risk of worsening symptoms. This study aimed to examine changes in depression, anxiety and post-traumatic stress disorder (PTSD) symptoms using prospective and retrospective symptom change assessments, and to find and examine the effect of key risk factors.
Method
Online questionnaires were administered to 34 465 individuals (aged 16 years or above) in April/May 2020 in the UK, recruited from existing cohorts or via social media. Around one-third (n = 12 718) of included participants had prior diagnoses of depression or anxiety and had completed pre-pandemic mental health assessments (between September 2018 and February 2020), allowing prospective investigation of symptom change.
Results
Prospective symptom analyses showed small decreases in depression (PHQ-9: −0.43 points) and anxiety [generalised anxiety disorder scale – 7 items (GAD)-7: −0.33 points] and increases in PTSD (PCL-6: 0.22 points). Conversely, retrospective symptom analyses demonstrated significant large increases (PHQ-9: 2.40; GAD-7 = 1.97), with 55% reported worsening mental health since the beginning of the pandemic on a global change rating. Across both prospective and retrospective measures of symptom change, worsening depression, anxiety and PTSD symptoms were associated with prior mental health diagnoses, female gender, young age and unemployed/student status.
Conclusions
We highlight the effect of prior mental health diagnoses on worsening mental health during the pandemic and confirm previously reported sociodemographic risk factors. Discrepancies between prospective and retrospective measures of changes in mental health may be related to recall bias-related underestimation of prior symptom severity.
Prenatal glucocorticoid overexposure causes adult metabolic dysfunction in several species but its effects on adult mitochondrial function remain largely unknown. Using respirometry, this study examined mitochondrial substrate metabolism of fetal and adult ovine biceps femoris (BF) and semitendinosus (ST) muscles after cortisol infusion before birth. Physiological increases in fetal cortisol concentrations pre-term induced muscle- and substrate-specific changes in mitochondrial oxidative phosphorylation capacity in adulthood. These changes were accompanied by muscle-specific alterations in protein content, fibre composition and abundance of the mitochondrial electron transfer system (ETS) complexes. In adult ST, respiration using palmitoyl-carnitine and malate was increased after fetal cortisol treatment but not with other substrate combinations. There were also significant increases in protein content and reductions in the abundance of all four ETS complexes, but not ATP synthase, in the ST of adults receiving cortisol prenatally. In adult BF, intrauterine cortisol treatment had no effect on protein content, respiratory rates, ETS complex abundances or ATP synthase. Activity of citrate synthase, a marker of mitochondrial content, was unaffected by intrauterine treatment in both adult muscles. In the ST but not BF, respiratory rates using all substrate combinations were significantly lower in the adults than fetuses, predominantly in the saline-infused controls. The ontogenic and cortisol-induced changes in mitochondrial function were, therefore, more pronounced in the ST than BF muscle. Collectively, the results show that fetal cortisol overexposure programmes mitochondrial substrate metabolism in specific adult muscles with potential consequences for adult metabolism and energetics.
Contact with livestock and consumption of unpasteurised dairy products are associated with an increased risk of zoonotic and foodborne infection, particularly among populations with close animal contact, including pastoralists and semi-pastoralists. However, there are limited data on disease risk factors among pastoralists and other populations where livestock herding, particularly of dromedary camels, is common. This cross-sectional study used a previously validated survey instrument to identify risk factors for self-reported symptoms. Adults (n = 304) were randomly selected from households (n = 171) in the Somali Region of Ethiopia, a region characterised by chronic food insecurity, population displacement, recurrent droughts and large semi-pastoralist and pastoralist populations. Multivariable logistic regression assessed associations between self-reported symptoms and type of milk consumed, controlling for demographics and human-animal interaction. Consumption of days-old unrefrigerated raw camel milk was significantly associated with symptoms in the 30 days prior to the survey (AOR = 5.07; 95% CI 2.41–10.66), after controlling for age, refugee status, sanitation, camel ownership and source of drinking water and accounting for clustering. Consumption of days-old unrefrigerated raw ruminant milk was significantly associated with symptoms (AOR = 4.00, 95% CI 1.27–12.58). Source of drinking water and camel ownership, a proxy for camel contact, were significantly associated with the outcome in each model. There were no significant associations between self-reported symptoms and fresh or soured animal milk consumption. Research is needed to identify pathogens and major routes of transmission. Tailored communication campaigns to encourage safe food preparation should also be considered.
We present Hubble Space Telescope Wide Field Camera 3 photometric and grism observations of the candidate ultra-high-redshift ($z>7$) radio galaxy, GLEAM J0917–0012. This radio source was selected due to the curvature in its 70–230 MHz, low-frequency Murchison Widefield Array radio spectrum and its faintness in K-band. Follow-up spectroscopic observations of this source with the Jansky Very Large Array and Atacama Large Millimetre Array were inconclusive as to its redshift. Our F105W and F0986M imaging observations detect the host of GLEAM J0917–0012 and a companion galaxy, $\sim$ one arcsec away. The G102 grism observations reveal a single weak line in each of the spectra of the host and the companion. To help identify these lines we utilised several photometric redshift techniques including template fitting to the grism spectra, fitting the ultraviolet (UV)-to-radio photometry with galaxy templates plus a synchrotron model, fitting of the UV-to-near-infrared photometry with EAZY, and fitting the radio data alone with RAiSERed. For the host of GLEAM J0917–0012 we find a line at $1.12\,\mu$m and the UV-to-radio spectral energy distribution (SED) fitting favours solutions at $z\sim 2$ or $z\sim 8$. While this fitting shows a weak preference for the lower redshift solution, the models from the higher redshift solution are more consistent with the strength of the spectral line. The redshift constraint by RAiSERed of $>6.5$ also supports the interpretation that this line could be Lyman$-\alpha$ at $z=8.21$; however EAZY favours the $z\sim 2$ solution. We discuss the implications of both solutions. For the companion galaxy we find a line at $0.98\,\mu$m and the SED fitting favours solutions at $z<3$ implying that the line could be the [OII]$\lambda3727$ doublet at $z=1.63$ (although the EAZY solution is $z\sim 2.6\pm 0.5$). Further observations are still required to unambiguously determine the redshift of this intriguing candidate ultra-high-redshift radio galaxy.
This study examined struggles to establish autonomy and relatedness with peers in adolescence and early adulthood as predictors of advanced epigenetic aging assessed at age 30. Participants (N = 154; 67 male and 87 female) were observed repeatedly, along with close friends and romantic partners, from ages 13 through 29. Observed difficulty establishing close friendships characterized by mutual autonomy and relatedness from ages 13 to 18, an interview-assessed attachment state of mind lacking autonomy and valuing of attachment at 24, and self-reported difficulties in social integration across adolescence and adulthood were all linked to greater epigenetic age at 30, after accounting for chronological age, gender, race, and income. Analyses assessing the unique and combined effects of these factors, along with lifetime history of cigarette smoking, indicated that each of these factors, except for adult social integration, contributed uniquely to explaining epigenetic age acceleration. Results are interpreted as evidence that the adolescent preoccupation with peer relationships may be highly functional given the relevance of such relationships to long-term physical outcomes.
Portable oxygen concentrators (POCs) are medical devices that use physical means to separate oxygen from the atmosphere to produce concentrated, medical-grade gas. Providing oxygen to low-resources environments, such as austere locations, military combat zones, rural Emergency Medical Services (EMS), and during disasters, becomes expensive and logistically intensive. Recent advances in separation technology have promoted the development of POC systems ruggedized for austere use. This review provides a comprehensive summary of the available data regarding POCs in these challenge environments.
Methods:
PubMed, Google Scholar, and the Defense Technical Information Center were searched from inception to November 2021. Articles addressing the use of POCs in low-resource settings were selected. Three authors were independently involved in the search, review, and synthesis of the articles. Evidence was graded using Oxford Centre for Evidence-Based Medicine guidelines.
Results:
The initial search identified 349 articles, of which 40 articles were included in the review. A total of 724 study subjects were associated with the included articles. There were no Level I systematic reviews or randomized controlled trials.
Discussion:
Generally, POCs are a low-cost, light-weight tool that may fill gaps in austere, military, veterinary, EMS, and disaster medicine. They are cost-effective in low-resource areas, such as rural and high-altitude hospitals in developing nations, despite relatively high capital costs associated with initial equipment purchase. Implementation of POC in low-resource locations is limited primarily on access to electricity but can otherwise operate for thousands of hours without maintenance. They provide a unique advantage in combat operations as there is no risk of explosive if oxygen tanks are struck by high-velocity projectiles. Despite their deployment throughout the battlespace, there were no manuscripts identified during the review involving the efficacy of POCs for combat casualties or clinical outcomes in combat. Veterinary medicine and animal studies have provided the most robust data on the physiological effectiveness of POCs. The success of POCs during the coronavirus disease 2019 (COVID-19) pandemic highlights the potential for POCs during future mass-casualty events. There is emerging technology available that combines a larger oxygen concentrator with a compressor system capable of refilling small oxygen cylinders, which could transform the delivery of oxygen in austere environments if ruggedized and miniaturized. Future clinical research is needed to quantify the clinical efficacy of POCs in low-resource settings.