To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The ongoing pandemic disaster of coronavirus erupted with the first confirmed cases in Wuhan, China in December 2019, caused by the SARS-CoV-2 novel coronavirus, the disease referred to as “COVID-19.” The World Health Organization (WHO) confirmed the outbreak and determined it a global pandemic. The current pandemic has infected nearly 100 million people and killed over 2 million. The current COVID-19 pandemic is smashing every public health barrier, guardrail and safety measure in underdeveloped and the most developed countries alike with peaks and troughs across time. Greatly impacted are those regions experiencing conflict and war. Morbidity and mortality increase logarithmically for those communities at risk and that lack the ability to promote basic preventative measures. As states around the globe struggle to unify responses, make gains on preparedness levels, identify and symptomatically treat positive cases and labs across the globe frantically rollout various vaccines and effective surveillance and therapeutic mechanisms. The incidence and prevalence of COVID-19 may continue to increase globally as no unified disaster response is manifested and disinformation spreads. During this failure in response, virus variants are erupting at a dizzying pace. Ungoverned spaces where non-state actors predominate and active war zones may become the next epicenter for COVID-19 fatality rates.
As the incidence rates continue to rise, hospitals in North America and Europe exceed surge capacity and immunity post infection struggles to be adequately described. The global threat in previously high-quality, robust infrastructure healthcare systems in the most developed economies are failing the challenge posed by COVID-19; how will less developed economies and those healthcare infrastructures that are destroyed by war and conflict until adequate vaccines penetrance in these communities or adequate treatment are established? Ukraine and other states in the Black Sea Region are under threat and are exposed to armed Russian aggression against territorial sovereignty daily. Ukraine, where Russia has been waging war since 2014, faces this specific dual threat: disaster response to violence and a deadly infectious disease. In order to best serve biosurveillance, aid in pandemic disaster response and bolster health security in Europe, across the North Atlantic Treaty Alliance (NATO) and Black Sea regions, increased NATO integration, across Ukraine’s disaster response structures within the Ministries of Health, Defense and Interior must be reenforced and expanded in order to mitigate the COVID-19 disaster.
To investigate the timing and routes of contamination of the rooms of patients newly admitted to the hospital.
Observational cohort study and simulations of pathogen transfer.
A Veterans’ Affairs hospital.
Patients newly admitted to the hospital with no known carriage of healthcare-associated pathogens.
Interactions between the participants and personnel or portable equipment were observed, and cultures of high-touch surfaces, floors, bedding, and patients’ socks and skin were collected for up to 4 days. Cultures were processed for Clostridioides difﬁcile, methicillin-resistant Staphylococcus aureus (MRSA), and vancomycin-resistant enterococci (VRE). Simulations were conducted with bacteriophage MS2 to assess plausibility of transfer from contaminated floors to high-touch surfaces and to assess the effectiveness of wearing slippers in reducing transfer.
Environmental cultures became positive for at least 1 pathogen in 10 (59%) of the 17 rooms, with cultures positive for MRSA, C. difficile, and VRE in the rooms of 10 (59%), 2 (12%), and 2 (12%) participants, respectively. For all 14 instances of pathogen detection, the initial site of recovery was the floor followed in a subset of patients by detection on sock bottoms, bedding, and high-touch surfaces. In simulations, wearing slippers over hospital socks dramatically reduced transfer of bacteriophage MS2 from the floor to hands and to high-touch surfaces.
Floors may be an underappreciated source of pathogen dissemination in healthcare facilities. Simple interventions such as having patients wear slippers could potentially reduce the risk for transfer of pathogens from floors to hands and high-touch surfaces.
Gloves and gowns are used during patient care to reduce contamination of personnel and prevent pathogen transmission.
To determine whether the use of gowns adds a substantial benefit over gloves alone in preventing patient-to-patient transfer of a viral DNA surrogate marker.
In total, 30 source patients had 1 cauliflower mosaic virus surrogate marker applied to their skin and clothing and a second to their bed rail and bedside table. Personnel caring for the source patients were randomized to wear gloves, gloves plus cover gowns, or no barrier. Interactions with up to 7 subsequent patients were observed, and the percentages of transfer of the DNA markers were compared among the 3 groups.
In comparison to the no-barrier group (57.8% transfer of 1 or both markers), there were significant reductions in transfer of the DNA markers in the gloves group (31.1% transfer; odds ratio [OR], 0.16; 95% confidence interval [CI], 0.02-0.73) and the gloves-plus-gown group (25.9% transfer; OR, 0.11; 95% CI, 0.01–0.51). The addition of a cover gown to gloves during the interaction with the source patient did not significantly reduce the transfer of the DNA marker (P = .53). During subsequent patient interactions, transfer of the DNA markers was significantly reduced if gloves plus gowns were worn and if hand hygiene was performed (P < .05).
Wearing gloves or gloves plus gowns reduced the frequency of patient-to-patient transfer of a viral DNA surrogate marker. The use of gloves plus gowns during interactions with the source patient did not reduce transfer in comparison to gloves alone.
New Zealand has a long-running campylobacter infection (campylobacteriosis) epidemic with contaminated fresh chicken meat as the major source. This is both the highest impact zoonosis and the largest food safety problem in the country. Adding to this burden is the recent rapid emergence of antibiotic resistance in these campylobacter infections acquired from locally-produced chicken. Campylobacteriosis rates halved in 2008, as compared with the previous 5 years, following the introduction of regulatory limits on allowable contamination levels in fresh chicken meat, with large health and economic benefits resulting. In the following decade, disease rates do not appear to have declined further. The cumulative impact would equate to an estimated 539 000 cases, 5480 hospitalisations, 284 deaths and economic costs of approximately US$380 million during the last 10 years (2009–2018). Additional regulatory interventions, that build on previously successful regulations in this country, are urgently needed to control the source of this epidemic.
There is controversy regarding whether the addition of cover gowns offers a substantial benefit over gloves alone in reducing personnel contamination and preventing pathogen transmission.
Simulated patient care interactions.
To evaluate the efficacy of different types of barrier precautions and to identify routes of transmission.
In randomly ordered sequence, 30 personnel each performed 3 standardized examinations of mannequins contaminated with pathogen surrogate markers (cauliflower mosaic virus DNA, bacteriophage MS2, nontoxigenic Clostridioides difficile spores, and fluorescent tracer) while wearing no barriers, gloves, or gloves plus gowns followed by examination of a noncontaminated mannequin. We compared the frequency and routes of transfer of the surrogate markers to the second mannequin or the environment.
For a composite of all surrogate markers, transfer by hands occurred at significantly lower rates in the gloves-alone group (OR, 0.02; P < .001) and the gloves-plus-gown group (OR, 0.06; P = .002). Transfer by stethoscope diaphragms was common in all groups and was reduced by wiping the stethoscope between simulations (OR, 0.06; P < .001). Compared to the no-barriers group, wearing a cover gown and gloves resulted in reduced contamination of clothing (OR, 0.15; P < .001), but wearing gloves alone did not.
Wearing gloves alone or gloves plus gowns reduces hand transfer of pathogens but may not address transfer by devices such as stethoscopes. Cover gowns reduce the risk of contaminating the clothing of personnel.
Background: Antibiotics are among the most commonly prescribed drugs in nursing homes; urinary tract infections (UTIs) are a frequent indication. Although there is no gold standard for the diagnosis of UTIs, various criteria have been developed to inform and standardize nursing home prescribing decisions, with the goal of reducing unnecessary antibiotic prescribing. Using different published criteria designed to guide decisions on initiating treatment of UTIs (ie, symptomatic, catheter-associated, and uncomplicated cystitis), our objective was to assess the appropriateness of antibiotic prescribing among NH residents. Methods: In 2017, the CDC Emerging Infections Program (EIP) performed a prevalence survey of healthcare-associated infections and antibiotic use in 161 nursing homes from 10 states: California, Colorado, Connecticut, Georgia, Maryland, Minnesota, New Mexico, New York, Oregon, and Tennessee. EIP staff reviewed resident medical records to collect demographic and clinical information, infection signs, symptoms, and diagnostic testing documented on the day an antibiotic was initiated and 6 days prior. We applied 4 criteria to determine whether initiation of treatment for UTI was supported: (1) the Loeb minimum clinical criteria (Loeb); (2) the Suspected UTI Situation, Background, Assessment, and Recommendation tool (UTI SBAR tool); (3) adaptation of Infectious Diseases Society of America UTI treatment guidelines for nursing home residents (Crnich & Drinka); and (4) diagnostic criteria for uncomplicated cystitis (cystitis consensus) (Fig. 1). We calculated the percentage of residents for whom initiating UTI treatment was appropriate by these criteria. Results: Of 248 residents for whom UTI treatment was initiated in the nursing home, the median age was 79 years [IQR, 19], 63% were female, and 35% were admitted for postacute care. There was substantial variability in the percentage of residents with antibiotic initiation classified as appropriate by each of the criteria, ranging from 8% for the cystitis consensus, to 27% for Loeb, to 33% for the UTI SBAR tool, to 51% for Crnich and Drinka (Fig. 2). Conclusions: Appropriate initiation of UTI treatment among nursing home residents remained low regardless of criteria used. At best only half of antibiotic treatment met published prescribing criteria. Although insufficient documentation of infection signs, symptoms and testing may have contributed to the low percentages observed, adequate documentation in the medical record to support prescribing should be standard practice, as outlined in the CDC Core Elements of Antibiotic Stewardship for nursing homes. Standardized UTI prescribing criteria should be incorporated into nursing home stewardship activities to improve the assessment and documentation of symptomatic UTI and to reduce inappropriate antibiotic use.
The hands of healthcare personnel are the most important source for transmission of healthcare-associated pathogens. The role of contaminated fomites such as portable equipment, stethoscopes, and clothing of personnel in pathogen transmission is unclear.
To study routes of transmission of cauliflower mosaic virus DNA markers from 31 source patients and from environmental surfaces in their rooms.
A 3-month observational cohort study.
A Veterans’ Affairs hospital.
After providing care for source patients, healthcare personnel were observed during interactions with subsequent patients. Putative routes of transmission were identified based on recovery of DNA markers from sites of contact with the patient or environment. To assess plausibility of fomite-mediated transmission, we assessed the frequency of transfer of methicillin-resistant Staphylococcus aureus (MRSA) from the skin of 25 colonized patients via gloved hands versus fomites.
Of 145 interactions involving contact with patients and/or the environment, 41 (28.3%) resulted in transfer of 1 or both DNA markers to the patient and/or the environment. The DNA marker applied to patients’ skin and clothing was transferred most frequently by stethoscopes, hands, and portable equipment, whereas the marker applied to environmental surfaces was transferred only by hands and clothing. The percentages of MRSA transfer from the skin of colonized patients via gloved hands, stethoscope diaphragms, and clothing were 52%, 40%, and 48%, respectively.
Fomites such as stethoscopes, clothing, and portable equipment may be underappreciated sources of pathogen transmission. Simple interventions such as decontamination of fomites between patients could reduce the risk for transmission.
This chapter examines the networks and connections within the early modern legal community of Aberdeen, Scotland. It reconstructs a particular master-apprentice network of the early-seventeenth to the mid-eighteenth centuries, showing the importance of this educational mechanism both for entrance into the local legal profession and for establishing professional contacts. This chapter also reconstructs the networks which were focused on two of Aberdeen’s most important courts of the period—the sheriff and commissary courts. It shows the extent to which the men who held offices in these courts were interconnected, both personally and professionally, and reflects on what this discovery reveals about contemporaneous local court practice. Finally, this chapter concludes by reflecting on how men of law may have regarded their own networks, through an examination of their children’s god-parentage records.
There is a requirement in some beef markets to slaughter bulls at under 16 months of age. This requires high levels of concentrate feeding. Increasing the slaughter age of bulls to 19 months facilitates the inclusion of a grazing period, thereby decreasing the cost of production. Recent data indicate few quality differences in longissimus thoracis (LT) muscle from conventionally reared 16-month bulls and 19-month-old bulls that had a grazing period prior to finishing on concentrates. The aim of the present study was to expand this observation to additional commercially important muscles/cuts. The production systems selected were concentrates offered ad libitum and slaughter at under 16 months of age (16-C) or at 19 months of age (19-CC) to examine the effect of age per se, and the cheaper alternative for 19-month bulls described above (19-GC). The results indicate that muscles from 19-CC were more red, had more intramuscular fat and higher cook loss than those from 16-C. No differences in muscle objective texture or sensory texture and acceptability were found between treatments. The expected differences in composition and quality between the muscles were generally consistent across the production systems examined. Therefore, for the type of animal and range of ages investigated, the effect of the production system on LT quality was generally representative of the effect on the other muscles analysed. In addition, the data do not support the under 16- month age restriction, based on meat acceptability, in commercial suckler bull production.
Acute change in mental status (ACMS), defined by the Confusion Assessment Method, is used to identify infections in nursing home residents. A medical record review revealed that none of 15,276 residents had an ACMS documented. Using the revised McGeer criteria with a possible ACMS definition, we identified 296 residents and 21 additional infections. The use of a possible ACMS definition should be considered for retrospective nursing home infection surveillance.
Clostridioides difficile infection (CDI) is the most frequently reported hospital-acquired infection in the United States. Bioaerosols generated during toilet flushing are a possible mechanism for the spread of this pathogen in clinical settings.
To measure the bioaerosol concentration from toilets of patients with CDI before and after flushing.
In this pilot study, bioaerosols were collected 0.15 m, 0.5 m, and 1.0 m from the rims of the toilets in the bathrooms of hospitalized patients with CDI. Inhibitory, selective media were used to detect C. difficile and other facultative anaerobes. Room air was collected continuously for 20 minutes with a bioaerosol sampler before and after toilet flushing. Wilcoxon rank-sum tests were used to assess the difference in bioaerosol production before and after flushing.
Rooms of patients with CDI at University of Iowa Hospitals and Clinics.
Bacteria were positively cultured from 8 of 24 rooms (33%). In total, 72 preflush and 72 postflush samples were collected; 9 of the preflush samples (13%) and 19 of the postflush samples (26%) were culture positive for healthcare-associated bacteria. The predominant species cultured were Enterococcus faecalis, E. faecium, and C. difficile. Compared to the preflush samples, the postflush samples showed significant increases in the concentrations of the 2 large particle-size categories: 5.0 µm (P = .0095) and 10.0 µm (P = .0082).
Bioaerosols produced by toilet flushing potentially contribute to hospital environmental contamination. Prevention measures (eg, toilet lids) should be evaluated as interventions to prevent toilet-associated environmental contamination in clinical settings.
We systematically reviewed implementation research targeting depression interventions in low- and middle-income countries (LMICs) to assess gaps in methodological coverage.
PubMed, CINAHL, PsycINFO, and EMBASE were searched for evaluations of depression interventions in LMICs reporting at least one implementation outcome published through March 2019.
A total of 8714 studies were screened, 759 were assessed for eligibility, and 79 studies met inclusion criteria. Common implementation outcomes reported were acceptability (n = 50; 63.3%), feasibility (n = 28; 35.4%), and fidelity (n = 18; 22.8%). Only four studies (5.1%) reported adoption or penetration, and three (3.8%) reported sustainability. The Sub-Saharan Africa region (n = 29; 36.7%) had the most studies. The majority of studies (n = 59; 74.7%) reported outcomes for a depression intervention implemented in pilot researcher-controlled settings. Studies commonly focused on Hybrid Type-1 effectiveness-implementation designs (n = 53; 67.1), followed by Hybrid Type-3 (n = 16; 20.3%). Only 21 studies (26.6%) tested an implementation strategy, with the most common being revising professional roles (n = 10; 47.6%). The most common intervention modality was individual psychotherapy (n = 30; 38.0%). Common study designs were mixed methods (n = 27; 34.2%), quasi-experimental uncontrolled pre-post (n = 17; 21.5%), and individual randomized trials (n = 16; 20.3).
Existing research has focused on early-stage implementation outcomes. Most studies have utilized Hybrid Type-1 designs, with the primary aim to test intervention effectiveness delivered in researcher-controlled settings. Future research should focus on testing and optimizing implementation strategies to promote scale-up of evidence-based depression interventions in routine care. These studies should use high-quality pragmatic designs and focus on later-stage implementation outcomes such as cost, penetration, and sustainability.
Sink drainage systems are not amenable to standard methods of cleaning and disinfection. Disinfectants applied as a foam might enhance efficacy of drain decontamination due to greater persistence and increased penetration into sites harboring microorganisms.
To examine the efficacy and persistence of foam-based products in reducing sink drain colonization with gram-negative bacilli.
During a 5-month period, different methods for sink drain disinfection in patient rooms were evaluated in a hospital and its affiliated long-term care facility. We compared the efficacy of a single treatment with 4 different foam products in reducing the burden of gram-negative bacilli in the sink drain to a depth of 2.4 cm (1 inch) below the strainer. For the most effective product, the effectiveness of foam versus liquid-pouring applications, and the effectiveness of repeated foam treatments were evaluated.
A foam product containing 3.13% hydrogen peroxide and 0.05% peracetic acid was significantly more effective than the other 3 foam products. In comparison to pouring the hydrogen peroxide and peracetic acid disinfectant, the foam application resulted in significantly reduced recovery of gram-negative bacilli on days 1, 2, and 3 after treatment with a return to baseline by day 7. With repeated treatments every 3 days, a progressive decrease in the bacterial load recovered from sink drains was achieved.
An easy-to-use foaming application of a hydrogen peroxide- and peracetic acid-based disinfectant suppressed sink-drain colonization for at least 3 days. Intermittent application of the foaming disinfectant could potentially reduce the risk for dissemination of pathogens from sink drains.
People with cerebral palsy (CP) are less physically active than the general population and, consequently, are at increased risk of preventable disease. Evidence indicates that low-moderate doses of physical activity can reduce disease risk and improve fitness and function in people with CP. Para athletes with CP typically engage in ‘performance-focused’ sports training, which is undertaken for the sole purpose of enhancing sports performance. Anecdotally, many Para athletes report that participation in performance-focused sports training confers meaningful clinical benefits which exceed those reported in the literature; however, supporting scientific evidence is lacking. The aim of this paper is to describe the protocol for an 18-month study evaluating the clinical effects of a performance-focused swimming training programme for people with CP who have high support needs.
This study will use a concurrent multiple-baseline, single-case experimental design across three participants with CP who have high support needs. Each participant will complete a five-phase trial comprising: baseline (A1); training phase 1 (B1); maintenance phase 1 (A2); training phase 2 (B2); and maintenance phase 2 (A3). For each participant, measurement of swim velocity, health-related quality of life and gross motor functioning will be carried out a minimum of five times in each of the five phases.
The study described will produce Level II evidence regarding the effects of performance-focused swimming training on clinical outcomes in people with CP who have high support needs. Findings are expected to provide an indication of the potential for sport to augment outcomes in neurological rehabilitation.
Starting in 2016, we initiated a pilot tele-antibiotic stewardship program at 2 rural Veterans Affairs medical centers (VAMCs). Antibiotic days of therapy decreased significantly (P < .05) in the acute and long-term care units at both intervention sites, suggesting that tele-stewardship can effectively support antibiotic stewardship practices in rural VAMCs.
Content analysis is the process of turning text into data, with either automated or manual techniques, and it provides a feasible and attractive option for undergraduate students to develop and utilize original data. This article presents a cohesive framework for teaching computerized content analysis in undergraduate political science courses. The article discusses examples of how we have taught the techniques in our own classrooms and provides a framework for a content-analysis research assignment. We describe coding, sources of text data available to students, software recommendations appropriate for students, and write-up issues. In the process, we also discuss various learning opportunities that arise from both the strengths and weaknesses of computerized content analysis as a methodological strategy.
The majority of paediatric Clostridioides difficile infections (CDI) are community-associated (CA), but few data exist regarding associated risk factors. We conducted a case–control study to evaluate CA-CDI risk factors in young children. Participants were enrolled from eight US sites during October 2014–February 2016. Case-patients were defined as children aged 1–5 years with a positive C. difficile specimen collected as an outpatient or ⩽3 days of hospital admission, who had no healthcare facility admission in the prior 12 weeks and no history of CDI. Each case-patient was matched to one control. Caregivers were interviewed regarding relevant exposures. Multivariable conditional logistic regression was performed. Of 68 pairs, 44.1% were female. More case-patients than controls had a comorbidity (33.3% vs. 12.1%; P = 0.01); recent higher-risk outpatient exposures (34.9% vs. 17.7%; P = 0.03); recent antibiotic use (54.4% vs. 19.4%; P < 0.0001); or recent exposure to a household member with diarrhoea (41.3% vs. 21.5%; P = 0.04). In multivariable analysis, antibiotic exposure in the preceding 12 weeks was significantly associated with CA-CDI (adjusted matched odds ratio, 6.25; 95% CI 2.18–17.96). Improved antibiotic prescribing might reduce CA-CDI in this population. Further evaluation of the potential role of outpatient healthcare and household exposures in C. difficile transmission is needed.
Medical procedures and patient care activities may facilitate environmental dissemination of healthcare-associated pathogens such as methicillin-resistant Staphylococcus aureus (MRSA).
Observational cohort study of MRSA-colonized patients to determine the frequency of and risk factors for environmental shedding of MRSA during procedures and care activities in carriers with positive nares and/or wound cultures. Bivariate analyses were performed to identify factors associated with environmental shedding.
A Veterans Affairs hospital.
This study included 75 patients in contact precautions for MRSA colonization or infection.
Of 75 patients in contact precautions for MRSA, 55 (73%) had MRSA in nares and/or wounds and 25 (33%) had positive skin cultures. For the 52 patients with MRSA in nares and/or wounds and at least 1 observed procedure, environmental shedding of MRSA occurred more frequently during procedures and care activities than in the absence of a procedure (59 of 138, 43% vs 8 of 83, 10%; P < .001). During procedures, increased shedding occurred ≤0.9 m versus >0.9 m from the patient (52 of 138, 38% vs 25 of 138, 18%; P = .0004). Contamination occurred frequently on surfaces touched by personnel (12 of 38, 32%) and on portable equipment used for procedures (25 of 101, 25%). By bivariate analysis, the presence of a wound with MRSA was associated with shedding (17 of 29, 59% versus 6 of 23, 26%; P = .04).
Environmental shedding of MRSA occurs frequently during medical procedures and patient care activities. There is a need for effective strategies to disinfect surfaces and equipment after procedures.
To evaluate the efficacy of multiple ultraviolet (UV) light decontamination devices in a radiology procedure room.
We compared the efficacy of 8 UV decontamination devices with a 4-minute UV exposure time in reducing recovery of methicillin-resistant Staphylococcus aureus (MRSA), vancomycin-resistant Enterococcus (VRE), and Clostridium difficile spores on steel disk carriers placed at 5 sites on a computed tomography patient table. Analysis of variance was used to compare reductions for the different devices. A spectrometer was used to obtain irradiance measurements for the devices.
Four standard vertical tower low-pressure mercury devices achieved 2 log10CFU or greater reductions in VRE and MRSA and ~1 log10CFU reductions in C. difficile spores, whereas a pulsed-xenon device resulted in less reduction in the pathogens (P<.001). In comparison to the vertical tower low-pressure mercury devices, equal or greater reductions in the pathogens were achieved by 3 nonstandard low-pressure mercury devices that included either adjustable bulbs that could be oriented directly over the exam table, a robotic base allowing movement along the side of the table during operation, or 3 vertical towers operated simultaneously. The low-pressure mercury devices produced primarily UV-C light, whereas the pulsed-xenon device produced primarily UV-A and UV-B light. The time required to move the devices from the corner of the room and set up for operation varied from 18 to 59 seconds.
Many currently available UV devices could provide an effective and efficient adjunct to manual cleaning and disinfection in radiology procedure rooms.
Soldier operational performance is determined by their fitness, nutritional status, quality of rest/recovery, and remaining injury/illness free. Understanding large fluctuations in nutritional status during operations is critical to safeguarding health and well-being. There are limited data world-wide describing the effect of extreme climate change on nutrient profiles. This study investigated the effect of hot-dry deployments on vitamin D status (assessed from 25-hydroxyvitamin D (25(OH)D) concentration) of young, male, military volunteers. Two data sets are presented (pilot study, n 37; main study, n 98), examining serum 25(OH)D concentrations before and during 6-month summer operational deployments to Afghanistan (March to October/November). Body mass, percentage of body fat, dietary intake and serum 25(OH)D concentrations were measured. In addition, parathyroid hormone (PTH), adjusted Ca and albumin concentrations were measured in the main study to better understand 25(OH)D fluctuations. Body mass and fat mass (FM) losses were greater for early (pre- to mid-) deployment compared with late (mid- to post-) deployment (P<0·05). Dietary intake was well-maintained despite high rates of energy expenditure. A pronounced increase in 25(OH)D was observed between pre- (March) and mid-deployment (June) (pilot study: 51 (sd 20) v. 212 (sd 85) nmol/l, P<0·05; main study: 55 (sd 22) v. 167 (sd 71) nmol/l, P<0·05) and remained elevated post-deployment (October/November). In contrast, PTH was highest pre-deployment, decreasing thereafter (main study: 4·45 (sd 2·20) v. 3·79 (sd 1·50) pmol/l, P<0·05). The typical seasonal cycling of vitamin D appeared exaggerated in this active male population undertaking an arduous summer deployment. Further research is warranted, where such large seasonal vitamin D fluctuations may be detrimental to bone health in the longer-term.