To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
World War II was a global war. It was a war of transport, mobility, and speed, fought in the far corners of the planet, across four continents and four oceans. It was also a gross national product war, fought with huge armies equipped with the fruits of modern industry – guns, artillery, tanks, trucks, airplanes, and thousands of other tools and supplies. To fight a war of men and material simultaneously on opposite sides of the earth, the United States needed weapons of speed and distance, such as the airplane, but also tools of mobile transport, such as heavy trucks – and they needed them in record numbers. The United States also required a transportation infrastructure at home (and abroad) that could deliver people and resources where needed for production and warfare.
Patient refusal for care or transport is a common request to medical control physicians, and it is an especially challenging decision in the case of minors. Parents or guardians are able to refuse medical care for a minor if there is not an imminent threat of harm to the minor. However, if a minor patient is presumed to be in need of emergent medical care to prevent harm, medical personnel have the right to treat the minor, even if the parent or guardian objects. If the minor patient is a fetus or a neonate, it is not always clear when they are considered to be a separate patient. Apparently, there is no over-riding general rule or law and, consequently, Emergency Medical Services (EMS) protocols vary greatly from state to state. This case report describes one patient case that involved some of these unclear legal areas and how it fit with local EMS protocols. The legal question arose when a pregnant patient delivered her baby, but the umbilical cord was not cut. Are the mother’s rights violated by cutting the umbilical cord if she objects to the procedure? How is the medical control physician to decide when to go beyond established EMS protocols to ensure that the safest and most ethical care is provided to a patient in the field? Does the care of the infant or the mother take precedence? Continued analyses of cases are required to ensure that protocols and guidelines are protecting both patients and providers.
Venegas A, Ann Maggiore W, Wells R, Baker R, Watts S. Medical control decisions: when does a neonate become a separate patient? Prehosp Disaster Med. 2019;34(2):224–225
Technological advancements in medical devices developed for adults far outpace the development of technologies designed for pediatric patients in the USA and other countries. This technology lag was previously reflected in a lack of pediatric-specific innovation within our academic institution. To address the institutional deficit of device innovation around pediatric patients, we formed unique partnerships both within our university and extending to the medical device industry, and developed novel programmatic approaches. The Pediatric Device Innovation Consortium (PDIC) bridges the medical device community and the University of Minnesota. Since 2014, the PDIC has supported 22 pediatric medical technology innovation projects, provided funds totaling more than $500,000, licensed two technologies, and advanced two technologies to patient use. Here, we describe the PDIC model and method, the PDIC approach to common challenges that arise in the development of small-market medical technologies at an academic institution, and iterations to our collaborative, multidisciplinary approach that have matured throughout our experience. The PDIC model continues to evolve to reflect the special needs of innovation for smaller markets and the unique role of clinician innovators. Our approach serves as a successful model for other institutions interested in creating support mechanisms for pediatric or small-market technology development.
Salmonella spp. continue to be a leading cause of foodborne morbidity worldwide. To assess the risk of foodborne disease, current national regulatory schemes focus on prevalence estimates of Salmonella and other pathogens. The role of pathogen quantification as a risk management measure and its impact on public health is not well understood. To address this information gap, a quantitative risk assessment model was developed to evaluate the impact of pathogen enumeration strategies on public health after consumption of contaminated ground turkey in the USA. Public health impact was evaluated by using several dose–response models for high- and low-virulent strains to account for potential under- or overestimation of human health impacts. The model predicted 2705–21 099 illnesses that would result in 93–727 reported cases of salmonellosis. Sensitivity analysis predicted cooking an unthawed product at home as the riskiest consumption scenario and microbial concentration the most influential input on the incidence of human illnesses. Model results indicated that removing ground turkey lots exceeding contamination levels of 1 MPN/g and 1 MPN in 25 g would decrease the median number of illnesses by 86–94% and 99%, respectively. For a single production lot, contamination levels higher than 1 MPN/g would be needed to result in a reported case to public health officials. At contamination levels of 10 MPN/g, there would be a 13% chance of detecting an outbreak, and at 100 MPN/g, the likelihood of detecting an outbreak increases to 41%. Based on these model prediction results, risk management strategies should incorporate pathogen enumeration. This would have a direct impact on illness incidence linking public health outcomes with measurable food safety objectives.
The ‘Digital Index of North American Archaeology’ (DINAA) project demonstrates how the aggregation and publication of government-held archaeological data can help to document human activity over millennia and at a continental scale. These data can provide a valuable link between specific categories of information available from publications, museum collections and online databases. Integration improves the discovery and retrieval of records of archaeological research currently held by multiple institutions within different information systems. It also aids in the preservation of those data and makes efforts to archive these research results more resilient to political turmoil. While DINAA focuses on North America, its methods have global applicability.
Patients with poorly controlled diabetes mellitus may have a sentinel emergency department (ED) visit for a precipitating condition prior to presenting for a hyperglycemic emergency, such as diabetic ketoacidosis (DKA) or hyperosmolar hyperglycemic state (HHS). This study’s objective was to describe the epidemiology and outcomes of patients with a sentinel ED visit prior to their hyperglycemic emergency visit.
This was a 1-year health records review of patients≥18 years old presenting to one of four tertiary care EDs with a discharge diagnosis of hyperglycemia, DKA, or HHS. Trained research personnel collected data on patient characteristics, management, disposition, and determined whether patients came to the ED within the 14 days prior to their hyperglycemia visit. Descriptive statistics were used to summarize the data.
Of 833 visits for hyperglycemia, 142 (17.0%; 95% CI: 14.5% to 19.6%) had a sentinel ED presentation within the preceding 14 days. Mean (SD) age was 50.5 (19.0) years and 54.4% were male; 104 (73.2%) were discharged from this initial visit, and 98/104 (94.2%) were discharged either without their glucose checked or with an elevated blood glucose (>11.0 mmol/L). Of the sentinel visits, 93 (65.5%) were for hyperglycemia and 22 (15.5%) for infection. Upon returning to the ED, 61/142 (43.0%) were admitted for severe hyperglycemia, DKA, or HHS.
In this unique ED-based study, diabetic patients with a sentinel ED visit often returned and required subsequent admission for hyperglycemia. Clinicians should be vigilant in checking blood glucose and provide clear discharge instructions for follow-up and glucose management to prevent further hyperglycemic emergencies from occurring.
Improving neurocognitive outcomes following treatment for brain metastases have become increasingly important. We propose that a brief telephone-based neurocognitive assessment may improve follow-up cognitive assessments in this palliative population. Aim: To prospectively assess the feasibility and reliability of a telephone based brief neurocognitive assessment compared to the same tests delivered face-to-face. Methods: Brain metastases patients to be treated with whole brain radiotherapy (WBRT) were assessed using a brief validated neurocognitive battery at baseline, at 1 month and 3 months following WBRT (in person and over the phone). The primary outcome was feasibility and inter-procedural (in person versus telephone) reliability. The secondary objective was to evaluate the change in neurocognitive function before and after WBRT. Results: Out of 39 patients enrolled, 82% of patients completed the baseline in-person and telephone neurocognitive assessments. However, at 1 month, only 41% of enrolled patients completed the in-person and telephone cognitive assessments and at 3 months, only 10% of patients completed them. Results pertaining to reliability and change in neurocognitive function will be updated. Conclusion: The pre-defined definition of feasibility (at least 80% completion for face to face and telephone neurocognitive assessments) was met at baseline. However, a large proportion of participants did not complete either telephone or in person neurocognitive follow-up at 1 month and at 3 months post-WBRT. Attrition remained a challenge for neurocognitive testing in this population even when a telephone-based brief assessment was used.
sequestration into a deep saline aquifer of finite vertical extent,
will tend to accumulate in structural highs such as offered by an anticline. Over times of tens to thousands of years, some of the
will dissolve into the underlying groundwater to produce a region of relatively dense, saturated water directly below the plume of
. Continued dissolution then requires the supply of unsaturated aquifer water. In an aquifer of finite vertical extent, this may be provided by a background hydrological flow, or a laterally-spreading buoyancy-driven flow caused by the greater density of the
saturated water relative to the original aquifer water.
We investigate long time steady-state dissolution in the presence of a background hydrological flow. In steady state, the distribution of
in the groundwater upstream of the aquifer involves a balance between three competing effects: (i) the buoyancy-driven flow of
saturated water; (ii) the diffusion of
from saturated to under-saturated water; and (iii) the advection associated with the oncoming background flow. This leads to three limiting regimes. In the limit of very slow diffusion, a nearly static intrusion of dense fluid may extend a finite distance upstream, balanced by the pressure gradient associated with the oncoming background flow. In the limit of fast diffusion relative to the flow, a gradient zone may become established in which the along-aquifer diffusive flux balances the advection associated with the background flow. However, if the buoyancy-driven flow speed exceeds the background hydrological flow speed, then a third, intermediate regime may become established. In this regime, a convective recirculation develops upstream of the anticline involving the vertical diffusion of
from an upstream propagating flow of dense
saturated water into the downstream propagating flow of
unsaturated water. For each limiting case, we find analytical solutions for the distribution of
upstream of the anticline, and test our analysis with full numerical simulations. A key result is that, although there may be very different controls on the distribution and extent of
bearing water upstream of the anticline, in each case the dissolution rate is given by the product of the background volume flux and the difference in concentration between the
saturated water and the original aquifer water upstream.
We consider the dynamics of actively entraining turbulent density currents on a conical sloping surface in a rotating fluid. A theoretical plume model is developed to describe both axisymmetric flow and single-stream currents of finite angular extent. An analytical solution is derived for flow dominated by the initial buoyancy flux and with a constant entrainment ratio, which serves as an attractor for solutions with alternative initial conditions where the initial fluxes of mass and momentum are non-negligible. The solutions indicate that the downslope propagation of the current halts at a critical level where there is purely azimuthal flow, and the boundary layer approximation breaks down. Observations from a set of laboratory experiments are consistent with the dynamics predicted by the model, with the flow approaching a critical level. Interpretation in terms of the theory yields an entrainment coefficient
$E\propto 1/\Omega $
where the rotation rate is
. We also derive a corresponding theory for density currents from a line source of buoyancy on a planar slope. Our theoretical models provide a framework for designing and interpreting laboratory studies of turbulent entrainment in rotating dense flows on slopes and understanding their implications in geophysical flows.
The emergence of invasive fungal wound infections (IFIs) in combat casualties led to development of a combat trauma-specific IFI case definition and classification. Prospective data were collected from 1133 US military personnel injured in Afghanistan (June 2009–August 2011). The IFI rates ranged from 0·2% to 11·7% among ward and intensive care unit admissions, respectively (6·8% overall). Seventy-seven IFI cases were classified as proven/probable (n = 54) and possible/unclassifiable (n = 23) and compared in a case-case analysis. There was no difference in clinical characteristics between the proven/probable and possible/unclassifiable cases. Possible IFI cases had shorter time to diagnosis (P = 0·02) and initiation of antifungal therapy (P = 0·05) and fewer operative visits (P = 0·002) compared to proven/probable cases, but clinical outcomes were similar between the groups. Although the trauma-related IFI classification scheme did not provide prognostic information, it is an effective tool for clinical and epidemiological surveillance and research.
Removal of an eye or the contents of an orbit may be indicated when the eye is affected by neoplasia or a severe infectious process, or when an end-stage ocular disease in a blind eye causes pain. These ophthalmic interventions are usually classified as:
Enucleation: the removal of the entire globe, including the sclera, intraocular contents, and the cornea. The stump of the optic nerve as well as the extraocular muscles are left behind.
Evisceration: the removal of intraocular contents including the lens, uvea, retina, vitreous humor, and in some cases the cornea. Only the sclera and extraocular muscles remain intact.
Exenteration: the removal of the globe and all of the orbital contents. This procedure may include removal of selective sections of orbital bone.
Following enucleations and eviscerations, an orbital implant is used to replace the globe and restore the lost orbital volume. The implant or sphere serves to maintain the structure of the orbit and to provide motility to the overlying prosthesis. For children, it additionally serves to maintain more normal growth of the surrounding orbital bones. In cases of exenteration, an osseointegrated prosthesis may be attached within the orbit, secured with metal support elements or magnets that are attached to bone.
Cover crop–based organic rotational no-till soybean production has attracted attention from farmers, researchers, and other agricultural professionals because of the ability of this new system to enhance soil conservation, reduce labor requirements, and decrease diesel fuel use compared to traditional organic production. This system is based on the use of cereal rye cover crops that are mechanically terminated with a roller-crimper to create in situ mulch that suppresses weeds and promotes soybean growth. In this paper, we report experiments that were conducted over the past decade in the eastern region of the United States on cover crop–based organic rotational no-till soybean production, and we outline current management strategies and future research needs. Our research has focused on maximizing cereal rye spring ground cover and biomass because of the crucial role this cover crop plays in weed suppression. Soil fertility and cereal rye sowing and termination timing affect biomass production, and these factors can be manipulated to achieve levels greater than 8,000 kg ha−1, a threshold identified for consistent suppression of annual weeds. Manipulating cereal rye seeding rate and seeding method also influences ground cover and weed suppression. In general, weed suppression is species-specific, with early emerging summer annual weeds (e.g., common ragweed), high weed seed bank densities (e.g. > 10,000 seeds m−2), and perennial weeds (e.g., yellow nutsedge) posing the greatest challenges. Due to the challenges with maximizing cereal rye weed suppression potential, we have also found high-residue cultivation to significantly improve weed control. In addition to cover crop and weed management, we have made progress with planting equipment and planting density for establishing soybean into a thick cover crop residue. Our current and future research will focus on integrated multitactic weed management, cultivar selection, insect pest suppression, and nitrogen management as part of a systems approach to advancing this new production system.
Wheat bran extract (WBE) is a food-grade soluble fibre preparation that is highly enriched in arabinoxylan oligosaccharides. In this placebo-controlled cross-over human intervention trial, tolerance and effects on colonic protein and carbohydrate fermentation were studied. After a 1-week run-in period, sixty-three healthy adult volunteers consumed 3, 10 and 0 g WBE/d for 3 weeks in a random order, with 2 weeks' washout between each treatment period. Fasting blood samples were collected at the end of the run-in period and at the end of each treatment period for analysis of haematological and clinical chemistry parameters. Additionally, subjects collected a stool sample for analysis of microbiota, SCFA and pH. A urine sample, collected over 48 h, was used for analysis of p-cresol and phenol content. Finally, the subjects completed questionnaires scoring occurrence frequency and distress severity of eighteen gastrointestinal symptoms. Urinary p-cresol excretion was significantly decreased after WBE consumption at 10 g/d. Faecal bifidobacteria levels were significantly increased after daily intake of 10 g WBE. Additionally, WBE intake at 10 g/d increased faecal SCFA concentrations and lowered faecal pH, indicating increased colonic fermentation of WBE into desired metabolites. At 10 g/d, WBE caused a mild increase in flatulence occurrence frequency and distress severity and a tendency for a mild decrease in constipation occurrence frequency. In conclusion, WBE is well tolerated at doses up to 10 g/d in healthy adults volunteers. Intake of 10 g WBE/d exerts beneficial effects on gut health parameters.
The intestinal microbiota are a complex ecosystem influencing the immunoregulation of the human host, providing protection from colonising pathogens and producing SCFA as the main energy source of colonocytes. Our objective was to investigate the effect of dietary fibre exclusion and supplementation on the intestinal microbiota and SCFA concentrations. Faecal samples were obtained from healthy volunteers before and after two 14 d periods of consuming formulated diets devoid or supplemented with fibre (14 g/l). The faecal microbiota were analysed using fluorescent in situ hybridisation and SCFA were measured using GLC. There were large and statistically significant reductions in the numbers of the Faecalibacterium prausnitzii (P ≤ 0·01) and Roseburia spp. (P ≤ 0·01) groups during both the fibre-free and fibre-supplemented diets. Significant and strong positive correlations between the proportion of F. prausnitzii and the proportion of butyrate during both baseline normal diets were found (pre-fibre free r 0·881, P = 0·001; pre-fibre supplemented r 0·844, P = 0·002). A significant correlation was also found between the proportional reduction in F. prausnitzii and the proportional reduction in faecal butyrate during both the fibre-free (r 0·806; P = 0·005) and the fibre-supplemented diet (r 0·749; P = 0·013). These findings may contribute to the understanding of the association between fibre, microbiota and fermentation in health, during enteral nutrition and in disease states such as Crohn's disease.