Book chapters will be unavailable on Saturday 24th August between 8am-12pm BST. This is for essential maintenance which will provide improved performance going forwards. Please accept our apologies for any inconvenience caused.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Public awareness of ‘red flag’ symptoms for head and neck cancer is low. There is a lack of evidence regarding patient concerns and expectations in consultations for cancer assessment.
This prospective questionnaire study examined the symptoms, concerns and expectations of 250 consecutive patients attending an ‘urgent suspicion of cancer’ clinic at a tertiary referral centre.
The patients’ most frequent responses regarding their concerns were ‘no concerns’ (n = 72, 29 per cent); ‘all symptoms’ were a cause for concern (n = 65, 26 per cent) and ‘neck lump’ was a symptom causing concern (n = 37, 17 per cent). The expectations of patients attending clinic were that they would find out what was wrong with them, followed by having no expectations at all. Overall patient knowledge of red flag symptoms was lacking and their expectations were low.
Patients with non-cancer symptoms are frequently referred with suspected cancer. Patients with red flag symptoms are not aware of their significance and they have low expectations of healthcare.
Lipid-based nutrient supplements (LNS) may be beneficial for malnourished HIV-infected patients starting antiretroviral therapy (ART). We assessed the effect of adding vitamins and minerals to LNS on body composition and handgrip strength during ART initiation. ART-eligible HIV-infected patients with BMI <18·5 kg/m2 were randomised to LNS or LNS with added high-dose vitamins and minerals (LNS-VM) from referral for ART to 6 weeks post-ART and followed up until 12 weeks. Body composition by bioelectrical impedance analysis (BIA), deuterium (2H) diluted water (D2O) and air displacement plethysmography (ADP), and handgrip strength were determined at baseline and at 6 and 12 weeks post-ART, and effects of LNS-VM v. LNS at 6 and 12 weeks investigated. BIA data were available for 1461, D2O data for 479, ADP data for 498 and handgrip strength data for 1752 patients. Fat mass tended to be lower, and fat-free mass correspondingly higher, by BIA than by ADP or D2O. At 6 weeks post-ART, LNS-VM led to a higher regain of BIA-assessed fat mass (0·4 (95 % CI 0·05, 0·8) kg), but not fat-free mass, and a borderline significant increase in handgrip strength (0·72 (95 % CI −0·03, 1·5) kg). These effects were not sustained at 12 weeks. Similar effects as for BIA were seen using ADP or D2O but no differences reached statistical significance. In conclusion, LNS-VM led to a higher regain of fat mass at 6 weeks and to a borderline significant beneficial effect on handgrip strength. Further research is needed to determine appropriate timing and supplement composition to optimise nutritional interventions in malnourished HIV patients.
Rayleigh–Bénard convection is one of the most well-studied models in fluid mechanics. Atmospheric convection, one of the most important components of the climate system, is by comparison complicated and poorly understood. A key attribute of atmospheric convection is the buoyancy source provided by the condensation of water vapour, but the presence of radiation, compressibility, liquid water and ice further complicate the system and our understanding of it. In this paper we present an idealized model of moist convection by taking the Boussinesq limit of the ideal-gas equations and adding a condensate that obeys a simplified Clausius–Clapeyron relation. The system allows moist convection to be explored at a fundamental level and reduces to the classical Rayleigh–Bénard model if the latent heat of condensation is taken to be zero. The model has an exact, Rayleigh-number-independent ‘drizzle’ solution in which the diffusion of water vapour from a saturated lower surface is balanced by condensation, with the temperature field (and so the saturation value of the moisture) determined self-consistently by the heat released in the condensation. This state is the moist analogue of the conductive solution in the classical problem. We numerically determine the linear stability properties of this solution as a function of Rayleigh number and a non-dimensional latent-heat parameter. We also present some two-dimensional, time-dependent, nonlinear solutions at various values of Rayleigh number and the non-dimensional condensational parameters. At sufficiently low Rayleigh number the system converges to the drizzle solution, and we find no evidence that two-dimensional self-sustained convection can occur when that solution is stable. The flow transitions from steady to turbulent as the Rayleigh number or the effects of condensation are increased, with plumes triggered by gravity waves emanating from other plumes. The interior dries as the level of turbulence increases, because the plumes entrain more dry air and because the saturated boundary layer at the top becomes thinner. The flow develops a broad relative humidity minimum in the domain interior, only weakly dependent on Rayleigh number when that is high.
A robust biomedical informatics infrastructure is essential for academic health centers engaged in translational research. There are no templates for what such an infrastructure encompasses or how it is funded. An informatics workgroup within the Clinical and Translational Science Awards network conducted an analysis to identify the scope, governance, and funding of this infrastructure. After we identified the essential components of an informatics infrastructure, we surveyed informatics leaders at network institutions about the governance and sustainability of the different components. Results from 42 survey respondents showed significant variations in governance and sustainability; however, some trends also emerged. Core informatics components such as electronic data capture systems, electronic health records data repositories, and related tools had mixed models of funding including, fee-for-service, extramural grants, and institutional support. Several key components such as regulatory systems (e.g., electronic Institutional Review Board [IRB] systems, grants, and contracts), security systems, data warehouses, and clinical trials management systems were overwhelmingly supported as institutional infrastructure. The findings highlighted in this report are worth noting for academic health centers and funding agencies involved in planning current and future informatics infrastructure, which provides the foundation for a robust, data-driven clinical and translational research program.
Knowledge of the effects of burial depth and burial duration on seed viability and, consequently, seedbank persistence of Palmer amaranth (Amaranthus palmeri S. Watson) and waterhemp [Amaranthus tuberculatus (Moq.) J. D. Sauer] ecotypes can be used for the development of efficient weed management programs. This is of particular interest, given the great fecundity of both species and, consequently, their high seedbank replenishment potential. Seeds of both species collected from five different locations across the United States were investigated in seven states (sites) with different soil and climatic conditions. Seeds were placed at two depths (0 and 15 cm) for 3 yr. Each year, seeds were retrieved, and seed damage (shrunken, malformed, or broken) plus losses (deteriorated and futile germination) and viability were evaluated. Greater seed damage plus loss averaged across seed origin, burial depth, and year was recorded for lots tested at Illinois (51.3% and 51.8%) followed by Tennessee (40.5% and 45.1%) and Missouri (39.2% and 42%) for A. palmeri and A. tuberculatus, respectively. The site differences for seed persistence were probably due to higher volumetric water content at these sites. Rates of seed demise were directly proportional to burial depth (α=0.001), whereas the percentage of viable seeds recovered after 36 mo on the soil surface ranged from 4.1% to 4.3% compared with 5% to 5.3% at the 15-cm depth for A. palmeri and A. tuberculatus, respectively. Seed viability loss was greater in the seeds placed on the soil surface compared with the buried seeds. The greatest influences on seed viability were burial conditions and time and site-specific soil conditions, more so than geographical location. Thus, management of these weed species should focus on reducing seed shattering, enhancing seed removal from the soil surface, or adjusting tillage systems.
BACKGROUND: IGTS is a rare phenomenon of paradoxical germ cell tumor (GCT) growth during or following treatment despite normalization of tumor markers. We sought to evaluate the frequency, clinical characteristics and outcome of IGTS in patients in 21 North-American and Australian institutions. METHODS: Patients with IGTS diagnosed from 2000-2017 were retrospectively evaluated. RESULTS: Out of 739 GCT diagnoses, IGTS was identified in 33 patients (4.5%). IGTS occurred in 9/191 (4.7%) mixed-malignant GCTs, 4/22 (18.2%) immature teratomas (ITs), 3/472 (0.6%) germinomas/germinomas with mature teratoma, and in 17 secreting non-biopsied tumours. Median age at GCT diagnosis was 10.9 years (range 1.8-19.4). Male gender (84%) and pineal location (88%) predominated. Of 27 patients with elevated markers, median serum AFP and Beta-HCG were 70 ng/mL (range 9.2-932) and 44 IU/L (range 4.2-493), respectively. IGTS occurred at a median time of 2 months (range 0.5-32) from diagnosis, during chemotherapy in 85%, radiation in 3%, and after treatment completion in 12%. Surgical resection was attempted in all, leading to gross total resection in 76%. Most patients (79%) resumed GCT chemotherapy/radiation after surgery. At a median follow-up of 5.3 years (range 0.3-12), all but 2 patients are alive (1 succumbed to progressive disease, 1 to malignant transformation of GCT). CONCLUSION: IGTS occurred in less than 5% of patients with GCT and most commonly after initiation of chemotherapy. IGTS was more common in patients with IT-only on biopsy than with mixed-malignant GCT. Surgical resection is a principal treatment modality. Survival outcomes for patients who developed IGTS are favourable.
The risk of cross infection in a busy emergency department (ED) is a serious public health concern, especially in times of pandemic threats. We simulated cross infections due to respiratory diseases spread by large droplets using empirical data on contacts (ie, close-proximity interactions of ≤1m) in an ED to quantify risks due to contact and to examine factors with differential risks associated with them.
Health workers (HCWs) and patients.
A busy ED.
Data on contacts between participants were collected over 6 months by observing two 12-hour shifts per week using a radiofrequency identification proximity detection system. We simulated cross infection due to a novel agent across these contacts to determine risks associated with HCW role, chief complaint category, arrival mode, and ED disposition status.
Cross-infection risk between HCWs was substantially greater than between patients or between patients and HCWs. Providers had the least risk, followed by nurses, and nonpatient care staff had the most risk. There were no differences by patient chief complaint category. We detected differential risk patterns by arrival mode and by HCW role. Although no differential risk was associated with ED disposition status, 0.1 infections were expected per shift among patients admitted to hospital.
These simulations demonstrate that, on average, 11 patients who were infected in the ED will be admitted to the hospital over the course of an 8-week local influenza outbreak. These patients are a source of further cross-infection risk once in the hospital.
Palmer amaranth (Amaranthus palmeri S. Watson) is a problematic weed encountered in U.S. cotton (Gossypium hirsutum L.) and soybean [Glycine max (L.) Merr.] production, with infestations spreading northward. This research investigated the influence of planting date (early, mid-, and late season) and population (AR, IN, MO, MS, NE, and TN) on A. palmeri growth and reproduction at two locations. All populations planted early or midseason at Throckmorton Purdue Agricultural Center (TPAC) and Arkansas Agriculture Research and Extension Center (AAREC) measured 196 and 141 cm or more, respectively. Amaranthus palmeri height did not exceed 168 and 134 cm when planted late season at TPAC and AAREC, respectively. Early season planted A. palmeri from NE grew to 50% of maximum height 8 to 13 d earlier than all other populations under TPAC conditions. In addition, the NE population planted early, mid-, and late season achieved 50% inflorescence emergence 5, 4, and 6 d earlier than all other populations, respectively. All populations established at TPAC produced fewer than 100,000 seeds plant−1. No population planted at TPAC and AAREC produced more than 740 and 1,520 g plant−1 of biomass at 17 and 19 wk after planting, respectively. Planting date influenced the distribution of male and female plants at TPAC, but not at AAREC. Amaranthus palmeri from IN and MS planted late season had male-to-female plant ratios of 1.3:1 and 1.7:1, respectively. Amaranthus palmeri introduced to TPAC from NE can produce up to 7,500 seeds plant−1 if emergence occurs in mid-July. An NE A. palmeri population exhibited biological characteristics allowing it to be highly competitive if introduced to TPAC due to a similar latitudinal range, but was least competitive when introduced to AAREC. Although A. palmeri originating from different locations can vary biologically, plants exhibited environmental plasticity and could complete their life cycle and contribute to spreading populations.
Using high-precision photometry from the Kepler mission, we investigate patterns of spot activity on the K1-type subgiant component of KIC 11560447, a short-period late-type eclipsing binary. We tested the validity of maximum entropy reconstructions of starspots by numerical simulations. Our procedure successfully captures up to three large spot clusters migrating in longitude. We suggest a way to measure a lower limit for stellar differential rotation, using slopes of spot patterns in the reconstructed time-longitude diagram. We find solar-like differential rotation and recurrent spot activity with a long-term trend towards a dominant axisymmetric spot distribution during the period of observations.
We describe the investigation of two temporally coincident illness clusters involving salmonella and Staphylococcus aureus in two states. Cases were defined as gastrointestinal illness following two meal events. Investigators interviewed ill persons. Stool, food and environmental samples underwent pathogen testing. Alabama: Eighty cases were identified. Median time from meal to illness was 5·8 h. Salmonella Heidelberg was identified from 27 of 28 stool specimens tested, and coagulase-positive S. aureus was isolated from three of 16 ill persons. Environmental investigation indicated that food handling deficiencies occurred. Colorado: Seven cases were identified. Median time from meal to illness was 4·5 h. Five persons were hospitalised, four of whom were admitted to the intensive care unit. Salmonella Heidelberg was identified in six of seven stool specimens and coagulase-positive S. aureus in three of six tested. No single food item was implicated in either outbreak. These two outbreaks were linked to infection with Salmonella Heidelberg, but additional factors, such as dual aetiology that included S. aureus or the dose of salmonella ingested may have contributed to the short incubation periods and high illness severity. The outbreaks underscore the importance of measures to prevent foodborne illness through appropriate washing, handling, preparation and storage of food.
Livestock health is economically important for agropastoral households whose wealth is held partly as livestock. Households can invest in disease prevention and treatment, but livestock disease risk is also affected by grazing practices that result in inter-herd contact and disease transmission in regions with endemic communicable diseases. This paper examines the relationships between communal grazing and antimicrobial use in Maasai, Chagga and Arusha households in northern Tanzania. We develop a theoretical model of the economic connection between communal grazing, disease transmission risk, risk perceptions, and antimicrobial use, and derive testable hypotheses about these connections. Regression results suggest that history of disease and communal grazing are associated with higher subjective disease risk and greater antimicrobial use. We discuss the implications of these results in light of the potential for relatively high inter-herd disease transmission rates among communal grazers and potential contributions to antimicrobial resistance due to antimicrobial use.
The Indiana Clinical and Translational Sciences Institute’s Community Engagement Partners-Purdue Extension collaborative model demonstrates tremendous potential for creating state-wide programmatic efforts and improvements in both the health culture and status of Indiana residents across the state. It can serve as a prototype not only for others interested in pursuing wide geographic health improvements through Clinical and Translational Sciences Award-Cooperative Extension partnerships but also for broader collaborations among United States Department of Agriculture, National Institutes of Health, Centers for Disease Control and Prevention, state and local health departments, and health foundation efforts to improve population health.
Interpretation of ice-core records in terms of changes in atmospheric concentrations requires understanding of the various parameters within air–snow transfer functions. the dry-deposition velocity is one of these parameters, dependent on local meteorological conditions and thereby also affected by climate changes. We have determined aerosol dry-deposition velocities by measurements of aerosol particle-number concentration and the vertical wind component with an eddy-covariance system close to the Swedish and Finnish research stations Wasa and Aboa in Dronning Maud Land, Antarctica. Measurements were performed over a smooth, snow-covered area and over moderately rough, rocky ground during 4 and 19 days, respectively, in January 2000. the median dry-deposition velocity determined 5.25 mabove the surface was 0.33 and 0.80 cm s–1, respectively. the large difference between the two sites was mainly due to the stratification of the surface boundary layer, the surface albedo and the surface roughness height. the dry-deposition number fluxes were dominated by the particle-size modes defined as ultrafine and Aitken, withmean diameters around 14 and 42 nm, respectively. A larger dry-deposition velocity, owing to stronger Brownian diffusion, for the smaller ultrafine mode was verified by the measurements.
During 2000–07, five giant icebergs (B15A, B15J, B15K, C16 and C25) adrift in the southwestern Ross Sea, Antarctica, were instrumented with global positioning system (GPS) receivers and other instruments to monitor their behavior in the near-coastal environment. The measurements show that collision processes can strongly influence iceberg behavior and delay their progress in drifting to the open ocean. Collisions appear to have been a dominant control on the movement of B15A, the largest of the icebergs, during the 4-year period it gyrated within the limited confines of Ross Island, the fixed Ross Ice Shelf and grounded C16. Iceberg interactions in the near-coastal regime are largely driven by ocean tidal effects which determine the magnitude of forces generated during collision and break-up events. Estimates of forces derived from the observed drift trajectories during the iceberg-collisioninduced calving of iceberg C19 from the Ross Ice Shelf, during the iceberg-induced break-off of the tip of the Drygalski Ice Tongue and the break-up of B15A provide a crude estimate of the stress scale involved in iceberg calving. Considering the total area the vertical face of new rifts created in the calving or break-up process, and not accounting for local stress amplification near rift tips, this estimated stress scale is 104 Pa.
We use a finite-element model of coupled ice-stream/ice-shelf flow to study the sensitivity of Pine Island Glacier, West Antarctica, to changes in ice-shelf and basal conditions. By tuning a softening coefficient of the ice along the glacier margins, and a basal friction coefficient controlling the distribution of basal shear stress underneath the ice stream, we are able to match model velocity to that observed with interferometric synthetic aperture radar (InSAR). We use the model to investigate the effect of small perturbations on ice flow. We find that a 5.5–13% reduction in our initial ice-shelf area increases the glacier velocity by 3.5–10% at the grounding line. The removal of the entire ice shelf increases the grounding-line velocity by > 70%. The changes in velocity associated with ice-shelf reduction are felt several tens of km inland. Alternatively, a 5% reduction in basal shear stress increases the glacier velocity by 13% at the grounding line. By contrast, softening of the glacier side margins would have to be increased a lot more to produce a comparable change in ice velocity. Hence, both the ice-shelf buttressing and the basal shear stress contribute significant resistance to the flow of Pine Island Glacier.
A field study was conducted for the 2014 and 2015 growing season in Arkansas, Indiana, Illinois, Missouri, Ohio, and Tennessee to determine the effect of cereal rye and either oats, radish, or annual ryegrass on the control of Amaranthus spp. when integrated with comprehensive herbicide programs in glyphosate-resistant and glufosinate-resistant soybean. Amaranthus species included redroot pigweed, waterhemp, and Palmer amaranth. The two herbicide programs included were: a PRE residual herbicide followed by POST application of foliar and residual herbicide (PRE/POST); or PRE residual herbicide followed by POST application of foliar and residual herbicide, followed by another POST application of residual herbicide (PRE/POST/POST). Control was not affected by type of soybean resistance trait. At the end of the season, herbicides controlled 100 and 96% of the redroot pigweed and Palmer amaranth, respectively, versus 49 and 29% in the absence of herbicides, averaged over sites and other factors. The PRE/POST and PRE/POST/POST herbicide treatments controlled 83 and 90% of waterhemp at the end of the season, respectively, versus 14% without herbicide. Cover crop treatments affected control of waterhemp and Palmer amaranth and soybean yield, only in the absence of herbicides. The rye cover crop consistently reduced Amaranthus spp. density in the absence of herbicides compared to no cover treatment.