To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We describe the investigation of two temporally coincident illness clusters involving salmonella and Staphylococcus aureus in two states. Cases were defined as gastrointestinal illness following two meal events. Investigators interviewed ill persons. Stool, food and environmental samples underwent pathogen testing. Alabama: Eighty cases were identified. Median time from meal to illness was 5·8 h. Salmonella Heidelberg was identified from 27 of 28 stool specimens tested, and coagulase-positive S. aureus was isolated from three of 16 ill persons. Environmental investigation indicated that food handling deficiencies occurred. Colorado: Seven cases were identified. Median time from meal to illness was 4·5 h. Five persons were hospitalised, four of whom were admitted to the intensive care unit. Salmonella Heidelberg was identified in six of seven stool specimens and coagulase-positive S. aureus in three of six tested. No single food item was implicated in either outbreak. These two outbreaks were linked to infection with Salmonella Heidelberg, but additional factors, such as dual aetiology that included S. aureus or the dose of salmonella ingested may have contributed to the short incubation periods and high illness severity. The outbreaks underscore the importance of measures to prevent foodborne illness through appropriate washing, handling, preparation and storage of food.
Kummerite, ideally Mn2+Fe3+A1(PO4)2(OH)2.8H2O, is a new secondary phosphate mineral belonging to the laueite group, from the Hagendorf-Süd pegmatite, Hagendorf, Oberpfalz, Bavaria, Germany. Kummerite occurs as sprays or rounded aggregates of very thin, typically deformed, amber yellow laths. Cleavage is good parallel to ﹛010﹜. The mineral is associated closely with green Zn- and Al-bearing beraunite needles. Other associated minerals are jahnsite-(CaMnMn) and Al-bearing frondelite. The calculated density of kummerite is 2.34 g cm 3. It is optically biaxial (-), α= 1.565(5), β = 1.600(5) and y = 1.630(5), with weak dispersion. Pleochroism is weak, with amber yellow tones. Electron microprobe analyses (average of 13 grains) with H2O and FeO/Fe2O3 calculated on structural grounds and normalized to 100%, gave Fe2O3 17.2, FeO 4.8, MnO 5.4, MgO 2.2, ZnO 0.5, Al2O3 9.8, P2O5 27.6, H2O 32.5, total 100 wt.%. The empirical formula, based on 3 metal apfu is (Mn2+0.37Mg0.27Zn0.03Fe2+0.33)Σ1.00(Fe3+1.06Al0. 94)Σ2.00PO4)1.91(OH)2.27(H2O)7.73. Kummerite is triclinic, P1̄, with the unit-cell parameters of a = 5.316(1) Å, b =10.620(3) Å , c = 7.118(1) Å, α = 107.33(3)°, β= 111.22(3)°, γ = 72.22(2)° and V= 348.4(2) Å3. The strongest lines in the powder X-ray diffraction pattern are [dobs in Å(I) (hkl)] 9.885 (100) (010); 6.476 (20) (001); 4.942 (30) (020); 3.988 (9) (̄110); 3.116 (18) (1̄20); 2.873 (11) (1̄21). Kummerite is isostructural with laueite, but differs in having Al and Fe3+ ordered into alternate octahedral sites in the 7.1 Å trans-connected octahedral chains.
Introduction: Some non-urgent/low-acuity Emergency Department (ED) presentations are considered convenience visits and potentially avoidable with improved access to primary care services. This study surveyed patients who presented to the ED and explored their self-reported reasons and barriers for not being connected to a primary care provider (PCP). Methods: Patients aged 17 years and older were randomly selected from electronic registration records at three urban EDs in Edmonton, Alberta (AB), Canada. Following initial triage, stabilization, and verbal informed consent, patients completed a 47-item questionnaire. Data from the survey were cross-referenced to a minimal patient dataset consisting of ED and demographic information. The questionnaire collected information on patient characteristics, their connection to a PCP, and patients' reasons for not having a PCP. Results: Of the 2144 eligible patients, 1408 (65.7%) surveys were returned and 1402 (65.4%) were completed. The majority of patients (74.4%) presenting to the ED reported having a family physician; however, the ‘closeness’ of the connection to their family physician varied greatly among ED patients with the most recent family physician visit ranging from 1 hour before ED presentation to 45 years prior. Approximately 25% of low acuity ED patients reported no connection with a family physician. Reasons for a lack of PCP connection included: prior physician retired, left, or died (19.8%), they had never tried to find one (19.2%), they had recently moved to Alberta (18.0%), and they were unable to find one (16.5%). Conclusion: A surprisingly high proportion of ED patients (25.6%) have no identified PCP. Patients had a variety of reasons for not having a family physician. These need to be understood and addressed in order for primary care access to successfully contribute to diverting non-urgent, low acuity presentations from the ED.
Introduction: Some low acuity Emergency Department (ED) presentations are considered non-urgent or convenience visits and potentially avoidable with improved access to primary care. This study explored self-reported reasons why non-urgent patients presented to the ED. Methods: Patients, 17 years and older, were randomly selected from electronic registration records at three urban EDs in Edmonton, Alberta (AB), Canada during weekdays (0700 to 1900). A 47-item questionnaire was completed by each consenting patient, which included items on whether the patient believed the ED was their best care option and the rationale supporting their response. A thematic content analysis was performed on the responses, using previous experience and review of the literature to identify themes. Results: Of the 2144 eligible patients, 1408 (65.7%) questionnaires were returned, and 1402 (65.4%) were analyzed. For patients who felt the ED was their best option (n = 1234, 89.3%), rationales included: safety concerns (n = 309), effectiveness of ED care (n = 284), patient-centeredness of ED (n = 277), and access to health care professionals in the ED (n = 204). For patients who felt the ED was not their best care option (n = 148, 10.7%), rationales included a perception that: access to health professionals outside the ED was preferable (n = 39), patient-centeredness (particularly timeliness) was lacking in the ED (n = 26), and their health concern was not important enough to require ED care (n = 18). Conclusion: Even during times when alternative care options are available, the majority of non-urgent patients perceived the ED to be the most appropriate location for care. These results highlight that simple triage scores do not accurately reflect the appropriateness of care and that understanding the diverse and multi-faceted reasons for ED presentation are necessary to implement strategies to support non-urgent, low acuity care needs.
Increasing evidence shows attachment security influences symptom expression and adaptation in people diagnosed with schizophrenia and other psychoses.
To describe the distribution of secure and insecure attachment in a cohort of individuals with first-episode psychosis, and to explore the relationship between attachment security and recovery from positive and negative symptoms in the first 12 months.
The study was a prospective 12-month cohort study. The role of attachment, duration of untreated psychosis (DUP), baseline symptoms and insight in predicting and mediating recovery from symptoms was investigated using multiple regression analysis and path analysis.
Of the 79 participants, 54 completed the Adult Attachment Interview (AAI): 37 (68.5%) were classified as insecure, of which 26 (48.1%) were insecure/dismissing and 11 (20.4%) insecure preoccupied. Both DUP and insight predicted recovery from positive symptoms at 12 months. Attachment security, DUP and insight predicted recovery from negative symptoms at 12 months.
Attachment is an important construct contributing to understanding and development of interventions promoting recovery following first-episode psychosis.
Increasing human occupation of the Brazilian Amazon has led to the intensification of deforestation over the last 50 years. The present study is aimed at analysing the impacts of the first year of slash-and-burn cultivation on soil physicochemical properties. Sampling was done in 26 small-scale farms of the Tapajós River basin. In August 2004, soil samples were collected from primary forest plots planned for slash-and-burn cultivation. In September 2005, 1 year after the initial burning and the beginning of cultivation, the same sites were re-sampled. The results indicated that soil fertility after burning was relatively moderate, as the increase of base cations was not particularly marked. Moreover, although an increase of some nutrients (such as exchangeable phosphorus) was observed at soil surface, total carbon and nitrogen (N) pools did not change significantly. Nutrient leaching was also detected through the accumulation of both forms of available nitrogen (NO3 and NH4) as well as potassium in subsoil horizons. In addition, signs of erosion were seen, as a significant increase surface density occurred, coupled with up to 25% fine particle loss at the surface. The present study draws attention to the early impacts of slash-and-burn agriculture on soil properties within a year of cultivation. Furthermore, its regional dimension highlights undisturbed soils natural variability as well as differentiated responses to deforestation according to soil texture.
The phase behaviour of aqueous suspensions of NAu1 nontronite was studied on size-selected particles by combining osmotic pressure measurements, visual observations under polarized light, rheological experiments and Small Angle X-ray Scattering (SAXS). NAu1 suspensions display a liquid crystalline behaviour as they exhibit a Isotropic/Nematic (I/N) transition that occurs before the sol/gel transition for ionic strengths below 10–3 M/L. This I/N transition shifts towards lower volume fractions for increasing particle anisotropy and its position in the phase diagram agrees well with the theoretical predictions for platelets. SAXS measurements reveal the presence of characteristic interparticular distances in the isotropic, nematic and gel phases. In the gel phase a local lamellar order is observed which shows that the “house of cards” model is not appropriate for describing the gel structure in swelling clay materials at low ionic strength. Furthermore, by combining results from osmotic pressure measurements and X-ray scattering, it appears that the pressure of the system can be well described using a simple Poisson-Boltzmann treatment based on the repulsion between charged infinite parallel planes. In terms of rheological properties, even if the thermodynamical status of the sol/gel transition remains partially unclear, the yield stress and elasticity of the gels can be easily renormalized for all particle sizes on the basis of the volume of the particles. Furthermore, rheological modelling of the flow curves shows that for all the particles an approach based on excluded volume effects captures most features of nontronite suspensions.
Few studies have prospectively investigated psychological morbidity in UK head and neck cancer patients. This study aimed to explore changes in psychological symptoms over time, and associations with patients' tumour and treatment characteristics, including toxicity.
Two hundred and twenty patients were recruited to complete the Hospital Anxiety and Depression Scale and the Late Effects on Normal Tissue (Subjective, Objective, Management and Analytic) (‘LENT-SOMA’) questionnaires, both pre- and post-treatment.
Anxiety was highest pre-treatment (38 per cent) and depressive symptoms peaked at the end of treatment (44 per cent). Anxiety significantly decreased and depression significantly increased, comparing pre- versus post-treatment responses (p < 0.001). Hospital Anxiety and Depression Scale scores were significantly correlated with toxicity, age and chemotherapy (p < 0.01 for all).
This is the first study to analyse the relationship between Hospital Anxiety and Depression Scale scores and toxicity scores in head and neck cancer patients. It lends support for the use of the Hospital Anxiety and Depression Scale and the Late Effects on Normal Tissue (Subjective, Objective, Management and Analytic) questionnaire in routine clinical practice; furthermore, continued surveillance is required at multiple measurement points.
The objective of this study was to quantify the effectiveness of selected surgical masks in arresting vegetative cells and endospores in an experimental model that simulated contagious patients.
Five commercially available surgical masks were tested for their ability to arrest infectious agents. Surgical masks were placed over the nose and mouth of mannequin head forms (Simulaids adult model Brad CPR torso). The mannequins were retrofitted with a nebulizer attached to an automated breathing simulator calibrated to a tidal volume of 500 mL/breath and a breathing rate of 20 breaths/min, for a minute respiratory volume of 10 L/min. Aerosols of endospores or vegetative cells were generated with a modified microbiological research establishment-type 6-jet collision nebulizer, while air samples were taken with all-glass impinger (AGI-30) samplers downstream of the point source. All experiments were conducted in a horizontal bioaerosol chamber.
Mean arrestance of bioaerosols by the surgical masks ranged from 48% to 68% when the masks were challenged with endospores and from 66% to 76% when they were challenged with vegetative cells. When the arrestance of endospores was evaluated, statistical differences were observed between some pairs, though not all, of the models evaluated. There were no statistically significant differences in arrestance observed between models of surgical masks challenged with vegetative cells.
The arrestance of airborne vegetative cells and endospores by surgical masks worn by simulated contagious patients supports surgical mask use as one of the recommended cough etiquette interventions to limit the transmission of airborne infectious agents.
Mid-infrared (mid-IR) spectra from ~5 to 14 μm of five, nearby (< 70 Mpc) elliptical galaxies are presented that were observed with the Infrared Spectrograph on the Spitzer Space Telescope. The sample galaxies have a main stellar component that is typical for normal, passively evolving ellipticals; however, they are rich in cold gas and dust and have morphological-merger signatures from which a time order of the galaxies since the merger or accretion events can be estimated. The presented results are significant because (1) emission due to Polycyclic Aromatic Hydrocarbons (PAHs) and associated species is detected for the first time in these galaxies and (2) the detected mid-IR spectra are independently exploited as a probe of current or recent star-formation that, in this case, is assumed to be triggered by the merger. As shown in exemplary spectra of the early-age merger NGC 3656, the strength of the PAH emission is more centrally peaked in the earlier-age mergers, suggesting that the PAH data are indeed probing star-formation that is correlated with the time since the mergers and systematically depletes the centrally located gas, becoming weaker and more flatly distributed as the merger evolves.
This paper compares strains of foot-and-mouth disease (FMD) serotype SAT (South African Territories) 2 viruses isolated from Zimbabwe and other African countries using monoclonal antibodies (MAb). A sandwich-ELISA was used to examine the relative binding of anti-SAT 2 MAb to the various viruses. The MAb-binding profiles of viruses isolated from field samples were compared using hierarchical cluster analysis. Viruses were obtained from game animals, mainly African buffalo (Syncerus caffer) which is the natural host and reservoir for SAT serotypes in Africa, and from cattle showing clinical signs of FMD, as well as from animals suspected of carrying the virus subclinically. Some isolates have been adapted for use as vaccine strains. The results showed that most of the Zimbabwe isolates collected between 1989 and 1992 were an antigenically closely-related group. Although differences were observed between Zimbabwe isolates collected between 1989 and 1992 and those collected in 1987, there was no correlation with the different MAb binding patterns within the 1987 group and the epidemiological information received from the field. Similar profiles were observed for many SAT 2 viruses, including viruses isolated over a 50-year period and from geographically distant areas. This indicates an inherent stability in antigenic profiles of SAT 2 viruses. The MAb panel was capable of assessing antigenic variation, since very different profiles were obtained for some isolates. The work also allowed comparison and characterization of anti-type SAT 2 MAb from different laboratories. The findings are discussed with reference to selection of vaccine strains.
1. Two hundred and ten blood samples were obtained from workers in the fish trade. Of these, fifty-one, or 24·2 per cent., gave positive sero-reactions in dilutions of the serum ranging from 1/30 to 1/1000.
2. There was no significant difference in the number of positive sero-reactions occurring in male or female workers.
3. In a control series of 406 blood specimens from individuals not engaged in the fish trade, no positive sero-reactions were obtained.
4. Evidence is adduced that leptospiral infections occur in three grades: (a) severe infections associated with jaundice; (b) mild infections with pyrexia and no jaundice; and (c) latent or inapparent infections with no clinical manifestations.
5. Preventive methods have been detailed which, if put into practice, would, it is believed, reduce the incidence of this disease.
6. Sufficient evidence has been produced to show that this disease is occupational in nature, and as such should be scheduled under the Workmen's Compensation Act.
Acknowledgements. We are indebted to Drs Hill, Goldie and Fullerton of the Department of Medicine for much help in the collection of specimens.
Many of the most pervasive disease challenges to livestock are transmitted via oral contact with faeces (or by faecal–aerosol) and the current paper focuses on how disease risk may depend on: spatial heterogeneity, animal searching behaviour, different grazing systems and faecal deposition patterns including those representative of livestock and a range of wildlife. A spatially explicit agent-based model was developed to describe the impact of empirically observed foraging and avoidance behaviours on the risk of disease presented by investigative and grazing contact with both livestock and wildlife faeces. To highlight the role of spatial heterogeneity on disease risks an analogous deterministic model, which ignores spatial heterogeneity and searching behaviour, was compared with the spatially explicit agent-based model. The models were applied to assess disease risks in temperate grazing systems. The results suggest that spatial heterogeneity is crucial in defining the disease risks to which individuals are exposed even at relatively small scales. Interestingly, however, although sensitive to other aspects of behaviour such as faecal avoidance, it was observed that disease risk is insensitive to search distance for typical domestic livestock restricted to small field plots. In contrast disease risk is highly sensitive to distributions of faecal contamination, in that contacts with highly clumped distributions of wildlife contamination are rare in comparison to those with more dispersed contamination. Finally it is argued that the model is a suitable framework to study the relative inter- and intra-specific disease risks posed to livestock under different realistic management regimes.
Reduction in wildlife populations is a common method for the control of livestock infections which have wildlife hosts, but its success is dependent on the characteristics of the infection itself, as well as on the spatial and social structure of the wildlife host. Paratuberculosis (Mycobacterium avium subsp. paratuberculosis; Map) is a widespread and difficult infection to control in livestock populations and also has possible links to Crohn's disease in humans. Rabbits have recently been identified as a key wildlife species in terms of paratuberculosis persistence in the environment and risk to the wider host community, including cattle. Here we use a spatially explicit stochastic simulation model of Map dynamics in rabbit populations to quantify the effects of rabbit population control on infection persistence. The model parameters were estimated from empirical studies of rabbit population dynamics and rabbit-to-rabbit routes of Map transmission. Three rabbit control strategies were compared: single unrepeated population reductions based on removing individual animals; single unrepeated population reductions based on removal of entire social groups; and repeated annual population reductions based on removing individual animals. Unrealistically high rabbit culls (>95% population reduction) are needed if infection is to be eradicated from local rabbit populations with a single one-off population reduction event, either of individuals or social groups. Repeated annual culls are more effective at reducing the prevalence of infection in rabbit populations and eradicating infection. However, annual population reductions of >40% are required over extended periods of time (many years). Thus, using an approach which is both highly conservative and parsimonious with respect to estimating lower bounds on the time to eradicate the infection, we find that Map is extremely persistent in rabbit populations and requires significant and prolonged effort to achieve control.
Objective: To audit sore throat management in adults, introduce proforma-based guidelines and to reaudit clinical practice.
Setting: Adult emergency department of an inner city teaching hospital.
Methods: A literature search was carried out to identify relevant guidelines. In stage one, patients presenting to the emergency department with sore throat were identified retrospectively from the emergency department attendance register. Proformas were completed retrospectively. In stage two, new guidelines were introduced and staff educated about the guidelines. In stage three, patients presenting with sore throat were identified at triage and proformas were completed at time of consultation.
Outcome Measures: (1) appropriate clinical assessment of the likelihood of bacterial infection using the clinical scoring system, (2) appropriateness of antibiotic prescription, (3) recommendation of supportive treatments to patients.
Results: Introduction of a clinical scoring system reduced the inappropriate prescribing of antibiotics from 44 per cent to 11 per cent. Correct antibiotic prescription rose from 60 per cent to 100 per cent. Although the variety of advice given about supportive treatment increased, the actual number of patients receiving documented supportive advice fell from 67.8 per cent in stage one to 58 per cent in stage three.
Conclusion: The introduction of clinically based guidelines for the diagnosis and management of sore throat in adults can reduce inappropriate antibiotic prescribing.