Please note, due to essential maintenance online transactions will not be possible between 02:30 and 04:00 BST, on Tuesday 17th September 2019 (22:30-00:00 EDT, 17 Sep, 2019). We apologise for any inconvenience.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Adverse childhood experiences (ACEs) of parents are associated with a variety of negative health outcomes in offspring. Little is known about the mechanisms by which ACEs are transmitted to the next generation. Given that maternal depression and anxiety are related to ACEs and negatively affect children’s behaviour, these exposures may be pathways between maternal ACEs and child psychopathology. Child sex may modify these associations. Our objectives were to determine: (1) the association between ACEs and children’s behaviour, (2) whether maternal symptoms of prenatal and postnatal depression and anxiety mediate the relationship between maternal ACEs and children’s behaviour, and (3) whether these relationships are moderated by child sex. Pearson correlations and latent path analyses were undertaken using data from 907 children and their mothers enrolled the Alberta Pregnancy Outcomes and Nutrition study. Overall, maternal ACEs were associated with symptoms of anxiety and depression during the perinatal period, and externalizing problems in children. Furthermore, we observed indirect associations between maternal ACEs and children’s internalizing and externalizing problems via maternal anxiety and depression. Sex differences were observed, with boys demonstrating greater vulnerability to the indirect effects of maternal ACEs via both anxiety and depression. Findings suggest that maternal mental health may be a mechanism by which maternal early life adversity is transmitted to children, especially boys. Further research is needed to determine if targeted interventions with women who have both high ACEs and mental health problems can prevent or ameliorate the effects of ACEs on children’s behavioural psychopathology.
Vertically aligned nitrogen-doped nanocrystalline diamond nanorods are fabricated from nitrogen-doped nanocrystalline diamond films using reactive ion etching in oxygen plasma. These nanorods show enhanced thermionic electron emission (TEE) characteristics, viz., a high current density of 12.0 mA/cm2 and a work function value of 4.5 eV with an applied voltage of 3 V at 923 K. The enhanced TEE characteristics of these nanorods are ascribed to the induction of nanographitic phases at the grain boundaries and the field penetration effect through the local field enhancement from nanorods owing to a high aspect ratio and an excellent field enhancement factor.
Introduction: Diagnosing pulmonary embolism (PE) can be challenging because the signs and symptoms are often non-specific. Studies have shown that evidence-based algorithms are not always adhered to in the Emergency Department (ED) and are often not used correctly, which leads to unnecessary CT scanning. The YEARS diagnostic algorithm, consisting of three items (clinical signs of deep vein thrombosis, hemoptysis, and whether pulmonary embolism is the most likely diagnosis) and D-dimer, is a novel and simplified way to approach suspected acute PE. The purpose of this study was to 1) evaluate the use of the YEARS algorithm in the ED and 2) to compare the rates of testing for PE if the YEARS algorithm was used. Methods: This was a health records review of ED patients investigated for PE at two emergency departments over a two-year period (April 2013-March 2015). Inclusion criteria were ED physician ordered CT pulmonary angiogram, ventilation-perfusion scan, or D-dimer for investigation of PE. Patients under the age of 18 and those without a D-dimer test were excluded. PE was considered to be present during the emergency department visit if PE was diagnosed on CT or VQ (subsegmental level or above), or if the patient was subsequently found to have PE or deep vein thrombosis during the next 30 days. Trained researchers extracted anonymized data. The rate of CT/VQ imaging and the false negative rate was calculated. Results: There were 1,163 patients that were tested for PE and 1,083 patients were eligible for our analysis. Of the total, 317/1,083 (29.3%; 95%CI 26.6-32.1%) had CT/VQ imaging for PE, and 41/1,083 (3.8%; 95%CI 2.8-5.1%) patients were diagnosed with PE at baseline. Three patients had a missed PE, resulting in a false negative rate of 0.4% (95%CI 0.1-1.2%). If the YEARS algorithm was used, 211/1,083 (19.5%; 95%CI 17.2-22.0%) would have required imaging for PE. Of the patients who would not have required imaging according to the YEARS algorithm, 8/872 (0.9%; 95%CI 0.5-1.8%) would have had a missed PE. Conclusion: If the YEARS algorithm was used in all patients with suspected PE, fewer patients would have required imaging with a small increase in the false negative rate.
Introduction: Diagnosing pulmonary embolism (PE) can be challenging because the signs and symptoms are often non-specific. Studies have shown that evidence-based algorithms are not always adhered to in the Emergency Department (ED), which leads to unnecessary CT scanning. The pulmonary embolism rule-out criteria (PERC) can identify patients who can be safely discharged from the ED without further investigation for PE. The purpose of this study is to evaluate the use of the PERC rule in the ED and to compare the rates of testing for PE if the PERC rule was used. Methods: This was a health records review of ED patients investigated for PE at two emergency departments over a two-year period (April 2013-March 2015). Inclusion criteria were ED physician ordered CT pulmonary angiogram, ventilation-perfusion scan, or D-dimer for investigation of PE. Patients under the age of 18 were excluded. PE was considered to be present during the emergency department visit if PE was diagnosed on CT or VQ (subsegmental level or above), or if the patient was subsequently found to have PE or deep vein thrombosis during the next 30 days. Trained researchers extracted anonymized data. The rate of CT/VQ imaging and the negative predictive value was calculated. Results: There were 1,163 patients that were tested for PE and 1,097 patients were eligible for our analysis. Of the total, 330/1,097 (30.1%; 95%CI 27.4-32.3%) had CT/VQ imaging for PE, and 48/1,097 (4.4%; 95%CI 3.3-5.8%) patients were diagnosed with PE. 806/1,097 (73.5%; 95%CI 70.8-76.0%) were PERC positive, and of these, 44 patients had a PE (5.5%; 95%CI 4.1-7.3%). Conversely, 291/1,097 (26.5%; 95%CI 24.0-29.2%) patients were PERC negative, and of these, 4 patients had a PE (1.4%; 95%CI 0.5-3.5%). Of the PERC negative patients, 291/291 (100.0%; 95%CI 98.7-100.0%) had a D-dimer test done, and 33/291 (11.3%; 95%CI 8.2-15.5%) had a CT angiogram. If PERC was used, CT/VQ imaging would have been avoided in 33/1,097 (3%; 95%CI 2.2-4.2%) patients and the D-dimer would have been avoided in 291/1,097 (26.5%; 95%CI 24.0-29.2%) patients. Conclusion: If the PERC rule was used in all patients with suspected PE, fewer patients would have further testing. The false negative rate for the PERC rule was low.
The correlation between objective and subjective nasal obstruction is poor, and dissatisfaction rates after surgery for nasal obstruction are high. Accordingly, novel assessment techniques may be required. This survey aimed to determine patient experience and preferences for the measurement of nasal obstruction.
Prospective survey of rhinology patients.
Of 72 questionnaires distributed, 60 were completed (response rate of 83 per cent). Obstruction duration (more than one year) (χ2 = 13.5, p = 0.00024), but not obstruction severity, affected willingness to spend more time being assessed. Questionnaires (48 per cent) and nasal inspiratory peak flow measurement (53 per cent) are the most commonly used assessment techniques. Forty-nine per cent of participants found their assessment unhelpful in understanding their obstruction. Eighty-two per cent agreed or strongly agreed that a visual and numerical aid would help them understand their blockage.
Many patients are dissatisfied with current assessment techniques; a novel device with visual or numerical results may help. Obstruction duration determines willingness to undergo longer assessment.
Multidrug-resistant organisms (MDROs) are increasingly reported in residential care homes for the elderly (RCHEs). We assessed whether implementation of directly observed hand hygiene (DOHH) by hand hygiene ambassadors can reduce environmental contamination with MDROs.
From July to August 2017, a cluster-randomized controlled study was conducted at 10 RCHEs (5 intervention versus 5 nonintervention controls), where DOHH was performed at two-hourly intervals during daytime, before meals and medication rounds by a one trained nurse in each intervention RCHE. Environmental contamination by MRDOs, such as methicillin-resistant Staphylococcus aureus (MRSA), carbapenem-resistant Acinetobacter species (CRA), and extended-spectrum β-lactamse (ESBL)–producing Enterobacteriaceae, was evaluated using specimens collected from communal areas at baseline, then twice weekly. The volume of alcohol-based hand rub (ABHR) consumed per resident per week was measured.
The overall environmental contamination of communal areas was culture-positive for MRSA in 33 of 100 specimens (33%), CRA in 26 of 100 specimens (26%), and ESBL-producing Enterobacteriaceae in 3 of 100 specimens (3%) in intervention and nonintervention RCHEs at baseline. Serial monitoring of environmental specimens revealed a significant reduction in MRSA (79 of 600 [13.2%] vs 197 of 600 [32.8%]; P<.001) and CRA (56 of 600 [9.3%] vs 94 of 600 [15.7%]; P=.001) contamination in the intervention arm compared with the nonintervention arm during the study period. The volume of ABHR consumed per resident per week was 3 times higher in the intervention arm compared with the baseline (59.3±12.9 mL vs 19.7±12.6 mL; P<.001) and was significantly higher than the nonintervention arm (59.3±12.9 mL vs 23.3±17.2 mL; P=.006).
The direct observation of hand hygiene of residents could reduce environmental contamination by MDROs in RCHEs.
Despite the growing interest in the phenomenon of learning without intention, the incidental learning of phonological features, especially prosodic features, has received relatively little attention. This paper reports an experiment on incidental learning of lexical stress rules, and investigates whether the resultant knowledge can be unconscious, abstract, and rule based. Participants were incidentally exposed to a lexical stress system where stress location of a word is mainly determined by the final phoneme, syllable type, and syllable weight. Learning was assessed by a pronunciation judgment task. Results indicate that participants were able to transfer their knowledge of stress patterns to novel words whose final phoneme was not previously encountered, suggesting that participants had acquired abstract and potentially rule-based knowledge. The combined use of subjective and objective measures of awareness in the present study provides a strong piece of evidence of the acquisition of implicit knowledge.
To assess the level of all-hazards disaster preparedness and training needs of emergency department (ED) doctors and nurses in Hong Kong from their perspective, and identify factors associated with high perceived personal preparedness.
This study was a cross-sectional territory-wide online survey conducted from 9 September to 26 October, 2015.
The participants were doctors from the Hong Kong College of Emergency Medicine and nurses from the Hong Kong College of Emergency Nursing.
We assessed various components of all-hazards preparedness using a 25-item questionnaire. Backward logistic regression was used to identify factors associated with perceived preparedness.
A total of 107 responses were analyzed. Respondents lacked training in disaster management, emergency communication, psychological first aid, public health interventions, disaster law and ethics, media handling, and humanitarian response in an overseas setting. High perceived workplace preparedness, length of practice, and willingness to respond were associated with high perceived personal preparedness.
Given the current gaps in and needs for increased disaster preparedness training, ED doctors and nurses in Hong Kong may benefit from the development of core-competency-based training targeting the under-trained areas, measures to improve staff confidence in their workplaces, and efforts to remove barriers to staff willingness to respond. (Disaster Med Public Health Preparedness. 2018; 12: 329–336)
Mycobacterium marinum, a bacterium found in freshwater and saltwater, can infect persons with direct exposure to fish or aquariums. During December 2013, the New York City Department of Health and Mental Hygiene learned of four suspected or confirmed M. marinum skin or soft tissue infections (SSTIs) among persons who purchased whole fish from Chinese markets. Ninety-eight case-patients with non-tuberculous mycobacteria (NTM) SSTIs were identified with onset June 2013–March 2014. Of these, 77 (79%) were female. The median age was 62 years (range 30–91). Whole genome sequencing of clinical isolates revealed two main clusters and marked genetic diversity. Environmental samples from distributors yielded NTM though not M. marinum. We compared 56 case-patients with 185 control subjects who shopped in Chinese markets, frequency-matched by age group and sex. Risk factors for infection included skin injury to the finger or hand (odds ratio [OR]: 15·5; 95% confidence interval [CI]: 6·9–37·3), hand injury while preparing fish or seafood (OR 8·3; 95% CI 3·8–19·1), and purchasing tilapia (OR 3·6; 95% CI 1·1–13·9) or whiting (OR 2·7; 95% CI 1·1–6·6). A definitive environmental outbreak source was not identified.
Introduction: The Surviving Sepsis Campaign (SSC) suggests that hypovolemic patients, in the setting of hypoperfusion, be administered 30 mL/kg crystalloid fluid within the first 3 hours of presentation to hospital. More recent evidence suggests that fluid resuscitation within 30 min of sepsis identification is associated with reduced mortality, hospital length of stay and ICU days. This study describes Emergency Department (ED) fluid resuscitation of patients with septic shock and/or sepsis-related in-hospital mortality, prior to implementation of a sepsis medical directive. Methods: Retrospective chart review of adult patients (18+ years), presenting to two tertiary care EDs between 01 Nov 2014 and 31 Oct 2015, with >=2 SIRS criteria and/or ED suspicion of infection and/or ED or hospital discharge sepsis diagnosis. Data were abstracted from electronic health records. Patients with septic shock, or who expired in the ED/hospital, were selected for manual chart review of clinical variables including: time, type and volume of ED IV fluid administration. Results: 13,506 patient encounters met inclusion criteria. In-hospital mortality rates were 2% (sepsis), 11.5% (severe sepsis), and 24.1% (septic shock). Of patients hypotensive at triage, fluids were administered to 33/50 (66.00%) septic shock patients, and 22/43 (51.16 %) patients who eventually expired. For all septic shock and expired patients (943), median time to IV fluid initiation was 60.50 minutes [29.75 to 101.25] for septic shock and 77.00 minutes [36.00 to 127.00] for expired patients. Median volume of fluid administered was 1.50L [1.0 to 2.00] for septic shock and 1.00L [1.00 to 2.00] for expired patients. Of septic shock and expired patients, IV fluid administration and body weight data was available for 148 encounters (15.6%). Within this group, 19 (12.8%) received no IV fluid. 90 (60.8%) received 0.1-75% of their recommended IV fluid volume. 25 (16.9%) received 75.1-125%, and 14 (9.4%) received >125.1% of their recommended fluid volume. Conclusion: In this study, severe forms of sepsis were often treated with <30 mL/kg crystalloid fluid. Fluids were administered outside of the recommended 30 min, but within the 3 h, time windows. In-hospital mortality was consistent with published data. Future research will examine a broader data set for IV fluid resuscitation in sepsis, and will measure the impact of a fluid resuscitation in sepsis medical directive.
Introduction: When ventricular fibrillation (VF) cannot be terminated with conventional external defibrillation, it is classified as refractory VF (RVF). There is a paucity of information regarding prehospital or patient factors that may be associated with RVF. The objectives of this study were to determine factors that may be associated with RVF, the initial ED rhythm for patients with prehospital RVF, and the incidence of survival in patients who had RVF and were transported to hospital. Methods: Ambulance Call Records (ACRs) of patients with out of hospital cardiac arrest between Mar. 1 2012 and Apr. 1 2016 were reviewed. Cases of RVF (≥5 consecutive shocks delivered) were determined by manual review of the ACR. ED and hospital records were analyzed to determine outcomes of patients who were in RVF and transported to hospital. Descriptive statistics were calculated and all variables were tested for an association with initial ED rhythm, survival to admission, and survival to discharge. Results: Eighty-five cases of RVF were identified. A history of coronary artery disease (47.10%) and hypertension (50.60%) were the most common comorbidities in patients transported to the ED with RVF. Upon arrival to the ED, 24 (28.2%) remained in RVF, 38 (44.7%) had a non-shockable rhythm, and 23 (27.1%) had return of spontaneous circulation. Thirty-four (40%) survived to admission, while only 18 (21.2%) survived to discharge. Pre-existing comorbidities, time to first shock, time on scene, and transport time were not statistically associated with initial ED rhythm, survival to admission or discharge. Patient age was statistically associated with improved rhythm on ED arrival (p=0.013) and survival to discharge (58.24 yrs vs 67.40 yrs, Δ9.17, 95% CI 1.82 to 16.52, p=0.015). Conclusion: The majority of patients with prehospital RVF have a rhythm deterioration by the time care is transferred to the ED. Of these patients with a rhythm deterioration, few survive to hospital discharge. Younger patients are more likely to remain in RVF and survive to discharge. Further research is required to determine prehospital treatment strategies for RVF, as well as patient populations that may benefit from those treatments.
Introduction: Diagnosing pulmonary embolism (PE) can be challenging because the signs and symptoms are often non-specific. Studies have shown that evidence-based diagnostic algorithms are not always adhered to in the Emergency Department (ED), which leads to unnecessary CT scanning. In 2013, the American College of Chest Physicians identified CT pulmonary angiography as one of the top five avoidable tests. One solution is to use a clinical prediction rule combined with the D-dimer, which safely reduces the use of CT scanning. The objective of this study was to compare the proportion of patients tested for PE in two emergency departments, who 1) had a CT-PE and 2) whose diagnosis of PE was missed. We compared these rates to those if the Wells rule and D-dimer had been applied as standard. Methods: This was a retrospective chart review of ED patients investigated for PE at two hospitals from April 2013 to March 2015 (24 months). Inclusion criteria were the ED physician ordered CT-PE, Ventilation-Perfusion (VQ) scan or D-dimer for investigation of PE. Patients under the age of 18 were excluded. PE was defined as CT/VQ diagnosis of acute PE or acute PE/DVT in 30-day follow-up. Trained researchers extracted anonymized data. The rate of CT/VQ imaging and the false-negative rates were calculated. The false-negative rate was calculated as the number of patients diagnosed with PE within 30 days as a proportion of those patients who did not have a CT/VQ scan at initial presentation. Results: There were 1,189 patients included in this study. 55/1,189 patients (4.6%; 95%CI 3.6-6.0%) were ultimately diagnosed with PE within 30 days. 397/1,189 patients (33.4%; 95%CI 30.8-36.1%) had CT/VQ scans for PE. 3 out of 792 who were not scanned had a missed PE resulting in a false-negative rate of 0.4% (95% CI 0.1-1.1%). 80 patients had an elevated D-dimer or high Wells score but were not imaged. Furthermore, 75 patients who did not have an elevated D-dimer nor a high Wells score were imaged. Had Wells rule/D-dimer been adhered to, 402/1,189 patients (33.8%; 95%CI 31.9-36.6%) would have undergone imaging and the false negative rate would be 0/727, 0% (95%CI 0.0-0.5%). Conclusion: If the Wells rule and D-dimer was used in all patients tested for PE, a similar proportion would have a CT scan but fewer PEs would be missed.
Introduction: Diagnosing pulmonary embolism (PE) in the emergency department can be challenging due to non-specific signs and symptoms; this often results in the over-utilization of CT pulmonary angiography (CT-PA). In 2013, the American College of Chest Physicians identified CT-PA as one of the top five avoidable tests. Age-adjusted D-dimer has been shown to decrease CT utilization rates. Recently, clinical-probability adjusted D-dimer has been promoted as an alternative strategy to reduce CT scanning. The aim of this study is to compare the safety and efficacy of the age-adjusted D-dimer rule and the clinical probability-adjusted D-dimer rule in Canadian ED patients tested for PE. Methods: This was a retrospective chart review of ED patients investigated for PE at two hospitals from April 2013 to March 2015 (24 months). Inclusion criteria were the ED physician ordered CT-PA, Ventilation-Perfusion (VQ) scan or D-dimer for investigation of PE. Patients under the age of 18 were excluded. PE was defined as CT/VQ diagnosis of acute PE or acute PE/DVT in 30-day follow-up. Trained researchers extracted anonymized data. The age-adjusted D-dimer and the clinical probability-adjusted D-dimer rules were applied retrospectively. The rate of CT/VQ imaging and the false negative rates were calculated. Results: In total, 1,189 patients were tested for PE. 1,129 patients had a D-dimer test and a Wells score less than 4.0. 364/1,129 (32.3%, 95%CI 29.6-35.0%) would have undergone imaging for PE if the age-adjusted D-dimer rule was used. 1,120 patients had a D-dimer test and a Wells score less than 6.0. 217/1,120 patients (19.4%, 95%CI 17.2-21.2%) would have undergone imaging for PE if the clinical probability-adjusted D-dimer rule was used. The false-negative rate for the age-adjusted D-dimer rule was 0.3% (95%CI 0.1-0.9%). The false-negative rate of the clinical probability-adjusted D-dimer was 1.0% (95%CI 0.5-1.9%). Conclusion: The false-negative rates for both the age-adjusted D-dimer and clinical probability-adjusted D-dimer are low. The clinical probability-adjusted D-dimer results in a 13% absolute reduction in CT scanning compared to age-adjusted D-dimer.
Recent observations on strength and deformation of small metals containing microstructures, including dislocation patterns, grain boundaries, and second-phase precipitates are reviewed. These microstructures impose an internal length scale that may interplay with the extrinsic length scale due to the specimen size to affect strength and deformation in an intricate manner. For micro-crystals containing pre-existing dislocations, Taylor work-hardening may dictate the dependence of strength on specimen size. The presence of grain boundaries in a small specimen may lead to effects far from the conventional Hall–Petch behavior. Precipitate–dislocation interactions in a small specimen may lead to an interesting weakest-size behavior.
To study the operational impact of process improvements on emergency department (ED) patient flow. The changes did not require any increase in resources or expenditures.
This was a 36-month pre- and post-intervention study to evaluate the effect of implementing process improvements at a community ED from January 2010 to December 2012. The intervention comprised streamlining triage by having patients accepted into internal waiting areas immediately after triage. Within the ED, parallel processes unfolded, and there was no restriction on when registration occurred or which health care provider a patient saw first. Flexible nursing ratios allowed nursing staff to redeploy and move to areas of highest demand. Last, demand-based physician scheduling was implemented. The main outcome was length of stay (LOS). Secondary outcomes included time to physician initial assessment (PIA), left-without-being-seen (LWBS) rates, and left-against-medical-advice (LAMA) rates. Segmented regression of interrupted time series analysis was performed to quantify the impact of the intervention, and whether it was sustained.
Patients totalling 251,899 attended the ED during the study period. Daily patient volumes increased 17.3% during the post-intervention period. Post-intervention, mean LOS decreased by 0.64 hours (p<0.005). LOS for non-admitted Canadian Triage and Acuity Scale 2 (-0.58 hours, p<0.005), 3 (-0.75 hours, p<0.005), and 4 (-0.32 hours, p<0.005) patients also decreased. There were reductions in PIA (43.81 minutes, p<0.005), LWBS (35.2%, p<0.005), and LAMA (61.9%, p<0.005).
A combination of process improvements in the ED was associated with clinically significant reductions in LOS, PIA, LWBS, and LAMA for non-resuscitative patients.
To study the association between gastrointestinal colonization of carbapenemase-producing Enterobacteriaceae (CPE) and proton pump inhibitors (PPIs).
We analyzed 31,526 patients with prospective collection of fecal specimens for CPE screening: upon admission (targeted screening) and during hospitalization (opportunistic screening, safety net screening, and extensive contact tracing), in our healthcare network with 3,200 beds from July 1, 2011, through December 31, 2015. Specimens were collected at least once weekly during hospitalization for CPE carriers and subjected to broth enrichment culture and multiplex polymerase chain reaction.
Of 66,672 fecal specimens collected, 345 specimens (0.5%) from 100 patients (0.3%) had CPE. The number and prevalence (per 100,000 patient-days) of CPE increased from 2 (0.3) in 2012 to 63 (8.0) in 2015 (P<.001). Male sex (odds ratio, 1.91 [95% CI, 1.15–3.18], P=.013), presence of wound or drain (3.12 [1.70–5.71], P<.001), and use of cephalosporins (3.06 [1.42–6.59], P=.004), carbapenems (2.21 [1.10–4.48], P=.027), and PPIs (2.84 [1.72–4.71], P<.001) in the preceding 6 months were significant risk factors by multivariable analysis. Of 79 patients with serial fecal specimens, spontaneous clearance of CPE was noted in 57 (72.2%), with a median (range) of 30 (3–411) days. Comparing patients without use of antibiotics and PPIs, consumption of both antibiotics and PPIs after CPE identification was associated with later clearance of CPE (hazard ratio, 0.35 [95% CI, 0.17–0.73], P=.005).
Concomitant use of antibiotics and PPIs prolonged duration of gastrointestinal colonization by CPE.
We use Monte Carlo techniques to simulate the statistical properties of rotation-powered pulsars in the Gould Belt. The gamma-ray properties of these pulsars are calculated by using a self-consistent outer gap model and other pulsar properties, i.e., initial magnetic field and period, and velocity distribution of the neutrons stars at birth, are obtained from the statistics of radio pulsars. We obtain distributions of the magnetic inclination angle, period, distance and age for these gamma-ray pulsars in the Gould Belt.
Research on close binary systems has continued at a high level during the past triennium, although the rate of growth is noticeably slower – probably reflecting the cutbacks in funds to which many of us are subject. There have also been changes of emphasis within the field, which are commented on in the pages that follow. These reflect both changing opportunities for observation and the natural development of the subject. In many areas, the time is ripe for a more critical look at ideas that previously seemed adequate.