To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Introduction: There is an opioid epidemic which has seen an increased mortality rate of 200% related to opioid use over the past decade. Prescription practices amongst ED physicians may be contributing to this problem. Our objective was to analyze ED physician prescription practices for patients discharged from the ED with acute fractures. Methods: We conducted a health records review of ED patients seen at two campuses of a tertiary care hospital with total annual census of 160,000 visits. We evaluated a consecutive sample of patients with acute fractures (January 1 2016–April 15 2016) seen and discharged by ED physicians. Patients admitted to hospital or discharged by consultant services were excluded. The primary outcome measure was the proportion of patients discharged with an opioid prescription. We collected data using a screening list, review of electronic records, and interobserver agreement for measures. We calculated simple descriptive statistics and estimated 4 months would be required to enroll 250 patients receiving opioid prescriptions. Results: We enrolled 816 patients, with 442 females (54.2%), median CTAS score of 3, and median pain score at triage of 6/10. The most common fractures were wrist/hand (35.2%) and foot excluding ankle (14.8%). An ED pain directive was used at triage for 21.2% and 281 patients (34.4%) received an opioid during ED stay, with tramadol (21.2%) being the most common. Overall, 250 patients (30.6%) were discharged with the following opioid prescriptions and median total dosages: hydromorphone (N = 114, median dosage 23mg, range 1–120mg), tramadol (N = 86, 1000mg, 200–2000mg), oxycodone (N = 33, 100mg, 10–170mg), codeine (N = 20, 600mg, 360–1200mg), and morphine (N = 9, 100mg, 25–200mg). Of patients prescribed hydromorphone, 61 (53.5%) were prescribed > 20mg. Overall, 35 patients (4.3%) had a pain related ED visit <1 month after discharge, of which 14 (40%) received an opioid prescription on initial discharge, and 12 (34.2%) received an opioid prescription upon subsequent discharge. Conclusion: Amongst patients presenting to the ED with acute fractures, the majority were not discharged home with an opioid prescription from ED physicians. Hydromorphone was the most common opioid prescribed, with large variations in total dosage. Despite only a minority of patients receiving opioid prescriptions, there were very few return to ED visits. To limit potential abuse, we recommend standardization of opioid prescribing in the ED, with attention to limiting the total dosage given.
In this review article, we discuss selected developments regarding the role of the equation of state in simulations of core-collapse supernovae. There are no first-principle calculations of the state of matter under supernova conditions since a wide range of conditions is covered, in terms of density, temperature, and isospin asymmetry. Instead, model equation of state are commonly employed in supernova studies. These can be divided into regimes with intrinsically different degrees of freedom: heavy nuclei at low temperatures, inhomogeneous nuclear matter where light and heavy nuclei coexist together with unbound nucleons, and the transition to homogeneous matter at high densities and temperatures. In this article, we discuss each of these phases with particular view on their role in supernova simulations.
Epstein Barr virus (EBV) infects 95% of the global population and is associated with up to 2% of cancers globally. Immunoglobulin G (IgG) antibody levels to EBV have been shown to be heritable and associated with developing malignancies. We, therefore, performed a pilot genome-wide association analysis of anti-EBV IgG traits in an African population, using a combined approach including array genotyping, whole-genome sequencing and imputation to a panel with African sequence data. In 1562 Ugandans, we identify a variant in human leukocyte antigen (HLA)-DQA1, rs9272371 (p = 2.6 × 10−17) associated with anti-EBV nuclear antigen-1 responses. Trans-ancestry meta-analysis and fine-mapping with European-ancestry individuals suggest the presence of distinct HLA class II variants driving associations in Uganda. In addition, we identify four putative, novel, very rare African-specific loci with preliminary evidence for association with anti-viral capsid antigen IgG responses which will require replication for validation. These findings reinforce the need for the expansion of such studies in African populations with relevant datasets to capture genetic diversity.
We describe a versatile array controller developed at RAL and SAAO. The original concept was due to Waltham, van Breda and Newton (1990). A Transputer-based microcomputer forms the heart of the device.
Introduction: Data regarding adverse events (AEs) (unintended harm to the patient from health care provided) among children seen in the emergency department (ED) are scarce despite the high risk setting and population. The objective of our study was to estimate the risk and type of AEs, and their preventability and severity, among children treated in pediatric EDs. Methods: Our prospective cohort study enrolled children <18 years of age presenting for care during 21 randomized 8 hr-shifts at 9 pediatric EDs from Nov 2014 to October 2015. Exclusion criteria included unavailability for follow-up or insurmountable language barrier. RAs collected demographic, medical history, ED course, and systems level data. At day 7, 14, and 21 a RA administered a structured telephone interview to all patients to identify flagged outcomes (e.g. repeat ED visits, worsening/new symptoms, etc). A validated trigger tool was used to screen admitted patients’ health records. For any patients with a flagged outcome or trigger, 3 ED physicians independently determined if an AE occurred. Primary outcome was the proportion of patients with an AE related to ED care within 3 weeks of their ED visit. Results: We enrolled 6377 (72.0%) of 8855 eligible patients; 545 (8.5%) were lost to follow-up. Median age was 4.4 years (range 3 months to 17.9 yrs). Eight hundred and seventy seven (13.8%) were triaged as CTAS 1 or 2, 2638 (41.4%) as CTAS 3, and 2839 (44.7%) as CTAS 4 or 5. Top entrance complaints were fever (11.2%) and cough (8.8%). Flagged outcomes/triggers were identified for 2047 (32.1%) patients. While 252 (4.0%) patients suffered at least one AE within 3 weeks of ED visit, 163 (2.6%) suffered an AE related to ED care. In total, patients suffered 286 AEs, most (67.9%) being preventable. The most common AE types were management issues (32.5%) and procedural complications (21.9%). The need for a medical intervention (33.9%) and another ED visit (33.9%) were the most frequent clinical consequences. In univariate analysis, older age, chronic conditions, hospital admission, initial location in high acuity area of the ED, having >1 ED MD or a consultant involved in care, (all p<0.001) and longer length of stay (p<0.01) were associated with AEs. Conclusion: While our multicentre study found a lower risk of AEs among pediatric ED patients than reported among pediatric inpatients and adult ED patients, a high proportion of these AEs were preventable.
Introduction: Active substance use and unstable housing are both associated with increased emergency department (ED) utilization. This study examined ED health care costs among a cohort of substance using and/or homeless adults following an index ED visit, relative to a control ED population. Methods: Consecutive patients presenting to an inner-city ED between August 2010 and November 2011 who reported unstable housing and/or who had a chief presenting complaint related to acute or chronic substance use were evaluated. Controls were enrolled in a 1:4 ratio. Participants’ health care utilization was tracked via electronic medical record for six months after the index ED visit. Costing data across all EDs in the region was obtained from Alberta Health Services and calculated to include physician billing and the cost of an ED visit excluding investigations. The cost impact of ED utilization was estimated by multiplying the derived ED cost per visit by the median number of visits with interquartile ranges (IQR) for each group during follow up. Proportions were compared using non-parametric tests. Results: From 4679 patients screened, 209 patients were enrolled (41 controls, 46 substance using, 91 unstably housed, 31 both unstably housed and substance using (UHS)). Median costs (IQR) per group over the six-month period were $0 ($0-$345.42) for control, $345.42 ($0-$1139.89) for substance using, $345.42 ($0-$1381.68) for unstably housed and $1381.68 ($690.84-$4248.67) for unstably housed and substance using patients (p<0.05). Conclusion: The intensity of excess ED costs was greatest in patients who were both unstably housed and presenting with a chief complaint related to substance use. This group had a significantly larger impact on health care expenditure relative to ED users who were not unstably housed or who presented with a substance use related complaint. Further research into how care or connection to community resources in the ED can reduce these costs is warranted.
Aerial photography was obtained for the Beaufort Sea north of Tuktoyaktuk. The flight path covered two distinct ice zones over a 15.5 km transect extending perpendicular to the coast, yielding fifty-nine photographs at a scale of 1 : 2000. The process of ridge extraction was automated using a series of computer algorithms for image filtering, edge detection and edge linking. Examples from two different sections along the transect are chosen for presentation: (a) a heavily ridged area, and (b) an area with one dominant linear ridge feature that separates ice cover of different age. Two parameters used in the automated process, a minimum edge gradient and minimum number of connected pixels said to form a continuous ridge segment, influence the number, length and spatial pattern of extracted ridges. Direct one-to-one correlations between manually interpreted ridges from photographs and the algorithm extracted ridges from digital data are not always possible. However, results indicate that the automated ridge extraction procedure reliably characterizes the overall direction and density of the ice ridges. The distribution of the ice-ridge directions is estimated from circular (angular) histograms constructed directly from the digital data. Analysis of the Beaufort Sea transect reveals that the ice ridging is strongly anisotropic, with a principal direction parallel to the local coastline.
With the changing distribution of infectious diseases, and an increase in the burden of non-communicable diseases, low- and middle-income countries, including those in Africa, will need to expand their health care capacities to effectively respond to these epidemiological transitions. The interrelated risk factors for chronic infectious and non-communicable diseases and the need for long-term disease management, argue for combined strategies to understand their underlying causes and to design strategies for effective prevention and long-term care. Through multidisciplinary research and implementation partnerships, we advocate an integrated approach for research and healthcare for chronic diseases in Africa.
Layers of volcanic ash, or tephra form widespread chronostratigraphic marker horizons which are important because of their distinctive characteristics and rapid deposition over large areas. Absolute dating of prehistoric layers effectively depends upon 14C analysis. We focus here on Icelandic tephra layers at both proximal and distal sites and consider three strategies to obtain age estimates: 1) the conventional dating of individual profiles; 2) high-precision multisample techniques or “wiggle-matching” using stratigraphic sequences of peat; and 3) a combination of routine analyses from multiple sites. The first approach is illustrated by the dating of a peat profile in Scotland containing tephra from the ad 1510 eruption of Hekla. This produced a 14C age compatible with ad 1510, independently derived by geochemical correlation with historically dated Icelandic deposits. In addition, the ca. 2100 bp date for the Glen Garry tephra in Scotland, determined by a series of dates on a peat profile in Caithness, is supported by its stratigraphic position within 14C dated profiles in Sutherland, and may be applied over a very large area of Scotland. More precise dates for individual tephras may be produced by “wiggle-matching”, although this approach could be biased by changes in peat-bog stratigraphy close to the position of the tephra fall. As appropriate sites for “wiggle-match” exercises may be found only for a few Icelandic tephras, we also consider the results of a spatial approach to 14C dating tephra layers. We combined dates on peat underlying the same layer at several sites to estimate the age of the tephra: 3826 ± 12 bp for the Hekla-4 tephra and 2879 ± 34 bp for the Hekla-3 tephra. This approach is effective in terms of cost, the need for widespread applicability to Icelandic tephra stratigraphy and the production of ages of a useful resolution. We stress the need for accurate identification of tephra deposits without which the conclusions drawn from subsequent 14C dating will be fundamentally flawed.
Introduction: Substance use and unstable housing are associated with heavy use of the Emergency Department (ED). This study examined the impact of substance use and unstable housing on the probability of future ED use. Methods: Case-control study of patients presenting to an urban ED. Patients were eligible if they were unstably housed for the past 30 days, and/or if their chief complaint was related to substance use. Following written informed consent, patients completed a baseline survey and health care use was tracked via electronic medical records for the next six months. Controls were enrolled in a 1:4 ratio. More than 2 ED visits during the follow-up was pre-specified as a measure of excess ED use. Descriptive analyses included proportions and medians with interquartile ranges (IQR). Binomial logistic regression models were used to estimate the impact of housing status, high-risk alcohol use (AUDIT) and drug use (DUDIT), and combinations of these factors on subsequent acute care system contacts (ED visits + admissions). We controlled for age, gender, comorbidities at baseline, and baseline presenting acuity. Results: 41 controls, 46 substance using, 91 unstably housed, and 31 both unstably housed and substance using patients were enrolled (n = 209). Median ED visits during follow up were 0 (IQR: 0-1.0) for controls, 1.0 (IQR: 0-3.3) for substance using, 1.0 (IQR: 0-4.0) for unstably housed and 4 (IQR: 2-12.3) for unstably housed and substance using patients. The median acute care system contacts over the same period was 1.0 (IQR 0-2.0) for controls, 1.0 (IQR: 0-4.0) for substance using, 1.0 (IQR: 0-5.0) for unstably housed and 4.5 (IQR: 2.8-14.3) for unstably housed and substance using patients. Being unstably housed was the factor most strongly associated with having > 2 ED visits (b=3.288, p<0.005) followed by high-risk alcohol and drug use (b=2.149, p<0.08); high risk alcohol use alone was not significantly associated with ED visits (b=1.939, p<0.1). The number of comorbidities present at baseline was a small but statistically significant additional risk factor (b=0.478, p<0.05). The model correctly predicted 70.1% of patients’ ED utilization status. Conclusion: Unstable housing is a substantial risk factor for ED use; high-risk alcohol and drug use, and comorbidities at baseline increased this risk. The intensity of excess ED use was greatest in patients who were unstably housed and substance using.
Periodicity and new properties of the frequency curve. Bruno Hanisch uses the method of autocorrelation introduced by W. Pollack in geophysics for the discovery of periods in the frequency series of sunspots from 1794 to 1925. Dividing the whole interval into three sections he finds an eleven- and an eight-year period common to the three sections, whereas other periods found in the three sections differ widely from each other. The new method gives for the length of the main period II-8 years for the interval 1880 to 1925. This result agrees strikingly with the revolution period of Jupiter (Gerlands Beiträge zur Geophysik, 46, 1935).
In western Canada, more money is spent on wild oat herbicides than on any
other weed species, and wild oat resistance to herbicides is the most
widespread resistance issue. A direct-seeded field experiment was conducted
from 2010 to 2014 at eight Canadian sites to determine crop life cycle, crop
species, crop seeding rate, crop usage, and herbicide rate combination
effects on wild oat management and canola yield. Combining 2× seeding rates
of early-cut barley silage with 2× seeding rates of winter cereals and
excluding wild oat herbicides for 3 of 5 yr (2011 to 2013) often led to
similar wild oat density, aboveground wild oat biomass, wild oat seed
density in the soil, and canola yield as a repeated canola–wheat rotation
under a full wild oat herbicide rate regime. Wild oat was similarly well
managed after 3 yr of perennial alfalfa without wild oat herbicides.
Forgoing wild oat herbicides in only 2 of 5 yr from exclusively summer
annual crop rotations resulted in higher wild oat density, biomass, and seed
banks. Management systems that effectively combine diverse and optimal
cultural practices against weeds, and limit herbicide use, reduce selection
pressure for weed resistance to herbicides and prolong the utility of
threatened herbicide tools.
DSM-5 contains substantial changes to eating disorder diagnoses. We examined
relative prevalence rates of DSM-IV and DSM-5 eating disorder diagnoses
using Eating Disorder Examination–Questionnaire diagnostic algorithms in 117
community out-patients. DSM-5 criteria produced a reduction in combined
‘other specified feeding or eating disorder’ and ‘unspecified feeding or
eating disorder’ diagnoses from 46% to 29%, an increase in anorexia nervosa
diagnoses from 35% to 47%, the same number of bulimia nervosa diagnoses and
a 5% rate of binge eating disorder diagnoses.
Goals for this study were to characterize the substances being used by youth who presented to an emergency department (ED), their demographic descriptors, and to describe the associated acute morbidity and mortality.
We conducted a retrospective review of all youth, ages 10–16 years, who presented to a pediatric ED with complaints related to recreational drug use (n=641) for 2 years ending on December 31, 2009.
The median age of patients was 15 years; 56% were female. Six percent of patients were homeless, and 21% were wards of the state. The most frequent ingestions included ethanol (74%), marijuana (20%), ecstasy (19%), and medications (15%). Over one third of patients had ingested two or more substances. Ninety percent of patients were brought to the ED by the emergency medical services; 63% of these activations were by non-acquaintances. Of the 47% of youth who presented with a decreased level of consciousness, half had a Glasgow Coma Scale less than 13. The Canadian Triage and Acuity Scale score was 1 or 2 for 44% of patients. Sixty-eight percent received IV fluids, 42% received medication, and 4% were intubated. The admission rate was 9%.
Youth who presented to the ED for substance use represented a socially vulnerable population whose use of recreational substances resulted in high medical acuity and significant morbidity. Improved clinical identification of such high-risk youth and subsequent design of interventions to address problematic substance use and social issues are urgently needed to complement the acute medical care that youth receive.
The purpose of the present study was to measure the trace element distribution of waster pieces of coarse ware found on the pottery kiln site associated with Roman Tocra. These would then be compared with other coarse ware sherds found in Cyrenaica.
Neutron activation has been chosen as the method of analysis because several elements can be determined simultaneously, and also many samples can be analysed routinely.
Samples from any sherd were obtained as follows:
The sherd was brushed lightly to remove any foreign matter. It was then sliced with a diamond-coated saw into pieces about 5 mm × 5 mm × 2 mm all surface material being removed in the process. Each piece was washed quickly in dilute nitric acid (ANALAR), rinsed in distilled water and finally boiled for 2 minutes in distilled water. The samples were dried in an oven at 120° C. During all procedures polythene gloved hands and clean tweezers were used for manipulation.
Each sample was wrapped individually in aluminium foil and labelled with a heat resistant indelible marker. All the samples were placed in an aluminium can together with suitable standards for irradiation.
In this investigation the elements determined were scandium and iron. The oxides of these elements are commercially available in a very pure form (SPECPURE).
A known weight of the oxide was dissolved in hot concentrated ANALAR acid, hydrochloric acid for iron and nitric acid for scandium, and diluted to 25 ml in a graduated flask. The resulting solutions were about 1.5 mg per ml for iron and 0.15 mg per ml for scandium.
Although rare, typhoid fever cases acquired in the United States continue to be reported. Detection and investigation of outbreaks in these domestically acquired cases offer opportunities to identify chronic carriers. We searched surveillance and laboratory databases for domestically acquired typhoid fever cases, used a space–time scan statistic to identify clusters, and classified clusters as outbreaks or non-outbreaks. From 1999 to 2010, domestically acquired cases accounted for 18% of 3373 reported typhoid fever cases; their isolates were less often multidrug-resistant (2% vs. 15%) compared to isolates from travel-associated cases. We identified 28 outbreaks and two possible outbreaks within 45 space–time clusters of ⩾2 domestically acquired cases, including three outbreaks involving ⩾2 molecular subtypes. The approach detected seven of the ten outbreaks published in the literature or reported to CDC. Although this approach did not definitively identify any previously unrecognized outbreaks, it showed the potential to detect outbreaks of typhoid fever that may escape detection by routine analysis of surveillance data. Sixteen outbreaks had been linked to a carrier. Every case of typhoid fever acquired in a non-endemic country warrants thorough investigation. Space–time scan statistics, together with shoe-leather epidemiology and molecular subtyping, may improve outbreak detection.
In North America, terrestrial records of biodiversity and climate change that span Marine Oxygen Isotope Stage (MIS) 5 are rare. Where found, they provide insight into how the coupling of the ocean–atmosphere system is manifested in biotic and environmental records and how the biosphere responds to climate change. In 2010–2011, construction at Ziegler Reservoir near Snowmass Village, Colorado (USA) revealed a nearly continuous, lacustrine/wetland sedimentary sequence that preserved evidence of past plant communities between ~140 and 55 ka, including all of MIS 5. At an elevation of 2705 m, the Ziegler Reservoir fossil site also contained thousands of well-preserved bones of late Pleistocene megafauna, including mastodons, mammoths, ground sloths, horses, camels, deer, bison, black bear, coyotes, and bighorn sheep. In addition, the site contained more than 26,000 bones from at least 30 species of small animals including salamanders, otters, muskrats, minks, rabbits, beavers, frogs, lizards, snakes, fish, and birds. The combination of macro- and micro-vertebrates, invertebrates, terrestrial and aquatic plant macrofossils, a detailed pollen record, and a robust, directly dated stratigraphic framework shows that high-elevation ecosystems in the Rocky Mountains of Colorado are climatically sensitive and varied dramatically throughout MIS 5.
The Columbarium of Pomponius Hylas is not by any means so well known as it deserves to be; for it is certainly one of the best preserved monuments of its kind, and it is with great pleasure that we are able to include in the present volume of the Papers of the School the interesting series of drawings by Mr. F. G. Newton. It is situated on the Via Latina, immediately before the Porta Latina of the Aurelian Wall, on the righthand side going out (Lanciani, Forma Urbis, 46), in the former Vigna Sassi: it is, however, best approached from the Via Appia, inasmuch as the custodian of the tomb of the Scipiones keeps the key.
Previous excavations in the sixteenth century are mentioned by Flaminio Vacca (Mem. 100, ed. Fea), but nothing of importance was found. The monument in question was excavated in 1831, permission having been granted at the end of January of that year: its discovery was announced in a letter of Campana's bearing date March 28 of that year (Atti del Camerlengato, Tit. iv. fasc. 1460). The columbarium was approached, not from the Via Latina, but from a branch road running S.E. and passing in front of the entrance, according to Campana (tav. i. A, frontispiece, and p. 301), while Lanciani shows pavement on the N.W. side, on which Campana, in the frontispiece, seems to represent the remains of another tomb, so that this pavement is probably modern.
Analytic results are derived for the apparent slip length, the change in drag and the optimum air layer thickness of laminar channel and pipe flow over an idealised superhydrophobic surface, i.e. a gas layer of constant thickness retained on a wall. For a simple Couette flow the gas layer always has a drag reducing effect, and the apparent slip length is positive, assuming that there is a favourable viscosity contrast between liquid and gas. In pressure-driven pipe and channel flow blockage limits the drag reduction caused by the lubricating effects of the gas layer; thus an optimum gas layer thickness can be derived. The values for the change in drag and the apparent slip length are strongly affected by the assumptions made for the flow in the gas phase. The standard assumptions of a constant shear rate in the gas layer or an equal pressure gradient in the gas layer and liquid layer give considerably higher values for the drag reduction and the apparent slip length than an alternative assumption of a vanishing mass flow rate in the gas layer. Similarly, a minimum viscosity contrast of four must be exceeded to achieve drag reduction under the zero mass flow rate assumption whereas the drag can be reduced for a viscosity contrast greater than unity under the conventional assumptions. Thus, traditional formulae from lubrication theory lead to an overestimation of the optimum slip length and drag reduction when applied to superhydrophobic surfaces, where the gas is trapped.