To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Use of the herbicide atrazine (ATR) is banned in the European Union; yet, it is still widely used in the USA and Australia. ATR is known to alter testosterone and oestrogen production and thus reproductive characteristics in numerous species. In this proof of concept study, we examined the effect of ATR exposure, at a supra-environmental dose (5 mg/kg bw/day), beginning on E9.5 in utero, prior to sexual differentiation of the reproductive tissues, until 26 weeks of age, on the development of the mouse penis. Notably, this is the first study to specifically investigate whether ATR can affect penis characteristics. We show that ATR exposure, beginning in utero, causes a shortening (demasculinisation) of penis structures and increases the incidence of hypospadias in mice. These data indicate the need for further studies of ATR on human reproductive development and fertility, especially considering its continued and widespread use.
Older adults with dementia are particularly vulnerable to adverse outcomes resulting from anticholinergic use. We aimed to: (i) Examine the anticholinergic burden of patients with dementia attending a Psychiatry of Later Life (PLL) service (ii) Examine concomitant prescription of acetylcholinesterase inhibitors (AChEIs) and anticholinergics and (iii) Compare the Anticholinergic Cognitive Burden (ACB) scale with a recently published composite list of anticholinergics.
Retrospective chart review of new referrals with a diagnosis of dementia (n = 66) seen by the PLL service, Tallaght University Hospital, Dublin, Ireland, over a consecutive period of 4 months.
The mean ACB score was 2.2 (range = 0–9, SD = 2.1). 37.9% (n = 25) had a clinically significant ACB score (>3) and 42.1% (n = 8) of those taking AChEIs had a clinically significant ACB score. A significantly greater number of medications with anticholinergic activity were identified using the composite list versus the traditional ACB scale (2.3 v.1.5, p = 0.001).
We demonstrated a significant anticholinergic burden amongst patients with dementia attending a specialist PLL service. There was no difference in anticholinergic burden between groups prescribed and not prescribed AChEIs, indicating that these medications are being prescribed without discontinuation of potentially inappropriate medications with anticholinergic activity. The true anticholinergic burden experienced by patients may be underestimated by the use of the ACB score alone, although the clinical significance of this finding is unclear. Calculation of true clinical anticholinergic burden load and its translation to a specific rating scale remains a challenge.
There is increasing evidence to support integration of simulation into medical training; however, no national emergency medicine (EM) simulation curriculum exists. Using Delphi methodology, we aimed to identify and establish content validity for adult EM curricular content best suited for simulation-based training, to inform national postgraduate EM training.
A national panel of experts in EM simulation iteratively rated potential curricular topics, on a 4-point scale, to determine those best suited for simulation-based training. After each round, responses were analyzed. Topics scoring <2/4 were removed and remaining topics were resent to the panel for further ratings until consensus was achieved, defined as Cronbach α ≥ 0.95. At conclusion of the Delphi process, topics rated ≥ 3.5/4 were considered “core” curricular topics, while those rated 3.0-3.5 were considered “extended” curricular topics.
Forty-five experts from 13 Canadian centres participated. Two hundred eighty potential curricular topics, in 29 domains, were generated from a systematic literature review, relevant educational documents and Delphi panellists. Three rounds of surveys were completed before consensus was achieved, with response rates ranging from 93-100%. Twenty-eight topics, in eight domains, reached consensus as “core” curricular topics. Thirty-five additional topics, in 14 domains, reached consensus as “extended” curricular topics.
Delphi methodology allowed for achievement of expert consensus and content validation of EM curricular content best suited for simulation-based training. These results provide a foundation for improved integration of simulation into postgraduate EM training and can be used to inform a national simulation curriculum to supplement clinical training and optimize learning.
Advancements in computer technology have enabled three-dimensional (3D) reconstruction, data-stitching, and manipulation of 3D data obtained on X-ray imaging systems such as micro-computed tomography (μ-CT). Likewise, intuitive evaluation of these 3D datasets can be enhanced by recent advances in virtual reality (VR) hardware and software. Additionally, the generation, viewing, and manipulation of 3D X-ray diffraction datasets, such as pole figures employed for texture analysis, can also benefit from these advanced visualization techniques. We present newly-developed protocols for porting 3D data (as TIFF-stacks) into a Unity gaming software platform so that data may be toured, manipulated, and evaluated within a more-intuitive VR environment through the use of game-like controls and 3D headsets. We demonstrate this capability by rendering μ-CT data of a polymer dogbone test bar at various stages of in situ mechanical strain. An additional experiment is presented showing 3D XRD data collected on an aluminum test block with vias. These 3D XRD data for texture analysis (χ, ϕ, 2θ dimensions) enables the viewer to visually inspect 3D pole figures and detect the presence or absence of in-plane residual macrostrain. These two examples serve to illustrate the benefits of this new methodology for multidimensional analysis.
Disarticulated human remains were recovered from a first-century fort ditch at Vindolanda on the north-west frontier of the Roman Empire. Ancient DNA analysis revealed the skeleton to be that of a male individual and forensic taphonomic analysis suggested a primary deposition of the body in a waterlogged environment with no obvious evidence of formal burial. Occurrences of disarticulated human remains outside a cemetery context are often overlooked in Roman bioarchaeology. This discovery adds to the growing body of literature regarding alternative funerary practice in the Empire, highlighting that the concept of burial and disposal of the dead is more complex than ancient historical sources suggest. Details of the DNA analysis are provided in the Supplementary Material available at https://doi.org/10.1017/S0068113X1900014X.
Introduction: Previous systematic reviews suggest early mobilization in the intensive care unit (ICU) population is feasible, safe, and may improve outcomes. Only one review investigated mobilization specifically in trauma ICU patients and failed to identify any relevant articles. The objective of the present systematic review was to conduct an up-to-date search of the literature to assess the effect of early mobilization in adult trauma ICU patients on mortality, length of stay (LOS) and duration of mechanical ventilation. Methods: We performed a systematic search of four electronic databases (Ovid MEDLINE, Embase, CINAHL, Cochrane Library) and the grey literature. To be included, studies must have compared early mobilization to delayed or no mobilization among trauma patients admitted to the ICU. Meta-analysis was performed to determine the effect of early mobilization on mortality, hospital LOS, ICU LOS, and duration of mechanical ventilation. Results: The search yielded 2,975 records from the 4 databases and 7 records from grey literature and bibliographic searches; of these, 9 articles met all eligibility criteria and were included in the analysis. There were 7 studies performed in the United States, 1 study from China and 1 study from Norway. Study populations included neurotrauma (3 studies), blunt abdominal trauma (2 studies), mixed injury types (2 studies) and burns (1 study). Cohorts ranged in size from 15 to 1,132 patients (median, 63) and varied in inclusion criteria. Most studies used some form of stepwise progressive mobility protocol. Two studies used simple ambulation as the mobilization measure, and 1 study employed upright sitting as their only intervention. Time to commencement of the intervention was variable across studies, and only 2 studies specified the timing of mobilization initiation. We did not detect a difference in mortality with early mobilization, although the pooled risk ratio (RR) was reduced (RR 0.90, 95% CI 0.74 to 1.09). Hospital LOS and ICU LOS were decreased with early mobilization, though this difference did not reach significance. Duration of mechanical ventilation was significantly shorter in the early mobilization group (mean difference −1.18. 95% CI −2.17 to −0.19). Conclusion: Our review identified few studies that examined mobilization of critically ill trauma patients in the ICU. On meta-analysis, early mobilization was found to reduce duration of mechanical ventilation, but the effects on mortality and LOS were not significant.
Introduction: Hypotension is known to be associated with increased mortality in severe traumatic brain injury (TBI) patients. Systolic blood pressure (SBP) of <90 mmHg is the threshold for hypotension in consensus TBI treatment guidelines; however, evidence suggests hypotension should be defined at higher levels for these patients. Our objective was to determine the influence of hypotension on mortality in TBI patients requiring ICU admission using different thresholds of SBP on arrival at the emergency department (ED). Methods: Retrospective cohort study of patients with severe TBI (Abbreviated Injury Scale Head score ≥3) admitted to ICU at the QEII Health Sciences Centre (Halifax, Canada) between 2002 and 2013. Patients were grouped by SBP on ED arrival ( <90 mmHg, <100 mmHg, <110 mmHg). We performed multiple logistic regression analysis with mortality as the dependent variable. Models were adjusted for confounders including age, gender, Injury Severity Score (ISS), injury mechanism, and trauma team activation (TTA). Results: A total of 1233 patients sustained a severe TBI and were admitted to the ICU during the study period. The mean age was 43.4 ± 23.9 years and most patients were male (919/1233; 74.5%). The most common mechanism of injury was motor vehicle collision (491/1233; 41.2%) followed by falls (427/1233; 35.8%). Mean length of stay in the ICU was 6.1 ± 6.4 days, and the overall mortality rate was 22.7%. SBP on arrival was available for 1182 patients. The <90 mmHg group had 4.6% (54/1182) of these patients; mean ISS was 20.6 ± 7.8 and mortality was 40.7% (22/54). The <100 mmHg had 9.3% (110/1182) of patients; mean ISS was 19.3 ± 7.9 and mortality was 34.5% (38/110). The <110 mmHg group had 16.8% (198/1182) of patients; mean ISS was 17.9 ± 8.0 and mortality was 28.8% (57/198). After adjusting for confounders, the association between hypotension and mortality was 2.22 (95% CI 1.19-4.16) using a <90 mmHg cutoff, 1.79 (95% CI 1.12-2.86) using a <100 mmHg cutoff, and 1.50 (95% CI 1.02-2.21) using a <110 mmHg cutoff. Conclusion: While we found that TBI patients with a SBP <90 mmHg were over 2 times more likely to die, patients with an SBP <110 mmHg on ED arrival were still 1.5 times more likely to die from their injuries compared to patients without hypotension. These results suggest that establishing a higher threshold for clinically meaningful hypotension in TBI patients is warranted.
Introduction: Oxygen is commonly administered to prehospital patients presenting with acute myocardial infarction (AMI). We conducted a systematic review to determine if oxygen administration, in AMI, impacts patient outcomes. Methods: We conducted a systematic search using MeSH terms and keywords in Medline, Embase, Cochrane Database of Systematic Reviews, Cochrane Central, clinicaltrials.gov and ISRCTN for relevant randomized controlled trials and observational studies comparing oxygen administration and no oxygen administration. The outcomes of interest were: mortality (≤30 days, in-hospital, and intermediate 2-11 months), infarct size, and major adverse cardiac events (MACE). Risk of Bias assessments were performed and GRADE methodology was employed to assess quality and overall confidence in the effect estimate. A meta-analysis was performed using RevMan 5 software. Results: Our search yielded 1192 citations of which 48 studies were reviewed as full texts and a total of 8 studies were included in the analysis. All evidence was considered low or very low quality. Five studies reported on mortality finding low quality evidence of no benefit or harm. Low quality evidence demonstrated no benefit or harm from supplemental oxygen administration. Similarly, no benefit or harm was found in MACE or infarct size (very low quality). Normoxia was defined as oxygen saturation measured via pulse oximetry at ≥90% in one recent study and ≥94% in another. Conclusion: We found low and very low quality evidence that the administration of supplemental oxygen to normoxic patients experiencing AMI, provides no clear harm nor benefit for mortality or MACE. The evidence on infarct size was inconsistent and warrants further prospective examination.
Introduction: Long-term immobility has detrimental effects for critically ill patients admitted to the intensive care unit (ICU) including ICU-acquired weakness. Early mobilization of patients admitted to ICU has been demonstrated to be a safe, feasible and effective strategy to improve patient outcomes. The optimal mobilization of trauma ICU patients has not been extensively studied. Our objective was to determine the impact of an early mobilization protocol on outcomes among trauma patients admitted to the ICU. Methods: We analyzed all adult trauma patients ( > 18 years old) admitted to ICU over a 2-year period prior to and following implementation of an early mobilization protocol, allowing for a 1-year transition period. Data were collected from the Nova Scotia Trauma Registry. We compared patient characteristics and outcomes (mortality, length of stay [LOS], ventilator days) between the pre- and post-implementation groups. Associations between early mobilization and clinical outcomes were estimated using binary and linear regression models. Results: Overall, there were 526 patients included in the analysis (292 pre-implementation, 234 post-implementation). The study population ranged in age from 18 to 92 years (mean age 49.0 ± 20.4 years) and 74.3% of all patients were male. The pre- and post-implementation groups were similar in age, sex, and injury severity. In-hospital mortality was reduced in the post-implementation group (25.3% vs. 17.5%; p = 0.031). In addition, there was a reduction in ICU mortality in the post-implementation group (21.6% vs. 12.8%; p = 0.009). We did not observe any difference in overall hospital LOS, ICU LOS, or ventilator days between the two groups. Compared to the pre-implementation period, trauma patients admitted to the ICU following protocol implementation were less likely to die in-hospital (OR = 0.52, 95% CI 0.30-0.91; p = 0.021) or in the ICU (OR = 0.40, 95% CI 0.21- 0.76, p = 0.005). Results were similar following a sensitivity analysis limited to patients with blunt or penetrating injuries. There was no difference between the pre- and post-implementation groups with respect to in-hospital LOS, ICU LOS, or the number of ventilator days. Conclusion: We found that trauma patients admitted to ICU during the post-implementation period had decreased odds of in-hospital mortality and ICU mortality. Ours is the first study to demonstrate a significant reduction in trauma mortality following implementation of an ICU mobility protocol.
Introduction: The Prehospital Evidence-Based Practice (PEP) program is an online, freely accessible, continuously updated Emergency Medical Services (EMS) evidence repository. This summary describes the research evidence for the identification and management of adult patients suffering from sepsis syndrome or septic shock. Methods: PubMed was searched in a systematic manner. One author reviewed titles and abstracts for relevance and two authors appraised each study selected for inclusion. Primary outcomes were extracted. Studies were scored by trained appraisers on a three-point Level of Evidence (LOE) scale (based on study design and quality) and a three-point Direction of Evidence (DOE) scale (supportive, neutral, or opposing findings based on the studies’ primary outcome for each intervention). LOE and DOE of each intervention were plotted on an evidence matrix (DOE x LOE). Results: Eighty-eight studies were included for 15 interventions listed in PEP. The interventions with the most evidence were related to identification tools (ID) (n = 26, 30%) and early goal directed therapy (EGDT) (n = 21, 24%). ID tools included Systematic Inflammatory Response Syndrome (SIRS), quick Sequential Organ Failure Assessment (qSOFA) and other unique measures. The most common primary outcomes were related to diagnosis (n = 30, 34%), mortality (n = 40, 45%) and treatment goals (e.g. time to antibiotic) (n = 14, 16%). The evidence rank for the supported interventions were: supportive-high quality (n = 1, 7%) for crystalloid infusion, supportive-moderate quality (n = 7, 47%) for identification tools, prenotification, point of care lactate, titrated oxygen, temperature monitoring, and supportive-low quality (n = 1, 7%) for vasopressors. The benefit of prehospital antibiotics and EGDT remain inconclusive with a neutral DOE. There is moderate level evidence opposing use of high flow oxygen. Conclusion: EMS sepsis interventions are informed primarily by moderate quality supportive evidence. Several standard treatments are well supported by moderate to high quality evidence, as are identification tools. However, some standard in-hospital therapies are not supported by evidence in the prehospital setting, such as antibiotics, and EGDT. Based on primary outcomes, no identification tool appears superior. This evidence analysis can guide selection of appropriate prehospital therapies.
Introduction: The Canadian Association of Emergency Physicians (CAEP) Atrial Fibrillation (AF) Guidelines prioritizes early cardioversion and discharge home in the management of rapid AF, however not all patients can be safely cardioverted in the emergency department (ED). Given limited ED-based evidence on rate control, we sought to better understand the burden of disease in AF patients not managed by rhythm control and identify opportunities for improved care. Methods: We conducted a health records review of consecutive AF patient visits at two Canadian academic hospital EDs over a 12-month period. We included all patients ≥18 years with AF on electrocardiogram, a heart rate ≥100 beats per minute (bpm), and who did not receive cardioversion. Outcomes included: (1) incidence of patients managed by rate control; (2) specific rate control management practices including choice of agent, route of administration, dosing, and timing; (3) adverse events; (4) compliance with CAEP AF Guidelines; and (5) disposition and outcomes. Results: Of 972 rapid AF patient visits, 307 were excluded and 665 were included, with mean age 77.2, female 51.6%. Of those included, 43.0% were given rate control medications, most common being metoprolol (72.0%). Admission to hospital occurred in 61.4% of visits, and 77.9% of AF cases were secondary to another medical condition. In those given rate control medications, 9.1% suffered adverse events and only 55.6% had a final ED heart rate ≤100 bpm. Inappropriate use of rate control medications was found in 44.8% of cases, specifically inappropriate choice of agent (4.5%), inappropriate route of administration (26.9%), over-dosed (2.4%), under-dosed (5.2%), and inadequate timing (5.6%). Conclusion: We demonstrated that for rapid AF patients not receiving cardioversion, most cases were secondary to a medical cause and of those receiving rate control, there were a concerning number of adverse events related to inappropriate choice of agent, route of administration, dosage, and timing. Moving forward, better awareness of the CAEP AF Guidelines by ED physicians will ensure safer use of rate control agents for rapid AF patients.
Introduction: Opioids are routinely administered for analgesia to prehospital patients experiencing chest discomfort from acute myocardial infarction (AMI). We conducted a systematic review to determine if opioid administration impacts patient outcomes. Methods: We conducted a systematic search using MeSH terms and keywords in Medline, Embase, Cochrane Database of Systematic Reviews, Cochrane Central and Clinicaltrials.gov for relevant randomized controlled trials and observational studies comparing opioid administration in AMI patients from 1990 to 2017. The outcomes of interest were: all-cause short-term mortality (≤30 days), major adverse cardiac events (MACE), platelet activity and aggregation, immediate adverse events, infarct size, and analgesia. Included studies were hand searched for additional citations. Risk of Bias assessments were performed and GRADE methodology was employed to assess quality and overall confidence in the effect estimate. Results: Our search yielded 3001 citations of which 19 studies were reviewed as full texts and a total of 9 studies were included in the analysis. The studies predominantly reported on morphine as the opioid. Five studies reported on mortality (≤30 days), seven on MACE, four on platelet activity and aggregation, two on immediate adverse events, two on infarct size and none on analgesic effect. We found low quality evidence suggesting no benefit or harm in terms of mortality or MACE. However, low quality evidence indicates that opioids increase infarct size. Low-quality evidence also shows reduced serum P2Y12 (eg: clopidogrel and ticagrelor) active metabolite levels and increased platelet reactivity in the first several hours post administration following an increase in vomiting. Conclusion: We find low and very low quality evidence that the administration of opioids in STEMI may be adversely related to vomiting and some surrogate outcomes including increased infarct size, reduced serum P2Y12 levels, and increased platelet activity. We found no clear benefit or harm on patient-oriented clinical outcomes including mortality.
The majority of self-management interventions are designed with a narrow focus on patient skills and fail to consider their potential as “catalysts” for improving care delivery. A project was undertaken to develop a patient self-management resource to support evidence-based, person-centered care for cancer pain and overcome barriers at the levels of the patient, provider, and health system.
The project used a mixed-method design with concurrent triangulation, including the following: a national online survey of current practice; two systematic reviews of cancer pain needs and education; a desktop review of online patient pain diaries and other related resources; consultation with stakeholders; and interviews with patients regarding acceptability and usefulness of a draft resource.
Findings suggested that an optimal self-management resource should encourage pain reporting, build patients’ sense of control, and support communication with providers and coordination between services. Each of these characteristics was identified as important in overcoming established barriers to cancer pain care. A pain self-management resource was developed to include: (1) a template for setting specific, measureable, achievable, relevant and time-bound goals of care, as well as identifying potential obstacles and ways to overcome these; and (2) a pain management plan detailing exacerbating and alleviating factors, current strategies for management, and contacts for support.
Significance of results
Self-management resources have the potential for addressing barriers not only at the patient level, but also at provider and health system levels. A cluster randomized controlled trial is under way to test effectiveness of the resource designed in this project in combination with pain screening, audit and feedback, and provider education. More research of this kind is needed to understand how interventions at different levels can be optimally combined to overcome barriers and improve care.
Objectives: Visual-spatial neglect is a common attentional disorder after right-hemisphere stroke and is associated with poor rehabilitation outcomes. The presence of neglect symptoms has been reported to vary across personal, peripersonal, and extrapersonal space. Currently, no measure is available to assess neglect severity equally across these spatial regions and may be missing subsets of symptoms or patients with neglect entirely. We sought to provide initial construct validity for a novel assessment tool that measures neglect symptoms equally for these spatial regions: the Halifax Visual Scanning Test (HVST). Methods: In Study I, the HVST was compared to conventional measures of neglect and functional outcome scores (wheelchair navigation) in 15 stroke inpatients and 14 healthy controls. In Study II, 19 additional controls were combined with the control data from Study I to establish cutoffs for impairment. Patterns of neglect in the stroke group were examined. Results: In Study I, performance on all HVST subtests were correlated with the majority of conventional subtests and wheelchair navigation outcomes. In Study II, neglect-related deficits in visual scanning showed dissociations across spatial regions. Four inpatients exhibited symptoms of neglect on the HVST that were not detected on conventional measures, one of which showed symptoms in personal and extrapersonal space exclusively. Conclusions: The HVST appears a useful measure of neglect symptoms in different spatial regions that may not be detected with conventional measures and that correlates with functional wheelchair performance. Preliminary control data are presented and further research to add to this normative database appears warranted. (JINS, 2019, 25, 490–500)
An X-ray/Laser technique is described for producing Laue patterns on very small (1mm2) crystal surfaces where it is otherwise not possible to mechanically align the surface of the crystal perpendicular to an X-ray beam. This technique has been used to determine the orientation of the diamond inserts in cutting tool bits.
Synchrotron white beam transmission topography of GaAs as previously reported by the authors relied on scanning specimen and film synchronously through the incident x-ray beam to record transmission topographic images en film. Sometimes the total dose required for reasonable contrast on film carried with it enough thermal deposition to cause elastic warping of the wafer. To escape these problems, a real time system was assembled. This system included an image intensifier, a solid state camera, a computer board to frame-grab and digitize images, and appropriate image processing software. With this system, a three inch specimen was scanned from edge to edge in one minute. At this scan rate, the incident x-ray beam had to be significantly attenuated to avoid saturating the intensifier output.
The strength of ceramics or glasses can be increased by placing their surfaces into compression. Techniques include ion exchange, temperature glazing, surface chemical reactions and stress-induced phase transformations. Although most of these techniques are well recognized, little effort has been expended In experimentally determining the magnitude of the compressive stress, and in particular, to use experimental evidence to identify important material and process parameters that need to be controlled. The goal of this investigation was to determine some of the factors that effect the magnitude, profile and depth of the compressive layer introduced by a structural phase transformation. X-ray residual stress measurements were used to directly determine the state of the surface residual stress.
We sought to address the prior limitations of symptom checker accuracy by analysing the diagnostic and triage feasibility of online symptom checkers using a consecutive series of real-life emergency department (ED) patient encounters, and addressing a complex patient population – those with hepatitis C or HIV. We aimed to study the diagnostic and triage accuracy of these symptom checkers in relation to an emergency room physician-determined diagnosis. An ED retrospective analysis was performed on 8363 consecutive adult patients. Eligible patients included: 90 HIV, 67 hepatitis C, 11 both HIV and hepatitis C. Five online symptom checkers were utilised for diagnosis (Mayo Clinic, WebMD, Symptomate, Symcat, Isabel), three with triage capabilities. Symptom checker output was compared with ED physician-determined diagnosis data in regards to diagnostic accuracy and differential diagnosis listing, along with triage advice. All symptom checkers, whether for combined HIV and hepatitis C, HIV alone or hepatitis C alone had poor diagnostic accuracy in regards to Top1 (<20%), Top3 (<35%), Top10 (<40%), Listed at All (<45%). Significant variations existed for each individual symptom checker, as some appeared more accurate for listing the diagnosis in the top of the differential, vs. others more apt to list the diagnosis at all. In regards to ED triage data, a significantly higher percentage of hepatitis C patients (59.7%; 40/67) were found to have an initial diagnosis with emergent criteria than HIV patients (35.6%; 32/90). Symptom checker diagnostic capabilities are quite inferior to physician diagnostic capabilities. Complex patients such as those with HIV or hepatitis C may carry a more specific differential diagnosis, warranting symptom checkers to have diagnostic algorithms accounting for such complexity. Symptom checkers carry the potential for real-time epidemiologic monitoring of patient symptoms, as symptom entries and subsequent symptom checker diagnosis could allow health officials a means to track illnesses in specific patient populations and geographic regions. In order to do this, accurate and reliable symptom checkers are warranted.
In the sixteenth century the pseudo-science of hieroglyphics attracted many scholars for much the same reasons that the picture-puzzle game of imprese was appealing to an enlightened and leisured nobility. Like dianoetics and phrenology in more modern times, both pursuits belonged to that seemingly endless line of intellectual fads that enchant with idealistic and largely empty promises. But though they failed to become the improved means of written communication they were supposed to, they both possessed genuine intellectual qualities, in that they were wittily conceived and dependent upon a wide range of information, much of it obscure, and most of it classical.
There are no estimates of the heritability of phenotypic udder traits in suckler sheep, which produce meat lambs, and whether these are associated with resilience to mastitis. Mastitis is a common disease which damages the mammary gland and reduces productivity. The aims of this study were to investigate the feasibility of collecting udder phenotypes, their heritability and their association with mastitis in suckler ewes. Udder and teat conformation, teat lesions, intramammary masses (IMM) and litter size were recorded from 10 Texel flocks in Great Britain between 2012 and 2014; 968 records were collected. Pedigree data were obtained from an online pedigree recording system. Univariate quantitative genetic parameters were estimated using animal and sire models. Linear mixed models were used to analyse continuous traits and generalised linear mixed models were used to analyse binary traits. Continuous traits had higher heritabilities than binary with teat placement and teat length heritability (h2) highest at 0.35 (SD 0.04) and 0.42 (SD 0.04), respectively. Udder width, drop and separation heritabilities were lower and varied with udder volume. The heritabilities of IMM and teat lesions (sire model) were 0.18 (SD 0.12) and 0.17 (SD 0.11), respectively. All heritabilities were sufficiently high to be in a selection programme to increase resilience to mastitis in the population of Texel sheep. Further studies are required to investigate genetic relationships between traits and to determine whether udder traits predict IMM, and the potential benefits from including traits in a selection programme to increase resilience to chronic mastitis.