To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Antipsychotics are widely used for treating patients with psychosis, and target threshold psychotic symptoms. Individuals at clinical high risk (CHR) for psychosis are characterized by subthreshold psychotic symptoms. It is currently unclear who might benefit from antipsychotic treatment. Our objective was to apply a risk calculator (RC) to identify people that would benefit from antipsychotics.
Drawing on 400 CHR individuals recruited between 2011 and 2016, 208 individuals who received antipsychotic treatment were included. Clinical and cognitive variables were entered into an individualized RC for psychosis; personal risk was estimated and 4 risk components (negative symptoms-RC-NS, general function-RC-GF, cognitive performance-RC-CP, and positive symptoms-RC-PS) were constructed. The sample was further stratified according to the risk level. Higher risk was defined based on the estimated risk score (20% or higher).
In total, 208 CHR individuals received daily antipsychotic treatment of an olanzapine-equivalent dose of 8.7 mg with a mean administration duration of 58.4 weeks. Of these, 39 (18.8%) developed psychosis within 2 years. A new index of factors ratio (FR), which was derived from the ratio of RC-PS plus RC-GF to RC-NS plus RC-CP, was generated. In the higher-risk group, as FR increased, the conversion rate decreased. A small group (15%) of CHR individuals at higher-risk and an FR >1 benefitted from the antipsychotic treatment.
Through applying a personal risk assessment, the administration of antipsychotics should be limited to CHR individuals with predominantly positive symptoms and related function decline. A strict antipsychotic prescription strategy should be introduced to reduce inappropriate use.
Beryl from Xuebaoding, Sichuan Province, western China is known for its unusual tabular habit and W–Sn–Be paragenesis in a greisen-type deposit. The crystals are typically colourless transparent to pale blue, often with screw dislocations of hexagonal symmetry on the (0001) crystal faces. Combining electron microprobe analyses and laser ablation inductively coupled plasma mass spectrometry with single-crystal X-ray diffraction (XRD), correlated with Raman and micro-infrared (IR) spectroscopy and imaging, the crystal chemical characteristics are determined. The contents of Na+ (0.24–0.38 atoms per formula unit (apfu)) and Li+ up to 0.38 apfu are at the high end compared to beryl from other localities worldwide. Li+ substitution for Be2+ on the tetrahedral (T2) site is predominantly charge balanced by Na+ on the smaller channel (C2) site, with Na+ ranging from 91.5% to 99.7% (apfu) of the sum of all other alkali elements. Cs+ and minor Rb+ and K+ primarily charge balance the minor M2+ substitution for Al3+ at the A site; all iron at the A site is suggested to be trivalent. The a axis ranges from 9.2161(2) to 9.2171(4) Å, with unit-cell volume from 678.03(3) to 678.48(7) Å3. The c/a ratio of 1.0002–1.0005 is characteristic for T2-type beryl with unit-cell parameters controlled primarily by Be2+ substitution. Transmission micro-IR vibrational spectroscopy and imaging identifies coordination of one or two water molecules to Na+ (type IIs and type IId, respectively) as well as alkali free water (type I). Based on IR absorption cross section and XRD a C1 site water content of 0.4–0.5 apfu is derived, i.e. close to 50% site occupancy. Secondary crystal phases with a decrease in Fe and Mg, yet increase in Na, suggest early crystallisation of aquamarine, with goshenite being late. With similar crystal chemistry to beryl of columnar habit from other localities worldwide, the tabular habit of Xuebaoding beryl seems to be unrelated to chemical composition and alkali content.
To examine associations between diet and risk of developing gastro-oesophageal reflux disease (GERD).
Prospective cohort with a median follow-up of 15·8 years. Baseline diet was measured using a FFQ. GERD was defined as self-reported current or history of daily heartburn or acid regurgitation beginning at least 2 years after baseline. Sex-specific logistic regressions were performed to estimate OR for GERD associated with diet quality scores and intakes of nutrients, food groups and individual foods and beverages. The effect of substituting saturated fat for monounsaturated or polyunsaturated fat on GERD risk was examined.
A cohort of 20 926 participants (62 % women) aged 40–59 years at recruitment between 1990 and 1994.
For men, total fat intake was associated with increased risk of GERD (OR 1·05 per 5 g/d; 95 % CI 1·01, 1·09; P = 0·016), whereas total carbohydrate (OR 0·89 per 30 g/d; 95 % CI 0·82, 0·98; P = 0·010) and starch intakes (OR 0·84 per 30 g/d; 95 % CI 0·75, 0·94; P = 0·005) were associated with reduced risk. Nutrients were not associated with risk for women. For both sexes, substituting saturated fat for polyunsaturated or monounsaturated fat did not change risk. For both sexes, fish, chicken, cruciferous vegetables and carbonated beverages were associated with increased risk, whereas total fruit and citrus were associated with reduced risk. No association was observed with diet quality scores.
Diet is a possible risk factor for GERD, but food considered as triggers of GERD symptoms might not necessarily contribute to disease development. Potential differential associations for men and women warrant further investigation.
To determine whether a previously reported association between the Special Supplemental Nutrition Program for Women, Infants and Children (WIC) food package change and reduced child obesity risk among WIC-participating children in Los Angeles County holds across levels of family income and neighbourhood poverty.
Analysis of prospectively collected WIC administrative data. The outcome was obesity at age 4 years (BMI-for-age ≥ 95th percentile). Poisson regression was applied to a matched sample (n 79 502) to determine if the association between the WIC food package change and child obesity was modified by family income (<50 % federal poverty level (FPL), 50–100 % FPL, >100 % but <185 % FPL) and neighbourhood poverty.
Los Angeles County, California.
Children who participated in WIC in Los Angeles County between 2003 and 2016; children were grouped as receiving the old WIC food package (2003–2009) or the new WIC food package (2010–2016).
Receiving the new WIC food package (i.e., post-2009) was associated with 7–18 % lower obesity risk across all family income categories. Neither family income nor neighbourhood poverty significantly modified the association between the WIC food package and child obesity. However, certain sub-groups seemed to benefit more from the food package change than others. In particular, boys from families with income above poverty but residing in the poorest neighbourhoods experienced the greatest reductions in obesity risk (relative risk = 0·77; 95 % CI 0·66, 0·88).
The WIC food package revisions were associated with reduced childhood obesity risk among all WIC-participating families in Los Angeles County, across levels of income eligibility and neighbourhood poverty.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
This timeline is concerned with Dynamic Assessment (henceforth, DA) as it has been taken up and elaborated in contexts involving the teaching, learning, and assessment of learners of second languages (L2s). DA is distinguished by its insistence that an individual's independent performance of assessment tasks reveals only part of his/her abilities, namely those that have completed their development at the time of the assessment; insights into abilities that have begun to emerge but have not yet fully developed can be determined according to an individual's responsiveness to particular kinds of support, referred to as mediation (e.g., reminders, leading questions, hints, provision of a model, feedback), offered during the assessment as difficulties arise (Haywood & Lidz, 2006). In this respect, DA differs from more conventional distinctions in assessment, such as that between assessments concerned with the results of previous learning (‘summative assessment’) and those intended to provide information relevant to subsequent instruction (‘formative assessment’). Instead, the embedding of an interactive, instructional element within the assessment procedure allows for the possibility of expanding the evidential basis upon which summative interpretations of learner abilities are made; that is, the results encompass previous learning that has resulted in both complete and partial understanding of relevant concepts. At the same time, DA serves a formative function in so far as interaction allows insights into the underlying sources of learner difficulties and the kind of support to which they are most responsive (Sternberg & Grigorenko, 2002).
An observational study was conducted to characterize high-touch surfaces in emergency departments and hemodialysis facilities. Certain surfaces were touched with much greater frequency than others. A small number of surfaces accounted for the majority of touch episodes. Prioritizing disinfection of these surfaces may reduce pathogen transmission within healthcare environments.
Infection prevention and control (IPC) workflows are often retrospective and manual. New tools, however, have entered the field to facilitate rapid prospective monitoring of infections in hospitals. Although artificial intelligence (AI)–enabled platforms facilitate timely, on-demand integration of clinical data feeds with pathogen whole-genome sequencing (WGS), a standardized workflow to fully harness the power of such tools is lacking. We report a novel, evidence-based workflow that promotes quicker infection surveillance via AI-assisted clinical and WGS data analysis. The algorithm suggests clusters based on a combination of similar minimum inhibitory concentration (MIC) data, timing of sample collection, and shared location stays between patients. It helps to proactively guide IPC professionals during investigation of infectious outbreaks and surveillance of multidrug-resistant organisms and healthcare-acquired infections. Methods: Our team established a 1-year workgroup comprised of IPC practitioners, clinical experts, and scientists in the field. We held weekly roundtables to study lessons learned in an ongoing surveillance effort at a tertiary care hospital—utilizing Philips IntelliSpace Epidemiology (ISEpi), an AI-powered system—to understand how such a tool can enhance practice. Based on real-time case discussions and evidence from the literature, a workflow guidance tool and checklist were codified. Results: In our workflow, data-informed clusters posed by ISEpi underwent triage and expert follow-up analysis to assess: (1) likelihood of transmission(s); (2) potential vector(s) identity; (3) need to request WGS; and (4) intervention(s) to be pursued, if warranted. In a representative sample (spanning October 17, 2019, to November 7, 2019) of 67 total isolates suggested for inclusion in 19 unique cluster investigations, we determined that 9 investigations merited follow-up. Collectively, these 9 investigations involved 21 patients and required 115 minutes to review in ISEpi and an additional 70 minutes of review outside of ISEpi. After review, 6 investigations were deemed unlikely to represent a transmission; the other 3 had potential to represent transmission for which interventions would be performed. Conclusions: This study offers an important framework for adaptation of existing infection control workflow strategies to leverage the utility of rapidly integrated clinical and WGS data. This workflow can also facilitate time-sensitive decisions regarding sequencing of specific pathogens given the preponderance of available clinical data supporting investigations. In this regard, our work sets a new standard of practice: precision infection prevention (PIP). Ongoing effort is aimed at development of AI-powered capabilities for enterprise-level quality and safety improvement initiatives.
Funding: Philips Healthcare provided support for this study.
Disclosures: Alan Doty and Juan Jose Carmona report salary from Philips Healthcare.
Background: The healthcare environment can serve as a reservoir for many microorganisms and, in the absence of appropriate cleaning and disinfection, can contribute to pathogen transmission. Identification of high-touch surfaces (HTS) in hospital patient rooms has allowed the recognition of surfaces that represent the greatest transmission risk and prioritization of cleaning and disinfection resources for infection prevention. HTS in other healthcare settings, including high-volume and high-risk settings such as emergency departments (EDs) and hemodialysis facilities (HDFs), have not been well studied or defined. Methods: Observations were conducted in 2 EDs and 3 HDFs using structured observation tools. All touch episodes, defined as hand-to-surface contact regardless of hand hygiene and/or glove use, were recorded. Touches by healthcare personnel, patients, and visitors were included. Surfaces were classified as being allocated to individual patients or shared among multiple patients. The number of touch episodes per hour was calculated for each surface to rank surfaces by frequency of touch. Results: In total, 28 hours of observation (14 hours each in EDs and HDFs) were conducted. 1,976 touch episodes were observed among 62 surfaces. On average, more touch episodes were observed per hour in HDFs than in EDs (89 vs 52, respectively). The most frequently touched surfaces in EDs included stretcher rails, privacy curtains, visitor chair arm rests and seats, and patient bedside tables, which together accounted for 68.8% of all touch episodes in EDs (Fig. 1). Frequently touched surfaces in HDFs included both shared and single-patient surfaces: 27.8% and 72.2% of HDF touch episodes, respectively. The most frequently touched surfaces in HDFs were supply cart drawers, dialysis machine control panels and keyboards, handwashing faucet handles, bedside work tables, and bed rail or dialysis chair armrests, which accounted for 68.4% of all touch-episodes recorded. Conclusions: To our knowledge, this is the first quantitative study to identify HTSs in EDs and HDFs. Our observations reveal that certain surfaces within these environments are subject to a substantially greater frequency of hand contact than others and that a relatively small number of surfaces account for most touch episodes. Notably, whereas HTSs in EDs were primarily single-patient surfaces, HTSs in HDFs included surfaces shared in the care of multiple patients, which may represent an even greater risk of patient-to-patient pathogen transmission than single-patient surfaces. The identification of HTSs in EDs and HDFs contributes to a better understanding of the risk of environment-related pathogen transmission in these settings and may allow prioritization and optimization of cleaning and disinfection resources within facilities.
Although seafood is considered to be an important part of a balanced diet, many national food consumption surveys suggest that seafood is not consumed in sufficient amounts. As consumers are moving to diversify their diet from animal-based protein, it is important to understand the factors influencing consumption of marine foods. This review aims to assess the characteristics of seafood consumers as well as the influences on seafood consumption in Europe, USA, Canada, Australia and New Zealand. Systematic search strategies were used to identify relevant journal articles from three electronic databases (PubMed, Web of Science and Embase). Three searches were carried out and identified 4405 unique publications from which 121 met the criteria for the review process. The reviewed studies revealed that seafood consumers were more likely to be older, more affluent and more physically active and were less likely to smoke compared with non-seafood consumers. Sex and BMI did not appear to have a directional association with seafood consumption. The most commonly reported barriers to seafood consumption were cost, followed by sensory or physical barriers, health and nutritional beliefs, habits, availability and cooking skills. The most commonly reported influences were beliefs about the contribution of seafood to health, environmental influences and personal preferences. Based on the findings of this review, future intervention strategies to increase seafood consumption may need to consider affordability and education in terms of health, nutrition and cooking skills. More research is needed to explore the effectiveness of specific interventions at increasing the consumption of seafood.
Family coaggregation of attention-deficit/hyperactivity disorder (ADHD), autism spectrum disorder (ASD), bipolar disorder (BD), major depressive disorder (MDD) and schizophrenia have been presented in previous studies. The shared genetic and environmental factors among psychiatric disorders remain elusive.
This nationwide population-based study examined familial coaggregation of major psychiatric disorders in first-degree relatives (FDRs) of individuals with ASD. Taiwan's National Health Insurance Research Database was used to identify 26 667 individuals with ASD and 67 998 FDRs of individuals with ASD. The cohort was matched in 1:4 ratio to 271 992 controls. The relative risks (RRs) and 95% confidence intervals (CI) of ADHD, ASD, BD, MDD and schizophrenia were assessed among FDRs of individuals with ASD and ASD with intellectual disability (ASD-ID).
FDRs of individuals with ASD have higher RRs of major psychiatric disorders compared with controls: ASD 17.46 (CI 15.50–19.67), ADHD 3.94 (CI 3.72–4.17), schizophrenia 3.05 (CI 2.74–3.40), BD 2.22 (CI 1.98–2.48) and MDD 1.88 (CI 1.76–2.00). Higher RRs of schizophrenia (4.47, CI 3.95–5.06) and ASD (18.54, CI 16.18–21.23) were observed in FDRs of individuals with both ASD-ID, compared with ASD only.
The risk for major psychiatric disorders was consistently elevated across all types of FDRs of individuals with ASD. FDRs of individuals with ASD-ID are at further higher risk for ASD and schizophrenia. Our results provide leads for future investigation of shared etiologic pathways of ASD, ID and major psychiatric disorders and highlight the importance of mental health care delivered to at-risk families for early diagnoses and interventions.
Pollen-mediated gene flow (PMGF) refers to the transfer of genetic information (alleles) from one plant to another compatible plant. With the evolution of herbicide-resistant (HR) weeds, PMGF plays an important role in the transfer of resistance alleles from HR to susceptible weeds; however, little attention is given to this topic. The objective of this work was to review reproductive biology, PMGF studies, and interspecific hybridization, as well as potential for herbicide resistance alleles to transfer in the economically important broadleaf weeds including common lambsquarters, giant ragweed, horseweed, kochia, Palmer amaranth, and waterhemp. The PMGF studies involving these species reveal that transfer of herbicide resistance alleles routinely occurs under field conditions and is influenced by several factors, such as reproductive biology, environment, and production practices. Interspecific hybridization studies within Amaranthus and Ambrosia spp. show that herbicide resistance allele transfer is possible between species of the same genus but at relatively low levels. The widespread occurrence of HR weed populations and high genetic diversity is at least partly due to PMGF, particularly in dioecious species such as Palmer amaranth and waterhemp compared with monoecious species such as common lambsquarters and horseweed. Prolific pollen production in giant ragweed contributes to PMGF. Kochia, a wind-pollinated species can efficiently disseminate herbicide resistance alleles via both PMGF and tumbleweed seed dispersal, resulting in widespread occurrence of multiple HR kochia populations. The findings from this review verify that intra- and interspecific gene flow can occur and, even at a low rate, could contribute to the rapid spread of herbicide resistance alleles. More research is needed to determine the role of PMGF in transferring multiple herbicide resistance alleles at the landscape level.
The age-adjusted rate of suicide death in the USA has increased significantly since 2000 and little is known about national trends in non-fatal suicidal behaviors (ideation, plan, and attempt) among adults and their associated sociodemographic and clinical characteristics. This study examined trends in non-fatal suicidal behaviors among adults in the USA.
Data were obtained from adults 18–65 years of age who participated in the National Survey on Drug Use and Health (NSDUH), including mental health assessment, from 2009 to 2017 (n = 335 359). Examinations of data involved trend analysis methods with the use of logistic regressions and interaction terms.
Suicidal ideation showed fluctuation from 2009 to 2017, whereas suicide plan and attempt showed significantly positive linear trends with the odds increasing by an average of 3% and 4%, respectively. Suicide plan increased the most for females and adults ages 18–34, and attempt increased the most for adults with drug dependence. Both plan and attempt increased the most among adults who either had mental illness but were not in treatment or had no mental illness.
Given attempted suicide is the strongest known risk factor for suicide death, reducing non-fatal suicidal behaviors including attempt are important public health and clinical goals. The interactional findings of age, sex, mental health status, and drug dependence point toward the importance of tailoring prevention efforts to various sociodemographic and clinical factors.
With the development of evidence-based interventions for treatment of priority mental health conditions in humanitarian settings, it is important to establish the cost-effectiveness of such interventions to enable their scale-up.
To evaluate the cost-effectiveness of the Problem Management Plus (PM+) intervention compared with enhanced usual care (EUC) for common mental disorders in primary healthcare in Peshawar, Pakistan. Trial registration ACTRN12614001235695 (anzctr.org.au).
We randomly allocated 346 participants to either PM+ (n = 172) or EUC (n = 174). Effectiveness was measured using the Hospital Anxiety and Depression Scale (HADS) at 3 months post-intervention. Cost-effectiveness analysis was performed as incremental costs (measured in Pakistani rupees, PKR) per unit change in anxiety, depression and functioning scores.
The total cost of delivering PM+ per participant was estimated at PKR 16 967 (US$163.14) using an international trainer and supervisor, and PKR 3645 (US$35.04) employing a local trainer. The mean cost per unit score improvement in anxiety and depression symptoms on the HADS was PKR 2957 (95% CI 2262–4029) (US$28) with an international trainer/supervisor and PKR 588 (95% CI 434–820) (US$6) with a local trainer/supervisor. The mean incremental cost-effectiveness ratio (ICER) to successfully treat a case of depression (PHQ-9 ≥ 10) using an international supervisor was PKR 53 770 (95% CI 39 394–77 399) (US$517), compared with PKR 10 705 (95% CI 7731–15 627) (US$102.93) using a local supervisor.
The PM+ intervention was more effective but also more costly than EUC in reducing symptoms of anxiety, depression and improving functioning in adults impaired by psychological distress in a post-conflict setting of Pakistan.
Decontamination of N95 respirators is being used by clinicians in the face of a global shortage of these devices. Some treatments for decontamination, such as some vaporized hydrogen peroxide methods or ultraviolet methods, had no impact on respiratory performance, while other treatments resulted in substantial damage to masks.
Both 1- and 2-hour rapid diagnostic algorithms using high-sensitivity troponin (hs-cTn) have been validated to diagnose acute myocardial infarction (MI), leaving physicians uncertain which algorithm is preferable. The objective of this study was to prospectively evaluate the diagnostic performance of 1- and 2-hour algorithms in clinical practice in a Canadian emergency department (ED).
ED patients with chest pain had high-sensitivity cardiac troponin-T (hs-cTnT) collected on presentation and 1- and 2-hours later at a single academic centre over a 2-year period. The primary outcome was index MI, and the secondary outcome was 30-day major adverse cardiac events (MACE). All outcomes were adjudicated.
We enrolled 608 patients undergoing serial hs-cTnT sampling. Of these, 350 had a valid 1-hour and 550 had a 2-hour hs-cTnT sample. Index MI and 30-day MACE prevalence was ~12% and 14%. Sensitivity of the 1- and 2-hour algorithms was similar for index MI 97.3% (95% CI: 85.8–99.9%) and 100% (95% CI: 91.6–100%) and 30-day MACE: 80.9% (95% CI: 66.7–90.9%) and 83.3% (95% CI: 73.2–90.8%), respectively. Both algorithms accurately identified about 10% of patients as high risk.
Both algorithms were able to classify almost two-thirds of patients as low risk, effectively ruling out MI and conferring a low risk of 30-day MACE for this group, while reliably identifying high-risk patients. While both algorithms had equivalent diagnostic performance, the 2-hour algorithm offers several practical advantages, which may make it preferable to implement. Broad implementation of similar algorithms across Canada can expedite patient disposition and lead to resource savings.
We describe 14 yr of public data from the Parkes Pulsar Timing Array (PPTA), an ongoing project that is producing precise measurements of pulse times of arrival from 26 millisecond pulsars using the 64-m Parkes radio telescope with a cadence of approximately 3 weeks in three observing bands. A comprehensive description of the pulsar observing systems employed at the telescope since 2004 is provided, including the calibration methodology and an analysis of the stability of system components. We attempt to provide full accounting of the reduction from the raw measured Stokes parameters to pulse times of arrival to aid third parties in reproducing our results. This conversion is encapsulated in a processing pipeline designed to track provenance. Our data products include pulse times of arrival for each of the pulsars along with an initial set of pulsar parameters and noise models. The calibrated pulse profiles and timing template profiles are also available. These data represent almost 21 000 h of recorded data spanning over 14 yr. After accounting for processes that induce time-correlated noise, 22 of the pulsars have weighted root-mean-square timing residuals of
in at least one radio band. The data should allow end users to quickly undertake their own gravitational wave analyses, for example, without having to understand the intricacies of pulsar polarisation calibration or attain a mastery of radio frequency interference mitigation as is required when analysing raw data files.
Introduction: There is ongoing concern about the burden placed on healthcare systems by lab tests. Although these concerns are widespread, it is difficult to quantify the extent of the problem. One approach involves use of a metric known as the Mean Abnormal Response Rate (MARR), which is the proportion of tests ordered that return an abnormal result; a higher MARR value indicates higher yield. The primary objective of this study was to calculate MARRs for tests ordered between April 2014 and March 2019 at the four adult emergency departments (EDs) covering a metropolitan population of 1.3 million. Secondary objectives included identifying tests with highest and lowest MARRs; comparison of MARRs for nurse- and physician-initiated orders; correlation of the number of tests per order requisition to MARR; and correlation of physician experience to MARR. Methods: In total, 40 laboratory tests met inclusion criteria for this study. Administrative data on these tests as ordered at the four EDs were obtained and analyzed. Multi-component test results, such as from CBC, were consolidated such that an abnormal result for any component was coded as an abnormal result for the entire test. Repeat tests ordered within a single patient visit were excluded. Physician experience was quantified for 209 ED physicians as number of years since licensure. Analyses were descriptive where appropriate for whole-population data. Risk of bias was attenuated by the focus on administrative data. Results: The population dataset comprised 33,757,004 test results on 415,665 unique patients. Of these results, 30.3% were the outcomes of nurse-initiated orders. The 5-year MARRs for the four hospitals were 38.3%, 40.0%, 40.7% and 40.9%. The highest per-test MARRs were for BNP (80.5%) and CBC (62.6%), while the lowest were for glucose (7.9%) and sodium (11.6%). MARRs were higher for nurse-initiated orders than for physician-initiated orders (44.7% vs. 38.1%), likely due to the greater order frequency of high-yield CBC in nurse-initiated orders (38.6% vs. 18.1%). The number of tests per order requisition was inversely associated with MARR (r = -0.90, p < 0.001). Finally, the number of years since licensure was modestly but significantly associated with MARR (r = 0.28, p < 0.001). Conclusion: This is the first and largest study to apply the MARR in an ED setting. As a metric, MARR effectively identifies differences in test ordering practices on per-test and per-hospital bases, which could be useful for data-informed practice optimization.