To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A new fossil site in a previously unexplored part of western Madagascar (the Beanka Protected Area) has yielded remains of many recently extinct vertebrates, including giant lemurs (Babakotia radofilai, Palaeopropithecus kelyus, Pachylemur sp., and Archaeolemur edwardsi), carnivores (Cryptoprocta spelea), the aardvark-like Plesiorycteropus sp., and giant ground cuckoos (Coua). Many of these represent considerable range extensions. Extant species that were extirpated from the region (e.g., Prolemur simus) are also present. Calibrated radiocarbon ages for 10 bones from extinct primates span the last three millennia. The largely undisturbed taphonomy of bone deposits supports the interpretation that many specimens fell in from a rock ledge above the entrance. Some primates and other mammals may have been prey items of avian predators, but human predation is also evident. Strontium isotope ratios (87Sr/86Sr) suggest that fossils were local to the area. Pottery sherds and bones of extinct and extant vertebrates with cut and chop marks indicate human activity in previous centuries. Scarcity of charcoal and human artifacts suggests only occasional visitation to the site by humans. The fossil assemblage from this site is unusual in that, while it contains many sloth lemurs, it lacks ratites, hippopotami, and crocodiles typical of nearly all other Holocene subfossil sites on Madagascar.
Introduction: For rhythm control of acute atrial fibrillation (AAF) in the emergency department (ED), choices include initial drug therapy or initial electrical cardioversion (ECV). We compared the strategies of pharmacological cardioversion followed by ECV if necessary (Drug-Shock), and ECV alone (Shock Only). Methods: We conducted a randomized, blinded, placebo-controlled trial (1:1 allocation) comparing two rhythm control strategies at 11 academic EDs. We included stable adult patients with AAF, where onset of symptoms was <48 hours. Patients underwent central web-based randomization stratified by site. The Drug-Shock group received an infusion of procainamide (15mg/kg over 30 minutes) followed 30 minutes later, if necessary, by ECV at 200 joules x 3 shocks. The Shock Only group received an infusion of saline followed, if necessary, by ECV x 3 shocks. The primary outcome was conversion to sinus rhythm for ≥30 minutes at any time following onset of infusion. Patients were followed for 14 days. The primary outcome was evaluated on an apriori-specified modified intention-to-treat (MITT) basis excluding patients who never received the study infusion (e.g. spontaneous conversion). Data were analyzed using chi-squared tests and logistic regression. Our target sample size was 374 evaluable patients. Results: Of 395 randomized patients, 18 were excluded from the MITT analysis; none were lost to follow-up. The Drug-Shock (N = 198) and Shock Only (N = 180) groups (total = 378) were similar for all characteristics including mean age (60.0 vs 59.5 yrs), duration of AAF (10.1 vs 10.8 hrs), previous AF (67.2% vs 68.3%), median CHADS2 score (0 vs 0), and mean initial heart rate (119.9 vs 118.0 bpm). More patients converted to normal sinus rhythm in the Drug-Shock group (97.0% vs 92.2%; absolute difference 4.8%, 95% CI 0.2-9.9; P = 0.04). The multivariable analyses confirmed the Drug-Shock strategy superiority (P = 0.04). There were no statistically significant differences for time to conversion (91.4 vs 85.4 minutes), total ED length of stay (7.1 vs 7.7 hours), disposition home (97.0% vs 96.1%), and stroke within 14 days (0 vs 0). Premature discontinuation of infusion was more common in the Drug-Shock group (8.1% vs 0.6%) but there were no serious adverse events. Conclusion: Both the Drug-Shock and Shock Only strategies were highly effective and safe in allowing AAF patients to go home in sinus rhythm. A strategy of initial cardioversion with procainamide was superior to a strategy of immediate ECV.
Introduction: Individualizing risk for stroke following a transient ischemic attack (TIA) is a topic of intense research, as existing scores are context-dependent or have not been well validated. The Canadian TIA Score stratifies risk of subsequent stroke into low, moderate and high risk. Our objective was to prospectively validate the Canadian TIA Score in a new cohort of emergency department (ED) patients. Methods: We conducted a prospective cohort study in 14 Canadian EDs over 4 years. We enrolled consecutive adult patients with an ED visit for TIA or nondisabling stroke. Treating physicians recorded standardized clinical variables onto data collection forms. Given the ability of prompt emergency carotid endarterectomy (CEA) to prevent stroke (NNT = 3) in high risk patients, our primary outcome was the composite of subsequent stroke or CEA ≤7 days. We conducted telephone follow-up using the validated Questionnaire for Verifying Stroke Free Status at 7 and 90 days. Outcomes were adjudicated by panels of 3 local stroke experts, blinded to the index ED data collection form. Based on prior work, we estimated a sample size of 5,004 patients including 93 subsequent strokes, would yield 95% confidence bands of +/− 10% for sensitivity and likelihood ratio (LR). Our analyses assessed interval LRs (iLR) with 95% CIs. Results: We prospectively enrolled 7,569 patients with mean 68.4 +/−14.7 years and 52.4% female, of whom 107 (1.4%) had a subsequent stroke and 74 (1.0%) CEA ≤7 days (total outcomes = 181). We enrolled 81.2% of eligible patients; missed patients were similar to enrolled. The Canadian TIA Score stratified the stroke/CEA ≤7days risk as: Low (probability <0.2%, iLR 0.20 [95%CI 0.091-0.44]; Moderate (probability 1.3%, iLR 0.79 [0.68-0.92]; High (probability 2.6%, iLR 2.2 [1.9-2.6]. Sensitivity analysis for just stroke ≤7 days yielded similar results: Low iLR 0.17 [95%CI 0.056-0.52], Medium iLR 0.89 [0.75-1.1], High iLR 2.0 [1.6-2.4]. Conclusion: The Canadian TIA Score accurately identifies TIA patients risk for stroke/CEA ≤7 days. Patients classified as low risk can be safely discharged following a careful ED assessment with elective follow-up. Patients at moderate risk can undergo additional testing in the ED, have antithrombotic therapy optimized, and be offered early stroke specialist follow-up. Patients at high risk should in most cases be fully investigated and managed ideally in consultation with a stroke specialist during their index ED visit.
Depression frequently co-occurs with disorders of glucose and insulin homeostasis (DGIH) and obesity. Low-grade systemic inflammation and lifestyle factors in childhood may predispose to DGIH, obesity and depression. We aim to investigate the cross-sectional and longitudinal associations among DGIH, obesity and depression, and to examine the effect of demographics, lifestyle factors and antecedent low-grade inflammation on such associations in young people.
Using the Avon Longitudinal Study of Parents and Children birth cohort, we used regression analyses to examine: (1) cross-sectional and (2) longitudinal associations between measures of DGIH [insulin resistance (IR); impaired glucose tolerance] and body mass index (BMI) at ages 9 and 18 years, and depression (depressive symptoms and depressive episode) at age 18 years and (3) whether sociodemographics, lifestyle factors or inflammation [interleukin-6 (IL-6) at age 9 years] confounded any such associations.
We included 3208 participants. At age 18 years, IR and BMI were positively associated with depression. These associations may be explained by sociodemographic and lifestyle factors. There were no longitudinal associations between DGIH/BMI and depression, and adjustment for IL-6 and C-reactive protein did not attenuate associations between IR/BMI and depression; however, the longitudinal analyses may have been underpowered.
Young people with depression show evidence of DGIH and raised BMI, which may be related to sociodemographic and lifestyle effects such as deprivation, smoking, ethnicity and gender. In future, studies with larger samples are required to confirm this. Preventative strategies for the poorer physical health outcomes associated with depression should focus on malleable lifestyle factors.
Toca 511 (vocimagene amiretrorepvec) is an investigational, conditionally lytic, retroviral replicating vector (RRV). RRVs selectively infect cancer cells due to innate and adaptive immune response defects in cancers that allow virus replication, and the requirement for cell division for virus integration into the genome. Toca 511 spreads through tumors, stably delivering an optimized yeast cytosine deaminase gene that converts the prodrug Toca FC (investigational, extended-release 5-FC) into 5-FU within the tumor microenvironment. 5-FU kills infected dividing cancer cells and surrounding tumor, myeloid derived suppressor cells, and tumor associated macrophages, resulting in long-term tumor immunity in preclinical models. Data from a Phase 1 resection trial showed six durable CRs and extended mOS compared to historical controls. The FDA granted Breakthrough Therapy Designation for Toca 511 & Toca FC in the treatment of patients with rHGG. Toca 5 is an international, randomized, open-label Phase 3 trial (NCT02414165) of Toca 511 & Toca FC versus SOC in patients undergoing resection for first or second recurrence of rHGG. Patients will be stratified by IDH1 status, KPS, and geographic region. Primary endpoint is OS, and secondary endpoints are durable response rate, durable clinical benefit rate, duration of durable response, and 12-month survival rate. Key inclusion criteria are histologically proven GBM or AA, tumor size ≥1cm and ≤5cm, and KPS ≥70. Immune monitoring and molecular profiling will be performed. Approximately 380 patients will be randomized. An IDMC is commissioned to review the safety and efficacy data which includes 2 interim analyses. Enrollment is ongoing.
Introduction: The Ottawa SAH Rule was developed to identify patients at high-risk for subarachnoid hemorrhage (SAH) who require investigations and the 6-Hour CT Rule found that computed tomography (CT) was 100% sensitive for SAH 6 hours of headache onset. Together, they form the Ottawa SAH Strategy. Our objectives were to assess: 1) Safety of the Ottawa SAH Strategy and its 2) Impact on: a) CTs, b) LPs, c) ED length of stay, and d) CT angiography (CTA). Methods: We conducted a multicentre prospective before/after study at 6 tertiary-care EDs January 2010 to December 2016 (implementation July 2013). Consecutive alert, neurologically intact adults with a headache peaking within one hour were included. SAH was defined by subarachnoid blood on head CT (radiologists final report); xanthochromia in the cerebrospinal fluid (CSF); >1x106/L red blood cells in the final tube of CSF with an aneurysm on CTA. Results: We enrolled 3,669 patients, 1,743 before and 1,926 after implementation, including 185 with SAH. The investigation rate before implementation was 89.0% (range 82.9 to 95.6%) versus 88.4% (range 85.2 to 92.3%) after implementation. The proportion who had CT remained stable (88.0% versus 87.4%; p=0.60), while the proportion who had LP decreased from 38.9% to 25.9% (p<0.001), and the proportion investigated with CTA increased from 18.8% to 21.6% (p=0.036). The additional testing rate (i.e. LP or CTA) diminishedfrom 50.1% to 40.8% (p<0.001). The proportion admitted declined from 9.8% to 7.3% (p=0.008), while the mean length of ED stay was stable (6.2 +/− 4.0 to 6.4 +/− 4.1 hours; p=0.45). For the 1,201 patients with CT 6 hours, there was an absolute decrease in additional testing (i.e. LP or CTA) of 15.0% (46.6% versus 31.6%; p<0.001). The sensitivity of the Ottawa SAH Rule was 100% (95%CI: 98-100%), and the 6-Hour CT Rule was 95.3% (95%CI: 88.9-98.3) for SAH. Five patients with early CT had SAH with CT reported as normal: 2 unruptured aneuryms on CTA and presumed traumatic LP (determined by treating neurosurgeon); 1 missed by the radiologist on the initial interpretation; 1 dural vein fistula (i.e. non-aneuyrsmal); and 1 profoundly anemic (Hgb 63g/L). Conclusion: The Ottawa SAH Strategy is highly sensitive and can be used routinely when SAH is being considered in alert and neurologically intact headache patients. Its implementation was associated with a decrease in LPs and admissions to hospital.
Introduction: Carotid artery stenosis (CAS) is a common cause of stroke. Patients with severe, symptomatic CAS can have their subsequent stroke risk reduced by carotid endarterectomy or stenting when completed soon after a TIA or non-disabling stroke. Patients presenting to a peripheral ED with TIA/stroke, may require transfer to another hospital for imaging to rule-out CAS. The purpose of this study was to determine the test characteristics of carotid artery POCUS in detecting greater than 50% stenosis in patients presenting with TIA/stroke. Methods: We conducted a prospective cohort study on a convenience sample of adult patients presenting to a tertiary care academic ED with TIA/stroke between June and October 2017. Carotid POCUS was performed by a trained medical student or a trained emergency physician. Our outcome measure, CAS >50% was determined by the final radiology report of CTA imaging by a trained radiologist, blinded to our study. A blinded POCUS expert reviewed the carotid POCUS scans. We calculated the sensitivity and specificity for CAS >50% using carotid POCUS versus the gold standard of CTA. Results: We enrolled 75 patients of which 5 did not meet inclusion criteria. The mean age was 70.4 years, 57% were male. 16% were diagnosed with greater than 50% CAS. 47% were stroke codes and 37% were admitted to hospital. Carotid POCUS had a sensitivity and specificity of 72% (46%-99%) and 88% (80%-96%) respectively. There were three false negatives of which two were exactly 50% ICA stenosis on CTA and the other was 100% occlusion of the distal ICA. Kappa coefficient for inter-rater reliability between standard and expert interpretation was 0.68 for moderate agreement. The scan took a mean time of 6.2 minutes to complete. Conclusion: Carotid POCUS has moderate correlation with CTA for detection of CAS greater than 50%. Carotid POCUS identified all the critical 70-99% stenosis lesions that would need urgent surgery. Further research is needed to confirm these findings.
Following a cluster of serious pseudomonas skin infections linked to a body piercing and tattooing premises, a look-back exercise was carried out to offer clients a screen for blood-borne viruses. Of those attending for screening 72% (581/809) had a piercing procedure in the premises of interest: 94 (16%) were under 16 years of age at the time of screening. The most common site of piercing was ear (34%), followed by nose (27%), nipple (21%) and navel (21%). A small number (<5) tested positive for hepatitis B and C, with no evidence this was linked to the premises. However, 36% (211/581) of clients reported a skin infection associated with their piercing. Using data from client forms, 36% provided a false age. Those aged under 16 years (OR 4.5, 95% CI 2.7–7.7) and those receiving a piercing at an intimate site (OR 2.1, 95% CI 1.3–3.6) were more likely to provide a false age. The findings from this exercise were used to support the drafting of the Public Health (Wales) Bill which proposed better regulation of piercing premises and the need to provide proof of being 18 years of age or over before having a piercing of an intimate site.
Current commercial poultry production in the UK faces many challenges which make it difficult to confidently predict the future. Changing legislation, responding largely to welfare pressures, is one such challenge. Additionally, consumer demands are widening. Eggs and meat from stock which is organically produced, or fed on rations containing no genetically modified ingredients, or free range produced, or corn fed are some of the assurances sought by the purchaser and consumer. Although the market place already offers such produce it is difficult to predict the extent to which they will penetrate a market which developed largely through the use of intensive production systems. The alternatives to intensively produced eggs and meat are more expensive to produce and therefore purchase and consequently are susceptible to changes in standards of living and the affluence of the consumer.
This paper briefly describes current commercial practices and some of the specific challenges arising from new legislation.
A description of some specific requirements of birds highlights areas where improvements, in terms of performance, production efficiency, and welfare might be gained. Since the overwhelming majority of eggs and meat is produced in intensive, highly automated systems, there is an obvious need for an integrated approach featuring engineers and the poultry industry to refine and further develop technology which better serves the birds, and ultimately, the consumer.
It is concluded that the UK cannot compete in production costs with some other areas of the world and as retailers increasingly source their goods worldwide, the UK poultry producer may have to resort to the production of products which satisfy niche demands.
The objective of this panel was to generate recommendations to promote the engagement of front-line emergency department (ED) clinicians in clinical and implementation research.
Panel members conducted semi-structured interviews with 37 Canadian adult and pediatric emergency medicine researchers to elicit barriers and facilitators to clinician engagement in research activities, and to glean strategies for promoting clinician engagement.
Responses were organized by themes, and, based on these responses, recommendations were developed and refined in an iterative fashion by panel members.
We offer eight recommendations to promote front-line clinician engagement in clinical research activities. Recommendations to promote clinician engagement specifically address the creation of a research-friendly culture in the ED, minimizing the burden of data collection on clinical staff through the careful design of data collection tools and the use of research staff, and communication between researchers and clinical staff to promote adherence to study protocols.
The objective of Panel 2b was to present an overview of and recommendations for the conduct of implementation trials and multicentre studies in emergency medicine.
Panel members engaged methodologists to discuss the design and conduct of implementation and multicentre studies. We also conducted semi-structured interviews with 37 Canadian adult and pediatric emergency medicine researchers to elicit barriers and facilitators to conducting these kinds of studies.
Responses were organized by themes, and, based on these responses, recommendations were developed and refined in an iterative fashion by panel members.
We offer eight recommendations to facilitate multicentre clinical and implementation studies, along with guidance for conducting implementation research in the emergency department. Recommendations for multicentre studies reflect the importance of local study investigators and champions, requirements for research infrastructure and staffing, and the cooperation and communication between the coordinating centre and participating sites.
Pigs housed under artificial lighting currently experience a wide range of illuminances and photoperiods, which may be more appropriate to the visual capabilities and needs of stockpersons rather than pigs. Pigs and wild boar can show nocturnal, diurnal and crepuscular activity patterns, suggesting that their visual system may function well under a wide range of light levels, unlike humans. Inappropriate lighting affects many aspects of an animal’s physiology, anatomy and behaviour and may compromise welfare. This experiment was designed to investigate the preference of juvenile pigs to occupy and conduct certain behaviours in different illuminances, and gain some indication of their preferred photoperiod.
Rates of opioid-related deaths have reached the level of national public health crisis in Canada. Community-based opioid overdose education and naloxone distribution (OEND) programs distribute naloxone to people at risk, and the emergency department (ED) may be an underutilized setting to deliver naloxone to these people. The goal of this study was to identify Canadian emergency physicians’ attitudes and perceived barriers to the implementation of take-home naloxone programs.
This was an anonymous Web-based survey of members of the Canadian Association of Emergency Physicians. Survey questions were developed by the research team and piloted for face validity and clarity. Two reminder emails were sent to non-responders at 2-week intervals. Respondent demographics were collected, and Likert scales were used to assess attitudes and barriers to the prescription of naloxone from the ED.
A total of 459 physicians responded. The majority of respondents were male (64%), worked in urban tertiary centres (58.3%), and lived in Ontario (50.6%). Overall, attitudes to OEND were strongly positive; 86% identified a willingness to prescribe naloxone from the ED. Perceived barriers included support for patient education (57%), access to follow-up (44%), and inadequate time (37%). In addition to people at risk of overdose, 77% of respondents identified that friends and family members may also benefit.
Canadian emergency physicians are willing to distribute take-home naloxone, but thoughtful systems are required to facilitate opioid OEND implementation. These data will inform the development of these programs, with emphasis on multidisciplinary training and education.
Following release by emergency department (ED) for acute heart failure (AHF), returns to ED represent important adverse health outcomes. The objective of this study was to document relapse events and factors associated with return to ED in the 14-day period following release by ED for patients with AHF.
The primary outcome was the number of return to ED for patients who were release by ED after the initial visit, for any related medical problem within 14 days of this initial ED visit.
Return visits to the EDs occurred in 166 (20%) patients. Of all patients who returned to ED within the 14-day period, 77 (47%) were secondarily admitted to the hospital. The following factors were associated with return visits to ED: past medical history of percutaneous coronary intervention or coronary artery bypass graft (aOR=1.51; 95% CIs [1.01-2.24]), current use of antiarrhythmics medications (1.96 [1.05-3.55]), heart rate above 80 /min (1.89 [1.28-2.80]), systolic blood pressure below 140 mm Hg (1.67[1.14-2.47]), oxygen saturation (SaO2) above 96% (1.58 [1.08-2.31]), troponin above the upper reference limit of normal (1.68 [1.15-2.45]), and chest X-ray with pleural effusion (1.52 [1.04-2.23]).
Many heart failure patients (i.e. 1 in 5 patients) are released from the ED and then suffer return to ED. Patients with multiple medical comorbidities, and those with abnormal initial vital signs are at increased risk for return to ED and should be identified.
Accurate models of X-ray absorption and re-emission in partly stripped ions are necessary to calculate the structure of stars, the performance of hohlraums for inertial confinement fusion and many other systems in high-energy-density plasma physics. Despite theoretical progress, a persistent discrepancy exists with recent experiments at the Sandia Z facility studying iron in conditions characteristic of the solar radiative–convective transition region. The increased iron opacity measured at Z could help resolve a longstanding issue with the standard solar model, but requires a radical departure for opacity theory. To replicate the Z measurements, an opacity experiment has been designed for the National Facility (NIF). The design uses established techniques scaled to NIF. A laser-heated hohlraum will produce X-ray-heated uniform iron plasmas in local thermodynamic equilibrium (LTE) at temperatures
eV and electron densities
. The iron will be probed using continuum X-rays emitted in a
diameter source from a 2 mm diameter polystyrene (CH) capsule implosion. In this design,
of the NIF beams deliver 500 kJ to the
mm diameter hohlraum, and the remaining
directly drive the CH capsule with 200 kJ. Calculations indicate this capsule backlighter should outshine the iron sample, delivering a point-projection transmission opacity measurement to a time-integrated X-ray spectrometer viewing down the hohlraum axis. Preliminary experiments to develop the backlighter and hohlraum are underway, informing simulated measurements to guide the final design.
Experiments on the National Ignition Facility show that multi-dimensional effects currently dominate the implosion performance. Low mode implosion symmetry and hydrodynamic instabilities seeded by capsule mounting features appear to be two key limiting factors for implosion performance. One reason these factors have a large impact on the performance of inertial confinement fusion implosions is the high convergence required to achieve high fusion gains. To tackle these problems, a predictable implosion platform is needed meaning experiments must trade-off high gain for performance. LANL has adopted three main approaches to develop a one-dimensional (1D) implosion platform where 1D means measured yield over the 1D clean calculation. A high adiabat, low convergence platform is being developed using beryllium capsules enabling larger case-to-capsule ratios to improve symmetry. The second approach is liquid fuel layers using wetted foam targets. With liquid fuel layers, the implosion convergence can be controlled via the initial vapor pressure set by the target fielding temperature. The last method is double shell targets. For double shells, the smaller inner shell houses the DT fuel and the convergence of this cavity is relatively small compared to hot spot ignition. However, double shell targets have a different set of trade-off versus advantages. Details for each of these approaches are described.
Genetically similar nulliparous Polled Hereford heifers from a closed pedigree herd were used to evaluate the effects of dietary protein during the first and second trimester of gestation upon foetal, placental and postnatal growth. Heifers were randomly allocated into two groups at 35 days after artificial insemination (35 days post conception (dpc)) to a single bull and fed high (15.7% CP) or low (5.9% CP) protein in the first trimester (T1). At 90 dpc, half of each nutritional treatment group changed to a high- or low-protein diet for the second trimester until 180 dpc (T2). High protein intake in the second trimester increased birth weight in females (P=0.05), but there was no effect of treatment upon birth weight when taken over both sexes. Biparietal diameter was significantly increased by high protein in the second trimester with the effect being greater in the female (P=0.02), but also significant overall (P=0.05). Placental weight was positively correlated with birth weight, fibroblast volume and relative blood vessel volume (P<0.05). Placental fibroblast density was increased and trophoblast volume decreased in the high-protein first trimester treatment group (P<0.05). There was a trend for placental weight to be increased by high protein in the second trimester (P=0.06). Calves from heifers fed the high-protein treatment in the second trimester weighed significantly more on all occasions preweaning (at 1 month (P=0.0004), 2 months (P=0.006), 3 months (P=0.002), 4 months (P=0.01), 5 months (P=0.03), 6 months (P=0.001)), and grew at a faster rate over the 6-month period. By 6 months of age, the calves from heifers fed high nutrition in the second trimester weighed 33 kg heavier than those fed the low diet in the second trimester. These results suggest that dietary protein in early pregnancy alters the development of the bovine placenta and calf growth to weaning.
In the dairy industry, excess dietary CP is consistently correlated with decreased conception rates. However, the source from which excess CP is derived and how it affects reproductive function in beef cattle is largely undefined. The objective of this experiment was to determine the effects of feeding excess metabolizable protein (MP) from feedstuffs differing in rumen degradability on ovulatory follicular dynamics, subsequent corpus luteum (CL) development, steroid hormone production and circulating amino acids (AA) in beef cows. Non-pregnant, non-lactating mature beef cows (n=18) were assigned to 1 of 2 isonitrogenous diets (150% of MP requirements) designed to maintain similar BW and body condition score (BCS) between treatments. Diets consisted of ad libitum corn stalks supplemented with corn gluten meal (moderate rumen undegradable protein (RUP); CGM) or soybean meal (low RUP; SBM). After a 20-day supplement adaptation period, cows were synchronized for ovulation. After 10 days of synchronization, gonadotropin releasing hormone (GnRH) was administered to reset ovarian follicular growth. Starting at GnRH administration and daily thereafter until spontaneous ovulation, transrectal ultrasonography was used to diagram ovarian follicular growth, and blood samples were collected for hormone, metabolite and AA analyses. After 7 days of visual detection of estrus, CL size was determined via ultrasound. Data were analyzed using the MIXED procedures of SAS. As designed, cow BW and BCS were not different (P⩾0.33). Ovulatory follicular wavelength, antral follicle count, ovulatory follicle size at dominance and duration of dominance were not different (P>0.13) between treatments. Cows supplemented with CGM had greater post-dominance ovulatory follicle growth, larger dominant follicles at spontaneous luteolysis, shorter proestrus, and larger ovulatory follicles (P⩽0.03) than SBM cows. No differences (P⩾0.44) in peak estradiol, ratio of estradiol to ovulatory follicle volume, or plasma urea nitrogen were observed. While CL volume and the ratio of progesterone to CL volume were not affected by treatment (P⩾0.24), CGM treated cows tended to have decreased (P=0.07) circulating progesterone 7 days post-estrus compared with SBM cows. Although total circulating plasma AA concentration did not differ (P=0.70) between treatments, CGM cows had greater phenylalanine (P=0.03) and tended to have greater leucine concentrations (P=0.07) than SBM cows. In summary, these data illustrate that excess MP when supplemented to cows consuming a low quality forage may differentially impact ovarian function depending on ruminal degradability of the protein source.