To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Control of barnyardgrass is becoming increasingly difficult as plants evolve resistance to herbicides. ROXY oxyfluorfen-resistant rice (ROXY® Rice Production System) has been developed to allow for an alternative mode of action to control barnyardgrass and other weeds. In 2021 and 2022, field trials were conducted at the Pine Tree Research Station near Colt, AR, the Northeast Research and Extension Center in Keiser, AR, and the University of Arkansas Pine Bluff Small Farm Research Center near Lonoke, AR to determine the level of weed control and crop tolerance following oxyfluorfen applied preemergence or postemergence relative to herbicides currently labeled for use in rice. When applied post-plant preemergence on silt loam soil, oxyfluorfen alone at 1,120 and 1,680 g ai ha-1 resulted in barnyardgrass control comparable to clomazone alone at 336 g ha-1. Still, injury to rice was often greater than with clomazone, ranging from 20% to 45%. On clay soil, oxyfluorfen at 1,680 g ha-1 resulted in barnyardgrass control comparable to clomazone alone in both site-years at three weeks after emergence but caused up to 18% injury to rice. When oxyfluorfen was applied at 560 to 1,680 g ha-1 at the 2-leaf rice growth stage, barnyardgrass control was ≥85% in three of four site-years one week after treatment. However, injury to rice ranged from 38% to 73% for the rates evaluated. Propanil caused the greatest injury by a herbicide currently labeled for use in rice at 34%. Oxyfluorfen should be used as a post-plant preemergence herbicide rather than a postemergence herbicide due to the injury observed after a postemergence application. The data indicates that if used as a preemergence herbicide, oxyfluorfen should be applied at 560 g ha-1 to reduce the injury observed on silt loam and clay soils.
Pediatric patients transferred by Emergency Medical Services (EMS) from urgent care (UC) and office-based physician practices to the emergency department (ED) following activation of the 9-1-1 EMS system are an under-studied population with scarce literature regarding outcomes for these children. The objectives of this study were to describe this population, explore EMS level-of-care transport decisions, and examine ED outcomes.
This was a retrospective review of patients zero to <15 years of age transported by EMS from UC and office-based physician practices to the ED of two pediatric receiving centers from January 2017 through December 2019. Variables included reason for transfer, level of transport, EMS interventions and medications, ED medications/labs/imaging ordered in the first hour, ED procedures, ED disposition, and demographics. Data were analyzed with descriptive statistics, X2 test, point biserial correlation, two-sample z test, Mann-Whitney U test, and 2-way ANOVA.
A total of 450 EMS transports were included in this study: 382 Advanced Life Support (ALS) runs and 68 Basic Life Support (BLS) runs. The median patient age was 2.66 years, 60.9% were male, and 60.7% had private insurance. Overall, 48.9% of patients were transported from an office-based physician practice and 25.1% were transported from UC. Almost one-half (48.7%) of ALS patients received an EMS intervention or medication, as did 4.41% of BLS patients. Respiratory distress was the most common reason for transport (46.9%). Supplemental oxygen was the most common EMS intervention and albuterol was the most administered EMS medication. There was no significant association between level of transport and ED disposition (P = .23). The in-patient admission rate for transported patients was significantly higher than the general ED admission rate (P <.001).
This study demonstrates that pediatric patients transferred via EMS after activation of the 9-1-1 system from UC and medical offices are more acutely ill than the general pediatric ED population and are likely sicker than the general pediatric EMS population. Paramedics appear to be making appropriate level-of-care transport decisions.
A Health Equity Task Force (HETF) of members from seven Centers funded by the National Cancer Institute’s (NCI) Implementation Science in Cancer Control Centers (ISC3) network sought to identify case examples of how Centers were applying a focus on health equity in implementation science to inform future research and capacity-building efforts.
HETF members at each ISC3 collected information on how health equity was conceptualized, operationalized, and addressed in initial research and capacity-building efforts across the seven ISC3 Centers funded in 2019–2020. Each Center completed a questionnaire assessing five health equity domains central to implementation science (e.g., community engagement; implementation science theories, models, and frameworks (TMFs); and engaging underrepresented scholars). Data generated illustrative examples from these five domains.
Centers reported a range of approaches focusing on health equity in implementation research and capacity-building efforts, including (1) engaging diverse community partners/settings in making decisions about research priorities and projects; (2) applying health equity within a single TMF applied across projects or various TMFs used in specific projects; (3) evaluating health equity in operationalizing and measuring health and implementation outcomes; (4) building capacity for health equity-focused implementation science among trainees, early career scholars, and partnering organizations; and (5) leveraging varying levels of institutional resources and efforts to engage, include, and support underrepresented scholars.
Examples of approaches to integrating health equity across the ISC3 network can inform other investigators and centers’ efforts to build capacity and infrastructure to support growth and expansion of health equity-focused implementation science.
Pharmacogenomic testing has emerged to aid medication selection for patients with major depressive disorder (MDD) by identifying potential gene-drug interactions (GDI). Many pharmacogenomic tests are available with varying levels of supporting evidence, including direct-to-consumer and physician-ordered tests. We retrospectively evaluated the safety of using a physician-ordered combinatorial pharmacogenomic test (GeneSight) to guide medication selection for patients with MDD in a large, randomized, controlled trial (GUIDED).
Materials and Methods
Patients diagnosed with MDD who had an inadequate response to ≥1 psychotropic medication were randomized to treatment as usual (TAU) or combinatorial pharmacogenomic test-guided care (guided-care). All received combinatorial pharmacogenomic testing and medications were categorized by predicted GDI (no, moderate, or significant GDI). Patients and raters were blinded to study arm, and physicians were blinded to test results for patients in TAU, through week 8. Measures included adverse events (AEs, present/absent), worsening suicidal ideation (increase of ≥1 on the corresponding HAM-D17 question), or symptom worsening (HAM-D17 increase of ≥1). These measures were evaluated based on medication changes [add only, drop only, switch (add and drop), any, and none] and study arm, as well as baseline medication GDI.
Most patients had a medication change between baseline and week 8 (938/1,166; 80.5%), including 269 (23.1%) who added only, 80 (6.9%) who dropped only, and 589 (50.5%) who switched medications. In the full cohort, changing medications resulted in an increased relative risk (RR) of experiencing AEs at both week 4 and 8 [RR 2.00 (95% CI 1.41–2.83) and RR 2.25 (95% CI 1.39–3.65), respectively]. This was true regardless of arm, with no significant difference observed between guided-care and TAU, though the RRs for guided-care were lower than for TAU. Medication change was not associated with increased suicidal ideation or symptom worsening, regardless of study arm or type of medication change. Special attention was focused on patients who entered the study taking medications identified by pharmacogenomic testing as likely having significant GDI; those who were only taking medications subject to no or moderate GDI at week 8 were significantly less likely to experience AEs than those who were still taking at least one medication subject to significant GDI (RR 0.39, 95% CI 0.15–0.99, p=0.048). No other significant differences in risk were observed at week 8.
These data indicate that patient safety in the combinatorial pharmacogenomic test-guided care arm was no worse than TAU in the GUIDED trial. Moreover, combinatorial pharmacogenomic-guided medication selection may reduce some safety concerns. Collectively, these data demonstrate that combinatorial pharmacogenomic testing can be adopted safely into clinical practice without risking symptom degradation among patients.
Impaired facial emotion recognition is a transdiagnostic risk factor for a range of psychiatric disorders. Childhood behavioral difficulties and parental emotional environment have been independently associated with impaired emotion recognition; however, no study has examined the contribution of these factors in conjunction. We measured recognition of negative (sad, fear, anger), neutral, and happy facial expressions in 135 children aged 5–7 years referred by their teachers for behavioral problems. Parental emotional environment was assessed for parental expressed emotion (EE) – characterized by negative comments, reduced positive comments, low warmth, and negativity towards their child – using the 5-minute speech sample. Child behavioral problems were measured using the teacher-informant Strengths and Difficulties Questionnaire (SDQ). Child behavioral problems and parental EE were independently associated with impaired recognition of negative facial expressions specifically. An interactive effect revealed that the combination of both factors was associated with the greatest risk for impaired recognition of negative faces, and in particular sad facial expressions. No relationships emerged for the identification of happy facial expressions. This study furthers our understanding of multidimensional processes associated with the development of facial emotion recognition and supports the importance of early interventions that target this domain.
Background: Increasing Emergency Department (ED) stretcher occupancy with admitted patients at our tertiary care hospital has contributed to long Physician Initial Assessment (PIA) times. As of Oct 2019, median PIA was 2.3 hours and 90th percentile PIA was 5.3 hours, with a consequent 71/74 PIA ranking compared to all Ontario EDs. Ambulatory zone (AZ) models are more commonly used in community EDs compared to tertiary level EDs. An interdisciplinary team trialled an AZ model for five days in our ED to improve PIA times. Aim Statement: We sought to decrease the median PIA for patients in our ED during the AZ trial period as compared to days with similar occupancy and volume. Measures & Design: The AZ was reserved for patients who could walk from a chair to stretcher. In this zone, ED rooms with stretchers were for patient assessment only; when waiting for results or receiving treatment, patients were moved into chairs. We removed nursing assignment ratios to increase patient flow. Our outcome measure was the median PIA for all patients in our ED. Our balancing measure was the 90th percentile PIA, which could increase if we negatively impacted patients who require stretchers. The median and 90th percentile PIA during the AZ trial were compared to similar occupancy and volume days without the AZ. Additional measures included ED Length of Stay (LOS) for non-admitted patients, and patients who leave without being seen (LWBS). Clinicians and patients provided qualitative feedback through surveys. Evaluation/Results: The median PIA during the AZ trial was 1.5 hours, compared to 2.1 hours during control days. Our balancing measure, the 90th percentile PIA was 3.7 hours, compared to 5.0 during control days. A run chart revealed both median and 90th percentile PIA during the trial were at their lowest points over the past 18 months. The number of LWBS patients decreased during the trial; EDLOS did not change. The majority of patients, nurses, and physicians felt the trial could be implemented permanently. Discussion/Impact: Although our highly specialized tertiary care hospital faces unique challenges and high occupancy pressures, a community-hospital style AZ model was successful in improving PIA. Shorter PIA times can improve other quality metrics, such as timeliness of analgesia and antibiotics. We are working to optimize the model based on feedback before we cycle another trial. Our findings suggest that other tertiary care EDs should consider similar AZ models.
Background: Most emergency departments (ED) utilize medical directives to initiate lab investigations for patients prior to physician assessment. This practice facilitates expedited patient care in the ED, resulting in safer and efficient care. However, some patients choose to leave the ED prior to seeing a physician due to prolonged waiting. Previously, at our hospital there was no defined process for identifying and following up on abnormal test results on patients that leave without being seen (LWBS), resulting in lab results often not being reviewed by a nurse or physician. Aim Statement: By April 2020, we aim to have 90% of ED LWBS patients with abnormal results identified and followed up. Measures & Design: A series of consultations and information gathering occurred that included an environmental scan of other EDs and discussions with emergency nurses, emergency physicians, Risk Management, Legal Department, College of Nurses of Ontario and Canadian Medical Protective Association. A process map was developed collaboratively to standardize the process to identify and follow up on abnormal investigations of LWBS patients and a new hospital policy was developed to officially outline this process. The following are the family of measures: Outcome measure – % LWBS patients with abnormal tests that had follow-up documented in chart Process measure – Number LWBS patients with investigations initiated by medical directive, Number LWBS patients, % LWBS patients Balancing measure – Satisfaction of nurses with new process for LWBS patients Evaluation/Results: At baseline, 29% of LWBS patients with abnormal lab results had follow up documented in the chart. After implementation of the new standardized process and policy, the follow up rate of LWBS patients with abnormal results in August, September and October 2019 was 47%, 28% and 29% respectively. Discussion/Impact: These results indicate that standardization and new policy implementation is insufficient to change practice, even one that aims to provide safer patient care. Nevertheless, these interventions are important first steps to improving the safety for ED LWBS patients. We plan to implement an audit and feedback approach to encourage nursing staff to routinely check lab results on LWBS patients.
The Genomics Used to Improve DEpresssion Decisions (GUIDED) trial assessed outcomes associated with combinatorial pharmacogenomic (PGx) testing in patients with major depressive disorder (MDD). Analyses used the 17-item Hamilton Depression (HAM-D17) rating scale; however, studies demonstrate that the abbreviated, core depression symptom-focused, HAM-D6 rating scale may have greater sensitivity toward detecting differences between treatment and placebo. However, the sensitivity of HAM-D6 has not been tested for two active treatment arms. Here, we evaluated the sensitivity of the HAM-D6 scale, relative to the HAM-D17 scale, when assessing outcomes for actively treated patients in the GUIDED trial.
Outpatients (N=1,298) diagnosed with MDD and an inadequate treatment response to >1 psychotropic medication were randomized into treatment as usual (TAU) or combinatorial PGx-guided (guided-care) arms. Combinatorial PGx testing was performed on all patients, though test reports were only available to the guided-care arm. All patients and raters were blinded to study arm until after week 8. Medications on the combinatorial PGx test report were categorized based on the level of predicted gene-drug interactions: ‘use as directed’, ‘moderate gene-drug interactions’, or ‘significant gene-drug interactions.’ Patient outcomes were assessed by arm at week 8 using HAM-D6 and HAM-D17 rating scales, including symptom improvement (percent change in scale), response (≥50% decrease in scale), and remission (HAM-D6 ≤4 and HAM-D17 ≤7).
At week 8, the guided-care arm demonstrated statistically significant symptom improvement over TAU using HAM-D6 scale (Δ=4.4%, p=0.023), but not using the HAM-D17 scale (Δ=3.2%, p=0.069). The response rate increased significantly for guided-care compared with TAU using both HAM-D6 (Δ=7.0%, p=0.004) and HAM-D17 (Δ=6.3%, p=0.007). Remission rates were also significantly greater for guided-care versus TAU using both scales (HAM-D6 Δ=4.6%, p=0.031; HAM-D17 Δ=5.5%, p=0.005). Patients taking medication(s) predicted to have gene-drug interactions at baseline showed further increased benefit over TAU at week 8 using HAM-D6 for symptom improvement (Δ=7.3%, p=0.004) response (Δ=10.0%, p=0.001) and remission (Δ=7.9%, p=0.005). Comparatively, the magnitude of the differences in outcomes between arms at week 8 was lower using HAM-D17 (symptom improvement Δ=5.0%, p=0.029; response Δ=8.0%, p=0.008; remission Δ=7.5%, p=0.003).
Combinatorial PGx-guided care achieved significantly better patient outcomes compared with TAU when assessed using the HAM-D6 scale. These findings suggest that the HAM-D6 scale is better suited than is the HAM-D17 for evaluating change in randomized, controlled trials comparing active treatment arms.
Major depressive disorder (MDD) is a leading cause of disease burden worldwide, with lifetime prevalence in the United States of 17%. Here we present the results of the first prospective, large-scale, patient- and rater-blind, randomized controlled trial evaluating the clinical importance of achieving congruence between combinatorial pharmacogenomic (PGx) testing and medication selection for MDD.
1,167 outpatients diagnosed with MDD and an inadequate response to ≥1 psychotropic medications were enrolled and randomized 1:1 to a Treatment as Usual (TAU) arm or PGx-guided care arm. Combinatorial PGx testing categorized medications in three groups based on the level of gene-drug interactions: use as directed, use with caution, or use with increased caution and more frequent monitoring. Patient assessments were performed at weeks 0 (baseline), 4, 8, 12 and 24. Patients, site raters, and central raters were blinded in both arms until after week 8. In the guided-care arm, physicians had access to the combinatorial PGx test result to guide medication selection. Primary outcomes utilized the Hamilton Depression Rating Scale (HAM-D17) and included symptom improvement (percent change in HAM-D17 from baseline), response (50% decrease in HAM-D17 from baseline), and remission (HAM-D17<7) at the fully blinded week 8 time point. The durability of patient outcomes was assessed at week 24. Medications were considered congruent with PGx test results if they were in the ‘use as directed’ or ‘use with caution’ report categories while medications in the ‘use with increased caution and more frequent monitoring’ were considered incongruent. Patients who started on incongruent medications were analyzed separately according to whether they changed to congruent medications by week8.
At week 8, symptom improvement for individuals in the guided-care arm was not significantly different than TAU (27.2% versus 24.4%, p=0.11). However, individuals in the guided-care arm were more likely than those in TAU to achieve remission (15% versus 10%; p<0.01) and response (26% versus 20%; p=0.01). Remission rates, response rates, and symptom reductions continued to improve in the guided-treatment arm until the 24week time point. Congruent prescribing increased to 91% in the guided-care arm by week 8. Among patients who were taking one or more incongruent medication at baseline, those who changed to congruent medications by week 8 demonstrated significantly greater symptom improvement (p<0.01), response (p=0.04), and remission rates (p<0.01) compared to those who persisted on incongruent medications.
Combinatorial PGx testing improves short- and long-term response and remission rates for MDD compared to standard of care. In addition, prescribing congruency with PGx-guided medication recommendations is important for achieving symptom improvement, response, and remission for MDD patients.
Funding Acknowledgements: This study was supported by Assurex Health, Inc.
Current measures for major depressive disorder focus primarily on the assessment of depressive symptoms, while often omitting other common features. However, the presence of comorbid features in the anxiety spectrum influences outcome and may effect treatment. More comprehensive measures of depression are needed that include the assessment of symptoms in the anxiety–depression spectrum. This study examines the reliability and validity of the Symptoms of Depression Questionnaire (SDQ), which assesses irritability, anger attacks, and anxiety symptoms together with the commonly considered symptoms of depression. Analysis of the factor structure of the SDQ identified 5 subscales, including one in the anxiety–depression spectrum, with adequate internal consistency and concurrent validity. The SDQ may be a valuable new tool to better characterize depression and identify and administer more targeted interventions.
Efforts to respond to performance-based accountability mandates for public health emergency preparedness have been hindered by a weak evidence base linking preparedness activities with response outcomes. We describe an approach to measure development that was successfully implemented in the Centers for Disease Control and Prevention Public Health Emergency Preparedness Cooperative Agreement. The approach leverages insights from process mapping and experts to guide measure selection, and provides mechanisms for reducing performance-irrelevant variation in measurement data. Also, issues are identified that need to be addressed to advance the science of measurement in public health emergency preparedness.
Determining the rate of hoof horn growth in sheep is important for understanding the physiology and pathology of the foot and the impact of the environment and the treatment of diseased feet on foot health. It could lead to improved understanding of the interaction between hoof horn and pasture/barn floor characteristics and in methods for prevention and treatment of ovine foot diseases. In the current study, the hoof horn was measured using a previously tested protocol on all eight digits of 21 healthy yearling mule ewes on a farm in North Wales on four occasions over a period of 53 days. The mean hoof horn growth rate was 0·11 mm (s.e.m. 0·02) per day; the residual error variance was 0·024 and the R2 was 0·245. There were no significant differences between hoof horn growth rates in front and hind feet or between medial and lateral claws or over time.
Fourteen young wether sheep were fed freshly cut Lotus pedunculatus as a sole diet to examine the effects of condensed tannins (CT; 55 g/kg lotus DM) on nitrogenous aspects of digestion. The experiment was carried out indoors at Palmerston North, New Zealand over 32 days with one group of sheep receiving an intraruminal infusion of polyethylene glycol (PEG; 100 g/day) to preferentially bind CT (PEG group) so that the lotus was essentially ‘CT-free'. The other sheep, not given PEG, were termed the ‘Tannin’ group.
The principal effects of CT were to increase the flow of feed nitrogen (N) to the abomasum despite a 12% reduction in DM intake of the Tannin sheep. Rumen microbial N turnover rate was slower in Tannin animals than in those receiving PEG (1·86 v. 2·63/day) but microbial N flux to the abomasum was similar in both treatments. The proportion of N intake disappearing from the rumen was lower in Tannin (0·13) than in PEG sheep (0·26) and the N digestibility was 0·67 and 0·81 for the respective treatments (P < 0·001).
The beneficial effects of CT in reducing rumen degradation of feed protein were negated in part by a reduction in fractional absorption of amino acids (AA) from the small intestine. Fractional absorption of essential AA was 0·66 in Tannin and 0·79 in PEG sheep; values for non-essential AA were 0'59 in Tannin and 0·73 in PEG groups. Amino acid concentrations in blood were similar for both groups, but Tannin sheep had lower plasma urea concentrations, a more rapid plasma urea turnover rate and a higher irreversible loss than those receiving PEG. Growth hormone concentrations in plasma were similar for both treatments.
Lotus pedunculatus was grown under high fertility conditions and its nutritive value was determined in a feeding trial with sheep at Palmerston North, New Zealand in 1989. The condensed tannins (CT) accounted for 5·5 % of lotus dry matter (DM) and its effect on digestion was evaluated by giving an intraruminal infusion of polyethylene glycol (PEG) to six of the sheep (PEG group). PEG preferentially binds with CT so that the lotus becomes essentially CT-free.
The experiment was carried out with 14 sheep (six PEG and eight ‘Tannin’) held in metabolism crates indoors and given freshly cut lotus hourly, for 32 days. This paper presents data relating to carbohydrate and mineral digestion, together with aspects of rumen function.
Digestibility of lotus DM was 68%, and the digestibility of fibre was not affected by CT. Infusion of PEG increased rumen concentrations of NH3 and volatile fatty acids (P < 0·001) but effects on molar ratios of VFA were inconsistent with time. CT reduced rumen degradation and absorption of sulphur and increased net absorption of both phosphorus and zinc, but other effects on mineral digestion were small.
Although the lotus was offered at c. 90% of ad libitum, intakes of the tannin sheep began to decline after c. 15 days of feeding and were c. 12% lower than those of the PEG sheep at the end of the trial (P < 0·05). At slaughter, rumen pool sizes were similar for the two treatments but the Tannin sheep had a lower fractional outflow rate, which suggests a slower rate of digestion in the rumen. Growth rate and wool production were similar for sheep on both treatments. It is concluded that the CT in Lotus pedunculatus grown under high fertility conditions had little effect on fibre and mineral digestion but the depression in DM intake reduced its nutritive value for sheep.
An experiment was conducted at Palmerston North, New Zealand, to determine the effect of condensed tannins (CT) on the true and apparent digestion of methionine and cysteine in the small intestine (SI) of sheep fed fresh Lotus comkulatus. The lotus contained c. 30 g total CT/kg dry matter (DM) and was fed hourly to sheep in metabolism crates. Four sheep were prepared with rumen and abomasal cannulae which enabled the indigestible liquid phase marker, chromium ethylene diamine tetra-acetic acid (Cr-EDTA), to be infused into the rumen to estimate digesta flow. True digestibility of plant methionine and cysteine in the SI and their site of absorption in the SI were determined from 35S-labelled L. corniculatus homogenate continuously infused into the abomasum. After 9 h infusion of the 35S-labelled lotus homogenate, the sheep were slaughtered and digesta samples were taken at intervals along the small and large intestines. The effect of CT was determined by comparing two control sheep (CT-acting) with two sheep given a continuous intraruminal infusion of polyethylene glycol (PEG, MW 3500) to bind and inactivate the CT.
The CT reduced the true digestibility of plant methionine (0·72 v. 0·88) and cysteine (0·65 v. 0·81) in the SI relative to sheep receiving PEG. Condensed tannins also appeared to alter the site of digestion of both [35S]methionine and [35S]cysteine in the SI, and increased the flux of both amino acids in the mid and latter thirds of the SI. CT did not affect the apparent digestibility of total methionine (0·82 v. 0·84) in the SI but reduced the apparent digestibility of total cysteine from 0·77 to 0·66. In control sheep CT increased the abomasal flux (as a proportion of eaten) of total digesta methionine (0·88 v. 0·76) and total digesta cysteine (0·74 v. 0·62). The apparent absorption of total methionine (plant + microbial + endogenous) was increased by the action of CT (0·72 v. 0·63 g/g eaten) but was similar for total cysteine (0·49 v. 0·48 g/g eaten) in both groups. It was concluded that CT reduced the true digestibility of plant methionine and cysteine in the SI. However, it was calculated that the action of CT actually increased the total amounts (g/g eaten) of plant methionine and cysteine absorbed from the SI, due to its effect in increasing abomasal flux.
A field experiment was conducted to assess the effect of competition between a leucaena hybrid and maize (Zea mays L.) when planted simultaneously in an alley cropping system. The leucaena hybrid (a cross between L. diversifolia and L. leucocephala) was planted at hedgerow spacings of 3 and 5.25 m, while maize was planted in rows 75 cm apart between the hedgerows. The spacing between the leucaena hedgerow and maize was varied by removal of 0, l or 2 rows of maize to give three spacing treatments of 37.5, 75 or 112.5 cm between leucaena and maize. A control plot of leucaena alone was also included in the treatments. The growth and yield of individual maize rows were virtually unaffected by the presence of leucaena, but maize had a significant influence on the growth and yield of leucaena. At full maize canopy development, photosynthetically active radiation reaching the leucaena was reduced in all treatments, resulting in a 75% yield reduction in leucaena at the closest spacing. Overall, maize grain yield reached 10.3 t ha−1 in the continuous maize plots (37.5 cm treatment). This was reduced by up to 40% after removal of two maize rows in the closest leucaena row spacing treatment. The implications of these results for the practical establishment of leucaena hedgerows with a maize crop are discussed.