To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The work of Ed Zigler spans decades of research all singularly dedicated to using science to improve the lives of children facing different challenges. The focus of this article is on one of Zigler's numerous lines of work: advocating for the practice of mental age (MA) matching in empirical research, wherein groups of individuals are matched on the basis of developmental level, rather than chronological age. While MA matching practices represented a paradigm shift that provided the seeds from which the developmental approach to developmental disability sprouted, it is not without its own limits. Here, we examine and test the underlying assumption of linearity inherent in MA matching using three commonly used IQ measures. Results provide practical constraints of using MA matching, a solution which we hope refines future clinical and empirical practices, furthering Zigler's legacy of continued commitment to compassionate, meaningful, and rigorous science in the service of children.
To investigate the timing and routes of contamination of the rooms of patients newly admitted to the hospital.
Observational cohort study and simulations of pathogen transfer.
A Veterans’ Affairs hospital.
Patients newly admitted to the hospital with no known carriage of healthcare-associated pathogens.
Interactions between the participants and personnel or portable equipment were observed, and cultures of high-touch surfaces, floors, bedding, and patients’ socks and skin were collected for up to 4 days. Cultures were processed for Clostridioides difﬁcile, methicillin-resistant Staphylococcus aureus (MRSA), and vancomycin-resistant enterococci (VRE). Simulations were conducted with bacteriophage MS2 to assess plausibility of transfer from contaminated floors to high-touch surfaces and to assess the effectiveness of wearing slippers in reducing transfer.
Environmental cultures became positive for at least 1 pathogen in 10 (59%) of the 17 rooms, with cultures positive for MRSA, C. difficile, and VRE in the rooms of 10 (59%), 2 (12%), and 2 (12%) participants, respectively. For all 14 instances of pathogen detection, the initial site of recovery was the floor followed in a subset of patients by detection on sock bottoms, bedding, and high-touch surfaces. In simulations, wearing slippers over hospital socks dramatically reduced transfer of bacteriophage MS2 from the floor to hands and to high-touch surfaces.
Floors may be an underappreciated source of pathogen dissemination in healthcare facilities. Simple interventions such as having patients wear slippers could potentially reduce the risk for transfer of pathogens from floors to hands and high-touch surfaces.
Reduction in the use of fluoroquinolone antibiotics has been associated with reductions in Clostridioides difficile infections (CDIs) due to fluoroquinolone-resistant strains.
To determine whether facility-level fluoroquinolone use predicts healthcare facility-associated (HCFA) CDI due to fluoroquinolone-resistant 027 strains.
Using a nationwide cohort of hospitalized patients in the Veterans’ Affairs Healthcare System, we identified hospitals that categorized >80% of CDI cases as positive or negative for the 027 strain for at least one-quarter of fiscal years 2011–2018. Within these facilities, we used visual summaries and multilevel logistic regression models to assess the association between facility-level fluoroquinolone use and rates of HCFA-CDI due to 027 strains, controlling for time and facility complexity level, and adjusting for correlated outcomes within facilities.
Between 2011 and 2018, 55 hospitals met criteria for reporting 027 results, including a total of 5,091 HCFA-CDI cases, with 1,017 infections (20.0%) due to 027 strains. Across these facilities, the use of fluoroquinolones decreased by 52% from 2011 to 2018, with concurrent reductions in the overall HCFA-CDI rate and the proportion of HCFA-CDI cases due to the 027 strain of 13% and 55%, respectively. A multilevel logistic model demonstrated a significant effect of facility-level fluoroquinolone use on the proportion of infections in the facility due to the 027 strain, most noticeably in low-complexity facilities.
Our findings provide support for interventions to reduce use of fluroquinolones as a control measure for CDI, particularly in settings where fluoroquinolone use is high and fluoroquinolone-resistant strains are common causes of infection.
Clostridioides difficile infection (CDI) is the most frequently reported hospital-acquired infection in the United States. Bioaerosols generated during toilet flushing are a possible mechanism for the spread of this pathogen in clinical settings.
To measure the bioaerosol concentration from toilets of patients with CDI before and after flushing.
In this pilot study, bioaerosols were collected 0.15 m, 0.5 m, and 1.0 m from the rims of the toilets in the bathrooms of hospitalized patients with CDI. Inhibitory, selective media were used to detect C. difficile and other facultative anaerobes. Room air was collected continuously for 20 minutes with a bioaerosol sampler before and after toilet flushing. Wilcoxon rank-sum tests were used to assess the difference in bioaerosol production before and after flushing.
Rooms of patients with CDI at University of Iowa Hospitals and Clinics.
Bacteria were positively cultured from 8 of 24 rooms (33%). In total, 72 preflush and 72 postflush samples were collected; 9 of the preflush samples (13%) and 19 of the postflush samples (26%) were culture positive for healthcare-associated bacteria. The predominant species cultured were Enterococcus faecalis, E. faecium, and C. difficile. Compared to the preflush samples, the postflush samples showed significant increases in the concentrations of the 2 large particle-size categories: 5.0 µm (P = .0095) and 10.0 µm (P = .0082).
Bioaerosols produced by toilet flushing potentially contribute to hospital environmental contamination. Prevention measures (eg, toilet lids) should be evaluated as interventions to prevent toilet-associated environmental contamination in clinical settings.
This study examined the long-term effects of a randomized controlled trial of the Family Check-Up (FCU) intervention initiated at age 2 on inhibitory control in middle childhood and adolescent internalizing and externalizing problems. We hypothesized that the FCU would promote higher inhibitory control in middle childhood relative to the control group, which in turn would be associated with lower internalizing and externalizing symptomology at age 14. Participants were 731 families, with half (n = 367) of the families assigned to the FCU intervention. Using an intent-to-treat design, results indicate that the FCU intervention was indirectly associated with both lower internalizing and externalizing symptoms at age 14 via its effect on increased inhibitory control in middle childhood (i.e., ages 8.5–10.5). Findings highlight the potential for interventions initiated in toddlerhood to have long-term impacts on self-regulation processes, which can further reduce the risk for behavioral and emotional difficulties in adolescence.
To assess the impact of a newly developed Central-Line Insertion Site Assessment (CLISA) score on the incidence of local inflammation or infection for CLABSI prevention.
A pre- and postintervention, quasi-experimental quality improvement study.
Setting and participants:
Adult inpatients with central venous catheters (CVCs) hospitalized in an intensive care unit or oncology ward at a large academic medical center.
We evaluated CLISA score impact on insertion site inflammation and infection (CLISA score of 2 or 3) incidence in the baseline period (June 2014–January 2015) and the intervention period (April 2015–October 2017) using interrupted times series and generalized linear mixed-effects multivariable analyses. These were run separately for days-to-line removal from identification of a CLISA score of 2 or 3. CLISA score interrater reliability and photo quiz results were evaluated.
Among 6,957 CVCs assessed 40,846 times, percentage of lines with CLISA score of 2 or 3 in the baseline and intervention periods decreased by 78.2% (from 22.0% to 4.7%), with a significant immediate decrease in the time-series analysis (P < .001). According to the multivariable regression, the intervention was associated with lower percentage of lines with a CLISA score of 2 or 3, after adjusting for age, gender, CVC body location, and hospital unit (odds ratio, 0.15; 95% confidence interval, 0.06–0.34; P < .001). According to the multivariate regression, days to removal of lines with CLISA score of 2 or 3 was 3.19 days faster after the intervention (P < .001). Also, line dwell time decreased 37.1% from a mean of 14 days (standard deviation [SD], 10.6) to 8.8 days (SD, 9.0) (P < .001). Device utilization ratios decreased 9% from 0.64 (SD, 0.08) to 0.58 (SD, 0.06) (P = .039).
The CLISA score creates a common language for assessing line infection risk and successfully promotes high compliance with best practices in timely line removal.
Building on prior work using Tom Dishion's Family Check-Up, the current article examined intervention effects on dysregulated irritability in early childhood. Dysregulated irritability, defined as reactive and intense response to frustration, and prolonged angry mood, is an ideal marker of neurodevelopmental vulnerability to later psychopathology because it is a transdiagnostic indicator of decrements in self-regulation that are measurable in the first years of life that have lifelong implications for health and disease. This study is perhaps the first randomized trial to examine the direct effects of an evidence- and family-based intervention, the Family Check-Up (FCU), on irritability in early childhood and the effects of reductions in irritability on later risk of child internalizing and externalizing symptomatology. Data from the geographically and sociodemographically diverse multisite Early Steps randomized prevention trial were used. Path modeling revealed intervention effects on irritability at age 4, which predicted lower externalizing and internalizing symptoms at age 10.5. Results indicate that family-based programs initiated in early childhood can reduce early childhood irritability and later risk for psychopathology. This holds promise for earlier identification and prevention approaches that target transdiagnostic pathways. Implications for future basic and prevention research are discussed.
Several research teams have previously traced patterns of emerging conduct problems (CP) from early or middle childhood. The current study expands on this previous literature by using a genetically-informed, experimental, and long-term longitudinal design to examine trajectories of early-emerging conduct problems and early childhood discriminators of such patterns from the toddler period to adolescence. The sample represents a cohort of 731 toddlers and diverse families recruited based on socioeconomic, child, and family risk, varying in urbanicity and assessed on nine occasions between ages 2 and 14. In addition to examining child, family, and community level discriminators of patterns of emerging conduct problems, we were able to account for genetic susceptibility using polygenic scores and the study's experimental design to determine whether random assignment to the Family Check-Up (FCU) discriminated trajectory groups. In addition, in accord with differential susceptibility theory, we tested whether the effects of the FCU were stronger for those children with higher genetic susceptibility. Results augmented previous findings documenting the influence of child (inhibitory control [IC], gender) and family (harsh parenting, parental depression, and educational attainment) risk. In addition, children in the FCU were overrepresented in the persistent low versus persistent high CP group, but such direct effects were qualified by an interaction between the intervention and genetic susceptibility that was consistent with differential susceptibility. Implications are discussed for early identification and specifically, prevention efforts addressing early child and family risk.
The ALMA twenty-six arcmin2 survey of GOODS-S at one millimeter (ASAGAO) is a deep (1σ ∼ 61μJy/beam) and wide area (26 arcmin2) survey on a contiguous field at 1.2 mm. By combining with archival data, we obtained a deeper map in the same region (1σ ∼ 30μJy/beam−1, synthesized beam size 0.59″ × 0.53″), providing the largest sample of sources (25 sources at 5σ, 45 sources at 4.5σ) among ALMA blank-field surveys. The median redshift of the 4.5σ sources is 2.4. The number counts shows that 52% of the extragalactic background light at 1.2 mm is resolved into discrete sources. We create IR luminosity functions (LFs) at z = 1–3, and constrain the faintest luminosity of the LF at 2 < z < 3. The LFs are consistent with previous results based on other ALMA and SCUBA-2 observations, which suggests a positive luminosity evolution and negative density evolution.
Background: Cervical sponylotic myelopathy (CSM) may present with neck and arm pain. This study investiagtes the change in neck/arm pain post-operatively in CSM. Methods: This ambispective study llocated 402 patients through the Canadian Spine Outcomes and Research Network. Outcome measures were the visual analogue scales for neck and arm pain (VAS-NP and VAS-AP) and the neck disability index (NDI). The thresholds for minimum clinically important differences (MCIDs) for VAS-NP and VAS-AP were determined to be 2.6 and 4.1. Results: VAS-NP improved from mean of 5.6±2.9 to 3.8±2.7 at 12 months (P<0.001). VAS-AP improved from 5.8±2.9 to 3.5±3.0 at 12 months (P<0.001). The MCIDs for VAS-NP and VAS-AP were also reached at 12 months. Based on the NDI, patients were grouped into those with mild pain/no pain (33%) versus moderate/severe pain (67%). At 3 months, a significantly high proportion of patients with moderate/severe pain (45.8%) demonstrated an improvement into mild/no pain, whereas 27.2% with mild/no pain demonstrated worsening into moderate/severe pain (P <0.001). At 12 months, 17.4% with mild/no pain experienced worsening of their NDI (P<0.001). Conclusions: This study suggests that neck and arm pain responds to surgical decompression in patients with CSM and reaches the MCIDs for VAS-AP and VAS-NP at 12 months.
Introduction: Recently there have been many studies performed on the effectiveness of implementing LEAN principals to improve wait times for emergency departments (EDs), but there have been relatively few studies on implementing these concepts on length of stay (LOS) in the ED. This research aims to explore the initial feasibility of applying the LEAN model to length-of-stay metrics in an ED by identifying areas of non-value added time for patients staying in the ED. Methods: In this project we used a sample of 10,000 ED visits at the Health Science Centre in St. John's over a 1-year period and compared patients’ LOS in the ED on four criteria: day of the week, hour of presentation, whether laboratory tests were ordered, and whether diagnostic imaging was ordered. Two sets of analyses were then performed. First a two-sided Wilcoxon rank-sum test was used to evaluate whether ordering either lab tests or diagnostic imaging affected LOS. Second a generalized linear model (GLM) was created using a 10-fold cross-validation with a LASSO operator to analyze the effect size and significance of each of the four criteria on LOS. Additionally, a post-test analysis of the GLM was performed on a second sample of 10,000 ED visits in the same 1-year period to assess its predictive power and infer the degree to which a patient's LOS is determined by the four criteria. Results: For the Wilcoxon rank-sum test there was no significant difference in LOS for patients who were ordered diagnostic imaging compared to those who were not (p = 0.6998) but there was a statistically significant decrease in LOS for patients who were ordered lab tests compared to those who were not (p = 2.696 x 10-10). When assessing the GLM there were two significant takeaways: ordering lab tests reduced LOS (95% CI = 42.953 - 68.173min reduction), and arriving at the ED on Thursday increased LOS significantly (95% CI = 6.846 – 52.002min increase). Conclusion: This preliminary analysis identified several factors that increased patients’ LOS in the ED, which would be suitable for potential LEAN interventions. The increase in LOS for both patients who are not ordered lab tests and who visit the ED on Thursday warrant further investigation to identify causal factors. Finally, while this analysis revealed several actionable criteria for improving ED LOS the relatively low predictive power of the final GLM in the post-test analysis (R2 = 0.00363) indicates there are more criteria that influence LOS for exploration in future analyses.
Introduction: Acute aortic syndrome (AAS) is a time sensitive aortic catastrophe that is often misdiagnosed. There are currently no Canadian guidelines to aid in diagnosis. Our goal was to adapt the existing American Heart Association (AHA) and European Society of Cardiology (ESC) diagnostic algorithms for AAS into a Canadian evidence based best practices algorithm targeted for emergency medicine physicians. Methods: We chose to adapt existing high-quality clinical practice guidelines (CPG) previously developed by the AHA/ESC using the GRADE ADOLOPMENT approach. We created a National Advisory Committee consisting of 21 members from across Canada including academic, community and remote/rural emergency physicians/nurses, cardiothoracic and cardiovascular surgeons, cardiac anesthesiologists, critical care physicians, cardiologist, radiologists and patient representatives. The Advisory Committee communicated through multiple teleconference meetings, emails and a one-day in person meeting. The panel prioritized questions and outcomes, using the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach to assess evidence and make recommendations. The algorithm was prepared and revised through feedback and discussions and through an iterative process until consensus was achieved. Results: The diagnostic algorithm is comprised of an updated pre test probability assessment tool with further testing recommendations based on risk level. The updated tool incorporates likelihood of an alternative diagnosis and point of care ultrasound. The final best practice diagnostic algorithm defined risk levels as Low (0.5% no further testing), Moderate (0.6-5% further testing required) and High ( >5% computed tomography, magnetic resonance imaging, trans esophageal echocardiography). During the consensus and feedback processes, we addressed a number of issues and concerns. D-dimer can be used to reduce probability of AAS in an intermediate risk group, but should not be used in a low or high-risk group. Ultrasound was incorporated as a bedside clinical examination option in pre test probability assessment for aortic insufficiency, abdominal/thoracic aortic aneurysms. Conclusion: We have created the first Canadian best practice diagnostic algorithm for AAS. We hope this diagnostic algorithm will standardize and improve diagnosis of AAS in all emergency departments across Canada.
Filamentary structures can form within the beam of protons accelerated during the interaction of an intense laser pulse with an ultrathin foil target. Such behaviour is shown to be dependent upon the formation time of quasi-static magnetic field structures throughout the target volume and the extent of the rear surface proton expansion over the same period. This is observed via both numerical and experimental investigations. By controlling the intensity profile of the laser drive, via the use of two temporally separated pulses, both the initial rear surface proton expansion and magnetic field formation time can be varied, resulting in modification to the degree of filamentary structure present within the laser-driven proton beam.
Development involves synergistic interplay among genotypes and the physical and cultural environments, and integrating genetics into experimental designs that manipulate the environment can improve understanding of developmental psychopathology and intervention efficacy. Consistent with differential susceptibility theory, individuals can vary in their sensitivity to environmental conditions including intervention for reasons including their genotype. As a consequence, understanding genetic influences on intervention response is critical. Empirically, we tested an interaction between a genetic index representing sensitivity to the environment and the Family Check-Up intervention. Participants were drawn from the Early Steps Multisite randomized prevention trial that included a low-income and racially/ethnically diverse sample of children and their families followed longitudinally (n = 515). As hypothesized, polygenic sensitivity to the environment moderated the effects of the intervention on 10-year-old children's symptoms of internalizing psychopathology, such that children who were genetically sensitive and were randomly assigned to the intervention had fewer symptoms of child psychopathology than genetically sensitive children assigned to the control condition. A significant difference in internalizing symptoms assessed with a clinical interview emerged between the intervention and control groups for those 0.493 SD above the mean on polygenic sensitivity, or 25% of the sample. Similar to personalized medicine, it is time to understand individual and sociocultural differences in treatment response and individualize psychosocial interventions to reduce the burden of child psychopathology and maximize well-being for children growing up in a wide range of physical environments and cultures.
To achieve their conservation goals individuals, communities and organizations need to acquire a diversity of skills, knowledge and information (i.e. capacity). Despite current efforts to build and maintain appropriate levels of conservation capacity, it has been recognized that there will need to be a significant scaling-up of these activities in sub-Saharan Africa. This is because of the rapid increase in the number and extent of environmental problems in the region. We present a range of socio-economic contexts relevant to four key areas of African conservation capacity building: protected area management, community engagement, effective leadership, and professional e-learning. Under these core themes, 39 specific recommendations are presented. These were derived from multi-stakeholder workshop discussions at an international conference held in Nairobi, Kenya, in 2015. At the meeting 185 delegates (practitioners, scientists, community groups and government agencies) represented 105 organizations from 24 African nations and eight non-African nations. The 39 recommendations constituted six broad types of suggested action: (1) the development of new methods, (2) the provision of capacity building resources (e.g. information or data), (3) the communication of ideas or examples of successful initiatives, (4) the implementation of new research or gap analyses, (5) the establishment of new structures within and between organizations, and (6) the development of new partnerships. A number of cross-cutting issues also emerged from the discussions: the need for a greater sense of urgency in developing capacity building activities; the need to develop novel capacity building methodologies; and the need to move away from one-size-fits-all approaches.
Infants and young children are frequently colonized with C. difficile but rarely have symptomatic disease. However, C. difficile testing remains prevalent in this age group.
To design a computerized provider order entry (CPOE) alert to decrease testing for C. difficile in young children and infants.
An interventional age-targeted before-after trial with comparison group
Monroe Carell Jr. Children’s Hospital at Vanderbilt University, Nashville, Tennessee.
All children seen in the inpatient or emergency room settings from July 2012 through July 2013 (pre-CPOE alert) and September 2013 through September 2014 (post-CPOE alert)
In August of 2013, we implemented a CPOE alert advising against testing in infants and young children based on the American Academy of Pediatrics recommendations with an optional override. We further offered healthcare providers educational seminars regarding recommended C. difficile testing.
The average monthly testing rate significantly decreased after the CPOE alert for children 0–11 months old (11.5 pre-alert vs 0 post-alert per 10,000 patient days; P<.001) and 12–35 months old (61.6 pre-alert vs 30.1 post-alert per 10,000 patients days; P<.001), but not for those children ≥36 months old (50.9 pre-alert vs 46.4 post-alert per 10,000 patient days; P=.3) who were not targeted with a CPOE alert. There were no complications in those children who testing positive for C. difficile.
The average monthly testing rate for C. difficile for children <35 months old decreased without complication after the use of a CPOE alert in those who tested positive for C. difficile.
There is limited longitudinal research that has looked at the longer term incidence of depressive symptoms, comparing women with a hysterectomy to women without a hysterectomy. We aimed to investigate the association between hysterectomy status and the 12-year incidence of depressive symptoms in a mid-aged cohort of Australian women, and whether these relationships were modified by use of exogenous hormones.
We used generalised estimating equation models for binary outcome data to assess the associations of the incidence of depressive symptoms (measured by the 10-item Centre for Epidemiologic Studies Depression Scale) across five surveys over a 12-year period, in women with a hysterectomy with ovarian conservation, or a hysterectomy with bilateral oophorectomy compared with women without a hysterectomy. We further stratified women with hysterectomy by their current use of menopausal hormone therapy (MHT). Women who reported prior treatment for depression were excluded from the analysis.
Compared with women without a hysterectomy (n = 4002), both women with a hysterectomy with ovarian conservation (n = 884) and women with a hysterectomy and bilateral oophorectomy (n = 450) had a higher risk of depressive symptoms (relative risk (RR) 1.20; 95% confidence interval (CI) 1.06–1.36 and RR 1.44; 95% CI 1.22–1.68, respectively). There were differences in the strength of the risk for women with a hysterectomy with ovarian conservation, compared with those without, when we stratified by current MHT use. Compared with women without a hysterectomy who did not use MHT, women with a hysterectomy with ovarian conservation who were also MHT users had a higher risk of depressive symptoms (RR 1.57; 95% CI 1.31–1.88) than women with a hysterectomy with ovarian conservation but did not use MHT (RR 1.17; 95% CI 1.02–1.35). For women with a hysterectomy and bilateral oophorectomy, MHT use did not attenuate the risk. We could not rule out, however, that the higher risk seen among MHT users may be due to confounding by indication, i.e. MHT was prescribed to treat depressive symptoms, but their depressive symptoms persisted.
Women with a hysterectomy (with and without bilateral oophorectomy) have a higher risk of new incidence of depressive symptoms in the longer term that was not explained by lifestyle or socio-economic factors.
The Antarctic Roadmap Challenges (ARC) project identified critical requirements to deliver high priority Antarctic research in the 21st century. The ARC project addressed the challenges of enabling technologies, facilitating access, providing logistics and infrastructure, and capitalizing on international co-operation. Technological requirements include: i) innovative automated in situ observing systems, sensors and interoperable platforms (including power demands), ii) realistic and holistic numerical models, iii) enhanced remote sensing and sensors, iv) expanded sample collection and retrieval technologies, and v) greater cyber-infrastructure to process ‘big data’ collection, transmission and analyses while promoting data accessibility. These technologies must be widely available, performance and reliability must be improved and technologies used elsewhere must be applied to the Antarctic. Considerable Antarctic research is field-based, making access to vital geographical targets essential. Future research will require continent- and ocean-wide environmentally responsible access to coastal and interior Antarctica and the Southern Ocean. Year-round access is indispensable. The cost of future Antarctic science is great but there are opportunities for all to participate commensurate with national resources, expertise and interests. The scope of future Antarctic research will necessitate enhanced and inventive interdisciplinary and international collaborations. The full promise of Antarctic science will only be realized if nations act together.
Fetal alcohol spectrum disorder (FASD) is increasingly recognized as a growing public health issue worldwide. Although more research is needed on both the diagnosis and treatment of FASD, and a broader and more culturally diverse range of services are needed to support those who suffer from FASD and their families, both research and practice for FASD raise significant ethical issues. In response, from the point of view of both research and clinical neuroethics, we provide a framework that emphasizes the need to maximize benefits and minimize harm, promote justice, and foster respect for persons within a global context.
We describe the performance of the Boolardy Engineering Test Array, the prototype for the Australian Square Kilometre Array Pathfinder telescope. Boolardy Engineering Test Array is the first aperture synthesis radio telescope to use phased array feed technology, giving it the ability to electronically form up to nine dual-polarisation beams. We report the methods developed for forming and measuring the beams, and the adaptations that have been made to the traditional calibration and imaging procedures in order to allow BETA to function as a multi-beam aperture synthesis telescope. We describe the commissioning of the instrument and present details of Boolardy Engineering Test Array’s performance: sensitivity, beam characteristics, polarimetric properties, and image quality. We summarise the astronomical science that it has produced and draw lessons from operating Boolardy Engineering Test Array that will be relevant to the commissioning and operation of the final Australian Square Kilometre Array Path telescope.