To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The duration of immunity after first severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection and the extent to which prior immunity prevents reinfection is uncertain and remains an important question within the context of new variants. This is a retrospective population-based matched observational study where we identified the first polymerase chain reaction (PCR) positive of primary SARS-CoV-2 infection case tests between 1 March 2020 and 30 September 2020. Each case was matched by age, sex, upper tier local authority of residence and testing route to one individual testing negative in the same week (controls) by PCR. After a 90-day pre-follow-up period for cases and controls, any subsequent positive tests up to 31 December 2020 and deaths within 28 days of testing positive were identified, this encompassed an essentially vaccine-free period. We used a conditional logistic regression to analyse the results. There were 517 870 individuals in the matched cohort with 2815 reinfection cases and 12 098 first infections. The protective effect of a prior SARS-CoV-2 PCR-positive episode was 78% (odds ratio (OR) 0.22, 0.21–0.23). Protection rose to 82% (OR 0.18, 0.17–0.19) after a sensitivity analysis excluded 933 individuals with a first test between March and May and a subsequent positive test between June and September 2020. Amongst individuals testing positive by PCR during follow-up, reinfection cases had 77% lower odds of symptoms at the second episode (adjusted OR 0.23, 0.20–0.26) and 45% lower odds of dying in the 28 days after reinfection (adjusted OR 0.55, 0.42–0.71). Prior SARS-CoV-2 infection offered protection against reinfection in this population. There was some evidence that reinfections increased with the alpha variant compared to the wild-type SARS-CoV-2 variant highlighting the importance of continued monitoring as new variants emerge.
Robert Pervine and colleagues employ Ice Cube’s 1990 song, “Endangered Species” to explicate the popular narrative on black men as an endangered species. This song was preceded by seven-years by Walter Leavy’s 1983 article in Ebony Magazine. Leavy introduced the black community to the provocative question, “Is the black male an endangered species?” To emphasize the deteriorating condition of the African American male, Leavy pointed to a number of factors including high rates of unemployment, homicide, and imprisonment, as well as a decrease in life expectancy that negatively impact their ability to prosper in life. The term “endangered species” refers to a species that is very likely to become extinct in the near future, either worldwide or in a particular area. Causes of the endangerment are usually the loss of habitats, poaching, and the unleashing of an invasive species, in this case police violence. The concept of black males as an endangered species was a reversal of the newly created sense in the US coming out of the 1950s and 1960s that America needed to improve the conditions of the Black Community in order to make good on its commitment to abolish the basic injustice of segregation.
Early in the COVID-19 pandemic, the World Health Organization stressed the importance of daily clinical assessments of infected patients, yet current approaches frequently consider cross-sectional timepoints, cumulative summary measures, or time-to-event analyses. Statistical methods are available that make use of the rich information content of longitudinal assessments. We demonstrate the use of a multistate transition model to assess the dynamic nature of COVID-19-associated critical illness using daily evaluations of COVID-19 patients from 9 academic hospitals. We describe the accessibility and utility of methods that consider the clinical trajectory of critically ill COVID-19 patients.
Developmental approaches to child and adolescent offending emphasise the role of individual and psychological factors when explaining the onset of offending, as well as the role of early risk and protective factors on future offending. This chapter will look at incidence and prevalence of young offending including the age crime curve; risk and protective factors; some key theoretical approaches; and interventions. In England and Wales (2018/19), 60,208 arrests of notifiable offences were made to those aged 10-17 years. Interventions that limit social experiences at the critical age of adolescence have not been shown as effective, with two thirds of young offenders in secure environments re-offending within 12 months. Secure schools, specialist foster care and the ‘child first’ approach aim to provide an environment in which children and adolescents feel secure, whilst promoting a positive learning environment. This may enhance confidence that the young offender can break the cycle of offending.
Throughout the coronavirus disease 2019 (COVID-19) pandemic, health and social care workers have faced unprecedented professional demands, all of which are likely to have placed considerable strain on their psychological well-being.
To measure the national prevalence of mental health symptoms within healthcare staff, and identify individual and organisational predictors of well-being.
The COVID-19 Staff Wellbeing Survey is a longitudinal online survey of psychological well-being among health and social care staff in Northern Ireland. The survey included four time points separated by 3-month intervals; time 1 (November 2020; n = 3834) and time 2 (February 2021; n = 2898) results are presented here. At time 2, 84% of respondents had received at least one dose of a COVID-19 vaccine. The survey included four validated psychological well-being questionnaires (depression, anxiety, post-traumatic stress and insomnia), as well as demographic and organisational measures.
At time 1 and 2, a high proportion of staff reported moderate-to-severe symptoms of depression (30–36%), anxiety (26–27%), post-traumatic stress (30–32%) and insomnia (27–28%); overall, significance tests and effect size data suggested psychological well-being was generally stable between November 2020 and February 2021 for health and social care staff. Multiple linear regression models indicated that perceptions of less effective communication within their organisation predicted greater levels of anxiety, depression, post-traumatic stress and insomnia.
This study highlights the need to offer psychological support to all health and social care staff, and to communicate with staff regularly, frequently and clearly regarding COVID-19 to help protect staff psychological well-being.
San Francisco (California USA) is a relatively compact city with a population of 884,000 and nine stroke centers within a 47 square mile area. Emergency Medical Services (EMS) transport distances and times are short and there are currently no Mobile Stroke Units (MSUs).
This study evaluated EMS activation to computed tomography (CT [EMS-CT]) and EMS activation to thrombolysis (EMS-TPA) times for acute stroke in the first two years after implementation of an emergency department (ED) focused, direct EMS-to-CT protocol entitled “Mission Protocol” (MP) at a safety net hospital in San Francisco and compared performance to published reports from MSUs. The EMS times were abstracted from ambulance records. Geometric means were calculated for MP data and pooled means were similarly calculated from published MSU data.
From July 2017 through June 2019, a total of 423 patients with suspected stroke were evaluated under the MP, and 166 of these patients were either ultimately diagnosed with ischemic stroke or were treated as a stroke but later diagnosed as a stroke mimic. The EMS and treatment time data were available for 134 of these patients with 61 patients (45.5%) receiving thrombolysis, with mean EMS-CT and EMS-TPA times of 41 minutes (95% CI, 39-43) and 63 minutes (95% CI, 57-70), respectively. The pooled estimates for MSUs suggested a mean EMS-CT time of 35 minutes (95% CI, 27-45) and a mean EMS-TPA time of 48 minutes (95% CI, 39-60). The MSUs achieved faster EMS-CT and EMS-TPA times (P <.0001 for each).
In a moderate-sized, urban setting with high population density, MP was able to achieve EMS activation to treatment times for stroke thrombolysis that were approximately 15 minutes slower than the published performance of MSUs.
With no approved treatments for COVID-19 initially available, the Food and Drug Administration utilized multiple preapproval pathways to provide access to investigational agents and/or medical devices: Expanded Access, Emergency Use Authorizations, and Clinical Trials. Regulatory units within an Academic Medical Center (AMC), including those part of the Clinical and Translational Science Award (CTSA) consortium, have provided support for clinicians in navigating these options prior to the pandemic. As such, they were positioned to be a resource for accessing therapies during the COVID-19 public health emergency.
A small survey and a follow-on poll of the national Investigational New Drug (IND)/Investigational Device Exemption (IDE) Workgroup were conducted in October and December 2020 to determine whether CTSA regulatory units assisted in facilitating access to COVID-19 therapies and the extent of pandemic-related challenges these units faced.
Fifteen survey and 21 poll responses were received, which provided insights into the demands placed on these regulatory support units due to the pandemic and the changes required to provide critical support during this and future crises. Key changes and lessons learned included the importance of regulatory knowledge to support the institutional response, the critical need for electronic submission capacity for Food and Drug Administration (FDA) documents, and the nimble reallocation of regulatory and legal resources to support patient access to investigational agents and/or medical devices during the pandemic.
AMC- and CTSA-based regulatory units played a meaningful role in the COVID-19 pandemic but further unit modifications are needed for enabling more robust regulatory support in the future.
Efforts to move community engagement in research from marginalized to mainstream include the NIH requiring community engagement programs in all Clinical and Translational Science Awards (CTSAs). However, the COVID-19 pandemic has exposed how little these efforts have changed the dominant culture of clinical research. When faced with the urgent need to generate knowledge about prevention and treatment of the novel coronavirus, researchers largely neglected to involve community stakeholders early in the research process. This failure cannot be divorced from the broader context of systemic racism in the US that has contributed to Black, Indigenous, and People of Color (BIPOC) communities bearing a disproportionate toll from COVID-19, being underrepresented in COVID-19 clinical trials, and expressing greater hesitancy about COVID-19 vaccination. We call on research funders and research institutions to take decisive action to make community engagement obligatory, not optional, in all clinical and translational research and to center BIPOC communities in this process. Recommended actions include funding agencies requiring all research proposals involving human participants to include a community engagement plan, providing adequate funding to support ongoing community engagement, including community stakeholders in agency governance and proposal reviews, promoting racial and ethnic diversity in the research workforce, and making a course in community engaged research a requirement for Masters of Clinical Research curricula.
This article investigates the implications of recent research findings that establish that older victims of crime are less likely to obtain procedural justice than other age groups. It explores original empirical data from the United Kingdom that finds evidence of a systemic failure amongst agencies to identify vulnerability in the older population and to put in place appropriate support mechanisms to allow older victims to participate fully in the justice system. The article discusses how the legally defined gateways to additional support, which are currently relied upon by many common law jurisdictions, disadvantage older victims and require reimagining. It argues that international protocols, especially the current European Union Directive on victims’ rights, are valuable guides in this process of re-conceptualisation. To reduce further the inequitable treatment of older victims, the article advocates for jurisdictions to introduce a presumption in favour of special assistance for older people participating in the justice system.
Interfacility patient movement plays an important role in the dissemination of antimicrobial-resistant organisms throughout healthcare systems. We evaluated how 3 alternative measures of interfacility patient sharing were associated with C. difficile infection incidence in Ontario acute-care facilities.
The cohort included adult acute-care facility stays of ≥3 days between April 2003 and March 2016. We measured 3 facility-level metrics of patient sharing: general patient importation, incidence-weighted patient importation, and C. difficile case importation. Each of the 3 patient-sharing metrics were examined against the incidence of C. difficile infection in the facility per 1,000 stays, using Poisson regression models.
The analyzed cohort included 6.70 million stays at risk of C. difficile infection across 120 facilities. Over the 13-year period, we included 62,189 new cases of healthcare-associated CDI (incidence, 9.3 per 1,000 stays). After adjustment for facility characteristics, general importation was not strongly associated with C. difficile infection incidence (risk ratio [RR] per doubling, 1.10; 95% confidence interval [CI], 0.97–1.24; proportional change in variance [PCV], −2.0%). Incidence-weighted (RR per doubling, 1.18; 95% CI, 1.06–1.30; PCV, −8.4%) and C. difficile case importation (RR per doubling, 1.43; 95% CI, 1.29–1.58; PCV, −30.1%) were strongly associated with C. difficile infection incidence.
In this 13-year study of acute-care facilities in Ontario, interfacility variation in C. difficile infection incidence was associated with importation of patients from other high-incidence acute-care facilities or specifically of patients with a recent history of C. difficile infection. Regional infection control strategies should consider the potential impact of importation of patients at high risk of C. difficile shedding from outside facilities.
Nudging in microbiology is an antimicrobial stewardship strategy to influence decision making through the strategic reporting of microbiology results while preserving prescriber autonomy. The purpose of this scoping review was to identify the evidence that demonstrates the effectiveness of nudging strategies in susceptibility result reporting to improve antimicrobial use.
A search for studies in Ovid MEDLINE, Embase, PsycINFO, and All EBM Reviews was conducted. All simulated and vignette studies were excluded. Two independent reviewers were used throughout screening and data extraction.
Of a total of 1,346 citations screened, 15 relevant studies were identified. Study types included pre- and postintervention (n = 10), retrospective cohort (n = 4), and a randomized controlled trial (n = 1). Most studies were performed in acute-care settings (n = 13), and the remainder were in primary care (n = 2). Most studies used a strategy to alter the default antibiotic choices on the antibiotic report. All studies reported at least 1 outcome of antimicrobial use: utilization (n = 9), appropriateness (n = 7), de-escalation (n = 2), and cost (n = 1). Moreover, 12 studies reported an overall benefit in antimicrobial use outcomes associated with nudging, and 4 studies evaluated the association of nudging strategy with subsequent antimicrobial resistance, with 2 studies noting overall improvement.
The number of heterogeneous studies evaluating the impact of applying nudging strategies to susceptibility result reports is small; however, most strategies do show promise in altering prescriber’s antibiotic selection. Selective and cascade reporting of targeted agents in a hospital setting represent the majority of current research. Gaps and opportunities for future research identified from our scoping review include performing prospective randomized controlled trials and evaluating other approaches aside from selective reporting.
Antimicrobial stewardship program (ASP) interventions, such as prospective audit and feedback (PAF), have been shown to reduce antimicrobial use and improve patient outcomes. However, the optimal approach to PAF is unknown.
We examined the impact of a high–intensity interdisciplinary rounds–based PAF compared to low–intensity PAF on antimicrobial use on internal medicine wards in a 400–bed community hospital.
Prior to the intervention, ASP pharmacists performed low–intensity PAF with a focus on targeted antibiotics. Recommendations were made directly to the internist for each patient. High–intensity, rounds–based PAF was then introduced sequentially to 5 internal medicine wards. This PAF format included twice–weekly interdisciplinary rounds, with a review of all internal medicine patients receiving any antimicrobial agent. Antibiotic use and clinical outcomes were measured before and after the transition to high–intensity PAF. An interrupted time–series analysis was performed adjusting for seasonal and secular trends.
With the transition from low–intensity to high–intensity PAF, a reduction in overall usage was seen from 483 defined daily doses (DDD)/1,000 patient days (PD) during the low–intensity phase to 442 DDD/1,000 PD in the high–intensity phase (difference, −42; 95% confidence interval [CI], −74 to −9). The reduction in usage was more pronounced in the adjusted analysis, in the latter half of the high intensity period, and for targeted agents. There were no differences seen in clinical outcomes in the adjusted analysis.
High–intensity PAF was associated with a reduction in antibiotic use compared to a low–intensity approach without any adverse impact on patient outcomes. A decision to implement high–intensity PAF approach should be weighed against the increased workload required.
This study investigated the characteristics of subjective memory complaints (SMCs) and their association with current and future cognitive functions.
A cohort of 209 community-dwelling individuals without dementia aged 47–90 years old was recruited for this 3-year study. Participants underwent neuropsychological and clinical assessments annually. Participants were divided into SMCs and non-memory complainers (NMCs) using a single question at baseline and a memory complaints questionnaire following baseline, to evaluate differential patterns of complaints. In addition, comprehensive assessment of memory complaints was undertaken to evaluate whether severity and consistency of complaints differentially predicted cognitive function.
SMC and NMC individuals were significantly different on various features of SMCs. Greater overall severity (but not consistency) of complaints was significantly associated with current and future cognitive functioning.
SMC individuals present distinctive features of memory complaints as compared to NMCs. Further, the severity of complaints was a significant predictor of future cognition. However, SMC did not significantly predict change over time in this sample. These findings warrant further research into the specific features of SMCs that may portend subsequent neuropathological and cognitive changes when screening individuals at increased future risk of dementia.
Clostridium difficile spores play an important role in transmission and can survive in the environment for several months. Optimal methods for measuring environmental C. difficile are unknown. We sought to determine whether increased sample surface area improved detection of C. difficile from environmental samples.
Samples were collected from 12 patient rooms in a tertiary-care hospital in Toronto, Canada.
Samples represented small surface-area and large surface-area floor and bedrail pairs from single-bed rooms of patients with low (without prior antibiotics), medium (with prior antibiotics), and high (C. difficile infected) shedding risk. Presence of C. difficile in samples was measured using quantitative polymerase chain reaction (qPCR) with targets on the 16S rRNA and toxin B genes and using enrichment culture.
Of the 48 samples, 64·6% were positive by 16S qPCR (geometric mean, 13·8 spores); 39·6% were positive by toxin B qPCR (geometric mean, 1·9 spores); and 43·8% were positive by enrichment culture. By 16S qPCR, each 10-fold increase in sample surface area yielded 6·6 times (95% CI, 3·2–13) more spores. Floor surfaces yielded 27 times (95% CI, 4·9–181) more spores than bedrails, and rooms of C. difficile–positive patients yielded 11 times (95% CI, 0·55–164) more spores than those of patients without prior antibiotics. Toxin B qPCR and enrichment culture returned analogous findings.
Clostridium difficile spores were identified in most floor and bedrail samples, and increased surface area improved detection. Future research aiming to understand the role of environmental C. difficile in transmission should prefer samples with large surface areas.
Antibiotic use varies widely between hospitals, but the influence of antimicrobial stewardship programs (ASPs) on this variability is not known. We aimed to determine the key structural and strategic aspects of ASPs associated with differences in risk-adjusted antibiotic utilization across facilities.
Observational study of acute-care hospitals in Ontario, Canada
A survey was sent to hospitals asking about both structural (8 elements) and strategic (32 elements) components of their ASP. Antibiotic use from hospital purchasing data was acquired for January 1 to December 31, 2014. Crude and adjusted defined daily doses per 1,000 patient days, accounting for hospital and aggregate patient characteristics, were calculated across facilities. Rate ratios (RR) of defined daily doses per 1,000 patient days were compared for hospitals with and without each antimicrobial stewardship element of interest.
Of 127 eligible hospitals, 73 (57%) participated in the study. There was a 7-fold range in antibiotic use across these facilities (min, 253 defined daily doses per 1,000 patient days; max, 1,872 defined daily doses per 1,000 patient days). The presence of designated funding or resources for the ASP (RRadjusted, 0·87; 95% CI, 0·75–0·99), prospective audit and feedback (RRadjusted, 0·80; 95% CI, 0·67–0·96), and intravenous-to-oral conversion policies (RRadjusted, 0·79; 95% CI, 0·64–0·99) were associated with lower risk-adjusted antibiotic use.
Wide variability in antibiotic use across hospitals may be partially explained by both structural and strategic ASP elements. The presence of funding and resources, prospective audit and feedback, and intravenous-to-oral conversion should be considered priority elements of a robust ASP.
To study the antibody response to tetanus toxoid and measles by age following vaccination in children aged 4 months to 6 years in Entebbe, Uganda. Serum samples were obtained from 113 children aged 4–15 months, at the Mother-Child Health Clinic (MCHC), Entebbe Hospital and from 203 of the 206 children aged between 12 and 75 months recruited through the Outpatients Department (OPD). Antibodies to measles were quantified by plaque reduction neutralisation test (PRNT) and with Siemens IgG EIA. VaccZyme IgG EIA was used to quantify anti-tetanus antibodies. Sera from 96 of 113 (85.0%) children attending the MCHC contained Measles PRNT titres below the protective level (120 mIU/ml). Sera from 24 of 203 (11.8%) children attending the OPD contained PRNT titres <120 mIU/ml. There was no detectable decline in anti-measles antibody concentrations between 1 and 6 years. The anti-tetanus antibody titres in all 113 children attending MCHC and in 189 of 203 (93.1%) children attending the OPD were >0.15 IU/ml by EIA, a level considered protective. The overall concentration of anti-tetanus antibody was sixfold higher in children under 12 months compared with the older children, with geometric mean concentrations of 3.15 IU/ml and 0.49 IU/ml, respectively. For each doubling in age between 4 and 64 months, the anti-tetanus antibody concentration declined by 50%. As time since the administration of the third DTP vaccination doubled, anti-tetanus antibody concentration declined by 39%. The low measles antibody prevalence in the children presenting at the MCHC is consistent with the current measles epidemiology in Uganda, where a significant number of measles cases occur in children under 1 year of age and earlier vaccination may be indicated. The consistent fall in anti-tetanus antibody titre over time following vaccination supports the need for further vaccine boosters at age 4–5 years as recommended by the WHO.
A significant reason for death and long-term disability due to head injuries and pathologic conditions is an elevation in the intracranial pressure (ICP) due to vascular compromise and secondary sequelae causing edema. ICP measurements before and after injury in a completely closed-head environment have a significant research value, particularly in the acute postinjury period. With current technology, a tethered fiberoptic probe penetrates the brain and therefore can only remain implanted for relatively short time periods. Use of the probe also can cause complications such as infection and hemorrhage and prohibit immediate (at the time of injury) and long-term measurements of ICP. A small, fully embedded, wireless ICP device may simplify clinical management and research protocols by offering a means for semi-invasive and long-term ICP measurement following brain injury. In this chapter, a new digital wireless ICP (DICP) device is described. The dynamic ICP measurement performances of both the analog ICP (AICP) devices (described in Chapter 2) and the DICP devices are evaluated in a specific traumatic brain injury (TBI) (swine) model of closed-head rotational injury.
In Chapter 2, a prototype of an AICP device operating in the industrial-scientific-medical (ISM) band at 2.4 GHz was described that successfully simplified the surgical procedure by reducing the infection rate, the risk of hemorrhage, and the degree of tissue injury.
The AICP device was implanted in a canine model only for a static test, and hypo- and hyperventilation were used to affect variations in ICP. Dynamic ICP variations as a result of TBI in a completely closed-head environment are of paramount importance for understanding the development of a prolonged postconcussion syndrome and facilitating institution of the correct treatment at different stages, particularly in the acute postinjury period. Currently, in experimental (animal) models of TBI, a tethered fiberoptic probe (if inserted before the injury) has to be removed before an injury is induced in order to avoid significant focal damage at the point of probe insertion. Moreover, reinsertion of the probe is possible only after the animal's vital signs have stabilized. However, the act of breaching the cranium after the injury affects the fidelity of the ICP measurements. In addition, proposed noninvasive ICP (NICP) solutions, such as the pulsatility index method based on the use of trancranial Doppler, argued by Figaji et al. , have been shown to be insufficient for accurate ICP estimation.
Alexandre Fréchette, Department of Computer Science, University of British Columbia,
Neil Newman, Department of Computer Science, University of British Columbia,
Kevin Leyton-Brown, Department of Computer Science, University of British Columbia
Over 13 months in 2016–17, the US government held an innovative “incentive auction” for radio spectrum, in which television broadcasters were paid to relinquish broadcast rights via a “reverse auction”, remaining broadcasters were repacked into a narrower band of spectrum, and the cleared spectrum was sold to telecommunications companies. The stakes were enormous: the auction was forecast to net the government tens of billions of dollars, as well as creating massive economic value by reallocating spectrum to more socially beneficial uses (Congressional Budget Office 2015). As a result of both its economic importance and its conceptual novelty, the auction has been the subject of considerable recent study by the research community, mostly focusing on elements of the auction design (Bazelon, Jackson, and McHenry 2011; Kwerel, LaFontaine, and Schwartz 2012; Milgrom et al. 2012; Calamari et al. 2012; Marcus 2013; Milgrom and Segal 2014; Dütting, Gkatzelis, and Roughgarden 2014; Vohra 2014; Nguyen and Sandholm 2014; Kazumori 2014). After considerable study and discussion, the FCC has selected an auction design based on a descending clock (FCC 2014c; 2014a). Such an auction offers each participating station a price for relinquishing its broadcast rights, with this price offer falling for a given station as long as it remains repackable. A consequence of this design is that the auction must (sequentially!) solve hundreds of thousands of such repacking problems. This is challenging, because the repacking problem is NP-complete. It also makes the performance of the repacking algorithm extremely important, as every failure to solve a single, feasible repacking problem corresponds to a lost opportunity to lower a price offer. Given the scale of the auction, individual unsolved problems can cost the government millions of dollars each.
This chapter shows how the station repacking problem can be solved exactly and reliably at the national scale. It describes the results of an extensive, multi-year investigation into the problem, which culminated in a solver that we call SATFC.
The Taipan galaxy survey (hereafter simply ‘Taipan’) is a multi-object spectroscopic survey starting in 2017 that will cover 2π steradians over the southern sky (δ ≲ 10°, |b| ≳ 10°), and obtain optical spectra for about two million galaxies out to z < 0.4. Taipan will use the newly refurbished 1.2-m UK Schmidt Telescope at Siding Spring Observatory with the new TAIPAN instrument, which includes an innovative ‘Starbugs’ positioning system capable of rapidly and simultaneously deploying up to 150 spectroscopic fibres (and up to 300 with a proposed upgrade) over the 6° diameter focal plane, and a purpose-built spectrograph operating in the range from 370 to 870 nm with resolving power R ≳ 2000. The main scientific goals of Taipan are (i) to measure the distance scale of the Universe (primarily governed by the local expansion rate, H0) to 1% precision, and the growth rate of structure to 5%; (ii) to make the most extensive map yet constructed of the total mass distribution and motions in the local Universe, using peculiar velocities based on improved Fundamental Plane distances, which will enable sensitive tests of gravitational physics; and (iii) to deliver a legacy sample of low-redshift galaxies as a unique laboratory for studying galaxy evolution as a function of dark matter halo and stellar mass and environment. The final survey, which will be completed within 5 yrs, will consist of a complete magnitude-limited sample (i ⩽ 17) of about 1.2 × 106 galaxies supplemented by an extension to higher redshifts and fainter magnitudes (i ⩽ 18.1) of a luminous red galaxy sample of about 0.8 × 106 galaxies. Observations and data processing will be carried out remotely and in a fully automated way, using a purpose-built automated ‘virtual observer’ software and an automated data reduction pipeline. The Taipan survey is deliberately designed to maximise its legacy value by complementing and enhancing current and planned surveys of the southern sky at wavelengths from the optical to the radio; it will become the primary redshift and optical spectroscopic reference catalogue for the local extragalactic Universe in the southern sky for the coming decade.