To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The diet of most adults is low in fish and, therefore, provides limited quantities of the long-chain, omega-3 fatty acids (LCn-3FAs), eicosapentaenoic and docosahexaenoic acids (EPA, DHA). Since these compounds serve important roles in the brain, we sought to determine if healthy adults with low-LCn-3FA consumption would exhibit improvements in neuropsychological performance and parallel changes in brain morphology following repletion through fish oil supplementation.
In a randomized, controlled trial, 271 mid-life adults (30–54 years of age, 118 men, 153 women) consuming ⩽300 mg/day of LCn-3FAs received 18 weeks of supplementation with fish oil capsules (1400 mg/day of EPA and DHA) or matching placebo. All participants completed a neuropsychological test battery examining four cognitive domains: psychomotor speed, executive function, learning/episodic memory, and fluid intelligence. A subset of 122 underwent neuroimaging before and after supplementation to measure whole-brain and subcortical tissue volumes.
Capsule adherence was over 95%, participant blinding was verified, and red blood cell EPA and DHA levels increased as expected. Supplementation did not affect performance in any of the four cognitive domains. Exploratory analyses revealed that, compared to placebo, fish oil supplementation improved executive function in participants with low-baseline DHA levels. No changes were observed in any indicator of brain morphology.
In healthy mid-life adults reporting low-dietary intake, supplementation with LCn-3FAs in moderate dose for moderate duration did not affect neuropsychological performance or brain morphology. Whether salutary effects occur in individuals with particularly low-DHA exposure requires further study.
Diagnosis, treatment, and prevention of vector-borne disease (VBD) in pets is one cornerstone of companion animal practices. Veterinarians are facing new challenges associated with the emergence, reemergence, and rising incidence of VBD, including heartworm disease, Lyme disease, anaplasmosis, and ehrlichiosis. Increases in the observed prevalence of these diseases have been attributed to a multitude of factors, including diagnostic tests with improved sensitivity, expanded annual testing practices, climatologic and ecological changes enhancing vector survival and expansion, emergence or recognition of novel pathogens, and increased movement of pets as travel companions. Veterinarians have the additional responsibility of providing information about zoonotic pathogen transmission from pets, especially to vulnerable human populations: the immunocompromised, children, and the elderly. Hindering efforts to protect pets and people is the dynamic and ever-changing nature of VBD prevalence and distribution. To address this deficit in understanding, the Companion Animal Parasite Council (CAPC) began efforts to annually forecast VBD prevalence in 2011. These forecasts provide veterinarians and pet owners with expected disease prevalence in advance of potential changes. This review summarizes the fidelity of VBD forecasts and illustrates the practical use of CAPC pathogen prevalence maps and forecast data in the practice of veterinary medicine and client education.
TwinsUK is the largest cohort of community-dwelling adult twins in the UK. The registry comprises over 14,000 volunteer twins (14,838 including mixed, single and triplets); it is predominantly female (82%) and middle-aged (mean age 59). In addition, over 1800 parents and siblings of twins are registered volunteers. During the last 27 years, TwinsUK has collected numerous questionnaire responses, physical/cognitive measures and biological measures on over 8500 subjects. Data were collected alongside four comprehensive phenotyping clinical visits to the Department of Twin Research and Genetic Epidemiology, King’s College London. Such collection methods have resulted in very detailed longitudinal clinical, biochemical, behavioral, dietary and socioeconomic cohort characterization; it provides a multidisciplinary platform for the study of complex disease during the adult life course, including the process of healthy aging. The major strength of TwinsUK is the availability of several ‘omic’ technologies for a range of sample types from participants, which includes genomewide scans of single-nucleotide variants, next-generation sequencing, metabolomic profiles, microbiomics, exome sequencing, epigenetic markers, gene expression arrays, RNA sequencing and telomere length measures. TwinsUK facilitates and actively encourages sharing the ‘TwinsUK’ resource with the scientific community — interested researchers may request data via the TwinsUK website (http://twinsuk.ac.uk/resources-for-researchers/access-our-data/) for their own use or future collaboration with the study team. In addition, further cohort data collection is planned via the Wellcome Open Research gateway (https://wellcomeopenresearch.org/gateways). The current article presents an up-to-date report on the application of technological advances, new study procedures in the cohort and future direction of TwinsUK.
Objectives: This study aimed to evaluate the influence of lower limb loss (LL) on mental workload by assessing neurocognitive measures in individuals with unilateral transtibial (TT) versus those with transfemoral (TF) LL while dual-task walking under varying cognitive demand. Methods: Electroencephalography (EEG) was recorded as participants performed a task of varying cognitive demand while being seated or walking (i.e., varying physical demand). Results: The findings revealed both groups of participants (TT LL vs. TF LL) exhibited a similar EEG theta synchrony response as either the cognitive or the physical demand increased. Also, while individuals with TT LL maintained similar performance on the cognitive task during seated and walking conditions, those with TF LL exhibited performance decrements (slower response times) on the cognitive task during the walking in comparison to the seated conditions. Furthermore, those with TF LL neither exhibited regional differences in EEG low-alpha power while walking, nor EEG high-alpha desynchrony as a function of cognitive task difficulty while walking. This lack of alpha modulation coincided with no elevation of theta/alpha ratio power as a function of cognitive task difficulty in the TF LL group. Conclusions: This work suggests that both groups share some common but also different neurocognitive features during dual-task walking. Although all participants were able to recruit neural mechanisms critical for the maintenance of cognitive-motor performance under elevated cognitive or physical demands, the observed differences indicate that walking with a prosthesis, while concurrently performing a cognitive task, imposes additional cognitive demand in individuals with more proximal levels of amputation.
We apply two methods to estimate the 21-cm bispectrum from data taken within the Epoch of Reionisation (EoR) project of the Murchison Widefield Array (MWA). Using data acquired with the Phase II compact array allows a direct bispectrum estimate to be undertaken on the multiple redundantly spaced triangles of antenna tiles, as well as an estimate based on data gridded to the uv-plane. The direct and gridded bispectrum estimators are applied to 21 h of high-band (167–197 MHz; z = 6.2–7.5) data from the 2016 and 2017 observing seasons. Analytic predictions for the bispectrum bias and variance for point-source foregrounds are derived. We compare the output of these approaches, the foreground contribution to the signal, and future prospects for measuring the bispectra with redundant and non-redundant arrays. We find that some triangle configurations yield bispectrum estimates that are consistent with the expected noise level after 10 h, while equilateral configurations are strongly foreground-dominated. Careful choice of triangle configurations may be made to reduce foreground bias that hinders power spectrum estimators, and the 21-cm bispectrum may be accessible in less time than the 21-cm power spectrum for some wave modes, with detections in hundreds of hours.
Maternal mental health during pregnancy and postpartum predicts later emotional and behavioural problems in children. Even though most perinatal mental health problems begin before pregnancy, the consequences of preconception maternal mental health for children's early emotional development have not been prospectively studied.
We used data from two prospective Australian intergenerational cohorts, with 756 women assessed repeatedly for mental health problems before pregnancy between age 13 and 29 years, and during pregnancy and at 1 year postpartum for 1231 subsequent pregnancies. Offspring infant emotional reactivity, an early indicator of differential sensitivity denoting increased risk of emotional problems under adversity, was assessed at 1 year postpartum.
Thirty-seven percent of infants born to mothers with persistent preconception mental health problems were categorised as high in emotional reactivity, compared to 23% born to mothers without preconception history (adjusted OR 2.1, 95% CI 1.4–3.1). Ante- and postnatal maternal depressive symptoms were similarly associated with infant emotional reactivity, but these perinatal associations reduced somewhat after adjustment for prior exposure. Causal mediation analysis further showed that 88% of the preconception risk was a direct effect, not mediated by perinatal exposure.
Maternal preconception mental health problems predict infant emotional reactivity, independently of maternal perinatal mental health; while associations between perinatal depressive symptoms and infant reactivity are partially explained by prior exposure. Findings suggest that processes shaping early vulnerability for later mental disorders arise well before conception. There is an emerging case for expanding developmental theories and trialling preventive interventions in the years before pregnancy.
Astrophysics Telescope for Large Area Spectroscopy Probe is a concept for a National Aeronautics and Space Administration probe-class space mission that will achieve ground-breaking science in the fields of galaxy evolution, cosmology, Milky Way, and the Solar System. It is the follow-up space mission to Wide Field Infrared Survey Telescope (WFIRST), boosting its scientific return by obtaining deep 1–4 μm slit spectroscopy for ∼70% of all galaxies imaged by the ∼2 000 deg2 WFIRST High Latitude Survey at z > 0.5. Astrophysics Telescope for Large Area Spectroscopy will measure accurate and precise redshifts for ∼200 M galaxies out to z < 7, and deliver spectra that enable a wide range of diagnostic studies of the physical properties of galaxies over most of cosmic history. Astrophysics Telescope for Large Area Spectroscopy Probe and WFIRST together will produce a 3D map of the Universe over 2 000 deg2, the definitive data sets for studying galaxy evolution, probing dark matter, dark energy and modifications of General Relativity, and quantifying the 3D structure and stellar content of the Milky Way. Astrophysics Telescope for Large Area Spectroscopy Probe science spans four broad categories: (1) Revolutionising galaxy evolution studies by tracing the relation between galaxies and dark matter from galaxy groups to cosmic voids and filaments, from the epoch of reionisation through the peak era of galaxy assembly; (2) Opening a new window into the dark Universe by weighing the dark matter filaments using 3D weak lensing with spectroscopic redshifts, and obtaining definitive measurements of dark energy and modification of General Relativity using galaxy clustering; (3) Probing the Milky Way’s dust-enshrouded regions, reaching the far side of our Galaxy; and (4) Exploring the formation history of the outer Solar System by characterising Kuiper Belt Objects. Astrophysics Telescope for Large Area Spectroscopy Probe is a 1.5 m telescope with a field of view of 0.4 deg2, and uses digital micro-mirror devices as slit selectors. It has a spectroscopic resolution of R = 1 000, and a wavelength range of 1–4 μm. The lack of slit spectroscopy from space over a wide field of view is the obvious gap in current and planned future space missions; Astrophysics Telescope for Large Area Spectroscopy fills this big gap with an unprecedented spectroscopic capability based on digital micro-mirror devices (with an estimated spectroscopic multiplex factor greater than 5 000). Astrophysics Telescope for Large Area Spectroscopy is designed to fit within the National Aeronautics and Space Administration probe-class space mission cost envelope; it has a single instrument, a telescope aperture that allows for a lighter launch vehicle, and mature technology (we have identified a path for digital micro-mirror devices to reach Technology Readiness Level 6 within 2 yr). Astrophysics Telescope for Large Area Spectroscopy Probe will lead to transformative science over the entire range of astrophysics: from galaxy evolution to the dark Universe, from Solar System objects to the dusty regions of the Milky Way.
We sought to define the prevalence of echocardiographic abnormalities in long-term survivors of paediatric hematopoietic stem cell transplantation and determine the utility of screening in asymptomatic patients. We analysed echocardiograms performed on survivors who underwent hematopoietic stem cell transplantation from 1982 to 2006. A total of 389 patients were alive in 2017, with 114 having an echocardiogram obtained ⩾5 years post-infusion. A total of 95 patients had echocardiogram performed for routine surveillance. The mean time post-hematopoietic stem cell transplantation was 13 years. Of 95 patients, 77 (82.1%) had ejection fraction measured, and 10/77 (13.0%) had ejection fraction z-scores ⩽−2.0, which is abnormally low. Those patients with abnormal ejection fraction were significantly more likely to have been exposed to anthracyclines or total body irradiation. Among individuals who received neither anthracyclines nor total body irradiation, only 1/31 (3.2%) was found to have an abnormal ejection fraction of 51.4%, z-score −2.73. In the cohort of 77 patients, the negative predictive value of having a normal ejection fraction given no exposure to total body irradiation or anthracyclines was 96.7% at 95% confidence interval (83.3–99.8%). Systolic dysfunction is relatively common in long-term survivors of paediatric hematopoietic stem cell transplantation who have received anthracyclines or total body irradiation. Survivors who are asymptomatic and did not receive radiation or anthracyclines likely do not require surveillance echocardiograms, unless otherwise indicated.
Concentrations of total organic carbon (TOC), total petroleum hydrocarbons, polycyclic aromatic hydrocarbons (PAHs) and polychlorinated biphenyls (PCBs) were determined in 84 near-surface soils (5–20cm depth) taken from a 255km2 area of Glasgow in the Clyde Basin, UK, during July 2011. Total petroleum hydrocarbon range was 79–2,505mgkg–1 (mean 388mgkg–1; median 272mgkg–1) of which the aromatic fraction was 13–74 % (mean 44 %, median 43 %) and saturates were 28–87 % (mean 56 %, median 57 %). ∑16 PAH varied from 2–653mgkg–1 (mean 32.4mgkg–1; median 12.5mgkg–1) and ∑31 PAH range was 2.47–852mgkg–1 (mean 45.4mgkg–1; median 19.0mgkg–1). ∑PCBtri-hepta range was 2.2–1052μgkg–1 (mean 32.4μgkg–1; median 12.7μgkg–1) and the ∑PCB7 range was 0.3–344μgkg–1 (mean 9.8μgkg–1; median 2.7μgkg–1). The concentration, distribution and source of the persistent organic pollutants were compared with those found in urban soils from other cities and to human health assessment criteria for chronic exposure to chemicals in soil. Total concentrations encountered were generally similar to other urban areas that had a similar industrial history. Benzo[a]pyrene concentrations were assessed against four different land use scenarios (irrespective of current land use) using generic assessment criteria resulting in six of 84 samples exceeding the residential criteria. Isomeric PAH ratios and relative abundance of perylene suggest multiple and environmentally modified pyrogenic PAH sources, inferred to be representative of diffuse pollution. ∑PCB7 concentrations were exceeded in 10 % of sites using the Dutch target value of 20μgkg–1. PCB congener profiles were environmentally attenuated and generally dominated by penta-, hexa- and hepta-chlorinated congeners.
To quantify the frequency and outcomes of receiving an antibiotic prescription upon discharge from the hospital to long-term care facilities (LTCFs).
Retrospective cohort study.
A 576-bed, academic hospital in Portland, Oregon.
Adult inpatients (≥18 years of age) discharged to an LTCF between January 1, 2012, and June 30, 2016.
Our primary outcome was receiving a systemic antibiotic prescription upon discharge to an LTCF. We also quantified the association between receiving an antibiotic prescription and 30-day hospital readmission, 30-day emergency department (ED) visit, and Clostridium difficile infection (CDI) on a readmission or ED visit at the index facility within 60 days of discharge.
Among 6,701 discharges to an LTCF, 22.9% were prescribed antibiotics upon discharge. The most prevalent antibiotic classes prescribed were cephalosporins (20.4%), fluoroquinolones (19.1%), and penicillins (16.7%). The medical records of ~82% of patients included a diagnosis code for a bacterial infection on the index admission. Among patients prescribed an antibiotic upon discharge, the incidence of 30-day hospital readmission to the index facility was 15.9%, the incidence of 30-day ED visit at the index facility was 11.0%, and the incidence of CDI on a readmission or ED visit within 60 days of discharge was 1.6%. Receiving an antibiotic prescription upon discharge was significantly associated with 30-day ED visits (adjusted odds ratio [aOR], 1.2; 95% confidence interval [CI], 1.02–1.5) and with CDI within 60 days (aOR, 1.7; 95% CI, 1.02–2.8) but not with 30-day readmissions (aOR, 1.01; 95% CI, 0.9–1.2).
Antibiotics were frequently prescribed upon discharge to LTCFs, which may be associated with increased risk of poor outcomes post discharge.
In 2017, dicamba-resistant (DR) soybean was commercially available to farmers in the United States. In August and September of 2017, a survey of 312 farmers from 60 Nebraska soybean-producing counties was conducted during extension field days or online. The objective of this survey was to understand farmers’ adoption and perceptions regarding DR soybean technology in Nebraska. The survey contained 16 questions and was divided in three parts: (1) demographics, (2) dicamba application in DR soybean, and (3) dicamba off-target injury to sensitive soybean cultivars. According to the results, 20% of soybean hectares represented by the survey were planted to DR soybean in 2017, and this number would probably double in 2018. Sixty-five percent of survey respondents own a sprayer and apply their own herbicide programs. More than 90% of respondents who adopted DR soybean technology reported significant improvement in weed control. Nearly 60% of respondents used dicamba alone or glyphosate plus dicamba for POST weed control in DR soybean; the remaining 40% added an additional herbicide with an alternative site of action (SOA) to the POST application. All survey respondents used one of the approved dicamba formulations for application in DR soybean. Survey results indicated that late POST dicamba applications (after late June) were more likely to result in injury to non-DR soybean compared to early POST applications (e.g., May and early June) in 2017. According to respondents, off-target dicamba movement resulted both from applications in DR soybean and dicamba-based herbicides applied in corn. Although 51% of respondents noted dicamba injury on non-DR soybean, 7% of those who noted injury filed an official complaint with the Nebraska Department of Agriculture. Although DR soybean technology allowed farmers to achieve better weed control during 2017 than previous growing seasons, it is apparent that off-target movement and resistance management must be addressed to maintain the viability and effectiveness of the technology in the future.
To evaluate whole-genome sequencing (WGS) as a molecular typing tool for MRSA outbreak investigation.
Investigation of MRSA colonization/infection in a neonatal intensive care unit (NICU) over 3 years (2014–2017).
Single-center level IV NICU.
NICU infants and healthcare workers (HCWs).
Infants were screened for MRSA using a swab of the anterior nares, axilla, and groin, initially by targeted (ring) screening, and later by universal weekly screening. Clinical cultures were collected as indicated. HCWs were screened once using swabs of the anterior nares. MRSA isolates were typed using WGS with core-genome multilocus sequence typing (cgMLST) analysis and by pulsed-field gel electrophoresis (PFGE). Colonized and infected infants and HCWs were decolonized. Control strategies included reinforcement of hand hygiene, use of contact precautions, cohorting, enhanced environmental cleaning, and remodeling of the NICU.
We identified 64 MRSA-positive infants: 53 (83%) by screening and 11 (17%) by clinical cultures. Of 85 screened HCWs, 5 (6%) were MRSA positive. WGS of MRSA isolates identified 2 large clusters (WGS groups 1 and 2), 1 small cluster (WGS group 3), and 8 unrelated isolates. PFGE failed to distinguish WGS group 2 and 3 isolates. WGS groups 1 and 2 were codistributed over time. HCW MRSA isolates were primarily in WGS group 1. New infant MRSA cases declined after implementation of the control interventions.
We identified 2 contemporaneous MRSA outbreaks alongside sporadic cases in a NICU. WGS was used to determine strain relatedness at a higher resolution than PFGE and was useful in guiding efforts to control MRSA transmission.
From 1565 to 1570, Spain established no fewer than three networks of presidios (fortified military settlements) across portions of its frontier territories in La Florida and New Spain. Juan Pardo's network of six forts, extending from the Atlantic coast over the Appalachian Mountains, was the least successful of these presidio systems, lasting only from late 1566 to early 1568. The failure of Pardo's defensive network has long been attributed to poor planning and an insufficient investment of resources. Yet recent archaeological discoveries at the Berry site in western North Carolina—the location of both the Native American town of Joara and Pardo's first garrison, Fort San Juan—warrants a reappraisal of this interpretation. While previous archaeological research at Berry concentrated on the domestic compound where Pardo's soldiers resided, the location of the fort itself remained unknown. In 2013, the remains of Fort San Juan were finally identified south of the compound, the first of Pardo's interior forts to be discovered by archaeologists. Data from excavations and geophysical surveys suggest that it was a substantial defensive construction. We attribute the failure of Pardo's network to the social geography of the Native South rather than to an insufficient investment of resources.
To test the feasibility of using telehealth to support antimicrobial stewardship at Veterans Affairs medical centers (VAMCs) that have limited access to infectious disease-trained specialists.
A prospective quasi-experimental pilot study.
Two rural VAMCs with acute-care and long-term care units.
At each intervention site, medical providers, pharmacists, infection preventionists, staff nurses, and off-site infectious disease physicians formed a videoconference antimicrobial stewardship team (VAST) that met weekly to discuss cases and antimicrobial stewardship-related education.
Descriptive measures included fidelity of implementation, number of cases discussed, infectious syndromes, types of recommendations, and acceptance rate of recommendations made by the VAST. Qualitative results stemmed from semi-structured interviews with VAST participants at the intervention sites.
Each site adapted the VAST to suit their local needs. On average, sites A and B discussed 3.5 and 3.1 cases per session, respectively. At site A, 98 of 140 cases (70%) were from the acute-care units; at site B, 59 of 119 cases (50%) were from the acute-care units. The most common clinical syndrome discussed was pneumonia or respiratory syndrome (41% and 35% for sites A and B, respectively). Providers implemented most VAST recommendations, with an acceptance rate of 73% (186 of 256 recommendations) and 65% (99 of 153 recommendations) at sites A and B, respectively. Qualitative results based on 24 interviews revealed that participants valued the multidisciplinary aspects of the VAST sessions and felt that it improved their antimicrobial stewardship efforts and patient care.
This pilot study has successfully demonstrated the feasibility of using telehealth to support antimicrobial stewardship at rural VAMCs with limited access to local infectious disease expertise.
SNP in the vitamin D receptor (VDR) gene is associated with risk of lower respiratory infections. The influence of genetic variation in the vitamin D pathway resulting in susceptibility to upper respiratory infections (URI) has not been investigated. We evaluated the influence of thirty-three SNP in eleven vitamin D pathway genes (DBP, DHCR7, RXRA, CYP2R1, CYP27B1, CYP24A1, CYP3A4, CYP27A1, LRP2, CUBN and VDR) resulting in URI risk in 725 adults in London, UK, using an additive model with adjustment for potential confounders and correction for multiple comparisons. Significant associations in this cohort were investigated in a validation cohort of 737 children in Manchester, UK. In all, three SNP in VDR (rs4334089, rs11568820 and rs7970314) and one SNP in CYP3A4 (rs2740574) were associated with risk of URI in the discovery cohort after adjusting for potential confounders and correcting for multiple comparisons (adjusted incidence rate ratio per additional minor allele ≥1·15, Pfor trend ≤0·030). This association was replicated for rs4334089 in the validation cohort (Pfor trend=0·048) but not for rs11568820, rs7970314 or rs2740574. Carriage of the minor allele of the rs4334089 SNP in VDR was associated with increased susceptibility to URI in children and adult cohorts in the United Kingdom.
Infants with prenatally diagnosed CHD are at high risk for adverse outcomes owing to multiple physiologic and psychosocial factors. Lack of immediate physical postnatal contact because of rapid initiation of medical therapy impairs maternal–infant bonding. On the basis of expected physiology, maternal–infant bonding may be safe for select cardiac diagnoses.
This is a single-centre study to assess safety of maternal–infant bonding in prenatal CHD.
In total, 157 fetuses with prenatally diagnosed CHD were reviewed. On the basis of cardiac diagnosis, 91 fetuses (58%) were prenatally approved for bonding and successfully bonded, 38 fetuses (24%) were prenatally approved but deemed not suitable for bonding at delivery, and 28 (18%) were not prenatally approved to bond. There were no complications attributable to bonding. Those who successfully bonded were larger in weight (3.26 versus 2.6 kg, p<0.001) and at later gestation (39 versus 38 weeks, p<0.001). Those unsuccessful at bonding were more likely to have been delivered via Caesarean section (74 versus 49%, p=0.011) and have additional non-cardiac diagnoses (53 versus 29%, p=0.014). There was no significant difference regarding the need for cardiac intervention before hospital discharge. Infants who bonded had shorter hospital (7 versus 26 days, p=0.02) and ICU lengths of stay (5 versus 23 days, p=0.002) and higher survival (98 versus 76%, p<0.001).
Fetal echocardiography combined with a structured bonding programme can permit mothers and infants with select types of CHD to successfully bond before ICU admission and intervention.
The optimal approach to unifocalisation in pulmonary atresia with ventricular septal defect and major aortopulmonary collateral arteries (pulmonary artery/ventricular septal defect/major aortopulmonary collaterals) remains controversial. Moreover, the impact of collateral vessel disease burden on surgical decision-making and late outcomes remains poorly defined. We investigated our centre’s experience in the surgical management of pulmonary artery/ventricular septal defect/major aortopulmonary collaterals.
Materials and methods
Between 1996 and 2015, 84 consecutive patients with pulmonary artery/ventricular septal defect/major aortopulmonary collaterals underwent unifocalisation. In all, 41 patients received single-stage unifocalisation (Group 1) and 43 patients underwent multi-stage repair (Group 2). Preoperative collateral vessel anatomy, branch pulmonary artery reinterventions, ventricular septal defect status, and late right ventricle/left ventricle pressure ratio were evaluated.
Median follow-up was 4.8 compared with 5.7 years for Groups 1 and 2, respectively, p = 0.65. Median number of major aortopulmonary collaterals/patient was 3, ranging from 1 to 8, in Group 1 compared with 4, ranging from 1 to 8, in Group 2, p = 0.09. Group 2 had a higher number of lobar/segmental stenoses within collateral vessels (p = 0.02). Group 1 had fewer catheter-based branch pulmonary artery reinterventions, with 5 (inter-quartile range from 1 to 7) per patient, compared with 9 (inter-quartile range from 4 to 14) in Group 2, p = 0.009. Among patients who achieved ventricular septal defect closure, median right ventricle/left ventricle pressure was 0.48 in Group 1 compared with 0.78 in Group 2, p = 0.03. Overall mortality was 6 (17%) in Group 1 compared with 9 (21%) in Group 2.
Single-stage unifocalisation is a promising repair strategy in select patients, achieving low rates of reintervention for branch pulmonary artery restenosis and excellent mid-term haemodynamic outcomes. However, specific anatomic substrates of pulmonary artery/ventricular septal defect/major aortopulmonary collaterals may be better suited to multi-stage repair. Preoperative evaluation of collateral vessel calibre and function may help inform more patient-specific surgical management.
Transcatheter right ventricle decompression in neonates with pulmonary atresia and intact ventricular septum is technically challenging, with risk of cardiac perforation and death. Further, despite successful right ventricle decompression, re-intervention on the pulmonary valve is common. The association between technical factors during right ventricle decompression and the risks of complications and re-intervention are not well described.
This is a multicentre retrospective study among the participating centres of the Congenital Catheterization Research Collaborative. Between 2005 and 2015, all neonates with pulmonary atresia and intact ventricular septum and attempted transcatheter right ventricle decompression were included. Technical factors evaluated included the use and characteristics of radiofrequency energy, maximal balloon-to-pulmonary valve annulus ratio, infundibular diameter, and right ventricle systolic pressure pre- and post-valvuloplasty (BPV). The primary end point was cardiac perforation or death; the secondary end point was re-intervention.
A total of 99 neonates underwent transcatheter right ventricle decompression at a median of 3 days (IQR 2–5) of age, including 63 patients by radiofrequency and 32 by wire perforation of the pulmonary valve. There were 32 complications including 10 (10.5%) cardiac perforations, of which two resulted in death. Cardiac perforation was associated with the use of radiofrequency (p=0.047), longer radiofrequency duration (3.5 versus 2.0 seconds, p=0.02), and higher maximal radiofrequency energy (7.5 versus 5.0 J, p<0.01) but not with patient weight (p=0.09), pulmonary valve diameter (p=0.23), or infundibular diameter (p=0.57). Re-intervention was performed in 36 patients and was associated with higher post-intervention right ventricle pressure (median 60 versus 50 mmHg, p=0.041) and residual valve gradient (median 15 versus 10 mmHg, p=0.046), but not with balloon-to-pulmonary valve annulus ratio, atmospheric pressure used during BPV, or the presence of a residual balloon waist during BPV. Re-intervention was not associated with any right ventricle anatomic characteristics, including pulmonary valve diameter.
Technical factors surrounding transcatheter right ventricle decompression in pulmonary atresia and intact ventricular septum influence the risk of procedural complications but not the risk of future re-intervention. Cardiac perforation is associated with the use of radiofrequency energy, as well as radiofrequency application characteristics. Re-intervention after right ventricle decompression for pulmonary atresia and intact ventricular septum is common and relates to haemodynamic measures surrounding initial BPV.