We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Many industrial applications require finding solutions to challenging combinatorial problems. Efficient elimination of symmetric solution candidates is one of the key enablers for high-performance solving. However, existing model-based approaches for symmetry breaking are limited to problems for which a set of representative and easily solvable instances is available, which is often not the case in practical applications. This work extends the learning framework and implementation of a model-based approach for Answer Set Programming to overcome these limitations and address challenging problems, such as the Partner Units Problem. In particular, we incorporate a new conflict analysis algorithm in the Inductive Logic Programming system ILASP, redefine the learning task, and suggest a new example generation method to scale up the approach. The experiments conducted for different kinds of Partner Units Problem instances demonstrate the applicability of our approach and the computational benefits due to the first-order constraints learned.
Recurrent laryngeal nerve injury leading to vocal cord paralysis is a known complication of cardiothoracic surgery. Its occurrence during interventional catheterisation procedures has been documented in case reports, but there have been no studies to determine an incidence.
Objective:
To establish the incidence of left recurrent laryngeal nerve injury leading to vocal cord paralysis after left pulmonary artery stenting, patent ductus arteriosus device closure and the combination of the procedures either consecutively or simultaneously.
Methods:
Members of the Congenital Cardiovascular Interventional Study Consortium were asked to perform a retrospective analysis to identify cases of recurrent laryngeal nerve injury after the aforementioned procedures. Twelve institutions participated in the analysis. They also contributed the total number of each procedure performed at their respective institutions for statistical purposes.
Results:
Of the 1337 patients who underwent left pulmonary artery stent placement, six patients (0.45%) had confirmed vocal cord paralysis. 4001 patients underwent patent ductus arteriosus device closure, and two patients (0.05%) developed left vocal cord paralysis. Patients who underwent both left pulmonary artery stent placement and patent ductus arteriosus device closure had the highest incidence of vocal cord paralysis which occurred in 4 of the 26 patients (15.4%). Overall, 92% of affected patients in our study population had resolution of symptoms.
Conclusion:
Recurrent laryngeal nerve injury is a rare complication of left pulmonary artery stent placement or patent ductus arteriosus device closure. However, the incidence is highest in patients undergoing both procedures either consecutively or simultaneously. Additional research is necessary to determine contributing factors that might reduce the risk of recurrent laryngeal nerve injury.
The goal of inductive logic programming (ILP) is to learn a program that explains a set of examples. Until recently, most research on ILP targeted learning Prolog programs. The ILASP system instead learns answer set programs (ASP). Learning such expressive programs widens the applicability of ILP considerably; for example, enabling preference learning, learning common-sense knowledge, including defaults and exceptions, and learning non-deterministic theories. Early versions of ILASP can be considered meta-level ILP approaches, which encode a learning task as a logic program and delegate the search to an ASP solver. More recently, ILASP has shifted towards a new method, inspired by conflict-driven SAT and ASP solvers. The fundamental idea of the approach, called Conflict-driven ILP (CDILP), is to iteratively interleave the search for a hypothesis with the generation of constraints which explain why the current hypothesis does not cover a particular example. These coverage constraints allow ILASP to rule out not just the current hypothesis, but an entire class of hypotheses that do not satisfy the coverage constraint. This article formalises the CDILP approach and presents the ILASP3 and ILASP4 systems for CDILP, which are demonstrated to be more scalable than previous ILASP systems, particularly in the presence of noise.
Multicentre research databases can provide insights into healthcare processes to improve outcomes and make practice recommendations for novel approaches. Effective audits can establish a framework for reporting research efforts, ensuring accurate reporting, and spearheading quality improvement. Although a variety of data auditing models and standards exist, barriers to effective auditing including costs, regulatory requirements, travel, and design complexity must be considered.
Materials and methods:
The Congenital Cardiac Research Collaborative conducted a virtual data training initiative and remote source data verification audit on a retrospective multicentre dataset. CCRC investigators across nine institutions were trained to extract and enter data into a robust dataset on patients with tetralogy of Fallot who required neonatal intervention. Centres provided de-identified source files for a randomised 10% patient sample audit. Key auditing variables, discrepancy types, and severity levels were analysed across two study groups, primary repair and staged repair.
Results:
Of the total 572 study patients, data from 58 patients (31 staged repairs and 27 primary repairs) were source data verified. Amongst the 1790 variables audited, 45 discrepancies were discovered, resulting in an overall accuracy rate of 97.5%. High accuracy rates were consistent across all CCRC institutions ranging from 94.6% to 99.4% and were reported for both minor (1.5%) and major discrepancies type classifications (1.1%).
Conclusion:
Findings indicate that implementing a virtual multicentre training initiative and remote source data verification audit can identify data quality concerns and produce a reliable, high-quality dataset. Remote auditing capacity is especially important during the current COVID-19 pandemic.
Cardiac Fibromas are primary cardiac tumours more common in children than in adults. Surgical intervention is often not required except in the case of limited cardiac output or significant arrhythmia burden. We present a symptomatic 3-month-old infant who had successful surgical intervention for a giant right ventricle fibroma found on prenatal imaging.
According to Ontario, Canada’s Basic Life Support Patient Care Standards, Emergency Medical Services (EMS) on-scene time (OST) for trauma calls should not exceed 10 minutes, unless there are extenuating circumstances. The time to definitive care can have a significant impact on the morbidity and mortality of trauma patients. This is the first Canadian study to investigate why this is the case by giving a voice to those most involved in prehospital care: the paramedics themselves. It is also the first study to explore this issue from a complex, adaptive systems approach which recognizes that OSTs may be impacted by local, contextual features.
Problem
Research addressed the following problem: what are the facilitators and barriers to achieving 10-minute OSTs?
Methods
This project used a descriptive, qualitative design to examine facilitators and barriers to achieving 10-minute OSTs on trauma calls, from the perspective of paramedics. Paramedics from a regional Emergency Services organization were interviewed extensively over the course of one year, using qualitative interviewing techniques developed by experts in that field. All interviews were recorded, transcribed, and entered into NVivo for Mac (QSR International; Victoria, Australia) software that supports qualitative research, for ease of data analysis. Researcher triangulation was used to ensure credibility of the data.
Results
Thirteen percent of the calls had OSTs that were less than 10 minutes. The following six categories were outlined by the paramedics as impacting the duration of OSTs: (1) scene characteristics; (2) the presence and effectiveness of allied services; (3) communication with dispatch; (4) the paramedics’ ability to effectively manage the scene; (5) current policies; and (6) the quantity and design of equipment.
Conclusion
These findings demonstrate the complexity of the prehospital environment and bring into question the feasibility of the 10-minute OST standard.LevitanM, LawMP, FerronR, Lutz-GraulK. Paramedics’ Perspectives on Factors Impacting On-Scene Times for Trauma Calls. Prehosp Disaster Med. 2018;33(3):250–255.
In this article, we present a case of a desaturated Fontan patient with an infra-diaphragmatic venous collateral to the pulmonary vein, which was too tortuous to attempt closure at the source. A trans-septal approach was successfully used to close the collateral in a retrograde manner.
In recent years, several frameworks and systems have been proposed that extend Inductive Logic Programming (ILP) to the Answer Set Programming (ASP) paradigm. In ILP, examples must all be explained by a hypothesis together with a given background knowledge. In existing systems, the background knowledge is the same for all examples; however, examples may be context-dependent. This means that some examples should be explained in the context of some information, whereas others should be explained in different contexts. In this paper, we capture this notion and present a context-dependent extension of the Learning from Ordered Answer Sets framework. In this extension, contexts can be used to further structure the background knowledge. We then propose a new iterative algorithm, ILASP2i, which exploits this feature to scale up the existing ILASP2 system to learning tasks with large numbers of examples. We demonstrate the gain in scalability by applying both algorithms to various learning tasks. Our results show that, compared to ILASP2, the newly proposed ILASP2i system can be two orders of magnitude faster and use two orders of magnitude less memory, whilst preserving the same average accuracy.
In the United States alone, ∼14,000 children are hospitalised annually with acute heart failure. The science and art of caring for these patients continues to evolve. The International Pediatric Heart Failure Summit of Johns Hopkins All Children’s Heart Institute was held on February 4 and 5, 2015. The 2015 International Pediatric Heart Failure Summit of Johns Hopkins All Children’s Heart Institute was funded through the Andrews/Daicoff Cardiovascular Program Endowment, a philanthropic collaboration between All Children’s Hospital and the Morsani College of Medicine at the University of South Florida (USF). Sponsored by All Children’s Hospital Andrews/Daicoff Cardiovascular Program, the International Pediatric Heart Failure Summit assembled leaders in clinical and scientific disciplines related to paediatric heart failure and created a multi-disciplinary “think-tank”. The purpose of this manuscript is to summarise the lessons from the 2015 International Pediatric Heart Failure Summit of Johns Hopkins All Children’s Heart Institute, to describe the “state of the art” of the treatment of paediatric cardiac failure, and to discuss future directions for research in the domain of paediatric cardiac failure.
This paper contributes to the area of inductive logic programming by presenting a new learning framework that allows the learning of weak constraints in Answer Set Programming (ASP). The framework, called Learning from Ordered Answer Sets, generalises our previous work on learning ASP programs without weak constraints, by considering a new notion of examples as ordered pairs of partial answer sets that exemplify which answer sets of a learned hypothesis (together with a given background knowledge) are preferred to others. In this new learning task inductive solutions are searched within a hypothesis space of normal rules, choice rules, and hard and weak constraints. We propose a new algorithm, ILASP2, which is sound and complete with respect to our new learning framework. We investigate its applicability to learning preferences in an interview scheduling problem and also demonstrate that when restricted to the task of learning ASP programs without weak constraints, ILASP2 can be much more efficient than our previously proposed system.
This article focuses attention on research examining workplace discrimination against employees from marginalized groups. We particularly consider the experiences of seven different groups of marginalized individuals, some of which have legal protection and some of which do not but all of whom we feel have been overlooked by the field of industrial–organizational (I–O) psychology. We briefly describe the importance of studying each group and then delineate the brief amount of research that has been conducted. Finally, we make recommendations for I–O psychologists in terms of research and advocacy. Overall, we argue that I–O psychologists are missing an opportunity to be at the forefront of understanding and instigating changes that would result in maximizing the fairness and optimization of these often forgotten employees and their experiences in the workplace.
The Australian Square Kilometre Array Pathfinder (ASKAP) will give us an unprecedented opportunity to investigate the transient sky at radio wavelengths. In this paper we present VAST, an ASKAP survey for Variables and Slow Transients. VAST will exploit the wide-field survey capabilities of ASKAP to enable the discovery and investigation of variable and transient phenomena from the local to the cosmological, including flare stars, intermittent pulsars, X-ray binaries, magnetars, extreme scattering events, interstellar scintillation, radio supernovae, and orphan afterglows of gamma-ray bursts. In addition, it will allow us to probe unexplored regions of parameter space where new classes of transient sources may be detected. In this paper we review the known radio transient and variable populations and the current results from blind radio surveys. We outline a comprehensive program based on a multi-tiered survey strategy to characterise the radio transient sky through detection and monitoring of transient and variable sources on the ASKAP imaging timescales of 5 s and greater. We also present an analysis of the expected source populations that we will be able to detect with VAST.
We investigated AlAs0.56Sb0.44 epitaxial layers lattice-matched to InP grown by molecular beam epitaxy (MBE). Silicon (Si) and tellurium (Te) were studied as n-type dopants in AlAs0.56Sb0.44 material. Similar to most Sb-based materials, AlAs0.56Sb0.44 demonstrates a maximum active carrier concentration around low-1018 cm-3 when using Te as a dopant. We propose the use of a heavily Si-doped InAlAs layer embedded in the AlAsSb barrier as a modulation-doped layer. The In0.53Ga0.47As/AlAs0.56Sb0.44 double heterostructures with a 10 nm InGaAs well show an electron mobility of about 9400 cm2/V・s at 295 K and 32000 cm2/V・s at 46 K. A thinner 5 nm InGaAs well has an electron mobility of about 4300 cm2/V・s at 295 K. This study demonstrates that AlAs0.56Sb0.44 is a promising barrier material for highly scaled InGaAs MOSFETs and HEMTs.
Level set methods have been used for Solid phase epitaxial regrowth, etching and deposition.This study is to model the growth of nickel silicide accurately using the level set method. NiSi growth has been observed to follow a linear-parabolic law which takes into account both diffusion and interfacial reaction. This linear-parabolic system is very similar to the Deal and Grove model of SiO2 growth. This model uses similar diffusion transport and reaction rate equations. This simulation models the growth of silicide coupling diffusion solutions to level-set techniques. Dual level sets have been used for top and bottom interface propagation of silicide; velocities were estimated based on nickel concentrations at both interfaces as well as diffusivity and reaction rate of nickel. This is important to predict precise shape of silicide that will allow current crowding and field focusing effects to be modeled in transport out of the intrinsic device into the contacting layers. These simulation models can be used for latest technology nodes at 45, 32, 22nm and special devices such as FinFET’s etc. The level set method is successfully implemented and verified in Florida Object Oriented Process Simulator and growth shapes matches well with the literature Transmission Electron Microscopy data.
Plate tectonics is the kinematic theory that describes the large-scale motions and events of the outermost shell of the solid Earth in terms of the relative motions and interactions of large, rigid, interlocking fragments of lithosphere called tectonic plates. Plates form and disappear incrementally over time as a result of tectonic processes. There are currently about a dozen major plates on the surface of the Earth, and many minor ones. The present-day configuration of tectonic plates is illustrated inFigure 7.1. As the interlocking plates move relative to each other, they interact at plate boundaries, where adjacent plates collide, diverge, or slide past each other. The interactions of plates result in a variety of observable surface phenomena, including the occurrence of earthquakes and the formation of large-scale surface features such as mountains, sedimentary basins, volcanoes, island arcs, and deep ocean trenches. In turn, the appearance of these phenomena and surface features indicates the location of plate boundaries. For a detailed review of the theory of plate tectonics, consult Wessel and Müller (2007).
A plate-tectonic reconstruction is the calculation of positions and orientations of tectonic plates at an instant in the history of the Earth. The visualization of reconstructions is a valuable tool for understanding the evolution of the systems and processes of the Earth's surface and near subsurface. Geological and geophysical features may be “embedded” in the simulated plates, to be reconstructed along with the plates, enabling a researcher to trace the motions of these features through time.
We present photometry and spectroscopy of the peculiar Type II supernova SN 2010jp, also named PTF10aaxi. The light curve exhibits a linear decline with a relatively low peak absolute magnitude of only −15.9 (unfiltered), and a low radioactive decay luminosity at late times that suggests a low synthesized nickel mass of about 0.003 M⊙ or less. Spectra of SN 2010jp display an unprecedented triple-peaked Hα line profile, showing: (1) a narrow central component that suggests shock interaction with a dense circumstellar medium (CSM); (2) high-velocity blue and red emission features centered at −12,600 and +15,400 km s−1; and (3) very broad wings extending from −22,000 to +25,000 km s−1. We propose that this line profile indicates a bipolar jet-driven explosion, with the central component produced by normal SN ejecta and CSM interaction at mid and low latitudes, while the high-velocity bumps and broad line wings arise in a nonrelativistic bipolar jet. Jet-driven SNe II are predicted for collapsars resulting from a wide range of initial masses above 25 M⊙, especially at the sub-solar metallicity consistent with the SN host environment. It also seems consistent with the apparently low 56Ni mass that may accompany black hole formation. We speculate that the jet survives to produce observable signatures because the star's H envelope was very low mass, having been mostly stripped away by the previous eruptive mass loss.
Flash-assist Rapid Thermal Processing (RTP) presents an opportunity to investigate annealing time and temperature regimes which were previously not accessible with conventional annealing techniques such as Rapid Thermal Annealing. This provides a unique opportunity to explore the early stages of the End of Range (EOR) damage evolution and also to examine how the damage evolves during the high temperature portion of the temperature profile. However, the nature of the Flash-assist RTP makes it is extremely difficult to reasonably compare it to alternative annealing techniques, largely because the annealing time at a given temperature is dictated by the FWHM of the radiation pulse. The FWHM for current flash tools vary between 0.85 and 1.38 milliseconds, which is three orders of magnitude smaller to that required for a RTA to achieve similar temperatures. Traditionally, the kinetics of the extended defects has been studied by time dependent studies utilizing isothermal anneals; in which specific defect structures could be isolated. The characteristics of Flash-assist RTP do not allow for such investigations in which the EOR defect evolution could be closely tracked with time. Since the annealing time at the target temperature for the Flash-assist RTP is essentially fixed to very small times on the order of milliseconds, isochronal anneals are a logical experimental approach to temperature dependent studies. This fact presents a challenge in the data analysis and comparison. Another feature of Flash-assist RTP which makes the analysis complex is the ramp time relative to the dwell time spent at the peak fRTP temperature. As the flash anneal temperature is increased the total ramp time can exceed the dwell time at the peak temperature, which may play a significantly larger role in dictating the final material properties. The inherent characteristics of Flash-assist RTP have consequently required the development of another approach to analyzing the attainable experimental data, such that a meaningful comparison could be made to past studies. The adopted analysis entails the selection of a reference anneal, from which the decay in the trapped interstitial density can be tracked with the flash anneal temperature, allowing for the kinetics of the interstitial decay to be extracted.
As millisecond annealing is increasingly utilized, the as-implanted profile dominates the final dopant distribution. We characterized boron diffusion in amorphous silicon prior to post-implantation annealing. SIMS confirmed that both fluorine and germanium enhance boron motion in amorphous materials. The magnitude of boron diffusion in germanium amorphized silicon scales with increasing fluorine dose. Boron atoms are mobile at concentrations approaching 1x1019 atoms/cm^3. It appears that defects inherent to the structure of amorphous silicon can trap and immobilize boron atoms at room temperature, but that chemical reactions involving Si-F and Si-Ge eliminate potential trapping sites. Sequential Ge+, F+, and B+ implants result in 80% more boron motion than do sequential Si+, F+, and B+ implants. The mobile boron dose and trapping site concentration change as functions of the fluorine dose through power law relationships. As the fluorine dose increases, the trapping site population decreases and the mobile boron dose increases. This reduction in trap density can result in as-implanted “junction depths” that are as much as 75% deeper (taken at 1x1018 atoms/cm-3) for samples implanted with 500 eV, 1x1015 atoms/cm2 boron.