To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Copper is a commonly used interconnect metal in microelectronic interconnects due to its exceptional electrical and thermal properties. Particularly in applications of the 2.5 and 3D integration, Cu is utilized in through-silicon-vias (TSVs) and flip chip interconnects between microelectronic chips for providing miniaturization, lower power and higher performance than current 2D packaging approaches. SnAg capped Cu pillars are a common high-density interconnect technology for flip chip bonding. For these interconnects, specific properties of the Cu surface, such as roughness and cleanliness, are an important factor in the process to ensure quality solder bumps. During electroplating, tight processing parameters must be met so that defects are avoided, and high bump uniformity is achieved. An understanding of the interactions at the solder and Cu pillar interface is needed, based on the electroplating parameters, to determine the best method for populating solder on the wafer surface. In this study, surface treatment techniques such as oxygen plasma cleaning were performed on the Cu surfaces and the SnAg plating chemistry for depositing the solder were evaluated through hull cell testing to qualitatively determine the range of current densities to investigate. It was observed that current density while plating played a large role in solder bump deposition morphology. At the higher current densities greater than 60 mA/cm2, bump height non-uniformity and dendritic growth are observed and at lower current densities, less than or equal to 60 mA/cm2, uniform, continuous bump height occurred.
Infants born preterm miss out on the peak period of in utero docosahexaenoic acid (DHA) accretion to the brain during the last trimester of pregnancy which is hypothesized to contribute to the increased prevalence of neurodevelopmental deficits in this population. This study aimed to determine whether DHA supplementation in infants born preterm improves attention at 18 months corrected age. This is a follow-up of a subset of infants who participated in the N3RO randomised controlled trial. Infants were randomised to receive an enteral emulsion of high-dose DHA (60 mg/kg/day) or no DHA (soy oil - control) from within the first days of birth until 36 weeks’ post menstrual age. The assessment of attention involved 3 tasks requiring the child to maintain attention on toy/s in either the presence or absence of competition or a distractor. The primary outcome was the child’s latency of distractibility when attention was focused on a toy. The primary outcome was available for 73 infants of the 120 infants that were eligible to participate. There was no evidence of a difference between groups in the latency of distractibility (adjusted mean difference: 0.08 s, 95% CI: -0.81, 0.97; P = 0.86). Enteral DHA supplementation did not result in improved attention in infants born preterm at 18 months corrected age.
The purpose of this paper is to identify a workhorse mortality model for the adult age range (i.e., excluding the accident hump and younger ages). It applies the “general procedure” (GP) of Hunt & Blake [(2014), North American Actuarial Journal, 18, 116–138] to identify an age-period model that fits the data well before adding in a cohort effect that captures the residual year-of-birth effects arising in the original age-period model. The resulting model is intended to be suitable for a variety of populations, but economises on the number of period effects in comparison with a full implementation of the GP. We estimate the model using two different iterative maximum likelihood (ML) approaches – one Partial ML and the other Full ML – that avoid the need to specify identifiability constraints.
Radar surveys across ice sheets typically measure numerous englacial layers that can often be regarded as isochrones. Such layers are valuable for extrapolating age–depth relationships away from ice-core locations, reconstructing palaeoaccumulation variability, and investigating past ice-sheet dynamics. However, the use of englacial layers in Antarctica has been hampered by underdeveloped techniques for characterising layer continuity and geometry over large distances, with techniques developed independently and little opportunity for inter-comparison of results. In this paper, we present a methodology to assess the performance of automated layer-tracking and layer-dip-estimation algorithms through their ability to propagate a correct age–depth model. We use this to assess isochrone-tracking techniques applied to two test case datasets, selected from CreSIS MCoRDS data over Antarctica from a range of environments including low-dip, continuous layers and layers with terminations. We find that dip-estimation techniques are generally successful in tracking englacial dip but break down in the upper and lower regions of the ice sheet. The results of testing two previously published layer-tracking algorithms show that further development is required to attain a good constraint of age–depth relationships away from dated ice cores. We recommend that auto-tracking techniques focus on improved linking of picked stratigraphy across signal disruptions to enable accurate determination of the Antarctic-wide age–depth structure.
Cardiac catheterisations for CHD produce anxiety for patients and families. Current strategies to mitigate anxiety and explain complex anatomy include pre-procedure meetings and educational tools (cardiac diagrams, echocardiograms, imaging, and angiography). More recently, three-dimensionally printed patient-specific models can be added to the armamentarium. The purpose of this study was to evaluate the efficacy of pre-procedure meetings and of different educational tools to reduce patient and parent anxiety before a catheterisation.
Prospective study of patients ≥18 and parents of patients <18 scheduled for clinically indicated catheterisations. Patients completed online surveys before and after meeting with the interventional cardiologist, who was blinded to study participation. Both the pre- and post-meeting surveys measured anxiety using the State-Trait Anxiety Inventory. In addition, the post-meeting survey evaluated the subjective value (from 1 to 4) of individual educational tools: physician discussion, cardiac diagrams, echocardiograms, prior imaging, angiograms and three-dimensionally printed cardiac models. Data were compared using paired t-tests.
Twenty-three patients consented to participate, 16 had complete data for evaluation. Mean State-Trait Anxiety Inventory scores were abnormally elevated at baseline and decreased into the normal range after the pre-procedure meeting (39.8 versus 31, p = 0.008). Physician discussion, angiograms, and three-dimensional models were reported to be most effective at increasing understanding and reducing anxiety.
In this pilot study, we have found that pre-catheterisation meetings produce a measurable decrease in patient and family anxiety before a procedure. Discussions of the procedure, angiograms, and three-dimensionally printed cardiac models were the most effective educational tools.
Despite a rapidly growing understanding of hoarding disorder (HD), there has been relatively limited systematic research into the impact of hoarding on children and adolescents. The goal of this paper is to suggest future research directions, both for children with hoarding behaviours and children living in a cluttered home. Key areas reviewed in this paper include (1) the need for prospective studies of children with hoarding behaviours and those who grow up with a parent with HD; (2) downward extensions of cognitive-behavioural models of adult HD that emphasise different information processing and behavioural biases in youth HD; (3) developmental research into the presentation of emerging HD in childhood compared with adulthood presentations of the disorder, with consideration of typical childhood development and unique motivators for childhood saving behaviours; (4) developmentally sensitive screening and assessment; and (5) the development of evidence-based treatments for this population. The paper concludes with a discussion of methodological suggestions to meet these aims.
Humans are contributing to large carnivore declines around the globe, and conservation interventions should focus on increasing local stakeholder tolerance of carnivores and be informed by both biological and social considerations. In the Okavango Delta (Botswana), we tested new conservation strategies alongside a pre-existing government compensation programme. The new strategies included the construction of predator-proof livestock enclosures, the establishment of an early warning system linked to GPS satellite lion collars, depredation event investigations and educational programmes. We conducted pre- and post-assessments of villagers’ livestock management practices, attitudes towards carnivores and conservation, perceptions of human–carnivore coexistence and attitudes towards established conservation programmes. Livestock management levels were low and 50% of farmers lost livestock to carnivores, while 5–10% of owned stock was lost. Respondents had strong negative attitudes towards lions, which kill most depredated livestock. Following new management interventions, tolerance of carnivores significantly increased, although tolerance of lions near villages did not. The number of respondents who believed that coexistence with carnivores was possible significantly increased. Respondents had negative attitudes towards the government-run compensation programme, citing low and late payments, but were supportive of the new management interventions. These efforts show that targeted, intensive management can increase stakeholder tolerance of carnivores.
Different mortality rates for different socio-economic groups within a population have been consistently reported throughout the years. In this study, we aim to exploit data from multiple public sources, including highly detailed cause-of-death data from the United States Centers for Disease Control and Prevention, to explore the mortality gap between the better and worse off in the US during the period 1989–2015, using education as a proxy.
Historically, the cardiac catheterization laboratory has been used for blood sampling, contrast-enhanced imaging and intravascular pressure measurement to provide diagnostic and prognostic information and to guide surgical intervention. In recent years, technological advancements have made less invasive therapies feasible and driven tremendous growth in percutaneous procedures. While this now encompasses a wide range of cardiovascular interventions, this chapter will focus on percutaneous therapies for structural heart disease, where the anaesthetist is most likely to be involved.
The INSYTE study provides an understanding of the management of Parkinson disease psychosis (PDP) in actual practice settings, including use of antipsychotic (APs) and their impact on clinical, economic, and humanistic outcomes. Treatment paradigms or the benefits/consequences of various “real world” PDP treatment strategies have not been evaluated. Thus, providers may be using a wide range of AP treatment strategies that contrast with consensus recommendations.
The INSYTE study is enrolling up to 750 patients from up to 100 sites in the US. Data are compiled at the baseline (BL) visit and from standard-of-care follow up visits over 3 years. PDP treatment pathways are defined from 3 BL cohorts reflecting (1) no AP medication, (2) use of pimavanserin (PIM), or (3) other AP treatment. Information about APs used is collected at each follow-up visit: history, duration, dose, adjustment, and rationale for adjustment of treatment. Outcomes assessments (clinical, quality of life, disease burden) by the physician, patient, and caregiver are also collected. AP medication and outcomes data are analyzed for patients completing a BL and 1 follow up visit (FU1).
For 404 patients with BL and FU1 visits (mean 120.7 days from BL), 56.8% used no AP medications, 26.0% used PIM, and 13.6% used other APs at BL. The No Medication group was noted to be less severe in key BL disease parameters. Considering primary PDP treatments at BL and FU1 (including no treatment), 26 distinct pathways were being employed. 12.6% of patients had AP medication adjustments between BL and FU1 visits, most frequently from the non-PIM group. Adjustments of APs occurred in many forms: introduction of a single AP (64.7%%), introduction of multiple APs (5.9%), switching to another AP (3.9%), decreasing the number of APs (5.9%), and discontinuation (19.6%).
Multiple, divergent AP treatment strategies for PDP exist in actual practice. No identifiable BL characteristics correlated with the broad range of AP treatment pathways. The numerous distinct AP treatment pathways utilized (n=26) reflect discordance with the updated 2019 MDS evidence-based recommendations, which recognize only 2 APs as “efficacious” and “clinically useful”: pimavanserin and clozapine. Education of healthcare professionals remains a priority for PDP management.
Understanding the effects of crop management practices on weed survival and seed production is imperative in improving long-term weed management strategies, especially for herbicide-resistant weed populations. Kochia [Bassia scoparia (L.) A.J. Scott] is an economically important weed in western North American cropping systems for many reasons, including prolific seed production and evolved resistance to numerous herbicide sites of action. Field studies were conducted in 2014 in a total of four field sites in Wyoming, Montana, and Nebraska to quantify the impact of different crop canopies and herbicide applications on B. scoparia density and seed production. Crops used in this study were spring wheat (Triticum aestivum L.), dry bean (Phaseolus vulgaris L.), sugar beet (Beta vulgaris L.), and corn (Zea mays L.). Herbicide treatments included either acetolactate synthase (ALS) inhibitors effective on non-resistant B. scoparia or a non–ALS inhibiting herbicide effective for both ALS-resistant and ALS-susceptible B. scoparia. Bassia scoparia density midseason was affected more by herbicide choice than by crop canopy, whereas B. scoparia seed production per plant was affected more by crop canopy compared with herbicide treatment. Our results suggest that crop canopy and herbicide treatments were both influential on B. scoparia seed production per unit area, which is likely a key indicator of long-term management success for this annual weed species. The lowest germinable seed production per unit area was observed in spring wheat treated with non–ALS inhibiting herbicides, and the greatest germinable seed production was observed in sugar beet treated with ALS-inhibiting herbicides. The combined effects of crop canopy and herbicide treatment can minimize B. scoparia establishment and seed production.
Executive functions (EF) drive health and educational outcomes and therefore are increasingly common treatment targets. Most treatment trials rely on questionnaires to capture meaningful change because ecologically valid, pediatric performance-based EF tasks are lacking. The Executive Function Challenge Task (EFCT) is a standardized, treatment-sensitive, objective measure which assesses flexibility and planning in the context of provocative social interactions, making it a “hot” EF task.
We investigate the structure, reliability, and validity of the EFCT in youth with autism (Autism Spectrum Disorder; n = 129), or attention deficit hyperactivity disorder with flexibility problems (n = 93), and typically developing (TD; n = 52) youth.
The EFCT can be coded reliably, has a two-factor structure (flexibility and planning), and adequate internal consistency and consistency across forms. Unlike a traditional performance-based EF task (verbal fluency), it shows significant correlations with parent-reported EF, indicating ecological validity. EFCT performance distinguishes youth with known EF problems from TD youth and is not significantly related to visual pattern recognition, or social communication/understanding in autistic children.
The EFCT demonstrates adequate reliability and validity and may provide developmentally appropriate, treatment-sensitive, and ecologically valid assessment of “hot” EF in youth. It can be administered in controlled settings by masked administrators.
To assess whether the implementation of an intensive care unit (ICU) rounding checklist reduces the number of catheter-associated urinary tract infections (CAUTIs).
Retrospective before-and-after study that took place between March 2013 and February 2017.
An academic community hospital 16-bed, mixed surgical, cardiac, medical ICU.
Participants were all patients admitted to the adult mixed ICU and had a diagnosis of CAUTI.
Initiation of an ICU rounding checklist that prompts physicians to address any use of urinary catheters with analysis comparing the preintervention period before roll out of the rounding checklist versus the postintervention periods.
There were 19 CAUTIs and 9,288 urinary catheter days (2.04 CAUTIs per 1,000 catheter days). The catheter utilization ratio increased in the first year after the intervention (0.67 vs 0.60; P = .0079), then decreased in the second year after the intervention (0.53 vs 0.60; P = .0992) and in the third year after the intervention (0.53 vs 0.60; P = .0224). The rate of CAUTI (ie, CAUTI per 1,000 urinary catheter days) decreased from 4.62 before the checklist was implemented to 2.12 in the first year after the intervention (P = .2104). The CAUTI rate was 0.45 in the second year (P = .0275) and 0.96 in the third year (P = .0532).
Our study suggests that utilization of a daily rounding checklist is associated with a decrease in the rates of CAUTI in ICU patients. Incorporating a rounding checklist is feasible in the ICU.
To measure caregivers’ and clinicians’ perception of false memories in the lives of patients with memory loss due to Alzheimer’s disease (AD) and mild cognitive impairment (MCI) using a novel false memories questionnaire. Our hypotheses were that false memories are occurring as often as forgetting according to clinicians and family members.
This prospective, questionnaire-based study consisting of 20 false memory questions paired with 20 forgetting questions had two forms: one for clinicians and the other for family members of older subjects. In total, 226 clinicians and 150 family members of 49 patients with AD, 44 patients with MCI, and 57 healthy older controls (OCs) completed the questionnaire.
False memories occurred nearly as often as forgetting according to clinicians and family members of patients with MCI and AD. Family members of OCs and patients with MCI reported fewer false memories compared to those of the AD group. As Mini-Mental State Examination scores decreased, the mean score increased for both forgetting and false memories. Among clinicians, correlations were observed between the dementia severity of patients seen with both forgetting and false memories questionnaire scores as well as with the impact of forgetting and false memories on daily life.
Patients with AD experience false memories almost as frequently as they do forgetting. Given how common false memories are in AD patients, additional work is needed to understand the clinical implications of these false memories on patients’ daily lives. The novel false memories questionnaire developed may be a valuable tool.
For outbreaks of gastrointestinal disease, rapid identification of the source is crucial to enable public health intervention and prevent further cases. Outbreak investigation comprises analyses of exposure information from cases and, if required, undertaking analytical epidemiological studies. Hypothesis generation has been reliant on empirical knowledge of exposures historically associated with a given pathogen. Epidemiology studies are resource-intensive and prone to bias, one of the reasons being the difficulties in recruiting appropriate controls. For this paper, the information from cases was compared against pre-defined background exposure information. As exemplars, three past outbreaks were used, one of common and two of rare exposures. Information from historical case trawling questionnaires was used to define background exposure having removed any exposures implicated with the outbreak. The case-background approach showed good sensitivity and specificity, identifying correctly all outbreak-related exposures. One additional exposure related to a retailer was identified and four food items where all cases had been exposed. In conclusion, the case-background method, a development of the case-case design, can be used to assist with hypothesis generation or when a case-control study may not be possible to carry out.
“Temporal plus” epilepsy (TPE) is a term that is used when the epileptogenic zone (EZ) extends beyond the boundaries of the temporal lobe. Stereotactic electroencephalography (SEEG) has been essential to identify additional EZs in adjacent structures that might be part of the temporal lobe/limbic network.
We present a small case series of temporal plus cases successfully identified by SEEG who were seizure-free after resective surgery.
We conducted a retrospective analysis of 156 patients who underwent SEEG in 5 years. Six cases had TPE and underwent anterior temporal lobectomy (ATL) with additional extra-temporal resections.
Five cases had a focus on the right hemisphere and one on the left. Three cases were non-lesional and three were lesional. Mean follow-up time since surgery was 2.9 years (SD ± 1.8). Three patients had subdural electrodes investigation prior or in addition to SEEG. All patients underwent standard ATL and additional extra-temporal resections during the same procedure or at a later date. All patients were seizure-free at their last follow-up appointment (Engel Ia = 3; Engel Ib = 2; Engel Ic = 1). Pathology was nonspecific/gliosis for all six cases.
TPE might explain some of the failures in temporal lobe epilepsy surgery. We present a small case series of six patients in whom SEEG successfully identified this phenomenon and surgery proved effective.