To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The science of studying diamond inclusions for understanding Earth history has developed significantly over the past decades, with new instrumentation and techniques applied to diamond sample archives revealing the stories contained within diamond inclusions. This chapter reviews what diamonds can tell us about the deep carbon cycle over the course of Earth’s history. It reviews how the geochemistry of diamonds and their inclusions inform us about the deep carbon cycle, the origin of the diamonds in Earth’s mantle, and the evolution of diamonds through time.
The German Twin Family Panel (TwinLife) is a German longitudinal study of monozygotic and dizygotic same-sex twin pairs and their families that was designed to investigate the development of social inequalities over the life course. The study covers an observation period from approximately 2014 to 2023. The target population of the sample are reared-together twins of four different age cohorts that were born in 2009/2010 (cohort 1), in 2003/2004 (cohort 2), in 1997/1998 (cohort 3) and between 1990 and 1993 (cohort 4). In the first wave, the study included data on 4097 twin families. Families were recruited in all parts of Germany so that the sample comprises the whole range of the educational, occupational and income structure. As of 2019, two face-to-face, at-home interviews and two telephone interviews have been conducted. Data from the first home and telephone interviews are already available free of charge as a scientific use-file from the GESIS data archive. This report aims to provide an overview of the study sample and design as well as constructs that are unique in TwinLife in comparison with previous twin studies — such as an assessment of cognitive abilities or information based on the children’s medical records and report cards. In addition, major findings based on the data already released are displayed, and future directions of the study are presented and discussed.
Childhood disruptive behaviors are highly prevalent and associated with adverse long-term social and economic outcomes. Trajectories of welfare receipt in early adulthood and the association of childhood behaviors with high welfare receipt trajectories have not been examined.
Boys (n = 1000) from low socioeconomic backgrounds were assessed by kindergarten teachers for inattention, hyperactivity, aggression, opposition, and prosociality, and prospectively followed up for 30 years. We used group-base trajectory modeling to estimate trajectories of welfare receipt from age 19–36 years using government tax return records, then examined the association between teacher-rated behaviors and trajectory group membership using mixed effects multinomial regression models.
Three trajectories of welfare receipt were identified: low (70.8%), declining (19.9%), and chronic (9.3%). The mean annual personal employment earnings (US$) for the three groups at age 35/36 years was $36 500 (s.d. = $24 000), $15 600 (s.d. = $16 275), and $1700 (s.d. = $4800), respectively. Relative to the low welfare receipt group, a unit increase in inattention (mean = 2.64; s.d. = 2.32, range = 0–8) at age 6 was associated with an increased risk of being in the chronic group (relative risk ratio; RRR = 1.16, 95% CI 1.03–1.31) and in the declining group (RRR = 1.13, 95% CI 1.03–1.23), after adjustment for child IQ and family adversity, and independent of other behaviors. Family adversity was more strongly associated with trajectories of welfare receipt than any behavior.
Boys from disadvantaged backgrounds exhibiting high inattention in kindergarten are at elevated risk of chronic welfare receipt during adulthood. Screening and support for inattentive behaviors beginning in kindergarten could have long-term social and economic benefits for individuals and society.
Among children exposed to elevated maternal depression symptoms (MDS), recent studies have demonstrated reduced internalizing and externalizing problems for those who have attended formal childcare (i.e., center-based, family-based childcare). However, these studies did not consider whether childcare attendance is associated with benefits for the child only or also with reduced MDS. Using a four-wave longitudinal cross-lagged model, we evaluated whether formal childcare attendance was associated with MDS or child behavior problems and whether it moderated longitudinal associations between MDS and child behavior problems and between child behavior problems and MDS. The sample was drawn from a population-based cohort study and consisted of 908 biologically related mother–child dyads, followed from 5 months to 5 years. Attending formal childcare was not associated with MDS or child behavior problems but moderated the association between MDS at 3.5 years and child internalizing and externalizing problems at 5 years as well as between girls’ externalizing problems at 3.5 years and MDS at 5 years. No other moderation of formal childcare was found. Findings suggest that attending formal childcare reduces the risks of behavior problems in the context of MDS but also the risk of MDS in the context of girls’ externalizing problems.
The understanding of the genetic basis of grain dormancy in wheat has rapidly improved in the last few years, and a number of genes have been identified related to that trait. We recently identified the wheat genes TaPM19-A1 and -A2 and we have now taken the first step towards understanding the role of this class of genes in seeds. By investigating the Arabidopsis homologous PM19-Like 1 (PM19L1) we have found that it has a seed-specific expression pattern and, while its expression is higher in dormant than in non-dormant seeds, knock-out mutations produced seeds with increased dormancy. Not only primary dormancy, but also secondary dormancy in response to high temperature was increased by the loss-of-function. We have also examined the function of PM19L1 by localizing the PM19 protein primarily to the cotyledon cells in seeds, possibly in membranes. By investigating the co-expression network of this gene we have found that it is connected to a small group of abscisic acid (ABA)-induced seed maturation and storage-related genes. The function of PM19L1 represents a good opportunity to explore the interactions of key factors that can influence seed dormancy such as ABA, temperature and membrane properties.
To assess the societal cost-effectiveness of the Transmural Trauma Care Model (TTCM), a multidisciplinary transmural rehabilitation model for trauma patients, compared with regular care.
The economic evaluation was performed alongside a before-and-after study, with a convenience control group measured only afterward, and a 9-month follow-up. Control group patients received regular care and were measured before implementation of the TTCM. Intervention group patients received the TTCM and were measured after its implementation. The primary outcome was generic health-related quality of life (HR-QOL). Secondary outcomes included disease-specific HR-QOL, pain, functional status, and perceived recovery.
Eighty-three trauma patients were included in the intervention group and fifty-seven in the control group. Total societal costs were lower in the intervention group than in the control group, but not statistically significantly so (EUR-267; 95 percent confidence interval [CI], EUR-4,175–3011). At 9 months, there was no statistically significant between-group differences in generic HR-QOL (0.05;95 percent CI, −0.02–0.12) and perceived recovery (0.09;95 percent CI, −0.09–0.28). However, mean between-group differences were statistically significantly in favor of the intervention group for disease-specific HR-QOL (−8.2;95 percent CI, −15.0–−1.4), pain (−0.84;95CI, −1.42–−0.26), and functional status (−20.1;95 percent CI, −29.6–−10.7). Cost-effectiveness acceptability curves indicated that if decision makers are not willing to pay anything per unit of effect gained, the TTCM has a 0.54–0.58 probability of being cost-effective compared with regular care. For all outcomes, this probability increased with increasing values of willingness-to-pay.
The TTCM may be cost-effective compared with regular care, depending on the decision-makers willingness to pay and the probability of cost-effectiveness that they perceive as acceptable.
Cognitive deficits in depressed adults may reflect impaired decision-making. To investigate this possibility, we analyzed data from unmedicated adults with Major Depressive Disorder (MDD) and healthy controls as they performed a probabilistic reward task. The Hierarchical Drift Diffusion Model (HDDM) was used to quantify decision-making mechanisms recruited by the task, to determine if any such mechanism was disrupted by depression.
Data came from two samples (Study 1: 258 MDD, 36 controls; Study 2: 23 MDD, 25 controls). On each trial, participants indicated which of two similar stimuli was presented; correct identifications were rewarded. Quantile-probability plots and the HDDM quantified the impact of MDD on response times (RT), speed of evidence accumulation (drift rate), and the width of decision thresholds, among other parameters.
RTs were more positively skewed in depressed v. healthy adults, and the HDDM revealed that drift rates were reduced—and decision thresholds were wider—in the MDD groups. This pattern suggests that depressed adults accumulated the evidence needed to make decisions more slowly than controls did.
Depressed adults responded slower than controls in both studies, and poorer performance led the MDD group to receive fewer rewards than controls in Study 1. These results did not reflect a sensorimotor deficit but were instead due to sluggish evidence accumulation. Thus, slowed decision-making—not slowed perception or response execution—caused the performance deficit in MDD. If these results generalize to other tasks, they may help explain the broad cognitive deficits seen in depression.
Earthquakes, landslides, and floods are the most frequent natural disasters in Turkey. The country has also recently experienced an increased number of terrorist attacks. The purpose of this study is to understand the expectations and training of Turkish emergency medicine attending physicians in disaster medicine.
An online questionnaire was administered to the 937 members of the Emergency Medicine Association of Turkey, of which 191 completed the survey (20%).
Most participants (68%) worked at a Training and Research Hospital (TRH) or a University Hospital (UH), and 69% had practiced as an attending for 5 years or less. Mass immigration, refugee problems, and war/terror attacks were considered to be the highest perceived risk topics. Most (95%) agreed that disaster medicine trainings should occur during residency training. Regular disaster drills and exercises and weekly or monthly trainings were the most preferred educational modalities. Most respondents (85%) were interested in advanced training in disaster medicine, and this was highest for those working less than 5 years as an attending. UH and TRH residency training programs were not considered in themselves to be sufficient for learning disaster medicine.
Turkish emergency medicine residency training should include more disaster medicine education and training.
Determine the effectiveness of a personal protective equipment (PPE)-free zone intervention on healthcare personnel (HCP) entry hand hygiene (HH) and PPE donning compliance in rooms of patients in contact precautions.
Quasi-experimental, multicenter intervention, before-and-after study with concurrent controls.
All patient rooms on contact precautions on 16 units (5 medical-surgical, 6 intensive care, 5 specialty care units) at 3 acute-care facilities (2 academic medical centers, 1 Veterans Affairs hospital). Observations of PPE donning and entry HH compliance by HCP were conducted during both study phases. Surveys of HCP perceptions of the PPE-free zone were distributed in both study phases.
A PPE-free zone, where a low-risk area inside door thresholds of contact precautions rooms was demarcated by red tape on the floor. Inside this area, HCP were not required to wear PPE.
We observed 3,970 room entries. HH compliance did not change between study phases among intervention units (relative risk [RR], 0.92; P = .29) and declined in control units (RR, 0.70; P = .005); however, the PPE-free zone did not significantly affect compliance (P = .07). The PPE-free zone effect on HH was significant only for rooms on enteric precautions (P = .008). PPE use was not significantly different before versus after the intervention (P = .15). HCP perceived the zone positively; 65% agreed that it facilitated communication and 66.8% agreed that it permitted checking on patients more frequently.
HCP viewed the PPE-free zone favorably and it did not adversely affect PPE or HH compliance. Future infection prevention interventions should consider the complex sociotechnical system factors influencing behavior change.
Rapid increases in herbicide resistance have highlighted the ability of weeds to undergo genetic change within a short period of time. That change, in turn, has resulted in an increasing emphasis in weed science on the evolutionary ecology and potential adaptation of weeds to herbicide selection. Here we argue that a similar emphasis would also be invaluable for understanding another challenge that will profoundly alter weed biology: the rapid rise in atmospheric carbon dioxide (CO2) and the associated changes in climate. Our review of the literature suggests that elevated CO2 and climate change will impose strong selection pressures on weeds and that weeds will often have the capacity to respond with rapid adaptive evolution. Based on current data, climate change and rising CO2 levels are likely to alter the evolution of agronomic and invasive weeds, with consequences for distribution, community composition, and herbicide efficacy. In addition, we identify four key areas that represent clear knowledge gaps in weed evolution: (1) differential herbicide resistance in response to a rapidly changing CO2/climate confluence; (2) shifts in the efficacy of biological constraints (e.g., pathogens) and resultant selection shifts in affected weed species; (3) climate-induced phenological shifts in weed distribution, demography, and fitness relative to crop systems; and (4) understanding and characterization of epigenetics and the differential expression of phenotypic plasticity versus evolutionary adaptation. These consequences, in turn, should be of fundamental interest to the weed science community.
The Comprehensive Framework for Disaster Evaluation Typologies, developed in 2017 (CFDET 2017), aims to unify and facilitate agreement regarding the identification, structure, and relationships between various evaluation typologies found in the disaster setting. A peer-reviewed validation process sought input from international experts in the fields of disaster medicine, disaster/emergency management, humanitarian/development, and evaluation. This paper discusses the validation process, its results, and outcomes.
Previous frameworks, identified in the literature, lack validation and consistent terminology. To gain credibility and utility, this unique framework needed to be validated by international experts in the disaster setting.
A mixed methods approach was designed to validate the framework. An initial iterative process informed an online survey which used a combination of a five-point Likert scale and open-ended questions. Pre-determined consensus thresholds, informed by a targeted literature review, provided the validation criteria.
A sample of 33 experts from 11 countries responded to the validation process. Quantitative measures largely supported the elements and relationships of the framework, and strongly supported its value and usefulness for supporting, promoting, and undertaking evaluations, as well as its usefulness for teaching evaluation in the disaster setting. Qualitative input suggested opportunities to strengthen and enhance the framework. There were limited responses to better understand the barriers and enablers of undertaking disaster evaluations. A potential for self-selection bias of respondents may be a limitation of this study. The attainment of high consensus thresholds, however, provides confidence in the validity of the results.
For the first time, a framework of this nature has undergone a rigorous validation process by experts in three related disciplines at an international level. The modified framework, CFDET 2018, provides a unifying framework within which existing evaluation typologies can be structured. It gives evaluators confidence to choose an appropriate strategy for their particular evaluation in the disaster setting and facilitates consistency in reporting across the different phases of a disaster to better understand the process, outcomes, and impacts of the efficacy and efficiency of interventions. Future research could create a series of toolkits to support improved disaster evaluation processes and to evaluate the utility of the framework in the real-world setting.
The paper is related to an adaptive satellite communication system for data transmission from small, low cost, low Earth orbit satellites. Tests run in a set-up consisting of a number of software-defined radio (SDR) modules operating as a satellite, a ground station, and a satellite channel simulator, have shown that by changing modulation scheme and code rate one can obtain increase of amount of data which can be downloaded from a satellite during a single pass over a ground station approximately by a factor of 2. To determine data rates obtainable in an SDR system using a common personal computer as a digital signal processing device, execution times of particular processing steps involved in the reception process were measured.
A power MOSFET-based push–pull configuration nanosecond-pulse generator has been designed, constructed, and characterized to permeabilize cells for biological and medical applications. The generator can deliver pulses with durations ranging from 80 ns up to 1 µs and pulse amplitudes up to 1.4 kV. The unit has been tested for in vitro experiments on a medulloblastoma cell line. Following the exposure of cells to 100, 200, and 300 ns electric field pulses, permeabilization tests were carried out, and viability tests were conducted to verify the performance of the generator. The maximum temperature rise of the biological load was also calculated based on Joule heating energy conservation and experimental validation. Our results indicate that the developed device has good capabilities to achieve well-controlled electro-manipulation in vitro.
The archaeological site of Saruq al-Hadid, Dubai, United Arab Emirates, presents a long sequence of persistent temporary human occupation on the northern edge of the Rub’ al-Khali desert. The site is located in active dune fields, and evidence for human activity is stratified within a deep sequence of natural dune deposits that reflect complex taphonomic processes of deposition, erosion and reworking. This study presents the results of a program of radiocarbon (14C) and thermoluminescence dating on deposits from Saruq al-Hadid, allied with studies of material remains, which are amalgamated with the results of earlier absolute dating studies provide a robust chronology for the use of the site from the Bronze Age to the Islamic period. The results of the dating program allow the various expressions of human activity at the site—ranging from subsistence activities such as hunting and herding, to multi-community ritual activities and large scale metallurgical extraction—to be better situated chronologically, and thus in relation to current debates regarding the development of late prehistoric and early historic societies in southeastern Arabia.
Hydrilla is an invasive aquatic plant that has rapidly spread through many inland water bodies across the globe by outcompeting native aquatic plants. The negative impacts of hydrilla invasion have become a concern for water resource management authorities, power companies, and environmental scientists. The early detection of hydrilla infestation is very important to reduce the costs associated with control and removal efforts of this invasive species. Therefore, in this study, we aimed to develop a tool for rapid, frequent, and large-scale monitoring and predicting spatial extent of hydrilla habitat. This was achieved by integrating in situ and Landsat 8 Operational Land Imager satellite data for Lake J. Strom Thurmond, the largest US Army Corps of Engineers lake east of the Mississippi River, located on the border of Georgia and South Carolina border. The predictive model for presence of hydrilla incorporated radiometric and physical measurements, including remote-sensing reflectance, Secchi disk depth (SDD), light-attenuation coefficient (Kd), maximum depth of colonization (Zc), and percentage of light available through the water column (PLW). The model-predicted ideal habitat for hydrilla featured high SDD, Zc, and PLW values, low values of Kd. Monthly analyses based on satellite images showed that hydrilla starts growing in April, reaches peak coverage around October, begins retreating in the following months, and disappears in February. Analysis of physical and meteorological factors (i.e., water temperature, surface runoff, net inflow, precipitation) revealed that these parameters are closely associated with hydrilla extent. Management agencies can use these results not only to plan removal efforts but also to evaluate and adapt their current mitigation efforts.
This paper reports on an ultra-wideband low-noise distributed amplifier (LNDA) in a transferred-substrate InP double heterojunction bipolar transistor (DHBT) technology which exhibits a uniform low-noise characteristic over a large frequency range. To obtain very high bandwidth, a distributed architecture has been chosen with cascode unit gain cells. Each unit cell consists of two cascode-connected transistors with 500 nm emitter length and ft/fmax of ~360/492 GHz, respectively. Due to optimum line-impedance matching, low common-base transistor capacitance, and low collector-current operation, the circuit exhibits a low-noise figure (NF) over a broad frequency range. A 3-dB bandwidth from 40 to 185 GHz is measured, with an NF of 8 dB within the frequency range between 75 and 105 GHz. Moreover, this circuit demonstrates the widest 3-dB bandwidth operation among all reported single-stage amplifiers with a cascode configuration. Additionally, this work has proposed that the noise sources of the InP DHBTs are largely uncorrelated. As a result, a reliable prediction can be done for the NF of ultra-wideband circuits beyond the frequency range of the measurement equipment.