To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Precise instrumental calibration is of crucial importance to 21-cm cosmology experiments. The Murchison Widefield Array’s (MWA) Phase II compact configuration offers us opportunities for both redundant calibration and sky-based calibration algorithms; using the two in tandem is a potential approach to mitigate calibration errors caused by inaccurate sky models. The MWA Epoch of Reionization (EoR) experiment targets three patches of the sky (dubbed EoR0, EoR1, and EoR2) with deep observations. Previous work in Li et al. (2018) and (2019) studied the effect of tandem calibration on the EoR0 field and found that it yielded no significant improvement in the power spectrum (PS) over sky-based calibration alone. In this work, we apply similar techniques to the EoR1 field and find a distinct result: the improvements in the PS from tandem calibration are significant. To understand this result, we analyse both the calibration solutions themselves and the effects on the PS over three nights of EoR1 observations. We conclude that the presence of the bright radio galaxy Fornax A in EoR1 degrades the performance of sky-based calibration, which in turn enables redundant calibration to have a larger impact. These results suggest that redundant calibration can indeed mitigate some level of model incompleteness error.
Impairment in reciprocal social behavior (RSB), an essential component of early social competence, clinically defines autism spectrum disorder (ASD). However, the behavioral and genetic architecture of RSB in toddlerhood, when ASD first emerges, has not been fully characterized. We analyzed data from a quantitative video-referenced rating of RSB (vrRSB) in two toddler samples: a community-based volunteer research registry (n = 1,563) and an ethnically diverse, longitudinal twin sample ascertained from two state birth registries (n = 714). Variation in RSB was continuously distributed, temporally stable, significantly associated with ASD risk at age 18 months, and only modestly explained by sociodemographic and medical factors (r2 = 9.4%). Five latent RSB factors were identified and corresponded to aspects of social communication or restricted repetitive behaviors, the two core ASD symptom domains. Quantitative genetic analyses indicated substantial heritability for all factors at age 24 months (h2 ≥ .61). Genetic influences strongly overlapped across all factors, with a social motivation factor showing evidence of newly-emerging genetic influences between the ages of 18 and 24 months. RSB constitutes a heritable, trait-like competency whose factorial and genetic structure is generalized across diverse populations, demonstrating its role as an early, enduring dimension of inherited variation in human social behavior. Substantially overlapping RSB domains, measurable when core ASD features arise and consolidate, may serve as markers of specific pathways to autism and anchors to inform determinants of autism's heterogeneity.
Background: Automated testing instruments (ATIs) are commonly used by clinical microbiology laboratories to perform antimicrobial susceptibility testing (AST), whereas public health laboratories may use established reference methods such as broth microdilution (BMD). We investigated discrepancies in carbapenem minimum inhibitory concentrations (MICs) among Enterobacteriaceae tested by clinical laboratory ATIs and by reference BMD at the CDC. Methods: During 2016–2018, we conducted laboratory- and population-based surveillance for carbapenem-resistant Enterobacteriaceae (CRE) through the CDC Emerging Infections Program (EIP) sites (10 sites by 2018). We defined an incident case as the first isolation of Enterobacter spp (E. cloacae complex or E. aerogenes), Escherichia coli, Klebsiella pneumoniae, K. oxytoca, or K. variicola resistant to doripenem, ertapenem, imipenem, or meropenem from normally sterile sites or urine identified from a resident of the EIP catchment area in a 30-day period. Cases had isolates that were determined to be carbapenem-resistant by clinical laboratory ATI MICs (MicroScan, BD Phoenix, or VITEK 2) or by other methods, using current Clinical and Laboratory Standards Institute (CLSI) criteria. A convenience sample of these isolates was tested by reference BMD at the CDC according to CLSI guidelines. Results: Overall, 1,787 isolates from 112 clinical laboratories were tested by BMD at the CDC. Of these, clinical laboratory ATI MIC results were available for 1,638 (91.7%); 855 (52.2%) from 71 clinical laboratories did not confirm as CRE at the CDC. Nonconfirming isolates were tested on either a MicroScan (235 of 462; 50.9%), BD Phoenix (249 of 411; 60.6%), or VITEK 2 (371 of 765; 48.5%). Lack of confirmation was most common among E. coli (62.2% of E. coli isolates tested) and Enterobacter spp (61.4% of Enterobacter isolates tested) (Fig. 1A), and among isolates testing resistant to ertapenem by the clinical laboratory ATI (52.1%, Fig. 1B). Of the 1,388 isolates resistant to ertapenem in the clinical laboratory, 1,006 (72.5%) were resistant only to ertapenem. Of the 855 nonconfirming isolates, 638 (74.6%) were resistant only to ertapenem based on clinical laboratory ATI MICs. Conclusions: Nonconfirming isolates were widespread across laboratories and ATIs. Lack of confirmation was most common among E. coli and Enterobacter spp. Among nonconfirming isolates, most were resistant only to ertapenem. These findings may suggest that ATIs overcall resistance to ertapenem or that isolate transport and storage conditions affect ertapenem resistance. Further investigation into this lack of confirmation is needed, and CRE case identification in public health surveillance may need to account for this phenomenon.
Background: Carbapenem-resistant Pseudomonas aeruginosa (CRPA) is a frequent cause of healthcare-associated infections (HAIs). The CDC Emerging Infections Program (EIP) conducted population and laboratory-based surveillance of CRPA in selected areas in 8 states from August 1, 2016, through July 31, 2018. We aimed to describe the molecular epidemiology and mechanisms of resistance of CRPA isolates collected through this surveillance. Methods: We defined a case as the first isolate of P. aeruginosa resistant to imipenem, meropenem, or doripenem from the lower respiratory tract, urine, wounds, or normally sterile sites identified from a resident of the EIP catchment area in a 30-day period; EIP sites submitted a systematic random sample of isolates to CDC for further characterization. Of 1,021 CRPA clinical isolates submitted, 707 have been sequenced to date using an Illumina MiSeq. Sequenced genomes were classified using the 7-gene multilocus sequence typing (MLST) scheme, and a core genome MLST (cgMLST) scheme was used to determine phylogeny. Antimicrobial resistance genes were identified using publicly available databases, and chromosomal mechanisms of carbapenem resistance were determined using previously validated genetic markers. Results: There were 189 sequence types (STs) among the 707 sequenced genomes (Fig. 1). The most frequently occurring were high-risk clones ST235 (8.5%) and ST298 (4.7%), which were found across all EIP sites. Carbapenemase genes were identified in 5 (<1%) isolates. Overall, 95.6% of the isolates had chromosomal mutations associated with carbapenem resistance: 93.2% had porinD-associated mutations that decrease membrane permeability to the drugs; 24.8% had mutations associated with overexpression of the multidrug efflux pump MexAB-OprM; and 22.9% had mutations associated with overexpression of the endogenous β-lactamase ampC. More than 1 such chromosomal resistance mutation type was present in 37.8% of the isolates. Conclusions: The diversity of the sequence types demonstrates that HAIs caused by CRPA can arise from a variety of strains and that high-risk clones are broadly disseminated across the EIP sites but are a minority of CRPA strains overall. Carbapenem resistance in P. aeruginosa was predominantly driven by chromosomal mutations rather than acquired mechanisms (ie, carbapenemases). The diversity of the CRPA isolates and the lack of carbapenemase genes suggest that this ubiquitous pathogen can readily evolve chromosomal resistance mechanisms, but unlike carbapenemases, these cannot be easily spread through horizontal transfer.
Quantifying tree biomass is an important research and management goal across many disciplines. For species that exhibit predictable relationships between structural metrics (e.g. diameter, height, crown breadth) and total weight, allometric calculations produce accurate estimates of above-ground biomass. However, such methods may be insufficient where inter-individual variation is large relative to individual biomass and is itself of interest (for example, variation due to herbivory). In an East African savanna bushland, we analysed photographs of small (<5 m) trees from perpendicular angles and fixed distances to estimate above-ground biomass. Pixel area of trees in photos and diameter were more strongly related to measured, above-ground biomass of destructively sampled trees than biomass estimated using a published allometric relation based on diameter alone (R2 = 0.86 versus R2 = 0.68). When tested on trees in herbivore-exclusion plots versus unfenced (open) plots, our predictive equation based on photos confirmed higher above-ground biomass in the exclusion plots than in unfenced (open) plots (P < 0.001), in contrast to no significant difference based on the allometric equation (P = 0.43). As such, our new technique based on photographs offers an accurate and cost-effective complement to existing methods for tree biomass estimation at small scales with potential application across a wide variety of settings.
On Hawai‘i Island, an increase in human neuroangiostrongyliasis cases has been primarily associated with the accidental ingestion of Angiostrongylus cantonensis L3 in snails or slugs, or potentially, from larvae left behind in the slug's slime or feces. We evaluated more than 40 different treatments in vitro for their ability to kill A. cantonensis larvae with the goal of identifying a safe and effective fruit and vegetable wash in order to reduce the risk of exposure. Our evaluation of treatment lethality was carried out in two phases; initially using motility as an indicator of larval survival after treatment, followed by the development and application of a propidium iodide staining assay to document larval mortality. Treatments tested included common household products, consumer vegetable washes and agricultural crop washes. We found minimal larvicidal efficacy among consumer-grade fruit and vegetable washes, nor among botanical extracts such as those from ginger or garlic, nor acid solutions such as vinegar. Alkaline solutions, on the other hand, as well as oxidizers such as bleach and chlorine dioxide, did show larvicidal potential. Surfactants, a frequent ingredient in detergents that lowers surface tension, had variable results, but dodecylbenzene sulfonic acid as a 70% w/w solution in 2-propanol was very effective, both in terms of the speed and the thoroughness with which it killed A. cantonensis L3 nematodes. Thus, our results suggest promising directions for future investigation.
We are asked to answer a seemingly simple question certified by the US District Court for the District of New Hampshire – namely: “Is a child conceived after her father’s death via artificial insemination eligible to inherit from her father as his surviving issue under New Hampshire intestacy law?” Majority Opinion at 1181 (emphasis added).
Spatially and temporally unpredictable rainfall patterns presented food production challenges to small-scale agricultural communities, requiring multiple risk-mitigating strategies to increase food security. Although site-based investigations of the relationship between climate and agricultural production offer insights into how individual communities may have created long-term adaptations to manage risk, the inherent spatial variability of climate-driven risk makes a landscape-scale perspective valuable. In this article, we model risk by evaluating how the spatial structure of ancient climate conditions may have affected the reliability of three major strategies used to reduce risk: drawing upon social networks in time of need, hunting and gathering of wild resources, and storing surplus food. We then explore how climate-driven changes to this reliability may relate to archaeologically observed social transformations. We demonstrate the utility of this methodology by comparing the Salinas and Cibola regions in the prehispanic U.S. Southwest to understand the complex relationship among climate-driven threats to food security, risk-mitigation strategies, and social transformations. Our results suggest key differences in how communities buffered against risk in the Cibola and Salinas study regions, with the structure of precipitation influencing the range of strategies to which communities had access through time.
This editorial describes current considerations regarding psychiatric diagnoses for transgender and gender-diverse (TGD) people. In addition to offering an assessment of the limitations in current diagnostic standards, the authors articulate a vision for psychiatric practice marked by renewed commitment to an affirmative framework that reduces stigma.
Supplier of system components face the challenge of customer requirements influencing the property level functional integral product architectures. For this, solution approaches focusing on the re-use of pre-engineered part variants are not applicable. However, to generate a valid product structure, customer-specific properties have to fit modelled product knowledge. Therefore, the approach models a reference class structure and analysis compatibilities on the property level for customer specific inputs concerning explicit product knowledge and constraints.
Facing a rising competitive pressure, manufactures create advantages when they are able to offer customer-specific products to the conditions of a mass production article. Traditional configurators support the creation of personalized products from the elements of a modular product system, but are based on a pre-defined set of rules. The model based approach changes the environment of configuration from static configuration rules to the dependencies defined within the product's system model. So, by regarding target quantities of the user, the configurator identifies the optimal variant.
Polysaccharide-based nanoparticles such as pectin had always been of greatest interest because of its excellent solubility and mucoadhesive nature and are highly suitable for oral drug delivery for drug administration. In this study, we used commercially available pectin samples based on their degree of esterification, and nanoparticles were fabricated by the ionotropic gelation method using magnesium (Mg2+) as the divalent cross-linker. We conducted a comparative analysis on the three pectin NPs—high methoxylated pectin (HMP), low methoxylated pectin (LMP), and amidated LMP (AMP)—to examine the difference in characteristics such as shape, size, and biocompatibility. HMP and AMP were found to be similar in size (~850 nm), whereas LMP was found to be of ~700 nm. The three NPs were also tested for their biocompatibility toward THP-1 cells. All three NPs were found to have the potential as a nanocarrier of therapeutic and preventive drugs, especially through oral routes.
OBJECTIVES/GOALS: In 2017, new guidelines recommended multi-step algorithms for CDI diagnosis, and clinical centers rapidly implemented changes despite limited pediatric data. We assessed a multi-step algorithm using NAAT followed by EIA for ability to differentiate symptomatic CDI from colonization in children. METHODS/STUDY POPULATION: We prospectively enrolled pediatric patients with cancer, cystic fibrosis, or inflammatory bowel disease who were not being tested or treated for CDI and obtained a stool sample for NAAT. If positive by NAAT (colonized), EIA was performed. Children with symptomatic CDI who tested positive by NAAT via the clinical laboratory were also enrolled and EIA performed on residual stool. A functional cell cytotoxicity neutralization assay (CCNA) was performed in addition. RESULTS/ANTICIPATED RESULTS: Of the 138 asymptomatic children enrolled, 24 (17%) were colonized. An additional 37 children with symptomatic CDI were enrolled. Neither EIA positivity (41% versus 21%, P = 0.11) or CCNA positivity (49% versus 46%, P = 0.84) were significantly different between symptomatic versus colonized children. When both EIA and CCNA were positive, children were more commonly symptomatic than colonized (33% versus 13%, P = 0.04). DISCUSSION/SIGNIFICANCE OF IMPACT: A multi-step testing algorithm with NAAT and EIA failed to differentiate symptomatic CDI from colonization in our pediatric cohort. As multi-step algorithms are moved into clinical care, pediatric providers will need to be aware of the continued limitations in diagnostic testing.
OBJECTIVES/GOALS: Lung transplant (LTx) candidates benefit from use of non-ideal donor organs. Each organ procurement organization (OPO) defines “acceptable” donor organs introducing unmeasured variation in donor pursuit. We characterized non-ideal donor pursuit among OPOs to identify drivers of risk aversion in LTx. METHODS/STUDY POPULATION: We queried the UNOS registry for adult donors who donated ≥1 organ for transplantation from 12/2007-12/2018. Non-ideal donors were those with any of age>50, smoking history ≥20 pack-years, PaO2/FiO2 (P/F) ratio<350, donation after cardiac death (DCD) status, or CDC increased risk (IRD) status. Non-ideal donor pursuit rate was defined as the proportion of non-ideal donors at each OPO from whom consent for lung donation was requested with lower numbers indicating increased risk aversion. We estimated the correlation between non-ideal and overall donor pursuit using a Spearman correlation coefficient. Adjusted non-ideal donor pursuit rates were estimated using multivariable logistic regression. RESULTS/ANTICIPATED RESULTS: Overall, 18,333 deceased donors were included and classified as ideal or non-ideal. Among 58 OPOs, rates of non-ideal donor pursuit ranged from 0.24-1.00 Figure). Of 5 non-ideal characteristics, DCD and IRD status were associated with the most and least risk aversion, respectively. Non-ideal donor pursuit was strongly correlated with overall donor pursuit (r = 0.99). On adjusted analysis, older age (OR 0.15, 95% CI 0.13-0.16), smoking history (OR 0.38, 95% CI 0.34-0.44), low P/F ratio (OR 0.12, 95% CI 0.11-0.14), and DCD status (OR 0.04, 95% CI 0.03-0.04) were all independently associated with significant risk aversion, corresponding to decreased rates of donor pursuit. DISCUSSION/SIGNIFICANCE OF IMPACT: OPOs differ in their levels of risk aversion in LTx and risk aversion is not uniform across selected categories of non-ideal lung donor. Consideration of new OPO performance metrics that encourage the pursuit of non-ideal lung donors is warranted.
Introduction: Emergency care serves as an important health resource for First Nations (FN) persons. Previous reporting shows that FN persons visit emergency departments at almost double the rate of non-FN persons. Working collaboratively with FN partners, academic researchers and health authority staff, the objective of this study is to investigate FN emergency care patient visit statistics in Alberta over a five year period. Methods: Through a population-based retrospective cohort study for the period from April 1, 2012 to March 31, 2017, patient demographics and emergency care visit characteristics for status FN patients in Alberta were analyzed and compared to non-FN statistics. Frequencies and percentages (%) describe patients and visits by categorical variables (e.g., Canadian Triage Acuity Scale (CTAS)). Means and standard deviations (medians and interquartile ranges (IQR)) describe continuous variables (e.g., distances) as appropriate for the data distribution. These descriptions are repeated for the FN and non-FN populations, separately. Results: The data set contains 11,686,288 emergency facility visits by 3,024,491 unique persons. FN people make up 4.8% of unique patients and 9.4% of emergency care visits. FN persons live further from emergency facilities than their non-FN counterparts (FN median 6 km, IQR 1-24; vs. non-FN median 4 km, IQR 2-8). FN visits arrive more often by ground ambulance (15.3% vs. 10%). FN visits are more commonly triaged as less acute (59% CTAS levels 4 and 5, compared to non-FN 50.4%). More FN visits end in leaving without completing treatment (6.7% vs. 3.6%). FN visits are more often in the evening – 4:01pm to 12:00am (43.6% vs. 38.1%). Conclusion: In a collaborative validation session, FN Elders and health directors contextualized emergency care presentation in evenings and receiving less acute triage scores as related to difficulties accessing primary care. They explained presentation in evenings, arrival by ambulance, and leaving without completing treatment in terms of issues accessing transport to and from emergency facilities. Many factors interact to determine FN patients’ emergency care visit characteristics and outcomes. Further research needs to separate the impact of FN identity from factors such as reasons for visiting emergency facilities, distance traveled to care, and the size of facility where care is provided.
We investigated the contribution of polymorphisms shown to moderate transcription of serotonin transporter (5HTT) and monoamine oxidase A (MAOA) to the development of violence, and furthermore to test for gene x environment interactions. To do so, a cohort of 184 adult male volunteers referred for forensic assessment were assigned to a violent or non-violent group. 45% of violent, but only 30% of non-violent individuals carried the low-activity, short MAOA allele. In the violent group, carriers of low-function variants of 5HTT were found in 77%, as compared to 59%. Logistic regression was performed and the best fitting model revealed a significant, independent effect of childhood environment and MAOA genotype. A significant influence of an interaction between childhood environment and 5HTT genotype was found (Fig. 1). MAOA thus appears to be independently associated with violent crime, while there is a relevant 5HTT x environment interaction.
Psychiatric patients are more often tobacco smokers than the general population. These finding indicate a causal relation between tobacco smoking and occurrence of psychiatric diseases. Therefore in the study presented psychiatric comorbidity of smokers and non smokers were investigated in ”healthy“ probands being either smokers or non smokers.
Students of medicine or of psychology (mv 25,3 Jahre, SD ± 5,3), 70 healthy smokers and 83 healthy non smokers (both groups without known psychic disorder or treatment) were studied according to psychic axis-1-disorders by Mini-DIPS, a questionare for the DSM IV-or ICD 10 criteria of nicotine dependence, Fagerström-test, craving visual scale, CAGE-test, a questionare for sociodemographic factors, organic and psychic diseases and psychiatric/ psychotherapeutic treatments. Urine analysis of addictive drugs and cotinin levels in urin and saliva were estimated.
From 70 smokers according to DSM IV 40 dependent and 30 non dependent smokers were found. According to Fagerström –test 51 of the 70 were dependent smokers. The urine cotinin level was significantly higher in dependent smokers and correlated with the range of dependence acc. to Fagerström (p <0.001). The saliva cotinin level significantly correlated with the range of craving (p < 0.006). In 12 (9f, 3m) of the 40 dependent smokers phobic and anxiety disorders and high levels of cotinin were found, but not in the groups of non dependent smokers or non smokers.
A relationship of dependent smoking with higher cotinin and craving levels and phobic / anxiety disorders seem to exist, especially in females.
An interim analysis of 1 year outcomes in schizophrenia patients enrolled in e-STAR in Australia and treated with RLAI continuously for 12 months.
e-STAR is a secure web-based, international, long-term (1 year retrospective, 2 years prospective) observational study of schizophrenia patients who initiate a new antipsychotic drug during their routine clinical management.
Currently, 315 patients have received RLAI continuously for 12 months; mean age 39.6 years, 68.9% male, mean duration of illness at baseline 11.8 years. Mean Clinical Global Impression Severity (CGI-S) scores at baseline (4.6) decreased significantly at 3, 6 and 12 months (n=284) (4.0, 3.7, 3.7, respectively; all p<0.001 vs baseline) indicating a reduction in illness severity from moderately-marked to mildly-moderate at month 3 and maintained to 1 year. The proportion of patients with CGI-S scores of 1–3 (not ill to mild severity) increased from 12.7% at baseline to 40.8% at 12 months (p<0.0001). Mean Global Assessment of Functioning (GAF) scale scores improved from 41.7 at baseline (serious impairment) to 56.7 (moderate impairment) at 12 months with improvements evident from month 3 after the start of RLAI (p<0.001 for both timepoints). Other significant improvements included fewer hospital stays (p<0.001) and rehospitalisations (p<0.001), reduced suicidal ideation (p=0.008) and violent behaviour (p=0.03), and decreased use of concomitant psychiatric medication.
These interim data show that a significant degree of clinical improvement and reduction in hospitalisation occurs early at 3 months in patients treated with RLAI and is maintained with continued treatment over 12 months.
Attention Deficit Hyperactivity Disorder (ADHD) is a serious risk factor for co-occurring psychiatric disorders and negative psychosocial consequences in adulthood. Given this background, there is great need for an effective treatment of adult ADHD patients.
Therefore, our research group has conducted a first controlled randomized multicenter study on the evaluation of disorder-tailored DBT-based group program in adult ADHD compared to a psychophar-macological treatment.
Between 2007 and 2010, in a four-arm-design 433 patients were randomized to a manualized dialectical behavioural therapy (DBT) based group program plus methylphenidate or placebo or clinical management plus methylphenidate or placebo with weekly sessions in the first twelve weeks and monthly sessions thereafter. Therapists are graduated psychologists or physicians. Treatment integrity is established by independent supervision. Primary endpoint (ADHD symptoms measured by the Conners Adult ADHD Rating Scale) is rated by interviewers blind to the treatment allocation (Current Controlled Trials ISRCTN54096201). The trial is funded by the German Federal Ministry of Research and Education (01GV0606) and is part of the German network for the treatment of ADHD in children and adults (ADHD-NET). In the lecture the first data of our interim analysis are presented (baseline data, results of treatment compliance and adherence).
The German version of the Conners Adult ADHD Rating Scales (CAARS) has proven to show very high model fit in confirmative factor analyses with the established factors inattention/memory problems, hyperactivity/restlessness, impulsivity/emotional lability, and problems with self-concept in both large healthy control and ADHD patient samples. This study now presents data on the psychometric properties of the German CAARS-self-report (CAARS-S) and observer-report (CAARS-O) questionnaires.
CAARS-S/O and questions on sociodemographic variables were filled out by 466 patients with ADHD, 847 healthy control subjects that already participated in two prior studies, and a total of 896 observer data sets were available. Cronbach's-alpha was calculated to obtain internal reliability coefficients. Pearson correlations were performed to assess test-retest reliability, and concurrent, criterion, and discriminant validity. Receiver Operating Characteristics (ROC-analyses) were used to establish sensitivity and specificity for all subscales.
Coefficient alphas ranged from .74 to .95, and test-retest reliability from .85 to .92 for the CAARS-S, and from .65 to .85 for the CAARS-O. All CAARS subscales, except problems with self-concept correlated significantly with the Barrett Impulsiveness Scale (BIS), but not with the Wender Utah Rating Scale (WURS). Criterion validity was established with ADHD subtype and diagnosis based on DSM-IV criteria. Sensitivity and specificity were high for all four subscales.
The reported results confirm our previous study and show that the German CAARS-S/O do indeed represent a reliable and cross-culturally valid measure of current ADHD symptoms in adults.