We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Bloodstream infections (BSIs) are a frequent cause of morbidity in patients with acute myeloid leukemia (AML), due in part to the presence of central venous access devices (CVADs) required to deliver therapy.
Objective:
To determine the differential risk of bacterial BSI during neutropenia by CVAD type in pediatric patients with AML.
Methods:
We performed a secondary analysis in a cohort of 560 pediatric patients (1,828 chemotherapy courses) receiving frontline AML chemotherapy at 17 US centers. The exposure was CVAD type at course start: tunneled externalized catheter (TEC), peripherally inserted central catheter (PICC), or totally implanted catheter (TIC). The primary outcome was course-specific incident bacterial BSI; secondary outcomes included mucosal barrier injury (MBI)-BSI and non-MBI BSI. Poisson regression was used to compute adjusted rate ratios comparing BSI occurrence during neutropenia by line type, controlling for demographic, clinical, and hospital-level characteristics.
Results:
The rate of BSI did not differ by CVAD type: 11 BSIs per 1,000 neutropenic days for TECs, 13.7 for PICCs, and 10.7 for TICs. After adjustment, there was no statistically significant association between CVAD type and BSI: PICC incident rate ratio [IRR] = 1.00 (95% confidence interval [CI], 0.75–1.32) and TIC IRR = 0.83 (95% CI, 0.49–1.41) compared to TEC. When MBI and non-MBI were examined separately, results were similar.
Conclusions:
In this large, multicenter cohort of pediatric AML patients, we found no difference in the rate of BSI during neutropenia by CVAD type. This may be due to a risk-profile for BSI that is unique to AML patients.
OBJECTIVES/GOALS: We aimed to determine if GLP-1 receptor agonists exert beneficial effects on surrogate measures of cardiovascular function independently of weight loss. Our objective was to compare the outcomes between GLP-1 receptor agonist treatment versus a similar drug without cardiovascular benefit versus weight loss through diet alone. METHODS/STUDY POPULATION: We enrolled 88 individuals with obesity (BMI ≥ 30kg/m2) and pre-diabetes and randomized them in a 2:1:1 ratio to 14 weeks of the GLP-1 receptor agonist liraglutide, the dipeptidyl peptidase-4 inhibitor sitagliptin, or hypocaloric diet. Sitagliptin blocks degradation of endogenous GLP-1 but does not cause weight loss or lower adverse cardiovascular outcomes. Treatment was double-blinded and placebo-controlled for drug, and unblinded for diet. Primary endpoints were flow-mediated dilation (FMD) to assess endothelial vasodilatory function, and plasminogen activator inhibitor-1 (PAI-1) to assess endothelial fibrinolytic function. We used a general linear model for each outcome and included gender as a covariate for FMD. Baseline characteristics were similar. Mean age was 50, with 32% men and 13% black. RESULTS/ANTICIPATED RESULTS: At 14 weeks, diet and liraglutide caused weight loss (diet -4.3 ± 3.2 kg, P<0.01; liraglutide -2.7 ± 3.2, P<0.01), while sitagliptin did not (-0.7 ± 2.0, P=0.17). Diet did not improve FMD at 14 weeks compared to baseline (+0.9%, 95% CI [-1.5, 3.3], P=0.46). FMD tended to increase after liraglutide and sitagliptin but was not significant (liraglutide +1.2 [-0.3, 2.8], P=0.12; sitagliptin +1.6 [-0.6, 3.8], P=0.15). Given that liraglutide and sitagliptin work through the same GLP-1 pathway, we combined the liraglutide and sitagliptin groups for overall effect on FMD, which was significantly improved from baseline (+1.4 [0.1, 2.8], P=0.04). Diet and liraglutide improved PAI-1 at 14 weeks (diet -4.4U/mL, [-8.5, -0.2], P=0.04; liraglutide -3.4 [-6.0, -0.7], P=0.01), while sitagliptin did not (-1.4 [-5.1, 2.3], P=0.46). DISCUSSION/SIGNIFICANCE: Activation of the GLP-1 pathway by liraglutide or sitagliptin improves FMD independent of weight loss, while PAI-1 improvement is weight-loss dependent and is only seen after liraglutide or diet. Our study suggests the cardiovascular benefit of liraglutide may be due to combined improvements in endothelial vasodilatory and fibrinolytic function.
A Mediterranean-style eating pattern (MED-EP) may include moderate red meat intake. However, it is unknown if the pro-atherogenic metabolite trimethylamine N-oxide (TMAO) is affected by the amount of red meat consumed with a MED-EP. The results presented are from a secondary, retrospective objective of an investigator-blinded, randomised, crossover, controlled feeding trial (two 5-week interventions separated by a 4-week washout) to determine if a MED-EP with 200 g unprocessed lean red meat/week (MED-CONTROL) reduces circulating TMAO concentrations compared to a MED-EP with 500 g unprocessed lean red meat/week (MED-RED). Participants were seventy-seven women and twelve men (n 39 total) who were either overweight or obese (BMI: mean (30·5) (sem 0·3) kg/m2). Serum samples were obtained following an overnight fast both before (pre) and after (post) each intervention. Fasting serum TMAO, choline, carnitine and betaine concentrations were measured using a targeted liquid chromatography-MS. Data were analysed to assess if (a) TMAO and related metabolites differed by intervention and (b) if changes in TMAO were associated with changes in Framingham 10-year risk score. Serum TMAO was lower post-intervention following MED-CONTROL compared with MED-RED intervention (post-MED-CONTROL 3·1 (sem 0·2) µmv. post-MED-RED 5·0 (sem 0·5) µm, P < 0·001), and decreased following MED-CONTROL (pre- v. post-MED-CONTROL, P = 0·025). Exploratory analysis using mixed model ANCOVA identified a positive association between changes in TMAO and changes in homoeostatic model assessment of insulin resistance (P = 0·036). These results suggest that lower amounts of red meat intake lead to lower TMAO concentrations in the context of a MED-EP.
Pharmacogenomic testing has emerged to aid medication selection for patients with major depressive disorder (MDD) by identifying potential gene-drug interactions (GDI). Many pharmacogenomic tests are available with varying levels of supporting evidence, including direct-to-consumer and physician-ordered tests. We retrospectively evaluated the safety of using a physician-ordered combinatorial pharmacogenomic test (GeneSight) to guide medication selection for patients with MDD in a large, randomized, controlled trial (GUIDED).
Materials and Methods
Patients diagnosed with MDD who had an inadequate response to ≥1 psychotropic medication were randomized to treatment as usual (TAU) or combinatorial pharmacogenomic test-guided care (guided-care). All received combinatorial pharmacogenomic testing and medications were categorized by predicted GDI (no, moderate, or significant GDI). Patients and raters were blinded to study arm, and physicians were blinded to test results for patients in TAU, through week 8. Measures included adverse events (AEs, present/absent), worsening suicidal ideation (increase of ≥1 on the corresponding HAM-D17 question), or symptom worsening (HAM-D17 increase of ≥1). These measures were evaluated based on medication changes [add only, drop only, switch (add and drop), any, and none] and study arm, as well as baseline medication GDI.
Results
Most patients had a medication change between baseline and week 8 (938/1,166; 80.5%), including 269 (23.1%) who added only, 80 (6.9%) who dropped only, and 589 (50.5%) who switched medications. In the full cohort, changing medications resulted in an increased relative risk (RR) of experiencing AEs at both week 4 and 8 [RR 2.00 (95% CI 1.41–2.83) and RR 2.25 (95% CI 1.39–3.65), respectively]. This was true regardless of arm, with no significant difference observed between guided-care and TAU, though the RRs for guided-care were lower than for TAU. Medication change was not associated with increased suicidal ideation or symptom worsening, regardless of study arm or type of medication change. Special attention was focused on patients who entered the study taking medications identified by pharmacogenomic testing as likely having significant GDI; those who were only taking medications subject to no or moderate GDI at week 8 were significantly less likely to experience AEs than those who were still taking at least one medication subject to significant GDI (RR 0.39, 95% CI 0.15–0.99, p=0.048). No other significant differences in risk were observed at week 8.
Conclusion
These data indicate that patient safety in the combinatorial pharmacogenomic test-guided care arm was no worse than TAU in the GUIDED trial. Moreover, combinatorial pharmacogenomic-guided medication selection may reduce some safety concerns. Collectively, these data demonstrate that combinatorial pharmacogenomic testing can be adopted safely into clinical practice without risking symptom degradation among patients.
An acute gastroenteritis (AGE) outbreak caused by a norovirus occurred at a hospital in Shanghai, China, was studied for molecular epidemiology, host susceptibility and serological roles. Rectal and environmental swabs, paired serum samples and saliva specimens were collected. Pathogens were detected by real-time polymerase chain reaction and DNA sequencing. Histo-blood group antigens (HBGA) phenotypes of saliva samples and their binding to norovirus protruding proteins were determined by enzyme-linked immunosorbent assay. The HBGA-binding interfaces and the surrounding region were analysed by the MegAlign program of DNAstar 7.1. Twenty-seven individuals in two care units were attacked with AGE at attack rates of 9.02 and 11.68%. Eighteen (78.2%) symptomatic and five (38.4%) asymptomatic individuals were GII.6/b norovirus positive. Saliva-based HBGA phenotyping showed that all symptomatic and asymptomatic cases belonged to A, B, AB or O secretors. Only four (16.7%) out of the 24 tested serum samples showed low blockade activity against HBGA-norovirus binding at the acute phase, whereas 11 (45.8%) samples at the convalescence stage showed seroconversion of such blockade. Specific blockade antibody in the population played an essential role in this norovirus epidemic. A wide HBGA-binding spectrum of GII.6 supports a need for continuous health attention and surveillance in different settings.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
In support of the ICRF experiments planned on the Wendelstein 7-X (W7-X) stellarator, i.e. fast ion generation, wall conditioning, target plasma production and heating, a first experimental study on plasma production has been made in the Uragan-2M (U-2M) stellarator using W7-X-like two-strap antenna. In all the experiments, antenna monopole phasing was used. The W7-X-like antenna operation with launched radiofrequency power of ~100 kW have been performed in helium (p = (4–14) × 10−2 Pa) with the vacuum vessel walls pre-loaded with hydrogen. Production of plasma with a density higher than 1012 cm−3 was observed near the first harmonic of the hydrogen cyclotron frequency. Operation at first hydrogen harmonic is feasible in W7-X future ICRF experiments.
At present, analysis of diet and bladder cancer (BC) is mostly based on the intake of individual foods. The examination of food combinations provides a scope to deal with the complexity and unpredictability of the diet and aims to overcome the limitations of the study of nutrients and foods in isolation. This article aims to demonstrate the usability of supervised data mining methods to extract the food groups related to BC. In order to derive key food groups associated with BC risk, we applied the data mining technique C5.0 with 10-fold cross-validation in the BLadder cancer Epidemiology and Nutritional Determinants study, including data from eighteen case–control and one nested case–cohort study, compromising 8320 BC cases out of 31 551 participants. Dietary data, on the eleven main food groups of the Eurocode 2 Core classification codebook, and relevant non-diet data (i.e. sex, age and smoking status) were available. Primarily, five key food groups were extracted; in order of importance, beverages (non-milk); grains and grain products; vegetables and vegetable products; fats, oils and their products; meats and meat products were associated with BC risk. Since these food groups are corresponded with previously proposed BC-related dietary factors, data mining seems to be a promising technique in the field of nutritional epidemiology and deserves further examination.
Diffusion tensor imaging (DTI) is a relatively new imaging technique that is being increasingly used in different types of psychiatric patologies to characterize white matter microstructural organization in this kind of disorders. In the present study we use DTI to explore the structure of the white matter of borderline personality disorder (BPD) patients, using a novel voxel-based approach, tract-based spatial statistics (TBSS), to analyze the data.
Methods and materials
DTI was performed in a 1.5T MRI unit in 9 young male patients with a DSM-IV defined BPD and 14 healthy male control subjects (no significant age difference between groups).Voxel wise analysis was performed using TBSS (diffusion toolbox of FSL- functional MRI Software Library) to localize regions of white matter showing significant changes of fractional anisotropy (FA). Additional high resolution three dimensional datasets were also acquired and normalised white matter volume was estimated with SIENAX (part of FSL).
Results
The TBSS analysis revealed a statistically significant decrease in FA at the anterior part of the body and the genu of the corpus callosum and frontal white matter. This finding is consistent with previously reported findings of subtle prefrontal white matter abnormalities in BPD.
Conclusion
Significant white matter tract alterations in patients with BPD where observed in frontal regions involved in emotional, behavioural and cognitive regulation, and these abnormalities may be linked to key aspects of psychopatology in these patients.
Life events and accompanying psychological and behavioral reactions frequently have an impact upon people's daily lives and are believed to predispose them to disease. Psychological stressors impact many physiological and pathological disease outcomes, including mental illness. Positive social interactions have in turn been shown to exert powerful beneficial effects on health outcomes and longevity.
Objectives
The Objective of this study was to analyze the relationships of Psychological Distress, Social Support, and Mental Fitness among patients of mental health services.
Aims
This article aims to discuss the evidence supporting the mediating effect of social support between psychological stress and mental health.
Methods
This study was performed on patients who visited the mental health services in Daejeon from October to December 2011. In total, 395 patients were evaluated with Mental Fitness Scale, Kessler Psychological Distress Scale(KPDS), and Multidimensional Scale of Perceived Social Support(MSPSS).
Results
Correlations among variables of psychological distress and social support on subordinate variable of mental fitness of patients were significant. The result of the regression analysis, psychological distress and social support have a positively significant influence on mental fitness of patients. social support showed mediating effects between psychological distress and mental fitness.
Conclusion
These results suggest that health care providers ought to seek social support for patients, in order to provide positive mental fitness of patients.
We aimed to comprehensively examine the association of breast-feeding, types and initial timing of complementary foods with adolescent cognitive development in low- and middle-income countries. We conducted a prospective cohort study of 745 adolescents aged 10–12 years who were born to women who participated in a randomised trial of prenatal micronutrient supplementation in rural Western China. An infant feeding index was constructed based on the current WHO recommendations. Full-scale intelligence quotient (FSIQ) was assessed and derived by the fourth edition of the Wechsler Intelligence Scale for Children. The duration of exclusive or any breast-feeding was not significantly associated with adolescent cognitive development. Participants who regularly consumed Fe-rich or Fe-fortified foods during 6–23 months of age had higher FSIQ than those who did not (adjusted mean differences 4·25; 95 % CI 1·99, 6·51). For cows’/goats’ milk and high protein-based food, the highest FSIQ was found in participants who initially consumed at 10–12 and 7–9 months, respectively. A strong dose–response relationship of the composite infant feeding index was also identified, with participants in the highest tertile of overall feeding quality having 3·03 (95 % CI 1·37, 4·70) points higher FSIQ than those in the lowest tertile. These findings suggest that appropriate infant feeding practices (breast-feeding plus timely introduction of appropriate complementary foods) were associated with significantly improved early adolescent cognitive development scores in rural China. In addition, improvement in Fe-rich or Fe-fortified foods complementary feeding may produce better adolescent cognitive development outcomes.
Rumen-protected betaine (RPB) can enhance betaine absorption in the small intestine of ruminants, while betaine can alter fat distribution and has the potential to affect the meat quality of livestock. Hence, we hypothesized that RPB might also affect the meat quality of lambs. Sixty male Hu sheep of similar weight (30.47 ± 2.04 kg) were selected and randomly subjected to five different treatments. The sheep were fed a control diet (control treatment, CTL); 1.1 g/day unprotected-betaine supplemented diet (UPB); or doses of 1.1 g/day (low RPB treatment; L-PB), 2.2 g/day (middle RPB treatment; M-PB) or 3.3 g/day (high RPB treatment; H-PB) RPB-supplemented diet for 70 days. Slaughter performance, meat quality, fatty acid and amino acid content in the longissimus dorsi (LD) muscle, shoulder muscle (SM) and gluteus muscle (GM) were measured. Compared with CTL, betaine (including UPB and RPB) supplementation increased the average daily weight gain (ADG) (P < 0.05) and average daily feed intake (P < 0.01) of lambs. Rumen-protected betaine increased ADG (P < 0.05) compared with UPB. With increasing RPB doses, the eye muscle area of the lambs linearly increased (P < 0.05). Compared with CTL, betaine supplementation decreased water loss (P < 0.05) in SM and increased pH24 in the SM (P < 0.05) and GM (P < 0.05). Compared with UPB, RPB decreased water loss in the GM (P < 0.01), decreased shear force (P < 0.05) in the LD and SM and increased the pH of the meat 24 h after slaughter (pH24). With increasing RPB doses, the shear force and b* value in the LD linearly decreased (P < 0.05), and the pH24 of the meat quadratically increased (P < 0.05). Compared with CTL, betaine supplementation increased the polyunsaturated fatty acid in the GM (P < 0.05). Compared with UPB, RPB supplementation decreased the saturated fatty acid (SFA) content in the LD (P < 0.05) and increased the unsaturated fatty acids (UFA), mono-unsaturated fatty acids and UFA/SFA ratio in the LD (P < 0.05). Compared with CTL, the content of histidine in the LD increased with betaine supplementation. Compared with UPB, RPB supplementation increased the content of total free amino acids and flavor amino acids in the LD of lambs (P < 0.05). With increasing RPB, the isoleucine and phenylalanine contents in the LD linearly increased (P < 0.05). Overall, the data collected indicated that the meat quality of lambs (especially in the LD) improved as a result of betaine supplementation, and RPB showed better effects than those of UPB.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Background: Canadian Stroke Guidelines recommend that Transient Ischemic Attack (TIA) patients at highest risk of stroke recurrence should undergo immediate vascular imaging. Computed tomography angiography (CTA) of the head and neck is recommended over carotid doppler because it allows for enhanced visualization of the intracranial and posterior circulation vasculature. Imaging while patients are in the emergency department (ED) is optimal for high-risk patients because the risk of stroke recurrence is highest in the first 48 hours. Aim Statement: At our hospital, a designated stroke centre, less than 5% of TIA patients meet national recommendations by undergoing CTA in the ED. We sought to increase the rate of CTA in high risk ED TIA patients from less than 5% to at least 80% in 10 months. Measures & Design: We used a multi-faceted approach to improve our adherence to guidelines including: 1) education for staff ED physicians; 2) agreements between ED and radiology to facilitate rapid access to CTA; 3) agreements between ED and neurology for consultations regarding patients with abnormal CTA; and 4) the creation of an electronic decision support tool to guide ED physicians as to which patients require CTA. We measured the rate of CTA in high risk patients biweekly using retrospective chart review of patients referred to the TIA clinic from the ED on a biweekly basis. As a balancing measure, we also measured the rate of CTA in non-high risk patients. Evaluation/Results: Data collection is ongoing. An interim run chart at 19 weeks shows a complete shift above the median after implementation, with CTA rates between 70 and 100%. At the time of submission, we had no downward trends below 80%, showing sustained improvement. The CTA rate in non-high risk patients did also increase. Disucssion/Impact: After 19 weeks of our intervention, 112 (78.9%) of high risk TIA patients had a CTA, compared to 10 (9.8%) in the 19 weeks prior to our intervention. On average, 10-15% of high risk patients will have an identifiable lesion on CTA, leading to immediate change in management (at minimum, an inpatient consultation with neurology). Our multi-faceted approach could be replicated in any ED with the engagement and availability of the same multi-disciplinary team (ED, radiology, and neurology), access to CTA, and electronic orders.
A stable reference gene is a key prerequisite for accurate assessment of gene expression. At present, the real-time reverse transcriptase quantitative polymerase chain reaction has been widely used in the analysis of gene expression in a variety of organisms. Neoseiulus barkeri Hughes (Acari: Phytoseiidae) is a major predator of mites on many important economically crops. Until now, however, there are no reports evaluating the stability of reference genes in this species. In view of this, we used GeNorm, NormFinder, BestKeeper, and RefFinder software tools to evaluate the expression stability of 11 candidate reference genes in developmental stages and under various abiotic stresses. According to our results, β-ACT and Hsp40 were the top two stable reference genes in developmental stages. The Hsp60 and Hsp90 were the most stable reference genes in various acaricides stress. For alterations in temperature, Hsp40 and α-TUB were the most suitable reference genes. About UV stress, EF1α and α-TUB were the best choice, and for the different prey stress, β-ACT and α-TUB were best suited. In normal conditions, the β-ACT and α-TUB were the two of the highest stable reference genes to respond to all kinds of stresses. The current study provided a valuable foundation for the further analysis of gene expression in N. barkeri.
Cow routines and behavioral responses are altered substantially following the installation of robot milking. The present study was designed to analyze the effect that switching from milking parlor to automatic milking system (AMS) had on the culling rate (due to various causes) of dairy cattle. For this purpose, culling records and causes for culling were tracked in 23 dairy farms in the Galicia region (NW Spain). The animals in these farms were monitored for 5 years. For the present study, that length of time was divided into three different stages, as follows: 2 years before switching from a milking parlor to AMS (stage 1), the 1st year following the implementation of AMS (stage 2) and the 2nd and 3rd years succeeding the implementation of AMS (stage 3). Cox models for survival analysis were used to estimate the time to culling due to different reasons during stage 1 in relation to stages 2 and 3. The data indicated that the risk of loss due to death or emergency slaughter decreased significantly following the installation of AMS. In contrast, the risk of culling due to low production, udder problems, infertility or lameness increased significantly. Low-production cows (such as cows in advanced lactation due to infertility) or sick cows (such as mastitic or lame cows) allegedly have a noticeable effect both on the performance and the amortization of the cost of AMS, which in turn would lead to a higher probability of elimination than in conventional systems.
Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88) presented a critique of our recently published paper in Cell Reports entitled ‘Large-Scale Cognitive GWAS Meta-Analysis Reveals Tissue-Specific Neural Expression and Potential Nootropic Drug Targets’ (Lam et al., Cell Reports, Vol. 21, 2017, 2597–2613). Specifically, Hill offered several interrelated comments suggesting potential problems with our use of a new analytic method called Multi-Trait Analysis of GWAS (MTAG) (Turley et al., Nature Genetics, Vol. 50, 2018, 229–237). In this brief article, we respond to each of these concerns. Using empirical data, we conclude that our MTAG results do not suffer from ‘inflation in the FDR [false discovery rate]’, as suggested by Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88), and are not ‘more relevant to the genetic contributions to education than they are to the genetic contributions to intelligence’.