To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To analyse the results of treatment for nasolabial cysts according to whether an intraoral sublabial or endoscopic transnasal approach was used, and to determine the recent surgical trend in our hospital.
Twenty-four patients with a histopathologically and radiologically confirmed nasolabial cyst between January 2010 and December 2017 were enrolled in this study.
Nasolabial cysts were predominant in females (91.7 per cent) and on the left side (54.2 per cent). Treatment involved an intraoral sublabial approach in 12 cases (48.0 per cent) and a transnasal endoscopic approach in 13 cases (52.0 per cent). In 13 cases (52.0 per cent) surgery was performed under local anaesthesia, while in 12 cases (48.0 per cent) it was conducted under general anaesthesia. The most common post-operative complications were numbness of the upper lip or teeth (n = 9, 36.0 per cent). Only one patient (4.0 per cent), who underwent a transnasal endoscopic approach, experienced a reoccurrence.
Surgical resection through an intraoral sublabial or transnasal endoscopic approach is the best treatment for a nasolabial cyst, showing very good results and a low recurrence rate. The recent surgical trend in our hospital is to treat nasolabial cysts using a transnasal endoscopic approach under local anaesthesia.
To analyse how the auditory brainstem response changes in patients with sudden sensorineural hearing loss.
Data were collected via retrospective medical chart review.
Forty-three patients were included in this study. The mean latency of auditory brainstem response wave 1 was significantly longer for the affected side than for the unaffected side (p = 0.003). The mean latency of auditory brainstem response wave 1 was significantly shorter, and the mean amplitude of auditory brainstem response wave 1 was significantly larger, in the good response group compared to the poor response group. In forward conditional logistic regression analysis, auditory brainstem response wave 1 latency was an independent predictor of a good response (odds ratio = 34.37, 95 per cent confidence interval = 1.56–757.15, p = 0.025).
In patients with sudden sensorineural hearing loss, the latency of wave 1 of the auditory brainstem response was significantly increased and was related to prognosis.
Maternal systemic inflammation during pregnancy may restrict embryo−fetal growth, but the extent of this effect remains poorly established in undernourished populations. In a cohort of 653 maternal−newborn dyads participating in a multi-armed, micronutrient supplementation trial in southern Nepal, we investigated associations between maternal inflammation, assessed by serum α1-acid glycoprotein and C-reactive protein, in the first and third trimesters of pregnancy, and newborn weight, length and head and chest circumferences. Median (IQR) maternal concentrations in α1-acid glycoprotein and C-reactive protein in the first and third trimesters were 0.65 (0.53–0.76) and 0.40 (0.33–0.50) g/l, and 0.56 (0.25–1.54) and 1.07 (0.43–2.32) mg/l, respectively. α1-acid glycoprotein was inversely associated with birth size: weight, length, head circumference and chest circumference were lower by 116 g (P = 2.3 × 10−6), and 0.45 (P = 3.1 × 10−5), 0.18 (P = 0.0191) and 0.48 (P = 1.7 × 10−7) cm, respectively, per 50% increase in α1-acid glycoprotein averaged across both trimesters. Adjustment for maternal age, parity, gestational age, nutritional and socio-economic status and daily micronutrient supplementation failed to alter any association. Serum C-reactive protein concentration was largely unassociated with newborn size. In rural Nepal, birth size was inversely associated with low-grade, chronic inflammation during pregnancy as indicated by serum α1-acid glycoprotein.
Online self-reported 24-h dietary recall systems promise increased feasibility of dietary assessment. Comparison against interviewer-led recalls established their convergent validity; however, reliability and criterion-validity information is lacking. The validity of energy intakes (EI) reported using Intake24, an online 24-h recall system, was assessed against concurrent measurement of total energy expenditure (TEE) using doubly labelled water in ninety-eight UK adults (40–65 years). Accuracy and precision of EI were assessed using correlation and Bland–Altman analysis. Test–retest reliability of energy and nutrient intakes was assessed using data from three further UK studies where participants (11–88 years) completed Intake24 at least four times; reliability was assessed using intra-class correlations (ICC). Compared with TEE, participants under-reported EI by 25 % (95 % limits of agreement −73 % to +68 %) in the first recall, 22 % (−61 % to +41 %) for average of first two, and 25 % (−60 % to +28 %) for first three recalls. Correlations between EI and TEE were 0·31 (first), 0·47 (first two) and 0·39 (first three recalls), respectively. ICC for a single recall was 0·35 for EI and ranged from 0·31 for Fe to 0·43 for non-milk extrinsic sugars (NMES). Considering pairs of recalls (first two v. third and fourth recalls), ICC was 0·52 for EI and ranged from 0·37 for fat to 0·63 for NMES. EI reported with Intake24 was moderately correlated with objectively measured TEE and underestimated on average to the same extent as seen with interviewer-led 24-h recalls and estimated weight food diaries. Online 24-h recall systems may offer low-cost, low-burden alternatives for collecting dietary information.
Several life-threatening diseases of the kidney have their origins in mutational events that occur during embryonic development. In this study, we investigate the role of the Wolffian duct (WD), the earliest embryonic epithelial progenitor of renal tubules, in the etiology of autosomal dominant polycystic kidney disease (ADPKD). ADPKD is associated with a germline mutation of one of the two Pkd1 alleles. For the disease to occur, a second event that disrupts the expression of the other inherited Pkd1 allele must occur. We postulated that this secondary event can occur in the pronephric WD. Using Cre-Lox recombination, mice with WD-specific deletion of one or both Pkd1 alleles were generated. Homozygous Pkd1-targeted deletion in WD-derived tissues resulted in mice with large cystic kidneys and serologic evidence of renal failure. In contrast, heterozygous deletion of Pkd1 in the WD led to kidneys that were phenotypically indistinguishable from control in the early postnatal period. High-throughput sequencing, however, revealed underlying gene and microRNA (miRNA) changes in these heterozygous mutant kidneys that suggest a strong predisposition toward developing ADPKD. Bioinformatic analysis of this data demonstrated an upregulation of several miRNAs that have been previously associated with PKD; pathway analysis further demonstrated that the differentially expressed genes in the heterozygous mutant kidneys were overrepresented in signaling pathways associated with maintenance and function of the renal tubular epithelium. These results suggest that the WD may be an early epithelial target for the genetic or molecular signals that can lead to cyst formation in ADPKD.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Introduction: Individualizing risk for stroke following a transient ischemic attack (TIA) is a topic of intense research, as existing scores are context-dependent or have not been well validated. The Canadian TIA Score stratifies risk of subsequent stroke into low, moderate and high risk. Our objective was to prospectively validate the Canadian TIA Score in a new cohort of emergency department (ED) patients. Methods: We conducted a prospective cohort study in 14 Canadian EDs over 4 years. We enrolled consecutive adult patients with an ED visit for TIA or nondisabling stroke. Treating physicians recorded standardized clinical variables onto data collection forms. Given the ability of prompt emergency carotid endarterectomy (CEA) to prevent stroke (NNT = 3) in high risk patients, our primary outcome was the composite of subsequent stroke or CEA ≤7 days. We conducted telephone follow-up using the validated Questionnaire for Verifying Stroke Free Status at 7 and 90 days. Outcomes were adjudicated by panels of 3 local stroke experts, blinded to the index ED data collection form. Based on prior work, we estimated a sample size of 5,004 patients including 93 subsequent strokes, would yield 95% confidence bands of +/− 10% for sensitivity and likelihood ratio (LR). Our analyses assessed interval LRs (iLR) with 95% CIs. Results: We prospectively enrolled 7,569 patients with mean 68.4 +/−14.7 years and 52.4% female, of whom 107 (1.4%) had a subsequent stroke and 74 (1.0%) CEA ≤7 days (total outcomes = 181). We enrolled 81.2% of eligible patients; missed patients were similar to enrolled. The Canadian TIA Score stratified the stroke/CEA ≤7days risk as: Low (probability <0.2%, iLR 0.20 [95%CI 0.091-0.44]; Moderate (probability 1.3%, iLR 0.79 [0.68-0.92]; High (probability 2.6%, iLR 2.2 [1.9-2.6]. Sensitivity analysis for just stroke ≤7 days yielded similar results: Low iLR 0.17 [95%CI 0.056-0.52], Medium iLR 0.89 [0.75-1.1], High iLR 2.0 [1.6-2.4]. Conclusion: The Canadian TIA Score accurately identifies TIA patients risk for stroke/CEA ≤7 days. Patients classified as low risk can be safely discharged following a careful ED assessment with elective follow-up. Patients at moderate risk can undergo additional testing in the ED, have antithrombotic therapy optimized, and be offered early stroke specialist follow-up. Patients at high risk should in most cases be fully investigated and managed ideally in consultation with a stroke specialist during their index ED visit.
Introduction: Many drugs, including cannabis and alcohol, cause impairment and contribute to motor vehicle collisions (MVCs). Policy makers require knowledge of the prevalence of drug use in crash-involved drivers, and types of drugs used in order to develop effective prevention programs. This issue is particularly relevant with the recent legalization of cannabis. We aim to study the prevalence of alcohol, cannabis, sedating medications, and other drugs in injured drivers from 4 Canadian Provinces. Methods: This prospective cohort study obtained excess clinical blood samples from consecutive injured drivers who attended a participating Canadian trauma centre following a MVC. Blood samples were analyzed using a broad spectrum toxicology screen capable of detecting cannabinoids, cocaine, amphetamines (including their major analogues), and opioids as well as psychotropic pharmaceuticals (including antihistamines, benzodiazepines, other hypnotics, and sedating antidepressants). Alcohol and cannabinoids were quantified. Health records were reviewed to extract demographic, medical, and MVC information using a standardized data collection tool. Results: This study has been collecting data in 4 trauma centres in British Columbia (BC) since 2011 and was launched in 2 trauma centres in Alberta (AB), 1 in Saskatchewan (SK), and 2 in Ontario (ON) in 2018. In preliminary results from BC (n = 2412), 8% of injured drivers tested positive for THC and 13% for alcohol. Preliminary results from other provinces (n = 301) suggest a regional variation in prevalence of drivers testing positive for THC (10% - 27%), alcohol (17% - 29%), and other drugs. By May 2018, an estimated 4500 cases from BC, 600 from AB, 150 from SK, and 650 from ON will have been analyzed. We will report the prevalence of positive tests for alcohol, THC, other recreational drugs, and sedating medications, pre and post cannabis legalization. The number of cases with alcohol and/or THC levels above Canadian per se limits will also be reported. Results will be reported according to province, driver sex, age, single vs. multi vehicle crashes, and requirement for hospital admission. Conclusion: This will be among the largest international datasets on drug use by injured drivers. Our findings will provide patterns of drug and alcohol impairment in 4 Canadian provinces pre and post cannabis legalization. The significance of these findings and implication for impaired driving policy and prevention programs in Canada will be discussed.
Identifying routes of transmission among hospitalized patients during a healthcare-associated outbreak can be tedious, particularly among patients with complex hospital stays and multiple exposures. Data mining of the electronic health record (EHR) has the potential to rapidly identify common exposures among patients suspected of being part of an outbreak.
We retrospectively analyzed 9 hospital outbreaks that occurred during 2011–2016 and that had previously been characterized both according to transmission route and by molecular characterization of the bacterial isolates. We determined (1) the ability of data mining of the EHR to identify the correct route of transmission, (2) how early the correct route was identified during the timeline of the outbreak, and (3) how many cases in the outbreaks could have been prevented had the system been running in real time.
Correct routes were identified for all outbreaks at the second patient, except for one outbreak involving >1 transmission route that was detected at the eighth patient. Up to 40 or 34 infections (78% or 66% of possible preventable infections, respectively) could have been prevented if data mining had been implemented in real time, assuming the initiation of an effective intervention within 7 or 14 days of identification of the transmission route, respectively.
Data mining of the EHR was accurate for identifying routes of transmission among patients who were part of the outbreak. Prospective validation of this approach using routine whole-genome sequencing and data mining of the EHR for both outbreak detection and route attribution is ongoing.
Polyphenol oxidase (PPO) in red clover (RC) has been shown to reduce both lipolysis and proteolysis in silo and implicated (in vitro) in the rumen. However, all in vivo comparisons have compared RC with other forages, typically with lower levels of PPO, which brings in other confounding factors as to the cause for the greater protection of dietary nitrogen (N) and C18 polyunsaturated fatty acids (PUFA) on RC silage. This study compared two RC silages which when ensiled had contrasting PPO activities (RC+ and RC−) against a control of perennial ryegrass silage (PRG) to ascertain the effect of PPO activity on dietary N digestibility and PUFA biohydrogenation. Two studies were performed the first to investigate rumen and duodenal flow with six Hereford×Friesian steers, prepared with rumen and duodenal cannulae, and the second investigating whole tract N balance using six Holstein-Friesian non-lactating dairy cows. All diets were offered at a restricted level based on animal live weight with each experiment consisting of two 3×3 Latin squares using big bale silages ensiled in 2010 and 2011, respectively. For the first experiment digesta flow at the duodenum was estimated using a dual-phase marker system with ytterbium acetate and chromium ethylenediaminetetraacetic acid as particulate and liquid phase markers, respectively. Total N intake was higher on the RC silages in both experiments and higher on RC− than RC+. Rumen ammonia-N reflected intake with ammonia-N per unit of N intake lower on RC+ than RC−. Microbial N duodenal flow was comparable across all silage diets with non-microbial N higher on RC than the PRG with no difference between RC+ and RC−, even when reported on a N intake basis. C18 PUFA biohydrogenation was lower on RC silage diets than PRG but with no difference between RC+ and RC−. The N balance trial showed a greater retention of N on RC+ over RC−; however, this response is likely related to the difference in N intake over any PPO driven protection. The lack of difference between RC silages, despite contrasting levels of PPO, may reflect a similar level of protein-bound-phenol complexing determined in each RC silage. Previously this complexing has been associated with PPOs protection mechanism; however, this study has shown that protection is not related to total PPO activity.
To assess variability in antimicrobial use and associations with infection testing in pediatric ventilator-associated events (VAEs).
Descriptive retrospective cohort with nested case-control study.
Pediatric intensive care units (PICUs), cardiac intensive care units (CICUs), and neonatal intensive care units (NICUs) in 6 US hospitals.
Children≤18 years ventilated for≥1 calendar day.
We identified patients with pediatric ventilator-associated conditions (VACs), pediatric VACs with antimicrobial use for≥4 days (AVACs), and possible ventilator-associated pneumonia (PVAP, defined as pediatric AVAC with a positive respiratory diagnostic test) according to previously proposed criteria.
Among 9,025 ventilated children, we identified 192 VAC cases, 43 in CICUs, 70 in PICUs, and 79 in NICUs. AVAC criteria were met in 79 VAC cases (41%) (58% CICU; 51% PICU; and 23% NICU), and varied by hospital (CICU, 20–67%; PICU, 0–70%; and NICU, 0–43%). Type and duration of AVAC antimicrobials varied by ICU type. AVAC cases in CICUs and PICUs received broad-spectrum antimicrobials more often than those in NICUs. Among AVAC cases, 39% had respiratory infection diagnostic testing performed; PVAP was identified in 15 VAC cases. Also, among AVAC cases, 73% had no associated positive respiratory or nonrespiratory diagnostic test.
Antimicrobial use is common in pediatric VAC, with variability in spectrum and duration of antimicrobials within hospitals and across ICU types, while PVAP is uncommon. Prolonged antimicrobial use despite low rates of PVAP or positive laboratory testing for infection suggests that AVAC may provide a lever for antimicrobial stewardship programs to improve utilization.
We observed pediatric S. aureus hospitalizations decreased 36% from 26.3 to 16.8 infections per 1,000 admissions from 2009 to 2016, with methicillin-resistant S. aureus (MRSA) decreasing by 52% and methicillin-susceptible S. aureus decreasing by 17%, among 39 pediatric hospitals. Similar decreases were observed for days of therapy of anti-MRSA antibiotics.
The white-backed planthopper, Sogatella furcifera (Horváth) (Hemiptera, Delphacidae), has emerged as a serious rice pest in Asia. In the present study, 12 microsatellite markers were employed to investigate the genetic structure, diversity and migration route of 43 populations sampled from seven Asian countries (Bangladesh, China, Korea, Laos, Nepal, Thailand, and Vietnam). According to the isolation by distance analysis, a significant positive correlation was observed between genetic and geographic distances by the Mantel test (r2 = 0.4585, P = 0.01), indicating the role of geographic isolation in the genetic structure of S. furcifera. A population assignment test using the first-generation migrants detection method (thresholds a = 0.01) revealed southern China and northern Vietnam as the main sources of S. furcifera in Korea. Nepal and Bangladesh might be additional potential sources via interconnection with Vietnam populations. This paper provides useful data for the migration route and origin of S. furcifera in Korea and will contribute to planthopper resistance management.
A cluster of Salmonella Paratyphi B variant L(+) tartrate(+) infections with indistinguishable pulsed-field gel electrophoresis patterns was detected in October 2015. Interviews initially identified nut butters, kale, kombucha, chia seeds and nutrition bars as common exposures. Epidemiologic, environmental and traceback investigations were conducted. Thirteen ill people infected with the outbreak strain were identified in 10 states with illness onset during 18 July–22 November 2015. Eight of 10 (80%) ill people reported eating Brand A raw sprouted nut butters. Brand A conducted a voluntary recall. Raw sprouted nut butters are a novel outbreak vehicle, though contaminated raw nuts, nut butters and sprouted seeds have all caused outbreaks previously. Firms producing raw sprouted products, including nut butters, should consider a kill step to reduce the risk of contamination. People at greater risk for foodborne illness may wish to consider avoiding raw products containing raw sprouted ingredients.
Chlamydia trachomatis (CT) infections remain highly prevalent. CT reinfection occurs frequently within months after treatment, likely contributing to sustaining the high CT infection prevalence. Sparse studies have suggested CT reinfection is associated with a lower organism load, but it is unclear whether CT load at the time of treatment influences CT reinfection risk. In this study, women presenting for treatment of a positive CT screening test were enrolled, treated and returned for 3- and 6-month follow-up visits. CT organism loads were quantified at each visit. We evaluated for an association of CT bacterial load at initial infection with reinfection risk and investigated factors influencing the CT load at baseline and follow-up in those with CT reinfection. We found no association of initial CT load with reinfection risk. We found a significant decrease in the median log10 CT load from baseline to follow-up in those with reinfection (5.6 CT/ml vs. 4.5 CT/ml; P = 0.015). Upon stratification of reinfected subjects based upon presence or absence of a history of CT infections prior to their infection at the baseline visit, we found a significant decline in the CT load from baseline to follow-up (5.7 CT/ml vs. 4.3 CT/ml; P = 0.021) exclusively in patients with a history of CT infections prior to our study. Our findings suggest repeated CT infections may lead to possible development of partial immunity against CT.
Our ALMA observations of HCO+ and HCN show such redshifted absorption toward an isolated core, BHR 71. Both lines show a similar redshifted absorption profile. We also found emissions of complex organic molecules (COMs) around 345 GHz from a compact region centered on the continuum source, which is barely resolved with a beam of 0″27, corresponding to ∼50 AU.
BACKGROUND: Meningiomas are the most common primary benign brain tumors in adults. Given the extended life expectancy of most meningiomas, consideration of quality of life (QOL) is important when selecting the optimal management strategy. There is currently a dearth of meningioma-specific QOL tools in the literature. OBJECTIVE: In this systematic review, we analyze the prevailing themes and propose toward building a meningioma-specific QOL assessment tool. METHODS: A systematic search was conducted, and only original studies based on adult patients were considered. QOL tools used in the various studies were analyzed for identification of prevailing themes in the qualitative analysis. The quality of the studies was also assessed. RESULTS: Sixteen articles met all inclusion criteria. Fifteen different QOL assessment tools assessed social and physical functioning, psychological, and emotional well-being. Patient perceptions and support networks had a major impact on QOL scores. Surgery negatively affected social functioning in younger patients, while radiation therapy had a variable impact. Any intervention appeared to have a greater negative impact on physical functioning compared to observation. CONCLUSION: Younger patients with meningiomas appear to be more vulnerable within social and physical functioning domains. All of these findings must be interpreted with great caution due to great clinical heterogeneity, limited generalizability, and risk of bias. For meningioma patients, the ideal QOL questionnaire would present outcomes that can be easily measured, presented, and compared across studies. Existing scales can be the foundation upon which a comprehensive, standard, and simple meningioma-specific survey can be prospectively developed and validated.
OBJECTIVES/SPECIFIC AIMS: Reducing radiologic exams has been a focus of cost reduction in healthcare systems. The utility and justification of obtaining cross-sectional imaging (PPCSI) before surgical intervention continues to be evaluated. For peripheral artery disease (PAD) consensus guidelines regarding PPCSI do not exist and may be influenced by patient complexity, variation of disease presentation, and physician preference. The objective of this study was to determine the utility of PPCSI before percutaneous PAD intervention. METHODS/STUDY POPULATION: Patients receiving first-time endovascular revascularization procedure for PAD from 2013 to 2015 were evaluated for PPCSI done within 180 days prior to revascularization. Patient and physician demographics, perioperative characteristics, and disease distribution/severity were evaluated. The primary outcome was technical success defined as improving inflow and/or revascularization of the target outflow vessels to <50% stenosis. RESULTS/ANTICIPATED RESULTS: Of the 348 patients who underwent an attempted revascularization procedure 159 (45.7%) patients underwent PPCSI, including 151 CTA and 8 MRA. Of these, 48% were ordered by the referring provider (84% at an outside institution), and 52% were ordered by the treating physician. PPCSI was performed a median of 26 days (IQR 9-53) prior to procedure. Individual vascular surgeon practice identified PPCSI rates ranging from 31% to 70%. On multivariate analysis chronic kidney disease (OR=0.35; CI 0.17–0.73) had the strongest effect against of PPCSI, and Inpatient/ED evaluation (OR=3.20; CI 1.58–6.50), aorto-iliac (OR=2.78; CI 1.46–5.29) and femoral-popliteal occlusions (OR=2.51; CI 1.38–4.55) most strongly predicted PPCSI. After excluding 31 diagnostic procedures, technical success did not differ between endovascular procedures with PPSCI (91.3%) or without PPCSI (85.6%), p=0.11. When analyzing 89 femoral-popliteal occlusions, technical success was higher with PPCSI (88%) compared to procedures without PPSCI (69%), p=0.026. DISCUSSION/SIGNIFICANCE OF IMPACT: PPCSI use is influenced by inpatient status, chronic kidney disease, and anatomic consideration. PPCSI was not associated with overall technical success although it appeared beneficial for femoral-popliteal occlusions. Routine practices of ordering of PPCSI may not be warranted when considering technical success but may be important in treatment planning. Further studies are warranted to determine if radiation, cost, and contrast load justify PPCSI.
Childhood obesity rates are higher among Indigenous compared with non-Indigenous Australian children. It has been hypothesized that early-life influences beginning with the intrauterine environment predict the development of obesity in the offspring. The aim of this paper was to assess, in 227 mother–child dyads from the Gomeroi gaaynggal cohort, associations between prematurity, Gestation Related-Optimal Weight (GROW) centiles, maternal adiposity (percentage body fat, visceral fat area), maternal non-fasting plasma glucose levels (measured at mean gestational age of 23.1 weeks) and offspring BMI and adiposity (abdominal circumference, subscapular skinfold thickness) in early childhood (mean age 23.4 months). Maternal non-fasting plasma glucose concentrations were positively associated with infant birth weight (P=0.005) and GROW customized birth weight centiles (P=0.008). There was a significant association between maternal percentage body fat (P=0.02) and visceral fat area (P=0.00) with infant body weight in early childhood. Body mass index (BMI) in early childhood was significantly higher in offspring born preterm compared with those born at term (P=0.03). GROW customized birth weight centiles was significantly associated with body weight (P=0.01), BMI (P=0.007) and abdominal circumference (P=0.039) at early childhood. Our findings suggest that being born preterm, large for gestational age or exposed to an obesogenic intrauterine environment and higher maternal non-fasting plasma glucose concentrations are associated with increased obesity risk in early childhood. Future strategies should aim to reduce the prevalence of overweight/obesity in women of child-bearing age and emphasize the importance of optimal glycemia during pregnancy, particularly in Indigenous women.