To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Frascati international research criteria for HIV-associated neurocognitive disorders (HAND) are controversial; some investigators have argued that Frascati criteria are too liberal, resulting in a high false positive rate. Meyer et al. recommended more conservative revisions to HAND criteria, including exploring other commonly used methodologies for neurocognitive impairment (NCI) in HIV including the global deficit score (GDS). This study compares NCI classifications by Frascati, Meyer, and GDS methods, in relation to neuroimaging markers of brain integrity in HIV.
Two hundred forty-one people living with HIV (PLWH) without current substance use disorder or severe (confounding) comorbid conditions underwent comprehensive neurocognitive testing and brain structural magnetic resonance imaging and magnetic resonance spectroscopy. Participants were classified using Frascati criteria versus Meyer criteria: concordant unimpaired [Frascati(Un)/Meyer(Un)], concordant impaired [Frascati(Imp)/Meyer(Imp)], or discordant [Frascati(Imp)/Meyer(Un)] which were impaired via Frascati criteria but unimpaired via Meyer criteria. To investigate the GDS versus Meyer criteria, the same groupings were utilized using GDS criteria instead of Frascati criteria.
When examining Frascati versus Meyer criteria, discordant Frascati(Imp)/Meyer(Un) individuals had less cortical gray matter, greater sulcal cerebrospinal fluid volume, and greater evidence of neuroinflammation (i.e., choline) than concordant Frascati(Un)/Meyer(Un) individuals. GDS versus Meyer comparisons indicated that discordant GDS(Imp)/Meyer(Un) individuals had less cortical gray matter and lower levels of energy metabolism (i.e., creatine) than concordant GDS(Un)/Meyer(Un) individuals. In both sets of analyses, the discordant group did not differ from the concordant impaired group on any neuroimaging measure.
The Meyer criteria failed to capture a substantial portion of PLWH with brain abnormalities. These findings support continued use of Frascati or GDS criteria to detect HIV-associated CNS dysfunction.
Objectives: Studies of neurocognitively elite older adults, termed SuperAgers, have identified clinical predictors and neurobiological indicators of resilience against age-related neurocognitive decline. Despite rising rates of older persons living with HIV (PLWH), SuperAging (SA) in PLWH remains undefined. We aimed to establish neuropsychological criteria for SA in PLWH and examined clinically relevant correlates of SA. Methods: 734 PLWH and 123 HIV-uninfected participants between 50 and 64 years of age underwent neuropsychological and neuromedical evaluations. SA was defined as demographically corrected (i.e., sex, race/ethnicity, education) global neurocognitive performance within normal range for 25-year-olds. Remaining participants were labeled cognitively normal (CN) or impaired (CI) based on actual age. Chi-square and analysis of variance tests examined HIV group differences on neurocognitive status and demographics. Within PLWH, neurocognitive status differences were tested on HIV disease characteristics, medical comorbidities, and everyday functioning. Multinomial logistic regression explored independent predictors of neurocognitive status. Results: Neurocognitive status rates and demographic characteristics differed between PLWH (SA=17%; CN=38%; CI=45%) and HIV-uninfected participants (SA=35%; CN=55%; CI=11%). In PLWH, neurocognitive groups were comparable on demographic and HIV disease characteristics. Younger age, higher verbal IQ, absence of diabetes, fewer depressive symptoms, and lifetime cannabis use disorder increased likelihood of SA. SA reported increased independence in everyday functioning, employment, and health-related quality of life than non-SA. Conclusions: Despite combined neurological risk of aging and HIV, youthful neurocognitive performance is possible for older PLWH. SA relates to improved real-world functioning and may be better explained by cognitive reserve and maintenance of cardiometabolic and mental health than HIV disease severity. Future research investigating biomarker and lifestyle (e.g., physical activity) correlates of SA may help identify modifiable neuroprotective factors against HIV-related neurobiological aging. (JINS, 2019, 25, 507–519)
Locally acquired hepatitis A infection is re-emerging in Australia owing to person-to-person outbreaks among men who have sex with men and imported frozen produce. This paper describes a multi-state foodborne outbreak in the first half of 2018. Enhanced human epidemiological investigation including a case–control study, as well as microbial surveillance and trace-back investigations concluded that the outbreak was caused by consumption of imported frozen pomegranate arils. A total of 30 cases of hepatitis A infection, genotype IB with identical sequences met the outbreak case definition, including 27 primary cases and three secondary cases. Twenty-five (83%) of the cases were hospitalised for their illness and there was one death. Imported frozen pomegranate arils from Egypt were strongly implicated as the source of infection through case interviews (19 of 26 primary cases) as well as from a case–control study (adjusted odds ratio 43.4, 95% confidence interval 4.2–448.8, P = 0.002). Hepatitis A virus (HAV) was subsequently detected by polymerase chain reaction in two food samples of the frozen pomegranate aril product. This outbreak was detected and responded to promptly owing to routine genetic characterisation of HAVs from all hepatitis A infections in Australia as part of a national hepatitis A enhanced surveillance project. This is now the third outbreak of hepatitis A in Australia from imported frozen fruits. A re-assessment of the risk of these types of imported foods is strongly recommended.
Objectives: Human immunodeficiency virus (HIV) disproportionately affects Hispanics/Latinos in the United States, yet little is known about neurocognitive impairment (NCI) in this group. We compared the rates of NCI in large well-characterized samples of HIV-infected (HIV+) Latinos and (non-Latino) Whites, and examined HIV-associated NCI among subgroups of Latinos. Methods: Participants included English-speaking HIV+ adults assessed at six U.S. medical centers (194 Latinos, 600 Whites). For overall group, age: M=42.65 years, SD=8.93; 86% male; education: M=13.17, SD=2.73; 54% had acquired immunodeficiency syndrome. NCI was assessed with a comprehensive test battery with normative corrections for age, education and gender. Covariates examined included HIV-disease characteristics, comorbidities, and genetic ancestry. Results: Compared with Whites, Latinos had higher rates of global NCI (42% vs. 54%), and domain NCI in executive function, learning, recall, working memory, and processing speed. Latinos also fared worse than Whites on current and historical HIV-disease characteristics, and nadir CD4 partially mediated ethnic differences in NCI. Yet, Latinos continued to have more global NCI [odds ratio (OR)=1.59; 95% confidence interval (CI)=1.13–2.23; p<.01] after adjusting for significant covariates. Higher rates of global NCI were observed with Puerto Rican (n=60; 71%) versus Mexican (n=79, 44%) origin/descent; this disparity persisted in models adjusting for significant covariates (OR=2.40; CI=1.11–5.29; p=.03). Conclusions: HIV+ Latinos, especially of Puerto Rican (vs. Mexican) origin/descent had increased rates of NCI compared with Whites. Differences in rates of NCI were not completely explained by worse HIV-disease characteristics, neurocognitive comorbidities, or genetic ancestry. Future studies should explore culturally relevant psychosocial, biomedical, and genetic factors that might explain these disparities and inform the development of targeted interventions. (JINS, 2018, 24, 163–175)
Research has long noted higher prevalence rates of suicidal thoughts and behaviors among individuals with psychotic symptoms. Major theories have proposed several explanations to account for this association. Given the differences in the literature regarding the operationalization of psychosis and sample characteristics, a quantitative review is needed to determine to what extent and how psychosis confers risk for suicidality.
We searched PsycInfo, PubMed, and GoogleScholar for studies published before 1 January 2016. To be included in the analysis, studies must have used at least one psychosis-related factor to longitudinally predict suicide ideation, attempt, or death. The initial search yielded 2541 studies. Fifty studies were retained for analysis, yielding 128 statistical tests.
Suicide death was the most commonly studied outcome (43.0%), followed by attempt (39.1%) and ideation (18.0%). The median follow-up length was 7.5 years. Overall, psychosis significantly conferred risk across three outcomes, with weighted mean ORs of 1.70 (1.39–2.08) for ideation, 1.36 (1.25–1.48) for attempt, and 1.40 (1.14–1.72) for death. Detailed analyses indicated that positive symptoms consistently conferred risk across outcomes; negative symptoms were not significantly associated with ideation, and were protective against death. Some small moderator effects were detected for sample characteristics.
Psychosis is a significant risk factor for suicide ideation, attempt, and death. The finding that positive symptoms increased suicide risk and negative symptoms seemed to decrease risk sheds light on the potential mechanisms for the association between psychosis and suicidality. We note several limitations of the literature and offer suggestions for future directions.
During May 2015, an increase in Salmonella Agona cases was reported from western Sydney, Australia. We examine the public health actions used to investigate and control this increase. A descriptive case-series investigation was conducted. Six outbreak cases were identified; all had consumed cooked tuna sushi rolls purchased within a western Sydney shopping complex. Onset of illness for outbreak cases occurred between 7 April and 24 May 2015. Salmonella was isolated from food samples collected from the implicated premise and a prohibition order issued. No further cases were identified following this action. Whole genome sequence (WGS) analysis was performed on isolates recovered during this investigation, with additional S. Agona isolates from sporadic-clinical cases and routine food sampling in New South Wales, January to July 2015. Clinical isolates of outbreak cases were indistinguishable from food isolates collected from the implicated sushi outlet. Five additional clinical isolates not originally considered to be linked to the outbreak were genomically similar to outbreak isolates, indicating the point-source contamination may have started before routine surveillance identified an increase. This investigation demonstrated the value of genomics-guided public health action, where near real-time WGS enhanced the resolution of the epidemiological investigation.
To determine the length and position of a thyroidectomy scar that is cosmetically most appealing to naïve raters.
Images of thyroidectomy scars were reproduced on male and female necks using digital imaging software. Surgical variables studied were scar position and length. Fifteen raters were presented with 56 scar pairings and asked to identify which was preferred cosmetically. Twenty duplicate pairings were included to assess rater reliability. Analysis of variance was used to determine preference.
Raters preferred low, short scars, followed by high, short scars, with long scars in either position being less desirable (p < 0.05). Twelve of 15 raters had acceptable intra-rater and inter-rater reliability.
Naïve raters preferred low, short scars over the alternatives. High, short scars were the next most favourably rated. If other factors influencing incision choice are considered equal, surgeons should consider these preferences in scar position and length when planning their thyroidectomy approach.
A recent outbreak of Q fever was linked to an intensive goat and sheep dairy farm in Victoria, Australia, 2012-2014. Seventeen employees and one family member were confirmed with Q fever over a 28-month period, including two culture-positive cases. The outbreak investigation and management involved a One Health approach with representation from human, animal, environmental and public health. Seroprevalence in non-pregnant milking goats was 15% [95% confidence interval (CI) 7–27]; active infection was confirmed by positive quantitative PCR on several animal specimens. Genotyping of Coxiella burnetii DNA obtained from goat and human specimens was identical by two typing methods. A number of farming practices probably contributed to the outbreak, with similar precipitating factors to the Netherlands outbreak, 2007-2012. Compared to workers in a high-efficiency particulate arrestance (HEPA) filtered factory, administrative staff in an unfiltered adjoining office and those regularly handling goats and kids had 5·49 (95% CI 1·29–23·4) and 5·65 (95% CI 1·09–29·3) times the risk of infection, respectively; suggesting factory workers were protected from windborne spread of organisms. Reduction in the incidence of human cases was achieved through an intensive human vaccination programme plus environmental and biosecurity interventions. Subsequent non-occupational acquisition of Q fever in the spouse of an employee, indicates that infection remains endemic in the goat herd, and remains a challenge to manage without source control.
A history of self-injurious thoughts and behaviors (SITBs) is consistently cited as one of the strongest predictors of future suicidal behavior. However, stark discrepancies in the literature raise questions about the true magnitude of these associations. The objective of this study is to examine the magnitude and clinical utility of the associations between SITBs and subsequent suicide ideation, attempts, and death.
We searched PubMed, PsycInfo, and Google Scholar for papers published through December 2014. Inclusion required that studies include at least one longitudinal analysis predicting suicide ideation, attempts, or death using any SITB variable. We identified 2179 longitudinal studies; 172 met inclusion criteria.
The most common outcome was suicide attempt (47.80%), followed by death (40.50%) and ideation (11.60%). Median follow-up was 52 months (mean = 82.52, s.d. = 102.29). Overall prediction was weak, with weighted mean odds ratios (ORs) of 2.07 [95% confidence interval (CI) 1.76–2.43] for ideation, 2.14 (95% CI 2.00–2.30) for attempts, and 1.54 (95% CI 1.39–1.71) for death. Adjusting for publication bias further reduced estimates. Diagnostic accuracy analyses indicated acceptable specificity (86–87%) and poor sensitivity (10–26%), with areas under the curve marginally above chance (0.60–0.62). Most risk factors generated OR estimates of <2.0 and no risk factor exceeded 4.5. Effects were consistent regardless of sample severity, sample age groups, or follow-up length.
Prior SITBs confer risk for later suicidal thoughts and behaviors. However, they only provide a marginal improvement in diagnostic accuracy above chance. Addressing gaps in study design, assessment, and underlying mechanisms may prove useful in improving prediction and prevention of suicidal thoughts and behaviors.
The 22·2 Mc./s. crossed array of the Carnegie Institution of Washington has been in use since 20 July 1954. This antenna system consists of two linear arrays 2047 ft. in length, each composed of sixty-six half-wave folded dipoles. The amplitude gains of the two arrays are, in effect, multiplied together by a phase-switching system similar to that used in phase-switching interferometers (Ryle, 1952) . The design differs somewhat from the arrangement first used by Mills (Mills and Little, 1953)  in that the arrays are arranged in the form of a slightly flattened X. The resulting pencil beam is slightly elliptical in cross-section, measuring 1°·6 by 2°·4 at half-power points, and is directed by inserting lengths of line into the feeder system of each array, phasing the dipoles such that the maximum response is at the desired zenith angle.
Records obtained at a declination of about + 22° during the first quarter of 1955 with the 22·2 Mc./s. Mills Cross of the Carnegie Institution of Washington occasionally exhibited an interference-like event which was apparently a function of sidereal time. A plot of the right ascensions of beginning and end of each event (duration roughly fifteen minutes) against the date of occurrence revealed a smooth change of Right Ascension corresponding, initially, to a westward motion (Fig. 1). This pre-cluded correlation of the event with passage through the pencil beam of fixed objects like the galactic cluster NGC2420 and the planetary nebula NGC2392 which otherwise would have been candidates. The retrograde motion suggested a planet, and a canvass of the solar system uncovered only Jupiter and Uranus as possibilities. Of these, Jupiter exhibited the same position and the same change of position as did the event recorded, while Uranus was well out of the pencil beam much of the time. It was therefore concluded that the source of the radio emission was associated with Jupiter.
In the United States alone, ∼14,000 children are hospitalised annually with acute heart failure. The science and art of caring for these patients continues to evolve. The International Pediatric Heart Failure Summit of Johns Hopkins All Children’s Heart Institute was held on February 4 and 5, 2015. The 2015 International Pediatric Heart Failure Summit of Johns Hopkins All Children’s Heart Institute was funded through the Andrews/Daicoff Cardiovascular Program Endowment, a philanthropic collaboration between All Children’s Hospital and the Morsani College of Medicine at the University of South Florida (USF). Sponsored by All Children’s Hospital Andrews/Daicoff Cardiovascular Program, the International Pediatric Heart Failure Summit assembled leaders in clinical and scientific disciplines related to paediatric heart failure and created a multi-disciplinary “think-tank”. The purpose of this manuscript is to summarise the lessons from the 2015 International Pediatric Heart Failure Summit of Johns Hopkins All Children’s Heart Institute, to describe the “state of the art” of the treatment of paediatric cardiac failure, and to discuss future directions for research in the domain of paediatric cardiac failure.
Carbon nanotubes come in many varieties, with chemical, mechanical, and electrical properties depending on carbon nanotube (CNT) structural morphology. In order to provide a platform for CNT structural tuning, a membrane reactor was designed and constructed. This reactor provided more intimate gas-catalyst contact by decoupling the carbon feedstock gas from carrier gas in a chemical vapour deposition (CVD) environment using an asymmetric membrane and a macroporous membrane. Growth using this membrane reactor demonstrated normalized yield improvements of ∼300% and ∼1000% for the asymmetric and macroporous membrane cases, respectively, over standard CVD methods. To illustrate the possibility for control, growth variation with time was successfully demonstrated by growing vertically aligned multi-walled CNTs to heights of 0.71 mm, 1.36 mm, and 1.84 mm after growth for 15, 30, and 60 minutes in a commercial thermal CVD reactor. To demonstrate CNT diameter control via catalyst particle size, dip coating and spray coating methods were explored using ferrofluid and Fe(NO3)3 systems. CNT diameter was demonstrated to increase with increasing particle size, yielding CNT like growth with diameters ranging from 15 -150 nm. Demonstration of these dimensions of control coupled with the dramatic efficiency increases over growth in a commercialized CVD reactor establish this new reactor technology as a starting point for further research into CNT structural tuning.
Thermodynamic modeling of the MOCVD process, using the standard free energy minimization algorithm, cannot always explain the deposition of hybrid films that occurs. The present investigation explores a modification of the procedure to account for the observed simultaneous deposition of metallic iron, Fe3O4, and carbon nanotubes from a single precursor. Such composite films have potential application in various device architectures and sensors, and are being studied as electrode material in energy storage devices such as lithium ion batteries and supercapacitors.
With ferric acetylacetonate [Fe(acac)3] as the precursor, MOCVD in argon ambient results in a nanocomposite of CNT, Fe, and Fe3O4 (characterized by XRD and Raman spectroscopy) when growth temperature T and total reactor pressure P are in the range from 600°C-800°C and 5-30 torr, respectively. No previous report could be found on the single-step formation of a CNT-metal-metal oxide composite. Equilibrium thermodynamic modeling using available software predicts the deposition of only Fe3C and carbon, without any co-deposition of Fe and Fe3O4, in contrast with experimental observations. To reconcile this contradiction, the modeling of the process was approached by taking the molecular structure of the precursor into account, whereas “standard” thermodynamic simulations are restricted to the total number of atoms of each element in the reactant(s) as the input. When Ocon (statistical average of the oxygen atom(s) taken up by each metal atom during CVD) is restricted to lie between 0 and 1, thermodynamic computations predict simultaneous deposition of FeO1-x, Fe3C, Fe3O4 and C in the inert ambient. At high temperature and in a carbon-rich atmosphere, iron carbide decomposes to iron and carbon. Furthermore, FeO1-x yields Fe and Fe3O4 when cooled below 567°C. Therefore, the resulting film would be composed of Fe3O4, Fe and C, in agreement with experiment. The weight percentage of carbon (∼40%) calculated from thermodynamic analysis matches well with experimental data from TG-DTA.
A nano thermal sensor was made by depositing carbon nanotubes from a medium containing a) methylene chloride b)sodium dodecyl sulfate and c) Baytron-P (polymer) assisted sodium dodecyl sulfate. The nano thermal sensors showed d.c. electrical resistance as independent of temperature when the sensors were made by procedures (a) or (b). The electrical resistivity in both the situations has been independent of temperature. When the nanosensor is made with carbon nanotubes by assisted method (c), the d.c. electrical resistance decreased with temperature. The negative temperature coefficient (TCR) is manifested in the semiconducting property of the active material. The sensor behavior is reproducible and varies linearly with temperature. The nanosensor made by non assisted carbon nanotube showed zero TCR. This is probably the first instance of assisted thermal sensor made with single walled carbon nanotubes.
As in much of the world, Australia’s birds have suffered greatly from habitat loss, feral predators and direct exploitation. Less universal have been the declines caused by post-colonial changes in fire regime after 40 000 years of Indigenous fire management. Climate change and a disengagement by Australians from nature loom as threats for the future. However, Australia is a country of climatic extremes and many birds are well-adapted to stressful conditions. Given adequate investment, all the major classes of threat have potential solutions, with particular success in recent decades in the removal of feral predators from islands and in reducing the by-catch from fishing. The biggest threat of all is possibly a failure to invest in conservation as modern lifestyles take people further and further away from the natural environment.
Australia’s birds are, like those in so much of the world, travelling poorly. Of the 1239 species and subspecies regularly occurring in Australia, 17% are Threatened or Near Threatened on the basis of the IUCN Red List Criteria (Garnett et al. 2011). This number has been increasing steadily (Szabo et al. 2012a) and, while originally it was taxa of Australia’s oceanic islands that were most likely to be threatened, taxa from the mainland are now starting to slip away (Szabo et al. 2012b). Sadly some of those most threatened are the most distinctive; birds at the end of long slender branches of the evolutionary tree whose closest relatives are long gone. Other species, however, are thriving under the conditions that have arisen over the past few centuries of intense development.
Using data on 825 under-5 children from the Ouagadougou Health and Demographic Surveillance System collected in 2010, this article examines the effects of aspects of the immediate environment on childhood fever. Logit regression models were estimated to assess the effects of the quality of the local environment on the probability that a child is reported to have had a fever in the two weeks preceding the survey, after controlling for various demographic and socioeconomic variables. While the estimated impact of some environmental factors persisted in the full models, the effects of variables such as access to water and type of household waste management decreased in the presence of demographic, socioeconomic and neighbourhood factors. The management of waste water was found to significantly affect the occurrence of childhood fever. Overall, the results of the study call for more efforts to promote access to tap water to households at prices that are affordable for the local population, where the threats to child health appears to be greatest.
Laboratory-based surveillance data is essential for monitoring trends in the incidence of enteric disease. Current Canadian human enteric surveillance systems report only confirmed cases of human enteric disease and are often unable to capture the number of negative test results. Data from 9116 hospital stool specimens from the Waterloo Region in Canada, with a mixed urban and rural population of about 500 000 were analysed to investigate the use of stool submission data and its role in reporting bias when determining the incidence of enteric disease. The proportion of stool specimens positive for Campylobacter spp. was highest in the 15–29 years age group, and in the 5–14 years age group for Salmonella spp. and E. coli O157:H7. By contrast, the age-specific incidence rates were highest for all three pathogens in the 0–4 years age group which also had the highest stool submission rate. This suggests that variations in age-specific stool submission rates are influencing current interpretation of surveillance data.