We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
There is evidence of an association between life events and psychosis in Europe, North America and Australasia, but few studies have examined this association in the rest of the world.
Aims
To test the association between exposure to life events and psychosis in catchment areas in India, Nigeria, and Trinidad and Tobago.
Method
We conducted a population-based, matched case–control study of 194 participants in India, Nigeria, and Trinidad and Tobago. Cases were recruited through comprehensive population-based, case-finding strategies. The Harvard Trauma Questionnaire was used to measure life events. The Screening Schedule for Psychosis was used to screen for psychotic symptoms. The association between psychosis and having experienced life events (experienced or witnessed) was estimated by conditional logistic regression.
Results
There was no overall evidence of an association between psychosis and having experienced or witnessed life events (adjusted odds ratio 1.19, 95% CI 0.62–2.28). We found evidence of effect modification by site (P = 0.002), with stronger evidence of an association in India (adjusted odds ratio 1.56, 95% CI 1.03–2.34), inconclusive evidence in Nigeria (adjusted odds ratio 1.17, 95% CI 0.95–1.45) and evidence of an inverse association in Trinidad and Tobago (adjusted odds ratio 0.66, 95% CI 0.44–0.97).
Conclusions
This study found no overall evidence of an association between witnessing or experiencing life events and psychotic disorder across three culturally and economically diverse countries. There was preliminary evidence that the association varies between settings.
This study aimed to compare the effectiveness of pharmacological therapy with and without direct maxillary sinus saline irrigation for the management of chronic rhinosinusitis without polyps.
Methods
In this prospective randomised controlled trial, 39 non-operated patients were randomly assigned to be treated with direct maxillary sinus saline irrigation in conjunction with systemic antibiotics and topical sprays (n = 24) or with pharmacological therapy alone (n = 15). Endoscopy, Sino-Nasal Outcome Test and Lund–MacKay computed tomography scores were obtained before, six weeks after and one to two years after treatment.
Results
Post-treatment Lund–Mackay computed tomography scores were significantly improved in both cohorts, with no inter-cohort difference identified. Post-treatment nasal endoscopy scores were significantly improved in the study group but were similar to those measured in the control group. The Sino-Nasal Outcome Test-20 results showed improvement in both cohorts, with no difference between treatment arms.
Conclusion
Maxillary sinus puncture and irrigation with saline, combined with pharmacological treatment improves endoscopic findings in patients with chronic rhinosinusitis without polyps, but has no beneficial effect on symptoms and imaging findings over conservative treatment alone.
Despite evidence for the prenatal onset of abnormal head growth in ASD children, studies on fetal ultrasound data in ASD are limited and controversial.
Objectives
To understand whether people with ASD have abnormal head growth during gestation
Methods
A longitudinal matched case-sibling-control study on prenatal ultrasound biometric measures of ASD children was conducted. Children with ASD were matched to two control groups: (1) typically developed sibling (TDS) and (2) typically developed population (TDP). The cohort comprised 528 children (72.7% males): 174 ASD, 178 TDS, and 176 TDP.
Results
Second-trimester ASD and TDS fetuses had significantly smaller biparietal diameter (BPD) than TDP fetuses (aORzBPD=0.685, 95%CI=0.527-0.890 and aORzBPD=0.587, 95%CI=0.459-0.751, respectively). However, these differences became statistically indistinguishable in the third trimester. Head biometric measures were associated with the sex of the fetus, with males having larger heads than females within and across groups. A linear mixed-effect model assessing the effects of sex and group assignment on fetal longitudinal head growth indicated faster BPD growth in TDS vs both ASD and TDP in males (β=0.084 and β=0.100 respectively; p<0.001) but not in females, suggesting an ASD–sex interaction in head growth during gestation. Fetal head shape showed sex-specific characteristics, and head growth was inversely correlated with ASD severity in males and females, thus further supporting the sex effect on the association between fetal head growth and ASD.
Conclusions
Our findings suggest that abnormal fetal head growth is a familial trait of ASD, which is modulated by sex and is associated with the severity of the disorder.
Hypotension is an adverse event that may be related to systemic exposure of milrinone; however, the true exposure–safety relationship is unknown.
Methods:
Using the Pediatric Trials Network multicentre repository, we identified children ≤17 years treated with milrinone. Hypotension was defined according to age, using the Pediatric Advanced Life Support guidelines. Clinically significant hypotension was defined as hypotension with concomitant lactate >3 mg/dl. A prior population pharmacokinetic model was used to simulate milrinone exposures to evaluate exposure–safety relationships.
Results:
We included 399 children with a median (quarter 1, quarter 3) age of 1 year (0,5) who received 428 intravenous doses of milrinone (median infusion rate 0.31 mcg/kg/min [0.29,0.5]). Median maximum plasma milrinone concentration was 110.7 ng/ml (48.4,206.2). Median lowest systolic and diastolic blood pressures were 74 mmHg (60,85) and 35 mmHg (25,42), respectively. At least 1 episode of hypotension occurred in 178 (45%) subjects; clinically significant hypotension occurred in 10 (2%). The maximum simulated milrinone plasma concentrations were higher in subjects with clinically significant hypotension (251 ng/ml [129,329]) versus with hypotension alone (86 ng/ml [44, 173]) versus without hypotension (122 ng/ml [57, 208], p = 0.002); however, this relationship was not retained on multivariable analysis (odds ratio 1.01; 95% confidence interval 0.998, 1.01).
Conclusions:
We successfully leveraged a population pharmacokinetic model and electronic health record data to evaluate the relationship between simulated plasma concentration of milrinone and systemic hypotension occurrence, respectively, supporting the broader applicability of our novel, efficient, and cost-effective study design for examining drug exposure–response and –safety relationships.
Le parcours de soins des patients atteints de TDA/H est mal connu. Cette enquête nationale a pour objectif de préciser les étapes d’évaluation successives menant au diagnostic ainsi qu’au traitement du TDA/H et d’identifier des éléments susceptibles d’être améliorés.
Méthode
Enquête transversale menée en France du 04/11/2013 au 31/01/2014 auprès d’un échantillon national de 61 médecins prenant en charge des enfants atteints de TDA/H, à l’aide d’un auto-questionnaire remis aux patients/parents.
Résultats
Quatre cent soixante-treize questionnaires analysés. Les premiers signes (troubles du comportement [78,2 %] et de l’attention [70 %]) sont repérés vers 4,5 ans, principalement hors du milieu familial. Le diagnostic est posé à l’âge de 8,1 ans, environ 4 ans après l’observation des premiers signes. Les familles consultent en moyenne 3,5 professionnels de santé avant que le diagnostic ne soit évoqué. Le psychiatre/pédopsychiatre est le plus consulté quelle que soit l’étape d’évaluation. Lors de la 1re étape, seuls 10,7 % des patients sont diagnostiqués. Ce délai pourrait en partie expliquer les taux élevés de redoublement (31,5 %), notamment en CP et CE1, et d’insatisfaction vis-à-vis de la prise en charge, principalement lors de la 1re étape d’évaluation (38,6 % d’insatisfaits). Deux groupes de patients ont été mis en évidence par une analyse en cluster : le premier (89,9 % de garçons) présente des problèmes de comportement, d’agitation, et des difficultés familiales ; le 2e (49 % de garçons), dont l’hyperactivité est moins prononcée, a mis une année supplémentaire pour recevoir un diagnostic de TDA/H. Dans cet échantillon, plus de 2/3 des patients bénéficient d’un traitement médicamenteux, du méthylphénidate dans 98 % des cas. Le diagnostic tardif a été la principale source de préoccupation des proches.
Conclusion
Le délai d’environ 4 ans, des premiers signes au diagnostic, pourrait constituer une perte de chance pour les enfants atteints de TDA/H.
Given the limited knowledge on the long-term outcome of adolescents who receive electroconvulsive therapy (ECT), the study aimed to follow-up adolescents treated with ECT for severe mood disorder. Eleven subjects treated during adolescence with bilateral ECT for psychotic depression (n = 6) or mania (n = 5), and ten psychiatric controls matched for sex, age, school level, and clinical diagnosis, completed at least 1 year after treatment a clinical and social evaluation. Mean duration between time of index episode and time of follow-up evaluation was 5.2 years (range 2–9 years). At follow-up: (1) all patients except two in the control group received a diagnosis of bipolar disorder. (2) Fifteen patients had had more than one episode of mood disorder. (3) The two groups did not differ in social functioning nor school achievement. (4) Impact on school achievement was related to the severity of the mood disorder rather than ECT treatment. The results suggest that adolescents given ECT for bipolar disorder, depressed or manic, do not differ in subsequent school and social functioning from carefully matched controls.
Systemic lupus erythematosus (SLE) is a chronic, autoimmune disease that has a wide variety of physical manifestations, including neuropsychiatric features. Bipolar disorder (BD) is a chronic, episodic illness, that may present as depression or as mania. The objective of this study was to investigate the association between SLE and BD using big data analysis methods.
Methods:
Patients with SLE were compared with age- and sex-matched controls regarding the prevalence of BD in a cross-sectional study. Chi-square and t-tests were used for univariate analysis and a logistic regression model was used for multivariate analysis, adjusting for confounders. The study was performed utilizing the chronic disease registry of Clalit Health Services medical database.
Results:
The study included 5018 SLE patients and 25,090 matched controls. BD was found in a higher prevalence among SLE patients compared to controls (0.62% vs. 0.26%, respectively, P < 0.001). BD patients had a greater prevalence of smokers compared to non-BD patients (62.5% vs 23.5%, respectively, P < 0.001). In a multivariate analysis, smoking and SLE were both found to be significantly associated with BD.
Conclusions:
SLE was found to be independently associated with BD. These findings may imply that an autoimmune process affecting the central nervous system among SLE patients facilitates the expression of concomitant BD.
During pregnancy, mothers-to-be should adapt their diet to meet increases in nutrient requirements. Pregnant women appear to be keener to adopt healthier diets, but are not always successful. The objective of the present study was to determine whether a guided, stepwise and tailored dietary counselling programme, designed using an optimisation algorithm, could improve the nutrient adequacy of the diet of pregnant women, beyond generic guidelines. Pregnant women (n 80) who attended Notre-Dame-de-Bon-Secours Maternity Clinic were randomly allocated to the control or intervention arm. Dietary data were obtained twice from an online 3-d dietary record. The nutrient adequacy of the diet was calculated using the PANDiet score, a 100-point diet quality index adapted to the specific nutrient requirements for pregnancy. Women were supplied with generic dietary guidelines in a reference booklet. In the intervention arm, they also received nine sets of tailored dietary advice identified by an optimisation algorithm as best improving their PANDiet score. Pregnant women (n 78) completed the 12-week dietary follow-up. Initial PANDiet scores were similar in the control and intervention arms (60·4 (sd 7·3) v. 60·3 (sd 7·3), P = 0·92). The PANDiet score increased in the intervention arm (+3·6 (sd 9·3), P = 0·02) but not in the control arm (−0·3 (sd 7·3), P = 0·77), and these changes differed between arms (P = 0·04). In the intervention arm, there were improvements in the probabilities of adequacy for α-linolenic acid, thiamin, folate and cholesterol intakes (P < 0·05). Tailored dietary counselling using a computer-based algorithm is more effective than generic dietary counselling alone in improving the nutrient adequacy of the diet of French women in mid-pregnancy.
The search for life in the Universe is a fundamental problem of astrobiology and modern science. The current progress in the detection of terrestrial-type exoplanets has opened a new avenue in the characterization of exoplanetary atmospheres and in the search for biosignatures of life with the upcoming ground-based and space missions. To specify the conditions favourable for the origin, development and sustainment of life as we know it in other worlds, we need to understand the nature of global (astrospheric), and local (atmospheric and surface) environments of exoplanets in the habitable zones (HZs) around G-K-M dwarf stars including our young Sun. Global environment is formed by propagated disturbances from the planet-hosting stars in the form of stellar flares, coronal mass ejections, energetic particles and winds collectively known as astrospheric space weather. Its characterization will help in understanding how an exoplanetary ecosystem interacts with its host star, as well as in the specification of the physical, chemical and biochemical conditions that can create favourable and/or detrimental conditions for planetary climate and habitability along with evolution of planetary internal dynamics over geological timescales. A key linkage of (astro)physical, chemical and geological processes can only be understood in the framework of interdisciplinary studies with the incorporation of progress in heliophysics, astrophysics, planetary and Earth sciences. The assessment of the impacts of host stars on the climate and habitability of terrestrial (exo)planets will significantly expand the current definition of the HZ to the biogenic zone and provide new observational strategies for searching for signatures of life. The major goal of this paper is to describe and discuss the current status and recent progress in this interdisciplinary field in light of presentations and discussions during the NASA Nexus for Exoplanetary System Science funded workshop ‘Exoplanetary Space Weather, Climate and Habitability’ and to provide a new roadmap for the future development of the emerging field of exoplanetary science and astrobiology.
Hepatitis E virus (HEV) is an emerging cause of viral hepatitis worldwide. Recently, HEV-7 has been shown to infect camels and humans. We studied HEV seroprevalence in dromedary camels and among Bedouins, Arabs (Muslims, none-Bedouins) and Jews and assessed factors associated with anti-HEV seropositivity. Serum samples from dromedary camels (n = 86) were used to determine camel anti-HEV IgG and HEV RNA positivity. Human samples collected between 2009 and 2016 from >20 years old Bedouins (n = 305), non-Bedouin Arabs (n = 320) and Jews (n = 195), were randomly selected using an age-stratified sampling design. Human HEV IgG levels were determined using Wantai IgG ELISA assay. Of the samples obtained from camels, 68.6% were anti-HEV positive. Among the human populations, Bedouins and non-Bedouin Arabs had a significantly higher prevalence of HEV antibodies (21.6% and 15.0%, respectively) compared with the Jewish population (3.1%). Seropositivity increased significantly with age in all human populations, reaching 47.6% and 34.8% among ⩾40 years old, in Bedouins and non-Bedouin Arabs, respectively. The high seropositivity in camels and in ⩾40 years old Bedouins and non-Bedouin Arabs suggests that HEV is endemic in Israel. The low HEV seroprevalence in Jews could be attributed to higher socio-economic status.
This study aimed to investigate the prevalence of and risk factors for Eustachian tube dysfunction leading to middle-ear pathology in patients on chronic mechanical ventilation via tracheostomy tube.
Methods:
A total of 40 patients on chronic ventilation were included in a prospective cohort study. Middle-ear status was determined by tympanometry. Tympanograms were categorised as types A, B or C; types B and C were defined as middle-ear pathology.
Results:
In all, 57 ears of 40 patients were examined. Disease was found in at least 1 ear in 26 out of 40 patients. Middle-ear pathology was found in 25 out of 34 patients who were tube fed (via nasogastric tube or percutaneous endoscopic gastrostomy) vs 1 patient out of the 6 fed orally (p = 0.014), and in 23 out of 31 with conscious or cognitive impairment vs 3 out of 9 cognitively intact patients (p = 0.044).
Conclusion:
Middle-ear pathology is common in patients on chronic mechanical ventilation via tracheostomy tube. The highest prevalence was in those with impaired consciousness or cognition, and oral feeding appeared protective.
One view of major Solar Energetic Particle (SEP) events is that these (proton-dominated) fluxes are accelerated in heliospheric shock sources created by Interplanetary Coronal Mass Ejections (ICMEs), and then travel mainly along interplanetary magnetic field lines connecting the shock(s) to the observer(s). This places a particular emphasis on the role of the heliospheric conditions during the event, requiring a realistic description of the latter to interpret and/or model SEP events. The well-known ENLIL heliospheric simulation with cone model generated ICME shocks is used together with the SEPMOD particle event modeling scheme to demonstrate the value of applying these concepts at multiple inner heliosphere sites.
To use VRI systems, a field is divided into irrigation management zones (IMZs). While IMZs are dynamic in nature, most of IMZs prescription maps are static. High-resolution thermal images (TI) coupled with measured atmospheric conditions have been utilized to map the within-field water status variability and to delineate in-season IMZs. Unfortunately, spaceborne TIs have coarse spatial resolution and aerial platforms require substantial financial investments, which may inhibit their large-scale adoption. Three approaches are proposed to facilitate large-scale adoption of TI-based IMZs: 1) increase of the capacity of aerial TI by enhancing their spatial resolution; 2) sharpening the spatial resolution of satellite TI by fusing satellite multi-spectral images in the visible-near-infrared (VIS-NIR) range; 3) increase the capacity of aerial TI by fusing satellite multi-spectral images in the VIS-NIR range. The scientific and engineering basis of each of the approaches is described together with initial results.
The Middle and Upper Palaeolithic Kebara cave include numerous fireplaces for most of the Middle Palaeolithic sequence. These layers are followed by more geogenic sediments (end of Middle Palaeolithic, Ahmarian and Aurignacian) under relatively wetter conditions. Radiocarbon ages indicate the appearance of the Ahmarian lithic industries between 48.5-46.5 Ka cal BP, the earliest in the Levant. Lithic, archaeozoological, palaeobotanical and combustion feature densities data and spatial organization of activities, indicate a change through the sequence, from repetitive, intensive, long-term occupations of the cave by Mousterian populations with Levallois technology (units XI-VIII), to more ephemeral occupations by the end of the Middle Palaeolithic (units VI-V). The cave was then used for limited initial carcass processing of Gazelles and Persian fallow deer. During the Early Upper Palaeolithic (units IV to I), hunting economy remained similar, despite radical changes in lithic technology. A well preserved adult burial (Unit XII) provided detailed information on the series of gestures involved in the mortuary practice.
An important factor in controlling invasive plant infestations is frequently the acceleration of the deterioration of their persistent seed bank, which is often associated with physical dormancy mechanisms. We hypothesized that breaking dormancy by heat would enhance the vulnerability of the nondormant seeds to hydrothermal stresses. The aim of the present study was to examine the effect of soil solarization treatments (heating the soil by means of polyethylene mulching) on buried Australian Acacia seeds, with emphasis on Acacia saligna L. The results of three field experiments indicate that soil solarization treatments caused an almost complete eradication of buried seeds of Acacia saligna and two other Australian Acacia species, Acacia murrayana and Acacia sclerosperma. The killing mechanism of solarization was further studied in laboratory experiments. We observed two phases of the heat-induced deterioration of seed persistence: breaking the dormancy of the seeds and exposing the “weakened seeds” to lethal temperatures. From an ecological perspective of conservation, the present study shows for the first time the possible utilization of solar energy, by means of soil solarization, for reducing persistent seed banks of invasive woody plants.
Melanoma represents the third most common cause of CNS metastases. Immunotherapy has evolved as a treatment option for patients with stage-IV melanoma. Stereotactic radiosurgery (SRS) also elicits an immune response within the brain and may interact with immunotherapy. We report a cohort of patients treated for brain metastasis with immunotherapy and evaluate the effect of SRS timing on the intracranial response. Methods: All consecutively treated melanoma patients receiving Ipilimumab and SRS for their brain metastasis were included in the retrospective analysis. 46 patients harboring 232 brain metastases were reviewed. The median clinical follow-up was 7.9 months (3-42.6). Median age was 63 years (24.3-83.6). 32 patients received SRS before or during ipilimumab cycles (Group-A) whereas 14 patients received SRS after the ipilimumab treatment (Group-B). Radiographic and clinical responses were assessed at approximately 3 months intervals after SRS. Results: The two cohorts were comparable in pertinent pre-treatment aspects with the exception of SRS timing relative to ipilimumab. Local recurrence free duration (LRFD) was significantly longer in Group-A patients (19.6 months, range 1.1-34.7 months) as compared to group-B patients (3 months, range 0.4-20.4 months), respectively (p=0.002). Post-SRS perilesional edema was more significant in Group-A. Conclusions: The effect of SRS and ipilimumab in attaining LRFD seems greater when SRS is performed before or during ipilimumab treatments. The timing of immunotherapy and SRS may effect LRFD and post-radiosurgical edema. The interactions between immunotherapy and SRS warrant further investigation so as to optimize the therapeutic benefits and mitigate the risks associated with multimodality, targeted therapy.
For patient with a recurrent or residual acromegaly or Cushing’s disease (CD) after resection, Gamma knife radiosurgery (GKRS) is often used. Hypopituitarism is the most common adverse effect after GKRS treatment. The paucity of studies with long-term follow up has hampered understanding of the latent risks of hypopituitarism in patients with a Acromegaly or CD. We report the long-term risks of hypopituitarism for patients treated with GKRS for Acromegaly or CD. Methods: From a prospectively created, IRB approved database, we identified all patients with a Acromegaly or CD treated with GKRS at the University of Virginia from 1989 to 2008. Only patients with a minimum endocrine follow up of 60 months were included. The median follow-up is 159.5 months (60.1-278). Thorough radiological and endocrine assessments were performed immediately before GKRS and at regular follow-up intervals. New onset of hypopituitarism was defined as pituitary hormone deficits after GKRS requiring corresponding hormone replacement. Results: 60 patients with either Acromegaly or CD were included. Median tumor volume at time of GKRS was 1.3 cm3 (0.3-13.4), median margin dose was 25 Gy (6-30). GKRS induced new pituitary deficiency occurred in 58.3% (n=35) of patients. Growth Hormone deficiency was most common (28.3%, n=17). The actuarial overall rates of hypopituitarism at 3, 5, and 10 years were 10%, 21.7%, and 53.3%, respectively. The median time to hypopituitarism was 61 months after GKRS (range, 12-160). Cavernous sinus invasion of the tumor was found to correlate with the occurrence of a new or progressive hypopituitarism after GKRS (p=0.018). Conclusions: Delayed hypopituitarism increases as a function of time after radiosurgery. Hormone axes appear to vary in terms of radiosensitivity. Patients with adenoma in the cavernous sinus are more prone to develop loss of pituitary function after GKRS.
Meningiomas are the most common primary benign brain tumor. Radiosurgery (primary or adjuvant) allows excellent local control. The Geriatric scoring system (GSS) for pre-operative risk stratification and outcome prediction of patients with meningiomas has been previously reported. The GSS incorporates eight tumor and patient parameters on admission. A GSS score higher than 16 was previously reported to be associated with a more favorable outcome. We assessed the validity of the GSS score and its influence on outcome in patients treated with gamma-knife radiosurgery (GKRS). Methods: Patients treated with single session GKRS for WHO-1 meningioma during 1989-2013 at the University of Virginia were reviewed. A cohort of 323 patients, 50.2% (n=162) males. Median age was 56 (29-84), and median follow-up was 53.6 (6-235) months. Median tumor volume was 4.5 cm3 (0.2-23). Median margin and maximal doses were 15 Gy (8-36) and 32.3 Gy (20-65), respectively. Results: Tumor volume control was achieved in 87% (n=281), and post-GKRS clinical neurological improvement reported in 66.3% (n=214). The median change in KPS was +10 (range -30 to +40). The most common complication was intermittent headaches (34.1%, n=110) and cranial nerve deficits (14.2%, n=46). The GSS (calculated and grouped as GSS>16 and GSS<=16) was found to correlate with different Post-GKRS functional status (p<0.0001) and tumor control (p=0.028). Conclusion: The GSS, used for risk stratification and outcome prediction in patients with meningiomas seems valid for patients undergoing single session GRKS. GSS score greater than 16 is associated with a better long-term functional status and tumor control.
Hemangiopericytomas (HPC) are widely recognized for their aggressive clinical behavior. We report a large multicenter study, through the International Gamma Knife Research Foundation reviewing management and outcome following stereotactic radiosurgery (SRS) for recurrent or newly-discovered HPC’s. Methods: Eight centers participated, reviewing a total of 90 patients harboring 133 tumors. Prior treatments included embolization (n = 8), chemotherapy (n=2), and fractionated radiotherapy (n=34). The median tumor volume at the time of SRS was 4.9 ml (range 0.2-42.4 ml). WHO-grade II (typical) HPC’s formed 78.9% (n=71) of the cohort. The median margin and maximal doses delivered were 15 Gy (2.8-24) and 32 Gy (8-51), respectively. The median clinical and radiographic follow-up period was 59 months (6-190) and 59 months (6-183), respectively. Results: At last follow-up, 55% of tumors and 62.2% of patients demonstrated local tumor control. New remote intracranial tumors were found in 27.8%. 24.4% of patients developed extra-cranial metastases. Adverse radiation effects were noted in 6.7%. The overall survival was 91.5%, 82.1%, 73.9%, 56.7%, and 53.7% at 2, 4, 6, 8, and 10 years, respectively, after initial SRS. Local progression free survival was 81.7%, 66.3%, 54.5%, 37.2%, and 25.5% at 2, 4, 6, 8, and 10 years, respectively, after initial SRS. In our cohort, 32 patients underwent 48 repeat SRS procedures for 76 lesions. Margin dose greater than 16 Gy (p=0.037) and tumor histology (p=0.006) were shown to influence PFS. Conclusions: SRS provides a reasonable rate of local tumor control and a low risk of adverse effects
The radiological detection of BMs is essential for optimizing a patient’s treatment. This statement is even more valid when stereotactic radiosurgery (SRS), a non-invasive image guided treatment that can target BM as small as 1-2 mm, is delivered as part of that care. The timing of image acquisition after contrast administration can influence the diagnostic sensitivity of contrast enhanced MRI for BM. Objective: Investigate the effect of time delayed acquisition after administration of intravenous Atavist® (Gadobutrol 1mmol/ml) on the detection of BM. Methods: This is a prospective IRB approved study of 50 patients with BM who underwent post-contrast MRI sequences immediately after injection of 0.1 mmol/kg Gadavist® as part of clinical care (t0), followed by axial T1 sequences after a 10 minutes (t1) and 20 minute delay (t2). MRI studies were blindly compared by 3 neuro-radiologists. Results: Single measure intraclass correlation coefficients were very high (0.914, 0.904 and 0.905 for t0, t1 and t2 respectively), corresponding to a reliable inter-observer correlation. The t2 delayed sequences showed a significant and consistently higher diagnostic sensitivity for BM by every participating neuroradiologist as well as for the entire cohort (p=0.016, p=0.035 and 0.034 respectively). A disproportionately high representation of BM detected on the delayed studies was located within posterior circulation territories (compared to predictions based on tissue volume and blood-flow volumes). Conclusion: Considering the safe and potentially high yield nature of delayed MRI sequences, it should supplement the basic MRI sequences in all patients in need of precise delineation of their intracranial disease.