To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To determine the incidence of nasolacrimal duct injury after functional endoscopic sinus surgery radiologically, using computed tomography.
Fifty patients of either sex who underwent functional endoscopic sinus surgery were evaluated for nasolacrimal duct injury by computed tomography. Computed tomography was conducted pre-operatively, and post-operatively at the end of four weeks, and nasolacrimal duct injury was analysed.
The prevalence of nasolacrimal duct injury dehiscence was 1.16 per cent, with a similar incidence of 1.16 per cent for nasolacrimal duct injury post-operatively. However, no cases of symptomatic nasolacrimal duct injury were recorded.
Computed tomography scan is an effective, non-invasive method to evaluate nasolacrimal duct injury following functional endoscopic sinus surgery, in accordance with evidence-based medicine.
South Asians, who are at a disproportionately greater risk of atherosclerotic CVD (ASCVD), represent a rapidly growing population in the USA. The relationship between dairy products, a major component of South Asian diets, and body composition – an established risk factor for ASCVD, is unclear. The aim of the present study was to examine associations between dairy intake and multiple measures of body composition (BMI, waist and hip circumference, waist:hip ratio, abdominal lean mass, subcutaneous, visceral, and intermuscular fat areas) among South Asian adults in the USA. A baseline analysis was conducted using existing data from the Mediators of Atherosclerosis in South Asians Living in America cohort. In women, the highest (>1·9 servings/d) v. lowest (<1 serving/d) tertile of dairy intake was associated with 53 % lower odds of a waist circumference >80 cm (95 % CI 0·25, 0·89, Pfor trend<0·05). No associations were observed between dairy intake and measures of body composition. However, >3 servings of low-fat yogurt/week was associated with a 9·9 cm2 lower visceral fat area (95 % CI –19·07, –0·72, P<0·05) and 2·3 cm2 lower intermuscular fat area (95 % CI –3·76, –0·79, P<0·05) as compared with those with three servings/week. Milk and cheese were not associated with body composition measures. These analyses suggest that higher consumption of low-fat yogurt is associated with lower visceral and intermuscular fat in the whole sample, and women with higher dairy intake have lower waist circumference. Our study supports dietary incorporation of dairy products, and recognises the utility of multidimensional measures of central adiposity.
Although increased weight, and particularly obesity, has been associated with a more severe clinical course of COVID-19 and risk of fatality, the course of the illness can lead to prolonged length of stay. Changes in nutritional status and weight loss during hospitalisation are largely reported in some populations, but still not explored in COVID-19 patients. Considering that patients with COVID-19 show an increased inflammatory response, other signs and symptoms, which can lead to weight and muscle loss, should be monitored. The aim of this article was to establish possible connections between COVID-19, prolonged hospitalisation and muscle wasting, as well as to propose nutritional recommendations for the prevention and treatment of cachexia, through a narrative review. Identification of risk and presence of malnutrition should be an early step in general assessment of all patients, with regard to more at-risk categories including older adults and individuals suffering from chronic and acute disease conditions, such as COVID-19. The deterioration of nutritional status, and consequently cachexia, increases the risk of mortality and needs to be treated with attention as other complications. There is, however, little hard evidence of nutritional approaches in assisting COVID-19 treatment or its management including cachexia.
The prevalence of malnutrition in patients with cancer is one of the highest of all patient groups. Weight loss (WL) is a frequent manifestation of malnutrition in cancer and several large-scale studies have reported that involuntary WL affects 50–80% of patients with cancer, with the degree of WL dependent on tumour site, type and stage of disease. The study of body composition in oncology using computed tomography has unearthed the importance of both low muscle mass (sarcopenia) and low muscle attenuation as important prognostic indications of unfavourable outcomes including poorer tolerance to chemotherapy; significant deterioration in performance status and quality of life (QoL), poorer post-operative outcomes and shortened survival. While often hidden by excess fat and high BMI, muscle abnormalities are highly prevalent in patients with cancer (ranging from 10 to 90%). Early screening to identify individuals with sarcopenia and decreased muscle quality would allow for earlier multimodal interventions to attenuate adverse body compositional changes. Multimodal therapies (combining nutritional counselling, exercise and anti-inflammatory drugs) are currently the focus of randomised trials to examine if this approach can provide a sufficient stimulus to prevent or slow the cascade of tissue wasting and if this then impacts on outcomes in a positive manner. This review will focus on the aetiology of musculoskeletal degradation in cancer; the impact of sarcopenia on chemotherapy tolerance, post-operative complications, QoL and survival; and outline current strategies for attenuation of muscle loss in clinical practice.
To determine the radiological prevalence of frontal cells according to the International Frontal Sinus Anatomy Classification in patients undergoing computed tomography of the paranasal sinuses for clinical symptoms of chronic rhinosinusitis, and to examine the association between cell classification and frontal sinusitis development.
A total of 180 (left and right) sides of 90 patients were analysed. The prevalence of each International Frontal Sinus Anatomy Classification cell was assessed. Logistic regression analysis was used to compare the distribution of various cells in patients with and without frontal sinusitis.
The agger nasi cell was the most commonly occurring cell, seen in 95.5 per cent of patients. The prevalence rates for supra agger cells, supra agger frontal cells, supra bullar frontal cells, supra bullar cells, supra-orbital ethmoid cells and frontal septal cells were 33.3 per cent, 22.2 per cent, 21.1 per cent, 36.1 per cent, 39.4 per cent and 21.1 per cent, respectively. There was no significant difference in the occurrence of any of the cell types in patients with frontal sinusitis compared to those without (p > 0.05).
The presence of any of the International Frontal Sinus Anatomy Classification cells was not significantly associated with frontal sinusitis.
Sarcopenia, defined as decrease in skeletal muscle mass (SMM) and strength, might be associated with reduced survival. We investigated the impact of sarcopenia and decrease in SMM in patients with advanced pancreatic cancer during FOLFIRINOX (FX) therapy. Consecutive sixty-nine patients who received FX were evaluated. Skeletal muscle index (SMI) (cm2/m2) was used to evaluate SMM. The cut-off value of sarcopenia was defined as SMI <42 for males and <38 for females, based on the Asian Working Group for sarcopenia criteria. Sarcopenia was diagnosed in thirty-three (48 %) subjects. Comparison of baseline characteristics of the two groups (sarcopenia group: non-sarcopenia group) showed a significant difference in sex, tumour size and BMI. There was no significant difference in the incidence of adverse events with grades 3–5 and progression-free survival (PFS) during FX between the two groups (PFS 8·1 and 8·8 months; P = 0·88). On the multivariate analysis, progressive disease at the first follow-up computed tomography (hazard ratio (HR) 3·87, 95 % CI 1·53, 9·67), decreased SMI ≥ 7·9 % in 2 months (HR 4·02, 95 % CI 1·87, 8·97) and carcinoembryonic antigen ≥ 4·6 (HR 2·52, 95 % CI 1·10, 6·11) were significant risk factors associated with poor overall survival (OS), but sarcopenia at diagnosis was not. OS in patients with decreased SMI of ≥7·9 % and <7·9 % were 10·9 and 21·0 months (P < 0·01), respectively. In conclusion, decrease in SMM within 2 months after the initiation of chemotherapy had significantly shorter OS, although sarcopenia at diagnosis did not affect OS. Therefore, it might be important to maintain SMM during chemotherapy for a better prognosis.
Although the gross and microscopic pathology in rats infected with Angiostrongylus cantonensis has been well described, corresponding changes detected using diagnostic imaging modalities have not been reported. This work describes the cardiopulmonary changes in mature Wistar rats chronically infected with moderate burdens of A. cantonensis using radiology, computed tomography (CT), CT angiography, echocardiography, necropsy and histological examinations. Haematology and coagulation studies were also performed. Thoracic radiography, CT and CT angiography showed moderately severe alveolar pulmonary patterns mainly affecting caudal portions of the caudal lung lobes and associated dilatation of the caudal lobar pulmonary arteries. Presumptive worm profiles could be detected using echocardiography, with worms seen in the right ventricular outflow tract or straddling either the pulmonary and/or the tricuspid valves. Extensive, multifocal, coalescing dark areas and multiple pale foci affecting the caudal lung lobes were observed at necropsy. Histologically, these were composed of numerous large, confluent granulomas and fibrotic nodules. Adult worms were found predominantly in the mid- to distal pulmonary arteries. An inflammatory leukogram, hyperproteinaemia and hyperfibrinogenaemia were found in most rats. These findings provide a comparative model for A. cantonensis in its accidental hosts, such as humans and dogs. In addition, the pathological and imaging changes are comparable to those seen in dogs infected with Angiostrongylus vasorum, suggesting rats infected with A. cantonensis could be a model for dogs with A. vasorum infection.
Anomalous origin of a branch pulmonary artery from the aorta is a rare malformation, accounting for 0.12% of all congenital heart defects. We present the case of a 43-year-old man with an anomalous origin of the right pulmonary artery (AORPA) from the ascending aorta. Reimplantation of the right pulmonary artery was carried out successfully, with favourable evolution in the medium-term follow-up. It is the first described case that receives corrective treatment in adulthood with a favourable evolution.
Chest CT evaluation is often vital to determine patients suspected of COVID-19 pneumonia. The aim of this study was to determine the evolution of lung abnormalities evaluated by quantitative CT techniques in patients with COVID-19 infection from initial diagnosis to recovery. This retrospective study included 16 patients with COVID-19 infection from 30 January 2020 through 11 March 2020. Repeat chest CT examinations were obtained for three or more scans per patient. We measured total volume and mean CT value of lung lesions in each patient per scan, and then calculated the mass, which equals to volume × (CT value + 1000). Dynamic evolution of chest CT imaging as a function of time was fitted by non-linear regression model in terms of mass, volume and CT value, respectively. According to the fitting curves, we redefined the evolution of lung abnormalities: progressive stage (0–5 days), infection emerged and rapidly aggravated; peak stage (5–15 days), the greatest severity at approximate 7–8 days after onset; and absorption stage (15–30 days), the lesions slowly and gradually resolved.
Concurrent three-dimensional imaging of the renal vascular and tubular systems on the whole-kidney scale with capillary level resolution is labor-intensive and technically difficult. Approaches based on vascular corrosion casting and X-ray micro computed tomography (μCT), for example, suffer from vascular filling artifacts and necessitate imaging with an additional modality to acquire tubules. In this work, we report on a new sample preparation, image acquisition, and quantification protocol for simultaneous vascular and tubular μCT imaging of whole, uncorroded mouse kidneys. The protocol consists of vascular perfusion with the water-soluble, aldehyde-fixable, polymeric X-ray contrast agent XlinCA, followed by laboratory-source μCT imaging and structural analysis using the freely available Fiji/ImageJ software. We achieved consistent filling of the entire capillary bed and staining of the tubules in the cortex and outer medulla. After imaging at isotropic voxel sizes of 3.3 and 4.4 μm, we segmented vascular and tubular systems and quantified luminal volumes, surface areas, diffusion distances, and vessel path lengths. This protocol permits the analysis of vascular and tubular parameters with higher reliability than vascular corrosion casting, less labor than serial sectioning and leaves tissue intact for subsequent histological examination with light and electron microscopy.
Vomiting is common in children after minor head injury. In previous research, isolated vomiting was not a significant predictor of intracranial injury after minor head injury; however, the significance of recurrent vomiting is unclear. This study aimed to determine the value of recurrent vomiting in predicting intracranial injury after pediatric minor head injury.
This secondary analysis of the CATCH2 prospective multicenter cohort study included participants (0–16 years) who presented to a pediatric emergency department (ED) within 24 hours of a minor head injury. ED physicians completed standardized clinical assessments. Recurrent vomiting was defined as ≥ four episodes. Intracranial injury was defined as acute intracranial injury on computed tomography scan. Predictors were examined using chi-squared tests and logistic regression models.
A total of 855 (21.1%) of the 4,054 CATCH2 participants had recurrent vomiting, 197 (4.9%) had intracranial injury, and 23 (0.6%) required neurosurgical intervention. Children with recurrent vomiting were significantly more likely to have intracranial injury (odds ratio [OR], 2.3; 95% confidence interval [CI], 1.7–3.1), and require neurosurgical intervention (OR, 3.5; 95% CI, 1.5–7.9). Recurrent vomiting remained a significant predictor of intracranial injury (OR, 2.8; 95% CI, 1.9–3.9) when controlling for other CATCH2 criteria. The probability of intracranial injury increased with number of vomiting episodes, especially when accompanied by other high-risk factors, including signs of a skull fracture, or irritability and Glasgow Coma Scale score < 15 at 2 hours postinjury. Timing of first vomiting episode, and age were not significant predictors.
Recurrent vomiting (≥ four episodes) was a significant risk factor for intracranial injury in children after minor head injury. The probability of intracranial injury increased with the number of vomiting episodes and if accompanied by other high-risk factors, such as signs of a skull fracture or altered level of consciousness.
Damage to the corticospinal tract (CST) from stroke leads to motor deficits. The damage can be quantified as the amount of overlap between the stroke lesion and CST (CST Injury). Previous literature has shown that the degree of motor deficits post-stroke is related to the amount of CST Injury. These studies delineate the stroke lesion from structural T1-weighted magnetic resonance imaging (MRI) scans, often acquired for research. In Canada, computed tomography (CT) is the most common imaging modality used in routine acute stroke care. In this proof-of-principle study, we determine whether CST Injury, using lesions delineated from CT scans, significantly explains the variability in motor impairment in individuals with stroke.
Thirty-seven participants with stroke were included in this study. These individuals had a CT scan within the acute stage (7 days) of their stroke and underwent motor assessments. Brain images from CT scans were registered to MRI space. We performed a stepwise regression analysis to determine the contribution of CST injury and demographic variables in explaining motor impairment variability.
Using clinically available CT scans, we found modest evidence that CST Injury explains variability in motor impairment (R2adj = 0.12, p = 0.02). None of the participant demographic variables entered the model.
We show for the first time a relationship between CST Injury and motor impairment using CT scans. Further work is required to evaluate the utility of data derived from clinical CT scans as a biomarker of stroke motor recovery.
Dual energy X-ray absorptiometry (DEXA) is an imaging modality that has been used to predict the computed tomography (CT)-determined carcass composition of multiple species, including sheep and pigs, with minimal inaccuracies, using medical grade DEXA scanners. An online DEXA scanner in an Australian abattoir has shown that a high level of precision can be achieved when predicting lamb carcass composition in real time. This study investigated the accuracy of that same online DEXA when predicting fat and lean percentages as determined by CT over a wide range of phenotypic and genotypic variables across 454 lambs over 6 kill groups and contrasted these results against the current Australian industry standard of grade-rule (GR) measurements to grade carcasses. Lamb carcasses were DEXA scanned and then CT scanned to determine CT Fat % and CT Lean %. All phenotypic traits and genotypic information, including Australian Sheep Breeding Values, were recorded for each carcass. Residuals of the DEXA predicted CT Fat % and Lean %, and the actual CT Fat % and Lean % were calculated and tested against all phenotypic and genotypic variables. Excellent overall precision was recorded when predicting CT Fat % (R2 = 0.91, RMSE = 1.19%). Small biases present for sire breed, sire type, dam breed, hot carcass weight and c-site eye muscle area could be explained by a regression paradox; however, biases among kill group (−0.73% to 1.01% for CT Fat %, −1.48% to 0.76% for CT Lean %) and the Merino sire type (0.36% for CT Fat %, −0.73% for CT Lean %) could not be explained by this effect. Over the large range of phenotypic and genotypic variation, there was excellent precision when predicting CT Fat % and CT Lean % by an online DEXA, with only minor biases, showing superiority to the existing Australian standard of GR measurements.
Introduction: As the availability of Computed Tomography Pulmonary Angiography (CTPA) to rule out pulmonary embolism (PE) increases, so too does its utilization, and consequent overutilization. A variety of evidence-based algorithms and decision rules using clinical criteria and D-Dimer testing have been proposed as instruments to allow physicians to safely rule out a PE in low-risk patients. However, studies have shown mixed results with respect to both physician uptake of these decision rules and their impact on improving ordering practices among physicians. The objective of this study is to describe the prevalence of D-Dimer utilization among ED physicians and its impact on positive yield rates of CTPAs in a community setting. Methods: Data was collected on all CTPA studies ordered by ED physicians at two very high-volume community hospitals and an affiliated urgent care centre during the 2-year period between January 1, 2016 and December 31, 2017. For each CTPA, we determined if 1) a D-Dimer had been ordered prior to CTPA, if 2) the D-Dimer was positive, and if 3) the CTPA was positive for a PE. Using a chi-square test, we compared the diagnostic yield for those patients who had a D-Dimer prior to their CTPA and those who did not. Results: A total of 2,811 CTPAs were included in the analysis. Of these, 964 CTPAs (34.3%) were ordered without a D-Dimer. Of those 1,847 patients who underwent D-Dimer testing prior to the CTPA, 343 (18.7%) underwent a CTPA despite a negative D-Dimer. When compared as a group, those CTPAs preceded by a D-Dimer showed no significant difference in positive yields when compared to those CTPAs ordered without a prior D-Dimer (9.9% versus 11.3%, p = 0.26). Conclusion: The findings of this study present a complicated picture of the impact of D-Dimer utilization on CTPA ordering patterns. There is evidence of suboptimal uptake of routine D-Dimer ordering, and adherence to guidelines in terms of forgoing CTPAs in low-risk patients with negative D-Dimers. While this study design leaves unanswered the question of how many CTPAs were avoided as a result of a negative D-Dimer, the finding of a similar positive yield among those patients who had a D-Dimer ordered versus those who did not is interesting, and illustrative of the issues arising from the high false-positive rates associated with D-Dimer screening.
Introduction: Most emergency departments (ED) in Canada have a population of high frequency users that present to the ED on a regular basis. These patients are well described in the literature and typically defined by a frequency of 8-10 visits/year. In Thunder Bay, Ontario we have a significant population of patients that present more often that we have termed “super-users”. These patients often are typically from a vulnerable population with multiple co-morbidities and a high mortality rate. Although their risk for poor health outcomes is well recognized, both the chronicity and complexity of their symptoms often contributes to diagnostic dilemmas. The decision to order a computed tomography (CT) scan can be a difficult balance between ruling out life threatening diagnoses and exposing the patient to excessive radiation. Our objective was to describe how often these super-users of the ED received a CT scan and what types of imaging were completed. Methods: The Thunder Bay Regional Health Sciences Centre is a geographically isolated hospital in Northwestern Ontario with the next closest hospital based CT scanner greater than 300 km away. Based on previous literature and our preliminary scoping of the super-user group, we have identified a minimum of 25 visits as the threshold. A retrospective chart review was conducted for the year 2017 using our electronic medical record. Patient demographic data was collected along with the type and number of CT scans into a standardized collection tool. Results: Our preliminary results showed that our total population of super-users was 75 patients with an average of 32 visits to the ED per year. A total of 76% of the patients had a CT scan completed at least once. On average these patients have a CT during 10% of their visits with head CT comprising 50% of the imaging and abdominal/pelvis imaging comprising another 45%. For 20% of these super-users, they had CTs on 20% of their visits. From this population, only 10% of the patients had surgery in 2017 while 7% of visits required admission to hospital. The most common diagnoses for these patient visits relate to mental health/addictions, gastrointestinal complaints and infection. Conclusion: This study has shown that a significant number of our super-user population are receiving multiple CTs. Our next step is collect data on individual radiation doses and calculate exposure risks. We hope to inform policy and decision-makers who are developing programs to treat the underlying cause of their high resource use.
To investigate the prevalence of bony dehiscence in the tympanic facial canal in patients with acute otitis media with facial paresis compared to those without facial paresis.
A retrospective case–control study was conducted on acute otitis media patients with facial paresis undergoing high-resolution temporal bone computed tomography.
Forty-eight patients were included (24 per group). Definitive determination of the presence of a bony dehiscence was possible in 44 out of 48 patients (91.7 per cent). Prevalence of bony dehiscence in acute otitis media patients with facial paresis was not different from that in acute otitis media patients without facial paresis (p = 0.21). Presence of a bony dehiscence was associated with a positive predictive value of 66.7 per cent in regard to development of facial paresis. However, an intact bony tympanic facial canal did not prevent facial paresis in 44.8 per cent of cases (95 per cent confidence interval = 34.6–55.6).
Prevalence of bony dehiscence in acute otitis media patients with facial paresis did not differ from that in acute otitis media patients without facial paresis. An intact tympanic bony facial canal does not protect from facial paresis development.
Introduction: Choosing Wisely Nova Scotia (CWNS), an affiliate of Choosing Wisely Canada™ (CWC), aims to address unnecessary care and testing through literature-informed lists developed by various disciplines. CWC has identified unnecessary head CTs among the top five interventions to question in the Emergency Department (ED). Zyluk (2015) determined the Canadian CT Head Rule (CCHR) as the most effective clinical decision rule in adults with minor head injuries. To better understand the current status of CCHR use in Nova Scotia, we conducted a retrospective audit of patient charts at the Charles V. Keating Emergency and Trauma Center, in Halifax, Nova Scotia. Methods: Our mixed methods design included a literature review, retrospective chart audit, and a qualitative audit-feedback component with participating physicians. The chart audit applied the guidelines for adherence to the CCHR and reported on the level of compliance within the ED. Analysis of qualitative data is included here, in parallel with in-depth to contextualize findings from the audit. Results: 302 charts of patients having presented to the surveyed site were retrospectively reviewed. Of the 37 cases where a CT head was indicated as per the CCHR, a CT was ordered 32 (86.5%) times. Of the 176 cases where a CT head was not indicated, a CT was not ordered 155 (88.1%) times. Therefore, the CCHR was followed in 187 (87.8%) of the total 213 cases where the CCHR should be applied. Conclusion: Our study reveals adherence to the CCHR in 87.8% of cases at this ED. Identifying contextual factors that facilitate or hinder the application of CCHR in practice is critical for reducing unnecessary CTs. This work has been presented to the physician group to gain physician engagement and to elucidate enablers and barriers to guideline adherence. In light of the frequency of CT heads ordered EDs, even a small reduction would be impactful.
Background: Traditionally, radiologists have routinely recommended oral contrast agents (such as Telebrix®) for patients undergoing a computed tomography of the abdomen/pelvis (CTAP), but recent evidence has shown limited diagnostic benefits for most emergency department (ED) patients. Additionally, the use of oral contrast has numerous drawbacks, including patient nausea/vomiting, risk of aspiration and delays to CTAP completion and increased ED length of stay (LOS). Aim Statement: The aim was to safely reduce the number of ED patients receiving oral contrast prior to undergoing CTAP and thereby reduce ED length of stay. Measures & Design: An evidence-based ED protocol was developed in collaboration with radiology. PDSA cycle #1 was implementation at a pilot site to identify potential barriers. Challenges identified included the need to change the electronic order sets to reflect the new protocol, improved communication with frontline providers and addition of an online BMI calculator. PDSA cycle #2 was widespread implementation across all 4 ED's in the Calgary zone. The protocol was incorporated into all relevant electronic ED order sets to act as a physician prompt. Using administrative data, we extracted and analyzed data using descriptive and inferential statistics for the outcomes and balancing measures from a period of 12 months pre- and 12 months post-intervention. Evaluation/Results: A total of 14,868 and 17,995 CTAP exams were included in the pre and post periods, respectively. There was a reduction in usage of oral contrast from 71% to 30% (P < 0.0001) in the pre- and post-study period, respectively. This corresponded to a reduction in average time of CT requisition to CT report completed from 3.30 hours to 2.31 hours (-0.99 hrs, P = 0.001) and a reduction in average ED LOS from 11.01 hours to 9.92 hours (-1.08 hrs, P < 0.0001). The protocol resulted in a reduction of 19,434.6 patient hrs in the ED. Run charts demonstrate change was sustained over time. Our protocol did not demonstrate an increase in rates of repeat CTAP (P = 0.563) at 30 days, nor an increase in patient re-admission within 7 days (P = 0.295). Discussion/Impact: Successful implementation of an ED and radiology developed protocol significantly reduced the use of oral contrast in patients requiring enhanced CTAP as part of their diagnostic work up and, thereby, reduced overall ED LOS without increasing the need for repeat examinations within 30 days or re-admission within 7 days.
Computed tomography studies concerning pineal calcification (PC) in schizophrenia have been conducted mainly by one author who correlated this calcification with several aspects of the illness. On the basis of these findings the aim of the present study was to analyze size and incidence of pineal gland calcification by CT in schizophrenics and healthy controls, and to verify the relationship between pineal calcification and age, and the possible correlation with psychopathologic variables. Pineal calcification was measured on CT scans of 87 schizophrenics and 46 controls divided into seven age subgroups of five years each. No significant differences in PC incidence and mean size between patients and controls were observed as far as the entire group was considered. PC size correlated with age both in schizophrenics and controls. We found a higher incidence of PC in schizophrenics in the age subgroup of 21–25 years, and a negative correlation with positive symptoms of schizophrenia in the overall group. These findings could suggest a premature calcific process in schizophrenics and a probable association with `non-paranoid' aspects of the illness. Nevertheless the potential role of this process possibly related to some aspects of the altered neurodevelopment in schizophrenia is still unclear.
To evaluate the upper airway morphology changes associated with ageing in adult Chinese patients with obstructive sleep apnoea.
A total of 124 male patients diagnosed with obstructive sleep apnoea by overnight polysomnography, who underwent upper airway computed tomography, were enrolled. The linear dimensions, cross-sectional area and volume of the upper airway region and the surrounding bony frame were measured. The association between ageing and upper airway morphology was analysed.
Soft palate length, minimum cross-sectional area of the retroglossal region, lateral dimensions at the minimum cross-sectional area of the retropalatal and retroglossal regions, nasopharyngeal volume, and average cross-sectional area of the nasopharyngeal region were found to significantly increase with ageing in all patients, while the upper airway shape flattened with ageing. The volume of the retropalatal region increased with ageing among the patients with a body mass index of less than 24 kg/m2. The volume of parapharyngeal fat pad increased with ageing among patients with a body mass index greater than 28 kg/m2.
A number of dimensional, cross-sectional and volumetric parameters of the pharynx increased with age, indicating that non-anatomical factors may play a more important role in the pathogenesis of obstructive sleep apnoea in aged patients.