To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
For historic property types such as archaeological sites and historic buildings, data recovery is often the main part of mitigation plans offered by federal agencies with undertakings that will destroy part or all of a cultural resource. In theory, by extracting important information before destruction, we recover some part of a historic resource's cultural value. In some situations, however, data recovery is impossible or otherwise undesirable, and “creative” or off-site mitigation measures are necessary to mitigate adverse effects. In such circumstances, the Washington State Department of Archaeology and Historic Preservation has accepted funding from federal agencies to create, implement, and enhance an online digital information system for cultural resources. This article describes the Washington Information System for Architectural and Archaeological Records Data (WISAARD) and provides an example of a federal agency funding WISAARD development as creative mitigation for the transfer of archaeological sites out of federal ownership. We discuss the benefits of such systems and address how their development meets preservation goals established by the Advisory Council on Historic Preservation.
OBJECTIVES/GOALS: The detection of liver fibrotic changes at an early and reversible stage is essential to prevent its progression to end-stage cirrhosis and hepatocellular carcinoma. Liver biopsy, which is the current gold standard for fibrosis assessment, is accompanied by several complications due to its invasive nature in addition to sampling errors and reader variability. In this study, we evaluate the use of quantitative parameters extracted from hybrid ultrasound and photoacoustic imaging to detect and monitor fibrotic changes in a DEN rat model. METHODS/STUDY POPULATION: Liver fibrotic changes were induced in 34 Wistar male rats by oral administration of Diethylnitrosamine (DEN) for 12 weeks. 22 rats were imaged with B-mode ultrasound at 3 different time points (baseline, 10 weeks and 13 weeks) for monitoring liver texture changes. Texture features studied included tissue echointensity (liver brightness normalized to kidney brightness) and tissue heterogeneity. 12 rats were imaged with photoacoustic imaging at 4 time points (baseline, 5 wks, 10 wks, and 13 wks) to look at changes in tissue oxygenation. Hemoglobin oxygen saturation (sO2A) and hemoglobin concentration (HbT) in the right and left lobes of the liver were measured. 8 rats were used as controls. Liver tissue samples were obtained following 13 weeks from DEN start time for METAVIR histopathology staging of fibrosis. RESULTS/ANTICIPATED RESULTS: Texture features studied showed an increase with time in DEN rats. Normalized echointensity increased from 0.28 ± 0.06 at baseline to 0.46 ± 0.10 at 10 weeks (p < 0.0005) and 0.53 ± 0.15 at 13 weeks in DEN rats (p < 0.0005). In the control rats, echointensity remained at an average of 0.25 ± 0.05 (p = 0.31). Tissue heterogeneity increased over time in the DEN-exposed rats from a baseline of 208.7 ± 58.3 to 344.6 ± 52.9 at 10 weeks (p < 0.0005) and 376.8 ± 54.9 at 13 weeks (p = 0.06) however it stayed constant at 225.7 ± 37.6 in control rats (p = 0.58). The quantitative analyses of the photoacoustic signals showed that blood oxygen saturation significantly increased with time. At 5 weeks sO2AvT increased by 53.83 % (± 0.25), and HbT by 35.31 % (± 0.07). Following 10 weeks of DEN; sO2AvT by 92.04 % (± 0.29), and HbT by 55.24 % (± 0.1). All increases were significant p < 0.05. In the 13th week, however, the values of all of these parameters were lower than those in the 10th week, however, the decrease was statistically insignificant. DISCUSSION/SIGNIFICANCE OF IMPACT: Quantitative features from B-mode ultrasound and photoacoustic imaging consistently increased over time corresponding to hepatic damage, inflammation and fibrosis progressed. The use of this hybrid imaging method in clinical practice can help meet the significant need for noninvasive assessment of liver fibrosis.
Introduction: Time-to-treatment plays a pivotal role in survival from sudden cardiac arrest (SCA). Every minute delay in defibrillation results in a 7-10% reduction in survival. This is particularly problematic in rural and remote regions, where bystander and EMS response is often prolonged and automated external defibrillators (AED) are often not available. Our objective was to examine the feasibility of a novel AED drone delivery method for rural and remote SCA. A secondary objective was to compare times between AED drone delivery and ambulance response to various mock SCA resuscitations. Methods: We conducted 6 simulations in two different rural communities in southern Ontario. During phase 1 (4 simulations) a “mock” call was placed to 911 and a single AED drone and an ambulance were simultaneously dispatched from the same location to a pre-determined destination. Once on scene, trained first responders retrieved the AED from the drone and initiated resuscitative efforts on a manikin. The second phase (2 scenarios) were done in a similar manner save for the drone being dispatched from a regionally optimized location for drone response. Results: Phase 1: The distance from dispatch location to scene varied from 6.6 km to 8.8 km. Mean (SD) response time from 911 call to scene arrival was 11.2 (+/- 1.0) minutes for EMS compared to 8.1 (+/- 0.1) for AED drone delivery. In all four simulations, the AED drone arrived before EMS, ranging from 2.1 to 4.4 minutes faster. The mean time for trained responders to retrieve the AED and apply it to the manikin was 35 (+/- 5) sec. No difficulties were encountered in drone activation by dispatch, drone lift off, landing or removal of the AED from the drone by responders. Phase 2: The ambulance response distance was 20km compared to 9km for the drone. Drones were faster to arrival at the scene by 7 minutes and 8 minutes with AED application 6 and 7 minutes prior to ambulance respectively. Conclusion: This implementation study suggests AED drone delivery is feasible with improvements in response time during a simulated SCA scenario. These results suggest the potential for AED drone delivery to decrease time to first defibrillation in rural and remote communities. Further research is required to determine the appropriate distance for drone delivery of an AED in an integrated EMS system as well as optimal strategies to simplify bystander application of a drone delivered AED.
Some patients with schizophrenia switch medications due to lack of efficacy or side effects; improvement in symptoms and side effects following a switch must be assessed.
In a 12-week, open-label, baseline-controlled, flexible dose switch study, adult outpatients with schizophrenia experiencing suboptimal efficacy or tolerability problems were switched from haloperidol (n=99), olanzapine (n=82), or risperidone (n=104) to ziprasidone (80¬–160 mg/d; dosed bid with food). The primary efficacy evaluation was the BPRS score at Week 12. Safety evaluations included change from baseline in movement disorders (SAS, BAS, AIMS), weight, prolactin, and fasting lipids levels. Statistical tests were 1-sided non-inferiority comparisons with correction for multiple comparisons (0.025/3 significance level), for the primary efficacy endpoint, or 2-sided (0.05 significance level), for secondary endpoints.
BPRS scores improved significantly compared with all 3 preswitch medications at Week 12. Mean change from baseline (SD) for patients switched from haloperidol, olanzapine, and risperidone was –11.3 (16.3), –6.3 (14.2), and –9.9 (13.2), respectively (p < 0.0001 vs baseline). Movement disorders, measured by SAS, BAS, and AIMS, improved significantly for subjects switched from haloperidol and risperidone. Change in weight (kg ± SD) from baseline was 0.4 ± 3.97, –2.0 ± 3.99 (p < 0.001), and –0.6 ± 3.21 for subjects switched from haloperidol, olanzapine, and risperidone, respectively.
Patients switched to ziprasidone demonstrated improvement in symptoms and movement disorders, with a weight neutral effect. Ziprasidone is an appropriate switch option for patients experiencing suboptimal efficacy or poor tolerability with their current treatment.
South Africa (SA) is a developing country with an ageing population. Adequate nutrition and physical activity (PA) protect against the loss of muscle mass and physical function, both of which are important components of sarcopenia. This study aimed to measure the prevalence of sarcopenia in older black SA women and investigate its associations with PA and protein intake.
Materials and Methods
Older black SA women (age, 68 (range; 60–85 years) n = 122) completed sociodemographic questionnaires, 24 h urine collection (estimate protein intake), venous blood (hs-C-reactive protein (hs-CRP) and ferritin), functional tests (grip strength, 3 m timed-up-and-go (TUG), 10 m walk test) and PA monitoring (activPAL). Dual-energy x-ray absorptiometry whole-body scans assessed fat and fat-free soft tissue mass (FFSTM).
According to the European Working group on Sarcopenia in Older People (EWGSOP)2, 2.5% (n = 3) had confirmed sarcopenia of a low severity based on normal physical function. Of the total cohort, 9% (n = 11) had low grip strength, 22.1% (n = 27) had a low appendicular skeletal muscle index (ASMI), and no women had low TUG (s) or gait speed (m/s). Higher ASMI was associated with lower hs-CRP (p = 0.05; Rho = -0.209) and higher ferritin (Rho = 0.252; p = 0.019), grip strength (kg, Rho = 0.223; p = 0.015), and gait speed (m/s, Rho = 0.180; p = 0.050). Protein intake suggested intake of 41.8g/day/ or 0.51 g/kg of body mass/day. Higher total protein intake (g/24h), was associated with higher FFSTM (kg) and ASMI (p < 0.001). PA outcomes were not correlated with FFSTM or ASMI (p > 0.05), however, there was a strong positive correlation of TUG (s) and gait speed (m/s) with time spent: 1) stepping per day (min) and; 2) at a high cadence (> 100 steps/min) (all p < 0.01). Daily step count was 7137 ± 3233 (mean ± Standard deviation), with 97.9 ± 38.7 min of total time spent stepping and 12.6 ± 16.8 min spent stepping at a high cadence (> 100 steps/min). Of note, 13.9% (n = 17) of women were completing > 10,000 steps/day.
Based on the EWGSOP2 criteria, there is a low prevalence of sarcopenia in older black SA women, explained by the maintenance of strength and physical function that directly related to PA, especially that performed at higher intensities. In contrast, low muscle mass was relatively prevalent (22.1%) and was associated with low dietary protein and not PA. Notably, it may be important to review the cut-points of EWGSOP2 criteria to be specific to the older SA women from disadvantaged communities.
Osteoporosis was not a public health concern in black South African (SA) women, until recently when it was reported that the prevalence of vertebral fractures was 9.1% in black compared to 5.0% in white SA women. Accordingly, this study aimed to measure bone mineral density (BMD) of older black SA women and to investigate its association with risk factors for osteoporosis, including strength, muscle and fat mass, dietary intake and objectively measured physical activity (PA).
Methods and materials
Older black SA women (age, 68 (range; 60–85 years) n = 122) completed sociodemographic and quantitative food frequency questionnaires (QFFQ), fasting venous blood samples (25-hydroxycholecalciferol: Vitamin D-25), 24 h urine collection (estimate protein intake), grip strength and PA monitoring (activPAL). Dual-energy x-ray absorptiometry (DXA) scans of the hip (femoral neck and total) and lumbar spine determined BMD and whole-body scans for fat and fat-free soft tissue mass (FFSTM). WHO classifications were used to determine osteopenia (t-score -2.5 to -1), and osteoporosis (t-score < -2.5).
At the lumbar spine 34.4% of the women (n = 42) had osteopenia and 19.7% (n = 24) had osteoporosis. Osteopenia at the left femoral neck was 32% (n = 40) and osteoporosis was 13.1% (n = 16) of participants. The total left hip BMD indicated osteopenia in 27.9% (n = 34) and osteoporosis in 13.1% (n = 16) of participants. Multinomial regression revealed no differences in age (y) or frequency of falls in the past year between all groups (p = 0.727). Compared to those with normal BMD, participants with osteoporosis at the hip neck and lumbar spine were shorter, weighed less and had a lower body mass index (BMI) (all p < 0.05). When adjusted for height, the osteoporotic group (hip neck and lumbar spine) had lower trunk fat (% whole body), FFSTM (kg) and grip strength (kg), compared to those with normal BMD (p < 0.05). Only protein intake (g; 24 h urine analyses) was lower in women with osteoporosis (all sites) compared to those with normal BMD. Fat, carbohydrate and micronutrient intakes (relative to total daily energy intake), and vitamin D concentrations were not associated with BMD (all sites). Number of daily step count and stepping time (min) were inversely associated with BMI (p < 0.05), but not with BMD (all sites; p > 0.05).
A high prevalence of osteopenia and osteoporosis was evident at the lumbar spine and hip in older black SA women. This study highlights the importance of strength, body composition, and protein intake in maintaining BMD and preventing the development of osteoporosis in older women.
We show that the isomorphism problems for left distributive algebras, racks, quandles and kei are as complex as possible in the sense of Borel reducibility. These algebraic structures are important for their connections with the theory of knots, links and braids. In particular, Joyce showed that a quandle can be associated with any knot, and this serves as a complete invariant for tame knots. However, such a classification of tame knots heuristically seemed to be unsatisfactory, due to the apparent difficulty of the quandle isomorphism problem. Our result confirms this view, showing that, from a set-theoretic perspective, classifying tame knots by quandles replaces one problem with (a special case of) a much harder problem.
To validate digitally displayed photographic portion-size estimation aids (PSEA) against a weighed meal record and compare findings with an atlas of printed photographic PSEA and actual prepared-food PSEA in a low-income country.
Participants served themselves water and five prepared foods, which were weighed separately before the meal and again after the meal to measure any leftovers. Participants returned the following day and completed a meal recall. They estimated the quantities of foods consumed three times using the different PSEA in a randomized order.
Two urban and two rural communities in southern Malawi.
Women (n 300) aged 18–45 years, equally divided by urban/rural residence and years of education (≤4 years and ≥5 years).
Responses for digital and printed PSEA were highly correlated (>91 % agreement for all foods, Cohen’s κw = 0·78–0·93). Overall, at the individual level, digital and actual-food PSEA had a similar level of agreement with the weighed meal record. At the group level, the proportion of participants who estimated within 20 % of the weighed grams of food consumed ranged by type of food from 30 to 45 % for digital PSEA and 40–56 % for actual-food PSEA. Digital PSEA consistently underestimated grams and nutrients across foods, whereas actual-food PSEA provided a mix of under- and overestimates that balanced each other to produce accurate mean energy and nutrient intake estimates. Results did not differ by urban and rural location or participant education level.
Digital PSEA require further testing in low-income settings to improve accuracy of estimations.
To investigate preferences for and ease-of-use perceptions of different aspects of printed and digitally displayed photographic portion-size estimation aids (PSEA) in a low-resource setting and to document accuracy of portion-size selections using PSEA with different visual characteristics.
A convergent mixed-methods design and stepwise approach were used to assess characteristics of interest in isolation. Participants served themselves food and water, which were weighed before and after consumption to measure leftovers and quantity consumed. Thirty minutes later, data collectors administered a meal recall using a PSEA and then a semi-structured interview.
Blantyre and Chikwawa Districts in the southern region of Malawi.
Ninety-six women, aged 18–45 years.
Preferences and ease-of-use perceptions favoured photographs rather than drawings of shapes, three and five portion-size options rather than three with four virtual portion-size options, a 45° rather than a 90° photograph angle, and simultaneous rather than sequential presentation of portion-size options. Approximately half to three-quarters of participants found the portion-size options represented appropriate amounts of foods or water consumed. Photographs with three portion sizes resulted in more accurate portion-size selections (closest to measured consumption) than other format and number of portion-size option combinations. A 45° angle and simultaneous presentation were more accurate than a 90° angle and sequential presentation of images.
Results from testing PSEA visual characteristics separately can be used to generate optimal PSEA, which can improve participants’ experiences during meal recalls.
A national need is to prepare for and respond to accidental or intentional disasters categorized as chemical, biological, radiological, nuclear, or explosive (CBRNE). These incidents require specific subject-matter expertise, yet have commonalities. We identify 7 core elements comprising CBRNE science that require integration for effective preparedness planning and public health and medical response and recovery. These core elements are (1) basic and clinical sciences, (2) modeling and systems management, (3) planning, (4) response and incident management, (5) recovery and resilience, (6) lessons learned, and (7) continuous improvement. A key feature is the ability of relevant subject matter experts to integrate information into response operations. We propose the CBRNE medical operations science support expert as a professional who (1) understands that CBRNE incidents require an integrated systems approach, (2) understands the key functions and contributions of CBRNE science practitioners, (3) helps direct strategic and tactical CBRNE planning and responses through first-hand experience, and (4) provides advice to senior decision-makers managing response activities. Recognition of both CBRNE science as a distinct competency and the establishment of the CBRNE medical operations science support expert informs the public of the enormous progress made, broadcasts opportunities for new talent, and enhances the sophistication and analytic expertise of senior managers planning for and responding to CBRNE incidents.
The Lung Cam expanded stratigraphic succession in Vietnam is correlated herein to the Meishan D section in China, the GSSP for the Permian–Triassic boundary. The first appearance datum of the conodont Hindeodus parvus at Meishan defines the Permian–Triassic boundary, and using published graphic correlation, the Permian–Triassic boundary level has been projected into the Lung Cam section. Using time-series analysis of magnetic susceptibility (χ) data, it is determined that H. parvus arrived at Lung Cam ∼18 kyr before the Permian–Triassic boundary. Data indicate that the Lung Cam section is expanded by ∼90 % relative to the GSSP section at Meishan. Given the expanded Lung Cam section, it is possible to resolve the timing of significant events during the Permian–Triassic transition with high precision. These events include major stepped extinctions, beginning at ∼135 kyr and ending at ∼110 kyr below the Permian–Triassic boundary, with a duration of ∼25 kyr, followed by deposition of Lung Cam ash Bed + 13, which is equivalent to Siberian Traps volcanism is graphically correlated to a precession Time-series model, placing onset of this major volcanic event at ~242 kyr before the PTB. The Meishan Beds 25 and 26, at ∼100 kyr before the Permian–Triassic boundary. In addition, the elemental geochemical, carbon and oxygen isotope stratigraphy, and magnetostratigraphy susceptibility datasets from Lung Cam allow good correlation to other Permian–Triassic boundary succession. These datasets are helpful when the conodont biostratigraphy is poorly known in sections with problems such as lithofacies variability, or is undefined, owing possibly to lithofacies exclusions, anoxia or for other reasons. The Lung Pu Permian–Triassic boundary section, ∼45 km from Lung Cam, is used to test these problems.
Available twin-family data on sex differences in antisocial behavior (ASB) simultaneously suggest that ASB is far more prevalent in males than in females, and that its etiology (i.e. the effects of genes, environments, hormones, culture) does not differ across sex. This duality presents a conundrum: How do we make sense of mean sex differences in ASB if not via differences in genes, environments, hormones, and/or cultures? The current selective review and critique explores possible contributions to these seemingly incompatible sets of findings. We asked whether the presence of sex differences in behavior could be smaller than is typically assumed, or confined to a specific set of behaviors. We also asked whether there might be undetected differences in etiology across sex in twin-family studies. We found little evidence that bias or measurement invariance across sex account for phenotypic sex differences in ASB, but we did identify some key limitations to current twin-family approaches. These included the questionable ability of qualitative sex difference analyses to detect gender norms and prenatal exposure to testosterone, and concerns regarding specific analytic components of quantitative sex difference analyses. We conclude that the male preponderance in ASB is likely to reflect a true sex difference in observed behavior. It was less clear, however, that the genetic and environmental contributions to ASB are indeed identical across sex, as argued by prior twin-family studies. It is our hope that this review will inspire the development of new, genetically-informed methods for studying sex differences in etiology.
We argue that understanding of autism can be strengthened by increasing involvement of autistic individuals as researchers and by exploring cascading impacts of early sensory, perceptual, attentional, and motor atypicalities on social and communicative developmental trajectories. Participatory action research that includes diverse participants or researchers may help combat stigma while expanding research foci to better address autistic people's needs.
Legionnaires’ disease (LD) incidence in the USA has quadrupled since 2000. Health departments must detect LD outbreaks quickly to identify and remediate sources. We tested the performance of a system to prospectively detect simulated LD outbreaks in Allegheny County, Pennsylvania, USA. We generated three simulated LD outbreaks based on published outbreaks. After verifying no significant clusters existed in surveillance data during 2014–2016, we embedded simulated outbreak-associated cases into 2016, assigning simulated residences and report dates. We mimicked daily analyses in 2016 using the prospective space-time permutation scan statistic to detect clusters of ⩽30 and ⩽180 days using 365-day and 730-day baseline periods, respectively. We used recurrence interval (RI) thresholds of ⩾20, ⩾100 and ⩾365 days to define significant signals. We calculated sensitivity, specificity and positive and negative predictive values for daily analyses, separately for each embedded outbreak. Two large, simulated cooling tower-associated outbreaks were detected. As the RI threshold was increased, sensitivity and negative predictive value decreased, while positive predictive value and specificity increased. A small, simulated potable water-associated outbreak was not detected. Use of a RI threshold of ⩾100 days minimised time-to-detection while maximizing positive predictive value. Health departments should consider using this system to detect community-acquired LD outbreaks.
Introduction: Take Home Naloxone (THN) programs prevent death from opioid poisoning by training laypersons to recognize an overdose and administer naloxone. Dispensing THN through the emergency department (ED) is particularly critical because an ED visit for opioid poisoning strongly predicts future mortality. Many EDs have implemented THN programs, yet almost no literature examines the reach of such initiatives. To address this gap, we conducted a chart review of all patients presenting for opioid poisoning to an urban tertiary hospital, with a large ED-based THN program. This exploratory study hypothesized that more than 50% of ED patients presenting for opioid poisoning would be offered a THN kit. Methods: Data on demographics, clinical characteristics, and THN kit dispensing were extracted and analyzed from the charts of all ED patients presenting with a primary diagnosis of opioid poisoning between April 1 2016 and April 30 2017. Logistic regression analyzed predictors of being offered a THN kit. Results: A total of 347 ED visits for 301 unique patients occurred during the study period. The mean age ± SD of patients was 38 ± 14 years, and 69% were male. In 49% of ED visits, a THN kit was offered; 73% of these episodes had a THN kit dispensation. Patients who were male (AOR=1.94; 95% CI 1.11 - 3.40), and reported that their overdose was unintentional (AOR=2.95; 95% CI 1.04 8.35) and caused by illegal opioids (AOR=4.73; 95% CI 2.63 8.52) were significantly more likely to be offered a THN kit. Conclusion: ED-based THN programs have the potential to reach significant proportions of patients at high risk of mortality. However, these programs may have differential reach within the target population. Further research is needed to examine barriers and facilitators to offering all eligible ED patients a THN kit.
Coinfection with human immunodeficiency virus (HIV) and viral hepatitis is associated with high morbidity and mortality in the absence of clinical management, making identification of these cases crucial. We examined characteristics of HIV and viral hepatitis coinfections by using surveillance data from 15 US states and two cities. Each jurisdiction used an automated deterministic matching method to link surveillance data for persons with reported acute and chronic hepatitis B virus (HBV) or hepatitis C virus (HCV) infections, to persons reported with HIV infection. Of the 504 398 persons living with diagnosed HIV infection at the end of 2014, 2.0% were coinfected with HBV and 6.7% were coinfected with HCV. Of the 269 884 persons ever reported with HBV, 5.2% were reported with HIV. Of the 1 093 050 persons ever reported with HCV, 4.3% were reported with HIV. A greater proportion of persons coinfected with HIV and HBV were males and blacks/African Americans, compared with those with HIV monoinfection. Persons who inject drugs represented a greater proportion of those coinfected with HIV and HCV, compared with those with HIV monoinfection. Matching HIV and viral hepatitis surveillance data highlights epidemiological characteristics of persons coinfected and can be used to routinely monitor health status and guide state and national public health interventions.
Water cultures were significantly more sensitive than concurrently collected swab cultures (n=2,147 each) in detecting Legionella pneumophila within a Veterans Affairs healthcare system. Sensitivity for water versus swab cultures was 90% versus 30% overall, 83% versus 48% during a nosocomial Legionnaires’ disease outbreak, and 93% versus 22% post outbreak.
As part of further investigations into three linked haemorrhagic fever with renal syndrome (HFRS) cases in Wales and England, 21 rats from a breeding colony in Cherwell, and three rats from a household in Cheltenham were screened for hantavirus. Hantavirus RNA was detected in either the lungs and/or kidney of 17/21 (81%) of the Cherwell rats tested, higher than previously detected by blood testing alone (7/21, 33%), and in the kidneys of all three Cheltenham rats. The partial L gene sequences obtained from 10 of the Cherwell rats and the three Cheltenham rats were identical to each other and the previously reported UK Cherwell strain. Seoul hantavirus (SEOV) RNA was detected in the heart, kidney, lung, salivary gland and spleen (but not in the liver) of an individual rat from the Cherwell colony suspected of being the source of SEOV. Serum from 20/20 of the Cherwell rats and two associated HFRS cases had high levels of SEOV-specific antibodies (by virus neutralisation). The high prevalence of SEOV in both sites and the moderately severe disease in the pet rat owners suggest that SEOV in pet rats poses a greater public health risk than previously considered.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.