To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
England has recently started a new paediatric influenza vaccine programme using a live-attenuated influenza vaccine (LAIV). There is uncertainty over how well the vaccine protects against more severe end-points. A test-negative case–control study was used to estimate vaccine effectiveness (VE) in vaccine-eligible children aged 2–16 years of age in preventing laboratory-confirmed influenza hospitalisation in England in the 2015–2016 season using a national sentinel laboratory surveillance system. Logistic regression was used to estimate the VE with adjustment for sex, risk-group, age group, region, ethnicity, deprivation and month of sample collection. A total of 977 individuals were included in the study (348 cases and 629 controls). The overall adjusted VE for all study ages and vaccine types was 33.4% (95% confidence interval (CI) 2.3–54.6) after adjusting for age group, sex, index of multiple deprivation, ethnicity, region, sample month and risk group. Risk group was shown to be an important confounder. The adjusted VE for all influenza types for the live-attenuated vaccine was 41.9% (95% CI 7.3–63.6) and 28.8% (95% CI −31.1 to 61.3) for the inactivated vaccine. The study provides evidence of the effectiveness of influenza vaccination in preventing hospitalisation due to laboratory-confirmed influenza in children in 2015–2016 and continues to support the rollout of the LAIV childhood programme.
Decreases in cognitive function related to increases in oxidative stress and inflammation occur with ageing. Acknowledging the free radical-quenching activity and anti-inflammatory action of the carotenoid lycopene, the aim of the present review was to assess if there is evidence for a protective relationship between lycopene and maintained cognitive function or between lycopene and development or progression of dementia. A systematic literature search identified five cross-sectional and five longitudinal studies examining these outcomes in relation to circulating or dietary lycopene. Among four studies evaluating relationships between lycopene and maintained cognition, three reported significant positive relationships. Neither of the two studies reporting on relationship between lycopene and development of dementia reported significant results. Of four studies investigating circulating lycopene and pre-existing dementia, only one reported significant associations between lower circulating lycopene and higher rates of Alzheimer's disease mortality. Acknowledging heterogeneity among studies, there is insufficient evidence and a paucity of data to draw firm conclusions or tease apart direct effects of lycopene. Nevertheless, as low circulating lycopene is a predictor of all-cause mortality, further investigation into its relationship with cognitive longevity and dementia-related mortality is warranted.
Introduction: Little is known about the variety of roles volunteers play in the emergency department (ED), and the potential impact they have on patient experience. The objective of this scoping review was to identify published and unpublished reports that described volunteer programs in EDs, and determine how these programs impacted patient experiences or outcomes. Methods: Electronic searches of Medline, EMBASE, Cochrane Central Register of Controlled Trials, Cochrane Database of Systematic Reviews and CINAHL were conducted and reference lists were hand-searched. A grey literature search was also conducted (Web of Science, ProQuest, Canadian Business and Current Affairs Database ProQuest Dissertations and Theses Global). Two reviewers independently screened titles and abstracts, reviewed full text articles, and extracted data. Results: The search strategy yielded 4,589 potentially relevant citations. After eliminating duplicate citations and articles that did not meet eligibility criteria, 87 reports were included in the review. Of the included reports, 18 were peer-reviewed articles, 6 were conference proceedings, 59 were magazine or newspaper articles, and 4 were graduate dissertations or theses. Volunteer activities were categorized as non-clinical tasks (e.g., provision of meals/snacks, comfort items and mobility assistance), navigation, emotional support/communication, and administrative duties. 52 (59.8%) programs had general volunteers in the ED and 35 (40.2%) had volunteers targeting a specific patient population, including pediatrics, geriatrics, patients with mental health and addiction issues and other vulnerable populations. 20 (23.0%) programs included an evaluative component describing how ED volunteers affected patient experiences and outcomes. Patient satisfaction, follow-up and referral rates, ED and hospital costs and length of stay, subsequent ED visits, medical complications, and malnutrition in the hospital were all reported to be positively affected by volunteers in the ED. Conclusion: This scoping review demonstrates the important role volunteers play in enhancing patient and caregiver experience in the ED. Future volunteer engagement programs implemented in the ED should be formally described and evaluated to share their success and experience with others interested in implementing similar programs in the ED.
Objectives: Studies of neurocognitively elite older adults, termed SuperAgers, have identified clinical predictors and neurobiological indicators of resilience against age-related neurocognitive decline. Despite rising rates of older persons living with HIV (PLWH), SuperAging (SA) in PLWH remains undefined. We aimed to establish neuropsychological criteria for SA in PLWH and examined clinically relevant correlates of SA. Methods: 734 PLWH and 123 HIV-uninfected participants between 50 and 64 years of age underwent neuropsychological and neuromedical evaluations. SA was defined as demographically corrected (i.e., sex, race/ethnicity, education) global neurocognitive performance within normal range for 25-year-olds. Remaining participants were labeled cognitively normal (CN) or impaired (CI) based on actual age. Chi-square and analysis of variance tests examined HIV group differences on neurocognitive status and demographics. Within PLWH, neurocognitive status differences were tested on HIV disease characteristics, medical comorbidities, and everyday functioning. Multinomial logistic regression explored independent predictors of neurocognitive status. Results: Neurocognitive status rates and demographic characteristics differed between PLWH (SA=17%; CN=38%; CI=45%) and HIV-uninfected participants (SA=35%; CN=55%; CI=11%). In PLWH, neurocognitive groups were comparable on demographic and HIV disease characteristics. Younger age, higher verbal IQ, absence of diabetes, fewer depressive symptoms, and lifetime cannabis use disorder increased likelihood of SA. SA reported increased independence in everyday functioning, employment, and health-related quality of life than non-SA. Conclusions: Despite combined neurological risk of aging and HIV, youthful neurocognitive performance is possible for older PLWH. SA relates to improved real-world functioning and may be better explained by cognitive reserve and maintenance of cardiometabolic and mental health than HIV disease severity. Future research investigating biomarker and lifestyle (e.g., physical activity) correlates of SA may help identify modifiable neuroprotective factors against HIV-related neurobiological aging. (JINS, 2019, 25, 507–519)
Approximately 70% of the 30 000 known bee (Hymenoptera) species and most flower-visiting, solitary wasps (Hymenoptera) nest in the ground. However, nesting behaviours of most ground-nesting bees and wasps are poorly understood. Habitat loss, including nesting habitat, threatens populations of ground-nesting bees and wasps. Most ground-nesting bee and wasp studies implement trapping methods that capture foraging individuals, but provide little insight into the nesting preferences of these taxa. Some researchers have suggested that emergence traps may provide a suitable means by which to determine ground-nesting bee and wasp abundance. We sought to evaluate nest-site selection of ground-nesting bees and wasps using emergence traps in two study systems: (1) planted wildflower enhancement plots and fallow control plots in agricultural land; and (2) upland pine and hammock habitat in forests. Over the course of three years (2015–2017), we collected 306 ground-nesting bees and wasps across all study sites from emergence traps. In one study, we compared captures per trap between coloured pan traps and emergence traps and found that coloured pan traps captured far more ground-nesting bees and wasps than did emergence traps. Based on our emergence trap data, our results also suggest ground-nesting bees and wasps are more apt to nest within wildflower enhancement plots than in fallow control plots, and in upland pine habitats than in hammock forests. In conclusion, emergence traps have potential to be a unique tool to gain understanding of ground-nesting bee and wasp habitat requirements.
In 1995, Robert Ambrogi, former columnist for Legal Technology News, wrote about the Internet's potential to revolutionize the accessibility and delivery of legal information. Almost 25 years later, Ambrogi now describes his initial optimism as a “pipe dream.” Perhaps one of the greatest problems facing the legal industry today is the sheer inaccessibility of legal information. Not only does this inaccessibility prevent millions of Americans from obtaining reliable legal information, but it also prevents many attorneys from adequately providing legal services to their clients. Whether locked behind government paywalls or corporate cash registers, legal information is simply not efficiently and affordably attainable through traditional means.
Unfavourable dietary habits, such as skipping breakfast, are common among ethnic minority children and may contribute to inequalities in cardiometabolic disease. We conducted a longitudinal follow-up of a subsample of the UK multi-ethnic Determinants of Adolescent Social well-being and Health cohort, which represents the main UK ethnic groups and is now aged 21–23 years. We aimed to describe longitudinal patterns of dietary intake and investigate their impact on cardiometabolic risk in young adulthood. Participants completed a dietary behaviour questionnaire and a 24 h dietary intake recall; anthropometry, blood pressure, total cholesterol and HDL-cholesterol and HbA1c were measured. The cohort consisted of 107 White British, 102 Black Caribbean, 132 Black African, 98 Indian, 111 Bangladeshi/Pakistani and 115 other/mixed ethnicity. Unhealthful dietary behaviours such as skipping breakfast and low intake of fruits and vegetables were common (56, 57 and 63 %, respectively). Rates of skipping breakfast and low fruit and vegetable consumption were highest among Black African and Black Caribbean participants. BMI and cholesterol levels at 21–23 years were higher among those who regularly skipped breakfast at 11–13 years (BMI 1·41 (95 % CI 0·57, 2·26), P=0·001; cholesterol 0·15 (95 % CI –0·01, 0·31), P=0·063) and at 21–23 years (BMI 1·05 (95 % CI 0·22, 1·89), P=0·014; cholesterol 0·22 (95 % CI 0·06, 0·37), P=0·007). Childhood breakfast skipping is more common in certain ethnic groups and is associated with cardiometabolic risk factors in young adulthood. Our findings highlight the importance of targeting interventions to improve dietary behaviours such as breakfast consumption at specific population groups.
Flexible piezoelectric generators (PEGs) present a unique opportunity for renewable and sustainable energy harvesting. Here, we present a low-temperature and low-energy deposition method using solvent evaporation-assisted three-dimensional printing to deposit electroactive poly(vinylidene fluoride) (PVDF)-trifluoroethylene (TrFE) up to 19 structured layers. Visible-wavelength transmittance was above 92%, while ATR-FTIR spectroscopy showed little change in the electroactive phase fraction between layer depositions. Electroactivity from the fabricated PVDF-TrFE PEGs showed that a single structured layer gave the greatest output at 289.3 mV peak-to-peak voltage. This was proposed to be due to shear-induced polarization affording the alignment of the fluoropolymer dipoles without an electric field or high temperature.
Two Category 5 storms, Hurricane Irma and Hurricane Maria, hit the U.S. Virgin Islands (USVI) within 13 days of each other in September 2017. These storms caused catastrophic damage across the territory, including widespread loss of power, destruction of homes, and devastation of critical infrastructure. During large scale disasters such as Hurricanes Irma and Maria, public health surveillance is an important tool to track emerging illnesses and injuries, identify at-risk populations, and assess the effectiveness of response efforts. The USVI Department of Health (DoH) partnered with shelter staff volunteers to monitor the health of the sheltered population and help guide response efforts.
Shelter volunteers collect data on the American Red Cross Aggregate Morbidity Report form that tallies the number of client visits at a shelter’s health services every 24 hours. Morbidity data were collected at all 5 shelters on St. Thomas and St. Croix between September and October 2017. This article describes the health surveillance data collected in response to Hurricanes Irma and Maria.
Following Hurricanes Irma and Maria, 1130 health-related client visits were reported, accounting for 1655 reasons for the visits (each client may have more than 1 reason for a single visit). Only 1 shelter reported data daily. Over half of visits (51.2%) were for health care management; 17.7% for acute illnesses, which include respiratory conditions, gastrointestinal symptoms, and pain; 14.6% for exacerbation of chronic disease; 9.8% for mental health; and 6.7% for injury. Shelter volunteers treated many clients within the shelters; however, reporting of the disposition (eg, referred to physician, pharmacist) was often missed (78.1%).
Shelter surveillance is an efficient means of quickly identifying and characterizing health issues and concerns in sheltered populations following disasters, allowing for the development of evidence-based strategies to address identified needs. When incorporated into broader surveillance strategies using multiple data sources, shelter data can enable disaster epidemiologists to paint a more comprehensive picture of community health, thereby planning and responding to health issues both within and outside of shelters. The findings from this report illustrated that managing chronic conditions presented a more notable resource demand than acute injuries and illnesses. Although there remains room for improvement because reporting was inconsistent throughout the response, the capacity of shelter staff to address the health needs of shelter residents and the ability to monitor the health needs in the sheltered population were critical resources for the USVI DoH overwhelmed by the disaster. (Disaster Med Public Health Preparedness. 2019;13:38-43)
Two category 5 storms hit the US Virgin Islands (USVI) within 13 days of each other in September 2017. This caused an almost complete loss of power and devastated critical infrastructure such as the hospitals and airports
The USVI Department of Health conducted 2 response Community Assessments for Public Health Emergency Response (CASPERs) in November 2017 and a recovery CASPER in February 2018. CASPER is a 2-stage cluster sampling method designed to provide household-based information about a community’s needs in a timely, inexpensive, and representative manner.
Almost 70% of homes were damaged or destroyed, 81.2% of homes still needed repair, and 10.4% of respondents felt their home was unsafe to live in approximately 5 months after the storms. Eighteen percent of individual respondents indicated that their mental health was “not good” for 14 or more days in the past month, a significant increase from 2016.
The CASPERs helped characterize the status and needs of residents after the devastating hurricanes and illustrate the evolving needs of the community and the progression of the recovery process. CASPER findings were shared with response and recovery partners to promote data-driven recovery efforts, improve the efficiency of the current response and recovery efforts, and strengthen emergency preparedness in USVI. (Disaster Med Public Health Preparedness. 2019;13:53-62)
Effective communication is a critical part of managing an emergency. During an emergency, the ways in which health agencies normally communicate warnings may not reach all of the intended audience. Not all communities are the same, and households within communities are diverse. Because different communities prefer different communication methods, community leaders and emergency planners need to know their communities’ preferred methods for seeking information about an emergency. This descriptive report explores findings from previous community assessments that have collected information on communication preferences, including television (TV), social media, and word-of-mouth (WoM) delivery methods. Data were analyzed from 12 Community Assessments for Public Health Emergency Response (CASPERs) conducted from 2014-2017 that included questions regarding primary and trusted communication sources. A CASPER is a rapid needs assessment designed to gather household-based information from a community. In 75.0% of the CASPERs, households reported TV as their primary source of information for specific emergency events (range = 24.0%-83.1%). Households reporting social media as their primary source of information differed widely across CASPERs (3.2%-41.8%). In five of the CASPERs, nearly one-half of households reported WoM as their primary source of information. These CASPERs were conducted in response to a specific emergency (ie, chemical spill, harmful algal bloom, hurricane, and flood). The CASPERs conducted as part of a preparedness activity had lower percentages of households reporting WoM as their primary source of information (8.3%-10.4%). The findings in this report demonstrate the need for emergency plans to include hybrid communication models, combining traditional methods with newer technologies to reach the broadest audience. Although TV was the most commonly reported preferred source of information, segments of the population relied on social media and WoM messaging. By using multiple methods for risk communication, emergency planners are more likely to reach the whole community and engage vulnerable populations that might not have access to, trust in, or understanding of traditional news sources. Multiple communication channels that include user-generated content, such as social media and WoM, can increase the timeliness of messaging and provide community members with message confirmation from sources they trust encouraging them to take protective public health actions.
WolkinAF, SchnallAH, NakataNK, EllisEM. Getting the Message Out: Social Media and Word-of-Mouth as Effective Communication Methods during Emergencies. Prehosp Disaster Med. 2019;34(1):89–94.
Giardiasis is one of the most important non-viral causes of human diarrhoea. Yet, little is known about the epidemiology of giardiasis in the context of developed countries such as Australia and there is a limited information about local sources of exposure to inform prevention strategies in New South Wales. This study aimed to (1) describe the epidemiology of giardiasis and (2) identify potential modifiable risk factors associated with giardiasis that are unique to south-western Sydney, Australia. A 1:2 matched case-control study of 190 confirmed giardiasis cases notified to the South-Western Local Health District Public Health Unit from January to December 2016 was employed to investigate the risk factors for giardiasis. Two groups of controls were selected to increase response rate; Pertussis cases and neighbourhood (NBH) controls. A matched analysis was carried out for both control groups separately. Variables with a significant odds ratio (OR) in the univariate analysis were placed into a multivariable regression for each matched group, respectively. In the regression model with the NBH controls, age and sex were controlled as potential confounders. Identified risk factors included being under 5 years of age (aOR = 7.08; 95% confidence intervals (CI) 1.02–49.36), having a household member diagnosed with a gastrointestinal illness (aOR = 15.89; 95% CI 1.53–164.60) and having contact with farm animals, domestic animals or wildlife (aOR = 3.03; 95% CI 1.08–8.54). Cases that travelled overseas were at increased risk of infection (aOR = 19.89; 95% CI 2.00–197.37) when compared with Pertussis cases. This study provides an update on the epidemiology and associated risk factors of a neglected tropical disease, which can inform enhanced surveillance and prevention strategies in the developed metropolitan areas.
Little is known about what motivates people to enroll in research registries. The purpose of this study is to identify facilitators of registry enrollment among diverse older adults.
Participants completed an 18-item Research Interest Assessment Tool. We used logistic regression analyses to examine responses across participants and by race and gender.
Participants (N=374) were 58% black, 76% women, with a mean age of 68.2 years. All participants were motivated to maintain their memory while aging. Facilitators of registry enrolled varied by both race and gender. Notably, blacks (estimate=0.71, p<0.0001) and women (estimate=0.32, p=0.03) were more willing to enroll in the registry due to home visits compared with whites and men, respectively.
Researchers must consider participant desire for maintaining memory while aging and home visits when designing culturally tailored registries.
Decades of fetal programming research indicates that we may be able to map the origins of many physical, psychological, and medical variations and morbidities before the birth of the child. While great strides have been made in identifying associations between prenatal insults, such as undernutrition or psychosocial stress, and negative developmental outcomes, far less is known about how adaptive responses to adversity regulate the developing phenotype to match stressful conditions. As the application of epigenetic methods to human behavior has exploded in the last decade, research has begun to shed light on the role of epigenetic mechanisms in explaining how prenatal conditions shape later susceptibilities to mental and physical health problems. In this review, we describe and attempt to integrate two dominant fetal programming models: the cumulative stress model (a disease-focused approach) and the match–mismatch model (an evolutionary–developmental approach). In conjunction with biological sensitivity to context theory, we employ these two models to generate new hypotheses regarding epigenetic mechanisms through which prenatal and postnatal experiences program child stress reactivity and, in turn, promote development of adaptive versus maladaptive phenotypic outcomes. We conclude by outlining priority questions and future directions for the fetal programming field.
Objectives: The objective of this study was to evaluate the feasibility and implementation of a standardized medically supervised concussion protocol established between a city-wide AAA hockey league and a multi-disciplinary concussion program. Methods: We conducted a retrospective review of injury surveillance, clinical and healthcare utilization data from all athletes evaluated and managed through the Winnipeg AAA Hockey concussion protocol during the 2016-2017 season. We also conducted post-season email surveys of head coaches and parents responsible for athletes who competed in the same season. Results: During the 2016-2017 season, 28 athletes were evaluated through the medically supervised concussion protocol, with two athletes undergoing evaluation for repeat injuries (a total of 30 suspected injuries and consultations). In all, 96.7% of the athletes managed through the concussion protocol were captured by the league-designated Concussion Protocol Coordinator and 100% of eligible athletes underwent complete medical follow-up and clearance to return to full hockey activities. Although 90% of responding head coaches and 91% of parents were aware of the concussion protocol, survey results suggest that some athletes who sustained suspected concussions were not managed through the protocol. Head coaches and parents also indicated that athlete education and communication between medical and sport stakeholders were other elements of the concussion protocol that could be improved. Conclusion: Successful implementation of a medically supervised concussion protocol for youth hockey requires clear communication between sport stakeholders and timely access to multi-disciplinary experts in traumatic brain and spine injuries. Standardized concussion protocols for youth sports may benefit from periodic evaluations by sport stakeholders and incorporation of national guideline best practices and resources.
Background: Heterozygous loss-of-function mutations in the synaptic scaffolding gene SHANK2 are strongly associated with autism spectrum disorder (ASD). However, their impact on the function of human neurons is unknown. Derivation of induced pluripotent stem cells (iPSC) from affected individuals permits generation of live neurons to answer this question. Methods: We generated iPSCs by reprogramming dermal fibroblasts of neurotypic and ASD-affected donors. To isolate the effect of SHANK2, we used CRISPR/Cas9 to knock out SHANK2 in control iPSCs and correct a heterozygous nonsense mutation in ASD-affected donor iPSCs. We then derived cortical neurons from SOX1+ neural precursor cells differentiated from these iPSCs. Using a novel assay that overcomes line-to-line variability, we compared neuronal morphology, total synapse number, and electrophysiological properties between SHANK2 mutants and controls. Results: Relative to controls, SHANK2 mutant neurons have increased dendrite complexity, dendrite length, total synapse number (1.5-2-fold), and spontaneous excitatory postsynaptic current (sEPSC) frequency (3-7.6-fold). Conclusions: ASD-associated heterozygous loss-of-function mutations in SHANK2 increase synaptic connectivity among human neurons by increasing synapse number and sEPSC frequency. This is partially supported by increased dendrite length and complexity, providing evidence that SHANK2 functions as a suppressor of dendrite branching during neurodevelopment.
Significant increases in excess all-cause mortality, particularly in the elderly, were observed during the winter of 2014/15 in England. With influenza A(H3N2) the dominant circulating influenza A subtype, this paper determines the contribution of influenza to this excess controlling for weather. A standardised multivariable Poisson regression model was employed with weekly all-cause deaths the dependent variable for the period 2008–2015. Adjusting for extreme temperature, a total of 26 542 (95% CI 25 301–27 804) deaths in 65+ and 1942 (95% CI 1834–2052) in 15–64-year-olds were associated with influenza from week 40, 2014 to week 20, 2015. This is compatible with the circulation of influenza A(H3N2). It is the largest estimated number of influenza-related deaths in England since prior to 2008/09. The findings highlight the potential health impact of influenza and the important role of the annual influenza vaccination programme that is required to protect the population including the elderly, who are vulnerable to a severe outcome.
Introduction: Our study objectives were to assess the acceptability of using the emergency department (ED) waiting room to provide knowledge on, and offer opportunities for organ and tissue donor registration; and to identify barriers to the donor registration process in Ontario. Methods: We conducted a paper based in-person survey over nine days for eight hour blocks in March and April 2017. The survey instrument was created in English using existing literature and expert opinion, pilot tested and then translated into French. The study collected data from patients and visitors in an urban academic Canadian tertiary care ED waiting room. All adults in the waiting room were approached to participate during the study periods. Individuals waiting in clinical care areas were excluded, as well as those who required immediate treatment. Results: The number of attempted surveys was 324; 67 individuals (20.7%) refused to partake. A total of 257 surveys were distributed and five were returned blank. This gave us a response rate of 77.8% with 252 completed surveys. The median age group was 51-60 years old with 55.9% female. Forty-six percent were Christian (46.0%) and 34.1% did not declare a religious affiliation. Nearly half of participants (44.1%) were registered organ donors. The majority of participants agreed or were neutral (83.3%) that the ED waiting room was an acceptable place to provide information on organ and tissue donation. Further, 82.1% agreed or were neutral that the ED was an acceptable place to register as an organ donor. Nearly half (47.2%) agreed that they would consider registering while in the ED waiting room. A number of barriers to registering as an organ and tissue donor were identified. The most common were: not knowing how to register (22.0%), a lack of time to register (21.1%), and having unanswered questions regarding organ and tissue donation (18.7%). Conclusion: Individuals waiting in the ED are supportive of using the ED waiting room for distributing information regarding organ and tissue donation, and facilitating organ and tissue donation registration. Developing such a practice could help to reduce some of the identified barriers, including a lack of time and having unanswered questions regarding donation.
Recent evidence suggests that exercise plays a role in cognition and that the posterior cingulate cortex (PCC) can be divided into dorsal and ventral subregions based on distinct connectivity patterns.
To examine the effect of physical activity and division of the PCC on brain functional connectivity measures in subjective memory complainers (SMC) carrying the epsilon 4 allele of apolipoprotein E (APOE 4) allele.
Participants were 22 SMC carrying the APOE ɛ4 allele (ɛ4+; mean age 72.18 years) and 58 SMC non-carriers (ɛ4–; mean age 72.79 years). Connectivity of four dorsal and ventral seeds was examined. Relationships between PCC connectivity and physical activity measures were explored.
ɛ4+ individuals showed increased connectivity between the dorsal PCC and dorsolateral prefrontal cortex, and the ventral PCC and supplementary motor area (SMA). Greater levels of physical activity correlated with the magnitude of ventral PCC–SMA connectivity.
The results provide the first evidence that ɛ4+ individuals at increased risk of cognitive decline show distinct alterations in dorsal and ventral PCC functional connectivity.
As grazing ruminants rely almost entirely on mastication to disrupt plant tissues, a series of processes (mastication, bolus formation and ingestion) will impact on the viability and number of cells that remain intact, and consequently alive, after ingestion (Kingston-Smith and Theodorou, 2000). Preliminary work in our group has shown substantial variation in the degree of cell damage during mastication and ingestion between grass species, resulting in differences in the rate of release of cell contents (protein, sugars and lipids) into the rumen (E.J. Kim, unpublished). These differences may affect nutrient utilisation by ruminal micro-organisms. The aim of this study was to compare the extent of nutrient release from three contrasting grass species following ingestion of the fresh forage by dairy cows.