To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The rocky shores of the north-east Atlantic have been long studied. Our focus is from Gibraltar to Norway plus the Azores and Iceland. Phylogeographic processes shape biogeographic patterns of biodiversity. Long-term and broadscale studies have shown the responses of biota to past climate fluctuations and more recent anthropogenic climate change. Inter- and intra-specific species interactions along sharp local environmental gradients shape distributions and community structure and hence ecosystem functioning. Shifts in domination by fucoids in shelter to barnacles/mussels in exposure are mediated by grazing by patellid limpets. Further south fucoids become increasingly rare, with species disappearing or restricted to estuarine refuges, caused by greater desiccation and grazing pressure. Mesoscale processes influence bottom-up nutrient forcing and larval supply, hence affecting species abundance and distribution, and can be proximate factors setting range edges (e.g., the English Channel, the Iberian Peninsula). Impacts of invasive non-native species are reviewed. Knowledge gaps such as the work on rockpools and host–parasite dynamics are also outlined.
Introduction: Low acuity patients have been controversially tagged as a source of emergency department (ED) misuse. Authorities for many Canadian health regions have set up policies so these patients preferably present to walk-in clinics (WIC). We compared the cost and quality of the care given to low acuity patients in an academic ED and a WIC of Québec City during fiscal year 2015-16. Methods: We conducted an ambidirectional (prospective and retrospective) cohort study using a time-driven activity-based costing method. This method uses duration of care processes (e.g., triage) to allocate to patient care all direct costs (e.g., personnel, consumables), overheads (e.g., building maintenance) and physician charges. We included consecutive adult patients, ambulatory at all time and discharged from the ED or WIC with a diagnosis of upper respiratory tract infection (URTI), urinary tract infection (UTI) or low back pain. Mean cost [95%CI] per patient per condition was compared between settings after risk-adjustment for age, sex, vital signs, number of regular medications and co-morbidities using generalized log-gamma regression models. Proportions [95%CI] of antibiotic prescription and chest X-Ray use in URTI, compliance with provincial guidelines on use of antibiotics in UTI, and column X-Ray use in low back pain were compared between settings using a Pearson Chi-Square test. Results: A total of 409 patients were included. ED and WIC groups were similar in terms of age, sex and vital signs on presentation, but ED patients had a greater burden of comorbidities. Adjusted mean cost (2016 CAN$) of care was significantly higher in the ED than in the WIC (p < 0.0001) for URTI (78.42[64.85-94.82] vs. 59.43[50.43-70.06]), UTI (78.88[69.53-89.48] vs. 53.29[43.68-65.03]), and low back pain (87.97[68.30-113.32] vs. 61.71[47.90-79.51]). For URTI, antibiotics were more frequently prescribed in the WIC (44.1%[34.3-54.3] vs. 5.8%[1.2-16.0]; p < 0.0001) and chest X-Rays, more frequently used in the ED (26.9%[15.6-41.0] vs. 13.7%[7.7-22.0]; p = 0.05). No significant differences were observed in the compliance with guidelines on use of antibiotics in UTI and in the use of column X-Ray in low back pain. Conclusion: Total cost of care for low acuity patients is lower in walk-in clinics than in EDs. However, our results suggest that quality-of-care issues should be considered in determining the best alternate setting for treating ambulatory emergency patients.
Objectives: Studies of neurocognitively elite older adults, termed SuperAgers, have identified clinical predictors and neurobiological indicators of resilience against age-related neurocognitive decline. Despite rising rates of older persons living with HIV (PLWH), SuperAging (SA) in PLWH remains undefined. We aimed to establish neuropsychological criteria for SA in PLWH and examined clinically relevant correlates of SA. Methods: 734 PLWH and 123 HIV-uninfected participants between 50 and 64 years of age underwent neuropsychological and neuromedical evaluations. SA was defined as demographically corrected (i.e., sex, race/ethnicity, education) global neurocognitive performance within normal range for 25-year-olds. Remaining participants were labeled cognitively normal (CN) or impaired (CI) based on actual age. Chi-square and analysis of variance tests examined HIV group differences on neurocognitive status and demographics. Within PLWH, neurocognitive status differences were tested on HIV disease characteristics, medical comorbidities, and everyday functioning. Multinomial logistic regression explored independent predictors of neurocognitive status. Results: Neurocognitive status rates and demographic characteristics differed between PLWH (SA=17%; CN=38%; CI=45%) and HIV-uninfected participants (SA=35%; CN=55%; CI=11%). In PLWH, neurocognitive groups were comparable on demographic and HIV disease characteristics. Younger age, higher verbal IQ, absence of diabetes, fewer depressive symptoms, and lifetime cannabis use disorder increased likelihood of SA. SA reported increased independence in everyday functioning, employment, and health-related quality of life than non-SA. Conclusions: Despite combined neurological risk of aging and HIV, youthful neurocognitive performance is possible for older PLWH. SA relates to improved real-world functioning and may be better explained by cognitive reserve and maintenance of cardiometabolic and mental health than HIV disease severity. Future research investigating biomarker and lifestyle (e.g., physical activity) correlates of SA may help identify modifiable neuroprotective factors against HIV-related neurobiological aging. (JINS, 2019, 25, 507–519)
From 1565 to 1570, Spain established no fewer than three networks of presidios (fortified military settlements) across portions of its frontier territories in La Florida and New Spain. Juan Pardo's network of six forts, extending from the Atlantic coast over the Appalachian Mountains, was the least successful of these presidio systems, lasting only from late 1566 to early 1568. The failure of Pardo's defensive network has long been attributed to poor planning and an insufficient investment of resources. Yet recent archaeological discoveries at the Berry site in western North Carolina—the location of both the Native American town of Joara and Pardo's first garrison, Fort San Juan—warrants a reappraisal of this interpretation. While previous archaeological research at Berry concentrated on the domestic compound where Pardo's soldiers resided, the location of the fort itself remained unknown. In 2013, the remains of Fort San Juan were finally identified south of the compound, the first of Pardo's interior forts to be discovered by archaeologists. Data from excavations and geophysical surveys suggest that it was a substantial defensive construction. We attribute the failure of Pardo's network to the social geography of the Native South rather than to an insufficient investment of resources.
Anthropologists have become increasingly aware of the importance of population as a factor in a systematic view of human biological and cultural development. This awareness has generated an interest in the field of demography, and consequently, techniques once utilized almost exclusively by demographers are now frequently utilized for anthropological studies. Anthropological-paleodemographic inquiry traditionally starts with the excavation of a skeletal population sample. The sample is aged and sexed, and the data are put into a descriptive analytic model–the life table. The life table, through a process of inference, is taken to represent the life processes of a local biological population and often forms the basis for further inference on the relationships between populational and cultural processes (Green and others 1974; Howell-Lee 1971).
Critics have questioned the assumption of life table construction and cited various sources of error in data collection to argue against the use of life tables as a source of inference concerning the biological population. In this paper we will attempt to address some of these sources of error. Specifically, we will discuss the effects of enumeration errors, population growth, and small population size on life table values. To assess the impact of these errors on life table values of “anthropological” populations, we make use of computer simulation. We conclude that, once the implications of these factors are understood, the life table can provide a useful model for paleodemographic research.
Traumatic stressors during childhood and adolescence are associated with psychopathology, mostly studied in the context of post-traumatic stress disorder (PTSD) and depression. We investigated broader associations of traumatic stress exposure with psychopathology and cognition in a youth community sample.
The Philadelphia Neurodevelopmental Cohort (N = 9498) is an investigation of clinical and neurobehavioral phenotypes in a diverse (56% Caucasian, 33% African American, 11% other) US youth community population (aged 8–21). Participants were ascertained through children's hospital pediatric (not psychiatric) healthcare network in 2009–2011. Structured psychiatric evaluation included screening for lifetime exposure to traumatic stressors, and a neurocognitive battery was administered.
Exposure rate to traumatic stressful events was high (none, N = 5204; one, N = 2182; two, N = 1092; three or more, N = 830). Higher stress load was associated with increased psychopathology across all clinical domains evaluated: mood/anxiety (standardized β = .378); psychosis spectrum (β = .360); externalizing behaviors (β = .311); and fear (β = .256) (controlling for covariates, all p < 0.001). Associations remained significant controlling for lifetime PTSD and depression. Exposure to high-stress load was robustly associated with suicidal ideation and cannabis use (odds ratio compared with non-exposed 5.3 and 3.2, respectively, both p < 0.001). Among youths who experienced traumatic stress (N = 4104), history of assaultive trauma was associated with greater psychopathology and, in males, vulnerability to psychosis and externalizing symptoms. Stress load was negatively associated with performance on executive functioning, complex reasoning, and social cognition.
Traumatic stress exposure in community non-psychiatric help-seeking youth is substantial, and is associated with more severe psychopathology and neurocognitive deficits across domains, beyond PTSD and depression.
It is now widely recognised that feeding high levels of cereal-based concentrate feeds to horses can precipitate episodes of colic, laminitis and developmental orthopaedic disease. However, when feeding performance horses or fast-growing young stock, particularly Thoroughbreds, traditional feeding regimes still persist whereby animals are fed high-cereal, low-fibre diets. When considering the knowledge to-date on the digestibility and availability of crude protein, this feeding regime is not surprising as previous work by Gibbs et al (1988 and 1996) and Potter et al (1992), have reported that the large intestine was the major site for crude protein degradation in hay. Protein is held within the cell wall matrix in fibre feeds and is fermented by microbes to NH3, which may be subsequently processed in the liver to urea, recycled or excreted via the urine. Moreover, a high proportion of the protein passing through the ileo-caecal junction is excreted with the faeces as intact microbial protein and therefore is unavailable for metabolic purposes in the horse.
The carbohydrate (CHO) fraction of pasture grasses is a major source of energy for many domestic herbivores. However, the amounts, and types, of the water–soluble carbohydrate (WSC) fraction (i.e. glucose, fructose, sucrose, and polymers of sucrose and fructose, the fructans) present in such grasses, varies with species and environmental conditions. As the WSC constitute a highly digestible, energy yielding fraction of grasses, it is important to be able to measure their levels in a sward so that the diets of pastured animals may be designed to elicit optimal health and productivity. The aim of this study was to characterise the WSC profile of six UK pasture grasses, and to develop a technique for extracting the fructan portion of the WSC.
Six species of UK pasture grasses [Cocksfoot (C), Timothy (T), Meadow Fescue (M), Italian Ryegrass (IR), Perennial Ryegrass (PR) and Hybrid Ryegrass (HR)] were grown in experimental field plots at IGER.
Fibrous foods are major sources of energy and protein for equids. The potentially energy yielding fraction of dietary fibre consists of non-starch polysaccharides (NSP). NSP cannot be digested by equine enzymes and in order for the animal to obtain energy from NSP it must be fermented by the gut microflora to yield volatile fatty acid. This fermentation process is less efficient in terms of yield of ATP than the digestion of starch to glucose, and is generally believed to occur solely in the large intestine. Plant protein may be associated with the NSP or the protoplast. However, horses can only utilize that protein which has been digested and absorbed in the small intestine. Thus, knowledge of the site and extent of nutrient degradation is important to enable the accurate formulation of diets for horses. Therefore in the current study the site and extent of NSP and crude protein (CP) degradation from four fibrous foods commonly given to horses in the United Kingdom were determined in caecally fistulated ponies using mobile bags.
The trial was a 4 X 3 incomplete Latin square design with three caecally fistulated Welsh X pony geldings (ca. 250 kg live weight) and four botanically diverse sources of dietary fibre. The ponies were maintained on a basal diet of hay and grass nuts and water was available ad libitum. Bags (6X1 cm) of monofilament polyester mesh pore size 4 μm were filled with 350 mg of either unmolassed sugar-beet pulp (SB), hay cubes (HC), soya hulls (SH) or oat hulls : naked oats (2:1) (OH: NO), which had been ground to pass a 1-mm steel mesh. On two consecutive mornings 20 bags were introduced into the ponies via a naso-gastric tube. Each bag contained two 100 mg steel washers which enabled their capture by a magnet placed inside the caecal fistula: the cannulae were positioned just posterior to the ileo-caecal junction. Between 10 and 16 bags were recovered on the magnet, the remaining bags were allowed to continue through the hind-gut and were subsequently collected in the faeces.
It has been shown that horses and ponies at pasture usually graze for 15-17 hours per day, and consume between 16 and 33g dry matter (DM) /kg live weight per day, depending on animal size and physiological status. However, many predominantly stabled horses have restricted access to pasture, often only 1-3 hours/day. There is no information on voluntary food intake (VFI) of horses under such regimens. Therefore the aim of this pilot study was to determine the voluntary intake of fresh herbage by ponies when their access to pasture was restricted.
Little information is available on digestion of the non-starch polysaccharide (NSP) fraction of fibrous feeds in equines. Two studies were conducted which examined the in vivo apparent digestibilities of proximate constituents and NSP in ponies offered diets based on botanically diverse fibrous foodstuffs.
In study 1 (S1), three mature caecally-fistulated Welsh-cross pony geldings (266 kg LW) were used in a 3 x 3 latin square changeover design experiment consisting of three 21 day periods. Ponies were offered 4 kg dry matter (DM) per day of either unmolassed sugar beet pulp (USBP), hay cubes (HC) or a 2:1 mix of oat hulls:naked oats (OHNO) plus minerals in 2 equal meals per day. After completion of S1 the same 3 ponies were used in study 2 (S2) where they were offered 4 kg DM/day of a 50:50 mix of USBP:HC (USHC) plus minerals fed as in S1 for a 21 day period.
Estimates of digesta passage through specific segments of the alimentary tract are a vital component of modelling approaches which attempt to quantitatively partition digestive processes in equines. This study reports results from three studies where digesta passage of Chromium (Cr) mordanted feeds was determined in the caecum of ponies.
Caecal outflow rates were determined during three in vivo apparent digestibility studies conducted using three caecally-fistulated ponies as described by Moore-Colyer et al, (1999) for studies 1 and 2; and McLean et al, (1999) for study 3. Pony basal diets consisted of unmolassed sugar beet pulp (USBP), hay cubes (HC) or a 2:1 mix of oat hulls:naked oats (OHNO) in study 1; a 1:1 mix of USBP:HC (USHC) in study 2 and either 100% HC or one of 3 diets consisting of a 1:1 HC:barley mix where the barley was either rolled (RBHC), micronised (MBHC) or extruded (EBHC) in study 3.
A wide range of roughage foodstuffs is available for feeding to horses. Considerable variation exists in the quality of these feeds in terms of nutrient composition and freedom from dust, fungal and bacterial contamination and infestation by mites. Many horse-owners are now feeding forages such as haylage and baled silage to horses as opposed to hay. However, limited information is available regarding the effect of chop length on digestibility and thus the feeding value of chopped or long-cut silage to horses. More information on fodder type and preparation could help to provide the basis for improvements in the practical guidelines and recommendations for those involved with the production and feeding of forages to horses.
Feeding horses high levels of cereal starch can result in diet-related azoturia, laminitis and colic, whereas high fibre, forage-based diets do not generally elicit these conditions. Therefore, it would be advantageous to develop fibrous feeds with increased digestibilities, permitting horses with high energy demands to be sustained on greater forage: cereal starch ratios. High temperature dried (HT) alfalfa has been fed to horses for a number of years and it is common practise to combine this with sugar beet pulp (SB) another nutritious fibrous feed for horses. Synergistic effects of SB when added to fibre-based diets have been observed in other species in vivo (Longland et al., 1994) whereby the digestibility of graminaceous feeds has been increased. However, such effects have been little examined in horses fed a leguminous-forage diet. The aim of this study therefore, was to determine if SB enhanced the digestibility of alfalfa, a forage legume that is increasingly being fed to equines in the UK.
There is a dearth of information available on the effects of donor animal on the fermentative capacity of equine faecal inocula for use in in vitro digestibility determinations. Furthermore, there is little knowledge of the degradation characteristic of feedstuffs incubated with equine faecal inocula. As such this study aimed to elucidate the effect of donor animal on the fermentation of feedstuffs in vitro and to assess the in vitro degradation characteristics of three commonly fed components of horse diets incubated with equine faecal inocula.
We use the upper 81 mof the record of stable isotopes of water from a 122m long ice core from Lomonosovfonna, central Spitsbergen, Svalbard, to construct an ice-core chronology and the annual accumulation rates over the icefield. the isotope cycles are counted in the ice-core record using a model that neglects short-wavelength and low-amplitude cycles. We find approximately the same number of δ18O cycles as years between known reference horizons, and assume these cycles represent annual cycles. Testing the validity of this assumption using cycles in δD shows that both records give similar numbers of cycles. Using the δ18O chronology, and decompressing the accumulation records using the Nye flow model, we calculate the annual accumulation for the ice-core site back to AD 1715. We find that the average accumulation rate from 1715 to 1950 was on average 0.30 mw.e. Accumulation rates increased about 25% during the later part of the 20th century to an average of 0.41 mw.e. for the period 1950–97. the accumulation rates show highly significant 2.1 and 21 year periodicities, which gives credibility to our time-scale.
Objectives: Human immunodeficiency virus (HIV) disproportionately affects Hispanics/Latinos in the United States, yet little is known about neurocognitive impairment (NCI) in this group. We compared the rates of NCI in large well-characterized samples of HIV-infected (HIV+) Latinos and (non-Latino) Whites, and examined HIV-associated NCI among subgroups of Latinos. Methods: Participants included English-speaking HIV+ adults assessed at six U.S. medical centers (194 Latinos, 600 Whites). For overall group, age: M=42.65 years, SD=8.93; 86% male; education: M=13.17, SD=2.73; 54% had acquired immunodeficiency syndrome. NCI was assessed with a comprehensive test battery with normative corrections for age, education and gender. Covariates examined included HIV-disease characteristics, comorbidities, and genetic ancestry. Results: Compared with Whites, Latinos had higher rates of global NCI (42% vs. 54%), and domain NCI in executive function, learning, recall, working memory, and processing speed. Latinos also fared worse than Whites on current and historical HIV-disease characteristics, and nadir CD4 partially mediated ethnic differences in NCI. Yet, Latinos continued to have more global NCI [odds ratio (OR)=1.59; 95% confidence interval (CI)=1.13–2.23; p<.01] after adjusting for significant covariates. Higher rates of global NCI were observed with Puerto Rican (n=60; 71%) versus Mexican (n=79, 44%) origin/descent; this disparity persisted in models adjusting for significant covariates (OR=2.40; CI=1.11–5.29; p=.03). Conclusions: HIV+ Latinos, especially of Puerto Rican (vs. Mexican) origin/descent had increased rates of NCI compared with Whites. Differences in rates of NCI were not completely explained by worse HIV-disease characteristics, neurocognitive comorbidities, or genetic ancestry. Future studies should explore culturally relevant psychosocial, biomedical, and genetic factors that might explain these disparities and inform the development of targeted interventions. (JINS, 2018, 24, 163–175)
VLBI observations of 3C 345 at 10.8 GHz and 22.2 GHz show that the position angle of the new component is increasing as it separates from the core. Also, the apparent velocity of the component is increasing. This is the first clear evidence for non-radial motion and acceleration of an individual component in an extragalactic radio source.