To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This proposed contribution to the special issue of ILWCH offers a theoretical re-consideration of the Liberian project. If, as is commonly supposed in its historiography and across contemporary discourse regarding its fortunes into the twenty-first century, Liberia is a notable, albeit contested, instance of the modern era's correctable violence in that it stands as an imperfect realization of the emancipated slave, the liberated colony, and the freedom to labor unalienated, then such representation continues to hide more than it reveals. This essay, instead, reads Liberia as an instructive leitmotif for the conversion of racial slavery's synecdochical plantation system in the Americas into the plantation of the world writ large: the global scene of antiblackness and the immutable qualification for enslavement accorded black positionality alone. Transitions between political economic systems—from slave trade to “re-colonization,” from Firestone occupation to dictatorial-democratic regimes—reemerge from this re-examination as crucial but inessential to understanding Liberia's position, and thus that of black laboring subjects, in the modern world. I argue that slavery is the simultaneous primitive accumulation of black land and bodies, but that this reality largely escapes current conceptualization of not only the history of labor but also that of enslavement. In other words, the African slave trade (driven first by Arabs in the Indian Ocean region, then Europeans in the Mediterranean, and, subsequently, Euro-Americans in the Atlantic) did not simply leave as its corollary effect, or byproduct, the underdevelopment of African societies. The trade in African flesh was at once the co-production of a geography of desire in which blackness is perpetually fungible at every scale, from the body to the nation-state to its soil—all treasures not simply for violation and exploitation, but more importantly, for accumulation and all manner of usage. The Liberian project elucidates this ongoing reality in distinctive ways—especially when we regard it through the lens of the millennium-plus paradigm of African enslavement. Conceptualizing slavery's “afterlife” entails exploring the ways that emancipation extended, not ameliorated, the chattel condition, and as such, impugns the efficacy of key analytic categories like “settler,” “native,” “labor,” and “freedom” when applied to black existence. Marronage, rather than colonization or emancipation, situates Liberia within the intergenerational struggle of, and over, black work against social death. Read as enslavement's conversion, this essay neither impugns nor heralds black action and leadership on the Liberian project at a particular historical moment, but rather agitates for centering black thought on the ongoing issue of black fungibility and social captivity that Liberia exemplifies. I argue that such a reading of Liberia presents a critique of both settler colonialism and of a certain conceptualization of the black radical tradition and its futures in heavily optimist, positivist, and political economic terms that are enjoying considerable favor in leading discourse on black struggle today.
Objectives: Research has shown that analyzing intrusion errors generated on verbal learning and memory measures is helpful for distinguishing between the memory disorders associated with Alzheimer’s disease (AD) and other neurological disorders, including Huntington’s disease (HD). Moreover, preliminary evidence suggests that certain clinical populations may be prone to exhibit different types of intrusion errors. Methods: We examined the prevalence of two new California Verbal Learning Test-3 (CVLT-3) intrusion subtypes – across-trial novel intrusions and across/within trial repeated intrusions – in individuals with AD or HD. We hypothesized that the encoding/storage impairment associated with medial-temporal involvement in AD would result in a greater number of novel intrusions on the delayed recall trials of the CVLT-3, whereas the executive dysfunction associated with subcortical-frontal involvement in HD would result in a greater number of repeated intrusions across trials. Results: The AD group generated significantly more across-trial novel intrusions than across/within trial repeated intrusions on the delayed cued-recall trials, whereas the HD group showed the opposite pattern on the delayed free-recall trials. Conclusions: These new intrusion subtypes, combined with traditional memory analyses (e.g., recall versus recognition performance), promise to enhance our ability to distinguish between the memory disorders associated with primarily medial-temporal versus subcortical-frontal involvement.
Introduction: There is increasing evidence supporting ultrasonography for the determination of optimal chest compression location during cardiac arrest. Radiological studies have demonstrated that in up to 1/3 of patients the aortic root or outflow tract is being compressed during standard CPR. Out-of-hospital-cardiac-arrests (OHCA) could benefit from cardiac localization, undertaken with scaled-down ultrasound equipment by which the largest fluid filled structure in the chest (the heart) is identified to guide optimal compression location. We intend to evaluate 1) where the left ventricle is in supine patients, 2) the accuracy and precision as well as 3) the feasibility and reliability of cardiac localization with a scaled down ultrasound device (bladder scanners). Methods: We are recruiting men and women over the age of 40. The scanning protocol involves using a bladder scanner on a 15-point grid over the subject's left chest and parasternal, midclavicular, and anterior axillary intercostal spaces 3-7. Detected volumes will be recorded, with the presumption that the intercostal space with the largest measured volume is centered over the heart. Echocardiography will then be used to confirm the bladder scanner accuracy and to better describe the patient's internal chest anatomy. Having assessed procedural feasibility on 3 pilot subjects, we are now recruiting 100 participants, with planned interim analysis at 50 participants for sample size reassessment. Maximal volume location frequencies from the echocardiograms will be described and assessed for variation utilizing the goodness-of-fit test. The proportion of agreement across the two modalities regarding the maximal volume location will also be examined. Results: Among the 3 volunteers (pilot study), the scanner identified fluid in 4-8 of 15 intercostal spaces. In each of the three pilot study patients, the maximal volume identified by the bladder scanner was found to be at the parasternal location of the 6th intercostal space. This was also the location of the mid left ventricular diameter on echocardiography. Conclusion: Our literature review and pilot study data support the premise that lay persons and emergency medical personnel may improve compressions (and thus outcomes) during OHCA by using a scaled-down ultrasound to identify the location of optimal compression. We are currently enrolling patients in our study.
Community-acquired pneumonia (CAP) results in substantial numbers of hospitalisations and deaths in older adults. There are known lifestyle and medical risk factors for pneumococcal disease but the magnitude of the additional risk is not well quantified in Australia. We used a large population-based prospective cohort study of older adults in the state of New South Wales (45 and Up Study) linked to cause-specific hospitalisations, disease notifications and death registrations from 2006 to 2015. We estimated the age-specific incidence of CAP hospitalisation (ICD-10 J12-18), invasive pneumococcal disease (IPD) notification and presumptive non-invasive pneumococcal CAP hospitalisation (J13 + J18.1, excluding IPD), comparing those with at least one risk factor to those with no risk factors. The hospitalised case-fatality rate (CFR) included deaths in a 30-day window after hospitalisation. Among 266 951 participants followed for 1 850 000 person-years there were 8747 first hospitalisations for CAP, 157 IPD notifications and 305 non-invasive pneumococcal CAP hospitalisations. In persons 65–84 years, 54.7% had at least one identified risk factor, increasing to 57.0% in those ⩾85 years. The incidence of CAP hospitalisation in those ⩾65 years with at least one risk factor was twofold higher than in those without risk factors, 1091/100 000 (95% confidence interval (CI) 1060–1122) compared with 522/100 000 (95% CI 501–545) and IPD in equivalent groups was almost threefold higher (18.40/100 000 (95% CI 14.61–22.87) vs. 6.82/100 000 (95% CI 4.56–9.79)). The CFR increased with age but there were limited difference by risk status, except in those aged 45 to 64 years. Adults ⩾65 years with at least one risk factor have much higher rates of CAP and IPD suggesting that additional risk factor-based vaccination strategies may be cost-effective.
Hospital environmental surfaces are frequently contaminated by microorganisms. However, the causal mechanism of bacterial contamination of the environment as a source of transmission is still debated. This prospective study was performed to characterize the nature of multidrug-resistant organism (MDRO) transmission between the environment and patients using standard microbiological and molecular techniques.
Prospective cohort study at 2 academic medical centers.
A prospective multicenter study to characterize the nature of bacterial transfer events between patients and environmental surfaces in rooms that previously housed patients with 1 of 4 ‘marker’ MDROs: methicillin-resistant Staphylococcus aureus, vancomycin-resistant enterococci, Clostridium difficile, and MDR Acinetobacter baumannii. Environmental and patient microbiological samples were obtained on admission into a freshly disinfected inpatient room. Repeat samples from room surfaces and patients were taken on days 3 and 7 and each week the patient stayed in the same room. The bacterial identity, antibiotic susceptibility, and molecular sequences were compared between organisms found in the environment samples and patient sources.
We enrolled 80 patient–room admissions; 9 of these patients (11.3%) were asymptomatically colonized with MDROs at study entry. Hospital room surfaces were contaminated with MDROs despite terminal disinfection in 44 cases (55%). Microbiological Bacterial Transfer events either to the patient, the environment, or both occurred in 12 patient encounters (18.5%) from the microbiologically evaluable cohort.
Microbiological Bacterial Transfer events between patients and the environment were observed in 18.5% of patient encounters and occurred early in the admission. This study suggests that research on prevention methods beyond the standard practice of room disinfection at the end of a patient’s stay is needed to better prevent acquisition of MDROs through the environment.
During the summer of 2016, the Hawaii Department of Health responded to the second-largest domestic foodborne hepatitis A virus (HAV) outbreak in the post-vaccine era. The epidemiological investigation included case finding and investigation, sequencing of RNA positive clinical specimens, product trace-back and virologic testing and sequencing of HAV RNA from the product. Additionally, an online survey open to all Hawaii residents was conducted to estimate baseline commercial food consumption. We identified 292 confirmed HAV cases, of whom 11 (4%) were possible secondary cases. Seventy-four (25%) were hospitalised and there were two deaths. Among all cases, 94% reported eating at Oahu or Kauai Island branches of Restaurant Chain A, with 86% of those cases reporting raw scallop consumption. In contrast, a food consumption survey conducted during the outbreak indicated 25% of Oahu residents patronised Restaurant Chain A in the 7 weeks before the survey. Product trace-back revealed a single distributor that supplied scallops imported from the Philippines to Restaurant Chain A. Recovery, amplification and sequence comparison of HAV recovered from scallops revealed viral sequences matching those from case-patients. Removal of product from implicated restaurants and vaccination of those potentially exposed led to the cessation of the outbreak. This outbreak further highlights the need for improved imported food safety.
Objectives: Although subjective cognitive complaints (SCC) are an integral component of the diagnostic criteria for mild cognitive impairment (MCI), previous findings indicate they may not accurately reflect cognitive ability. Within the Alzheimer’s Disease Neuroimaging Initiative, we investigated longitudinal change in the discrepancy between self- and informant-reported SCC across empirically derived subtypes of MCI and normal control (NC) participants. Methods: Data were obtained for 353 MCI participants and 122 “robust” NC participants. Participants were classified into three subtypes at baseline via cluster analysis: amnestic MCI, mixed MCI, and cluster-derived normal (CDN), a presumptive false-positive group who performed within normal limits on neuropsychological testing. SCC at baseline and two annual follow-up visits were assessed via the Everyday Cognition Questionnaire (ECog), and discrepancy scores between self- and informant-report were calculated. Analysis of change was conducted using analysis of covariance. Results: The amnestic and mixed MCI subtypes demonstrated increasing ECog discrepancy scores over time. This was driven by an increase in informant-reported SCC, which corresponded to participants’ objective cognitive decline, despite stable self-reported SCC. Increasing unawareness was associated with cerebrospinal fluid Alzheimer’s disease biomarker positivity and progression to Alzheimer’s disease. In contrast, CDN and NC groups over-reported cognitive difficulty and demonstrated normal cognition at all time points. Conclusions: MCI participants’ discrepancy scores indicate progressive underappreciation of their evolving cognitive deficits. Consistent over-reporting in the CDN and NC groups despite normal objective cognition suggests that self-reported SCC do not predict impending cognitive decline. Results demonstrate that self-reported SCC become increasingly misleading as objective cognitive impairment becomes more pronounced. (JINS, 2018, 24, 842–853)
Objectives: The third edition of the California Verbal Learning Test (CVLT-3) includes a new index termed List A versus Novel/Unrelated recognition discriminability (RD) on the Yes/No Recognition trial. Whereas the Total RD index incorporates false positive (FP) errors associated with all distractors (including List B and semantically related items), the new List A versus Novel/Unrelated RD index incorporates only FP errors associated with novel, semantically unrelated distractors. Thus, in minimizing levels of source and semantic interference, the List A versus Novel/Unrelated RD index may yield purer assessments of yes/no recognition memory independent of vulnerability to source memory difficulties or semantic confusion, both of which are often seen in individuals with primarily frontal-system dysfunction (e.g., early Huntington’s disease [HD]). Methods: We compared the performance of individuals with Alzheimer’s disease (AD) and HD in mild and moderate stages of dementia on CVLT-3 indices of Total RD and List A versus Novel/Unrelated RD. Results: Although AD and HD subgroups exhibited deficits on both RD indices relative to healthy comparison groups, those with HD generally outperformed those with AD, and group differences were more robust on List A versus Novel/Unrelated RD than on Total RD. Conclusions: Our findings highlight the clinical utility of the new CVLT-3 List A versus Novel/Unrelated RD index, which (a) maximally assesses yes/no recognition memory independent of source and semantic interference; and (b) provides a greater differentiation between individuals whose memory disorder is primarily at the encoding/storage level (e.g., as in AD) versus at the retrieval level (e.g., as in early HD). (JINS, 2018, 24, 833–841)
It has been suggested that cattle have a greater ability to digest fibrous feeds and a lower ability to digest non-fibrous feeds than sheep (Mc Donald et al., 1995). This statement applies mainly to forages and few direct comparisons have been conducted using concentrate ingredients. The digestibility of concentrate ingredients may be influenced by the level of consumption since an increase in intake of a complete diet resulted in a decrease in digestibility (El Khidir and Vestergaard Thomsen, 1983). The aims of this study were (a) to determine the effect of level of consumption by cattle and (b) to examine the effect of animal species (sheep and cattle) on the digestibility of concentrate ingredients.
Mechanical forces during machine milking induce changes in teat condition which can be differentiated into short-term and long-term changes. Machine milking-induced short-term changes in teat condition (STC) are defined as tissue responses to a single milking and have been associated with the risk of new intramammary infection. Albeit, their association with teat characteristics, such as teat-end shape, has not been investigated by rigorous methods. The primary objective was to determine the association of STC, as measured by ultrasonography, with teat-end shape. The second objective was to describe possible differences in the recovery time of teat tissue after machine milking among teats with different teat-end shapes. Holstein cows (n=128) were enrolled in an observational study, housed in free-stall pens with sand bedding and milked three times a day. Ultrasonography of the left front and right hind teat was performed after teat preparation before milking (t−1), immediately after milking (t0) and 1, 3, 5 and 7 h after milking (t1, t3, t5, t7). The teat tissue parameters measured from ultrasound scans were teat canal length, teat-end diameter, teat-end diameter at the midpoint between the distal and proximal end of the teat canal, teat wall thickness, and teat cistern width. Teat-end shape was assessed visually and classified into three categories: pointed, flat and round. Multivariable linear regression analyses showed differences in the relative change of teat tissue parameters (compared with t−1) at t0 among teats with different teat-end shapes, with most parameters showing the largest change for round teats. The premilking values were reached (recovery time) after 7 h in teats with a pointed teat-end shape, whereas recovery time was greater than 7 h in teats with flat and round teat-end shapes. Under the same liner and milking machine conditions, teats with a round teat-end shape had the most severe short-term changes. The results of this observational study indicated that teat-end shape may be one of the factors that contribute to the severity of STC.
To identify predictors of disagreement with antimicrobial stewardship prospective audit and feedback recommendations (PAFR) at a free-standing children’s hospital.
Retrospective cohort study of audits performed during the antimicrobial stewardship program (ASP) from March 30, 2015, to April 17, 2017.
The ASP included audits of antimicrobial use and communicated PAFR to the care team, with follow-up on adherence to recommendations. The primary outcome was disagreement with PAFR. Potential predictors for disagreement, including patient-level, antimicrobial, programmatic, and provider-level factors, were assessed using bivariate and multivariate logistic regression models.
In total, 4,727 antimicrobial audits were performed during the study period; 1,323 PAFR (28%) and 187 recommendations (15%) were not followed due to disagreement. Providers were more likely to disagree with PAFR when the patient had a gastrointestinal infection (odds ratio [OR], 5.50; 95% confidence interval [CI], 1.99–15.21), febrile neutropenia (OR, 6.14; 95% CI, 2.08–18.12), skin or soft-tissue infections (OR, 6.16; 95% CI, 1.92–19.77), or had been admitted for 31–90 days at the time of the audit (OR, 2.08; 95% CI, 1.36–3.18). The longer the duration since the attending provider had been trained (ie, the more years of experience), the more likely they were to disagree with PAFR recommendations (OR, 1.02; 95% CI, 1.01–1.04).
Evaluation of our program confirmed patient-level predictors of PAFR disagreement and identified additional programmatic and provider-level factors, including years of attending experience. Stewardship interventions focused on specific diagnoses and antimicrobials are unlikely to result in programmatic success unless these factors are also addressed.
Singh deploys cultural evolution to explain recurrent features of shamanistic trance forms, but fails to substantively address important distinctions between these forms. Possession trance (vs. trance without possession) is disproportionately female-dominated and found in complex societies. The effects of cultural conditions on shamanism thus extend beyond its presence or absence and are vital for modeling its professionalization and spread.
Reliable and affordable technology for collecting and managing livestock production process information is being developed. The advances in data measurement, collection and transfer technology enable us to retrieve information from one or more remote sites to be processed and managed centrally. This opens up the opportunity to advance from open loop, prescriptive production to closed loop systems where factors influencing the actual performance of animals are used to modify and improve their production parameters (feed, environment, medication). We strive from producing animals by predicting what is needed using outdated data, to measuring what is actually happening as they grow, processing this information and acting to optimise animal performance by modifying production parameters in real time.
This paper describes commercially available systems that make possible the retrieval, collection, processing and distribution of near real time production information. Various aspects of production management using this technology are discussed, and examples of how it can be applied to monitor water usage, how it relates to pig performance and how energy usage can be influenced, are considered.
This study compared the effect of feeding AmyPlus, a moist feed, as opposed to rolled wheat on the yield and composition of milk from dairy cows consuming grass silage based total mixed ration (TMR). Seventy-two Holstein-Friesian cows were distributed into AmyPlus (Treatment) and Wheat (Control) groups and loose housed on straw in an open shed. Each kg Wheat based concentrate contained 345g rolled wheat, 230g rapeseed meal, 115g sugarbeet pulp, 115g Molaferm 20, 115g soybean meal, 56g barley straw and 24g vitamin-minerals. In contrast, each kg AmyPlus based concentrate contained 501g AmyPlus (480g DM /kg), 105g rapeseed meal, 126g sugarbeet pulp, 126g Molaferm 20, 84g soybean meal, 41g barley straw and 17g vitamin-minerals. Here, AmyPlus was loaded directly into the mixer wagon to prepare fresh AmyPlus based TMR with a silage to concentrate ratio of 68:32. Each TMR was fed once daily to the corresponding group of cows also receiving 2kg of Distillers’ grains per cow in the parlour during milking. Daily milk yield and composition was recorded from November 1999 to February 2000. The overall daily Dry matter intake (DMI) of each TMR per cow remained uniform (20.19 vs 20.15 kg for Treatment and Control group respectively) across both groups. Daily milk yield and total cell counts per cow did not vary significantly (P>0.05) between groups during various months. While, milk fat and protein contents were greater in Treatment than Control group during each month, the differences were significant (P<0.05) only during November and December for fat and in January for protein. On average, the Treatment group tended to show a non-significant increase (P>0.05) in daily milk yield per cow by 0.144 kg than the Control group. The fat (46.2 vs 43.7) and protein (34.5 vs 33.5) contents in g /kg milk were also increased significantly (P<0.001) in Treatment compared with Control group. Total cell counts did not vary significantly (P>0.05) and remained within the acceptable limits. The cows consuming AmyPlus maintained their health as indicated by their intake, production, cell counts and general appearance. It would appear that AmyPlus can replace rolled wheat in TMR. However, it may be necessary to evaluate the storage, economic and environmental implications of using such moist co-products in silage based dairy rations.
An algebraic description of the lactation curve is a useful component of any model of the day-to-day production of lactating animals. Observation and common sense suggest that such a function should rise to a peak early in lactation and decline thereafter but simpler models have been used. Ostergaard (1979), for example, used a linear model to study strategies for concentrate feeding to obtain optimum feeding levels in high yielding dairy cows. His model was
where y (t) was the yield in week t and a and b were the usual linear regression coefficients. The error term is omitted here and elsewhere for clarity. Gaines (1927) used the decay function
where A and k are the constants fitted to log y (t).
These models, one linear, the other exponential, peak in Week 1 and require only two parameters. In this paper, more sophisticated functions are described and compared.
A lactation curve is assumed to contain two components, one of which is the intrinsic biological drive to produce milk and the other is an environmental constraint upon it. The algebra may be justified by biological argument according to the skill and the leanings of the modeller.
Nutritional management of pigs to optimise growth demands pig-specific, time-specific and place-specific determination and provision of nutritional requirement. These elements need to be incorporated into response prediction models that operate in a real-time (not retrospective) closed-loop control environment. This implies appropriate means for the on-line measurement of response to change in nutrient provision, and the simultaneous means for manipulation of feeding level and feed quality. The paper describes how response prediction modelling and response measurement may now be achieved. Optimisation may be pursued with mixed objectives, including those of production efficiency and environmental protection.
A key to long-term sustainable enhancement of viable livestock production is the introduction of genetic traits that ensure that fertility and meat quality characteristics are compatible with farming environments and market needs. For example, the sheep industry could benefit if daughters of hill-breed ewes were of a crossbred genotype that enhances both carcass characteristics and fertility traits. Use of sires that confer better conformation is an option but does not significantly boost prolificacy. Introduction of the ‘Inverdale’ fecundity gene could change this. On a flock basis in the Romney breed, mean ovulation is increased by 1.0 and litter size by 0.6 in adult ewes carrying a single copy of this gene (designated as FecXI because it is on the X chromosome; Davis et al. 1992). Carrier males transmit it to all of their female offspring, these being heterozygous carriers of the gene unless it also is maternally inherited. In the latter instance, young would be infertile the homozygous genotype confers an undesirable ‘streak ovary’ phenotype. Although a number of sheep breeds world-wide exhibit significant ‘single gene’ effects on ovulation and litter size (Montgomery et al. 2001), Scottish hill sheep breeds show no evidence of this. Consequently, all ewe lambs generated by crossing these hill ewes with a ram carrying the Inverdale gene should be heterozygous. To ascertain whether such animals exhibit enhanced fecundity, an on-farm study investigated ovulation incidence in cyclic ewe lambs born to Cheviot or Scottish Blackface ewes that had been bred to Texel rams carrying a single copy of the ‘Inverdale’ gene.
Knowledge of the genetic and phenotypic relationships for muscle fibre characteristics with meat and eating quality in pigs is required by the pig breeding industry for two reasons. Muscle fibre traits, determined from muscle biopsy, could be used as genetic predictors of meat and eating quality traits and, secondly, if responses in meat and eating quality traits are partially due to changes in muscle fibre traits, then selection criteria can be designed to compensate for such responses. The current study estimated the genetic and phenotypic relationships for muscle fibre traits with meat and eating quality traits.
The study consisted of 160 Large White pigs from lines divergently selected for lean growth rate on ad-libitum or restricted feeding regimes, lean food conversion ratio and daily food intake for seven generations in the Edinburgh lean growth selection experiment. Within each selection line, there were 10 pairs of full-sibs. Boars and gilts were tested from 30 kg, individually penned and fed a diet consisting of 224 g CP/kg DM and 15.9 MJ DE/kg DM.