To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The partition of the total genetic variance into its additive and non-additive components can differ from trait to trait, and between purebred and crossbred populations. A quantification of these genetic variance components will determine the extent to which it would be of interest to account for dominance in genomic evaluations or to establish mate allocation strategies along different populations and traits. This study aims at assessing the contribution of the additive and dominance genomic variances to the phenotype expression of several purebred Piétrain and crossbred (Piétrain × Large White) pig performances. A total of 636 purebred and 720 crossbred male piglets were phenotyped for 22 traits that can be classified into six groups of traits: growth rate and feed efficiency, carcass composition, meat quality, behaviour, boar taint and puberty. Additive and dominance variances estimated in univariate genotypic models, including additive and dominance genotypic effects, and a genomic inbreeding covariate allowed to retrieve the additive and dominance single nucleotide polymorphism variances for purebred and crossbred performances. These estimated variances were used, together with the allelic frequencies of the parental populations, to obtain additive and dominance variances in terms of genetic breeding values and dominance deviations. Estimates of the Piétrain and Large White allelic contributions to the crossbred variance were of about the same magnitude in all the traits. Estimates of additive genetic variances were similar regardless of the inclusion of dominance. Some traits showed relevant amount of dominance genetic variance with respect to phenotypic variance in both populations (i.e. growth rate 8%, feed conversion ratio 9% to 12%, backfat thickness 14% to 12%, purebreds-crossbreds). Other traits showed higher amount in crossbreds (i.e. ham cut 8% to 13%, loin 7% to 16%, pH semimembranosus 13% to 18%, pH longissimus dorsi 9% to 14%, androstenone 5% to 13% and estradiol 6% to 11%, purebreds-crossbreds). It was not encountered a clear common pattern of dominance expression between groups of analysed traits and between populations. These estimates give initial hints regarding which traits could benefit from accounting for dominance for example to improve genomic estimated breeding value accuracy in genetic evaluations or to boost the total genetic value of progeny by means of assortative mating.
Errors pertinent in dual beam absorptiometry have been studied. Five areas are given in detail: 1. Scattering, in which a computer analysis of multiple scattering shows little error. 2. Geometrical configuration effects, in which the shape of the sample influences accuracy. 3. Poisson variations, in which dosage and statistical error are minimized. 4. Absorption coefficients, in which variations in compilations are examined. 5. Filtering, wherein is shown the need for Kβ filtering. A zero filter system is outlined.
Surgery for CHD has been slow to develop in parts of the former Soviet Union. The impact of an 8-year surgical assistance programme between an emerging centre and a multi-disciplinary international team that comprised healthcare professionals from developed cardiac programmes is analysed and presented.
Material and methods
The international paediatric assistance programme included five main components – intermittent clinical visits to the site annually, medical education, biomedical engineering support, nurse empowerment, and team-based practice development. Data were analysed from visiting teams and local databases before and since commencement of assistance in 2007 (era A: 2000–2007; era B: 2008–2015). The following variables were compared between periods: annual case volume, operative mortality, case complexity based on Risk Adjustment for Congenital Heart Surgery (RACHS-1), and RACHS-adjusted standardised mortality ratio.
A total of 154 RACHS-classifiable operations were performed during era A, with a mean annual case volume by local surgeons of 19.3 at 95% confidence interval 14.3–24.2, with an operative mortality of 4.6% and a standardised mortality ratio of 2.1. In era B, surgical volume increased to a mean of 103.1 annual cases (95% confidence interval 69.1–137.2, p<0.0001). There was a non-significant (p=0.84) increase in operative mortality (5.7%), but a decrease in standardised mortality ratio (1.2) owing to an increase in case complexity. In era B, the proportion of local surgeon-led surgeries during visits from the international team increased from 0% (0/27) in 2008 to 98% (58/59) in the final year of analysis.
The model of assistance described in this report led to improved adjusted mortality, increased case volume, complexity, and independent operating skills.
During the 1970s, the “coastal zone”—that ecologically unique area where sea and land meet and strongly influence each other—has become a principal laboratory for experiments in new land management techniques. These coastal experiments are producing institutional arrangements that involve every level of government, and the implementation of the programs may well affect every citizen of the United States.
During the 1970s, the Florida legislature enacted some of the nation's most innovative and comprehensive state and local land-planning and regulatory programs. The Environmental Land and Water Management Act of 1972 adopted large parts of an early draft of article 7 of the ALI Model Land Development Code, thereby asserting a state regulatory role in areas of critical state concern and for developments of regional impact; Florida's Local Government Comprehensive Planning Act of 1975 introduced planning and regulatory innovations that, if ever fully implemented, could place Florida in the vanguard of land regulatory reform at the local governmental level. This study, which is the concluding part of a study of the evolution of federal, state, and local regulatory roles in the management of coastal land resources, examines the intergovernmental, interagency, and separate-branch tensions that have emerged as Florida moves to implement its new laws. Included, inter alia, is an analysis of the Florida Supreme Court's controversial nondelegation decision in Askew v. Cross Key Waterways. Although Florida can claim some limited successes in program implementation, its land management systems are still not adequately integrated and coordinated, and they have not been implemented as successfully as their proponents thought possible. For example, the state has several alternatives for complying with the federal requirements for an approved management program under the Federal Coastal Zone Management Act of 1972—the comprehensive land management system examined in this study being only one of the available ones. Yet Florida still has been unable to obtain federal approval, and, if it ever does, will be one of the last of the major coastal states to do so. Much of Florida's difficulty in forging a well-integrated coastal land management process is attributable to substantial disagreements on two basic propositions: because of Florida's unique ecological characteristics, coastal land management should not be divorced from comprehensive land management for other purposes; and because of substantial regional diversities within the state, coastal land management in Florida should include a significant planning and regulatory role for local governments as well as for regional and state agencies.
Historically, the federal government has had a major role in regulation of coastal activities, restricted until recently, however, primarily to regulation of activities affecting navigable water bodies. On the sea coast, regulation of areas landward of such jurisdictional boundaries as “mean high tide line” was generally left to the states or their delegates.
No discipline has been impacted more by war and armed conflict than health care has. Health systems and health care providers are often the first victims, suffering increasingly heinous acts that cripple the essential health delivery and public health infrastructure necessary for the protection of civilian and military victims of the state at war. This commentary argues that current instructional opportunities to prepare health care providers fall short in both content and preparation, especially in those operational skill sets necessary to manage multiple challenges, threats, and violations under international humanitarian law and to perform triage management in a resource-poor medical setting. Utilizing a historical framework, the commentary addresses the transformation of the education and training of humanitarian health professionals from the Cold War to today followed by recommendations for the future. (Disaster Med Public Health Preparedness. 2019;13:383-396)
In response to Roemer's reformulation of the Marxian concept of exploitation in terms of comparative wealth distributions (1982, 1996), Vrousalis (2013) treats economic exploitation as an explicitly relational phenomenon in which one party takes advantage of the other's economic vulnerability in order to extract a net benefit. This paper offers a critical assessment of Vrousalis's account, prompting a revised formulation that is analysed in the context of a matching and bargaining model. This analysis yields precise representations of Vrousalis's conditions of economic vulnerability and economic exploitation and facilitates comparison to the alternative conceptions of Marx and Roemer.
Since 1945, the reason for humanitarian crises and the way in which the world responds to them has dramatically changed every 10 to 15 years or less. Planning, response, and recovery for these tragic events have often been ad hoc, inconsistent, and insufficient, largely because of the complexity of global humanitarian demands and their corresponding response system capabilities. This historical perspective chronicles the transformation of war and armed conflicts from the Cold War to today, emphasizing the impact these events have had on humanitarian professionals and their struggle to adapt to increasing humanitarian, operational, and political challenges. An unprecedented independent United Nations–World Health Organization decision in the Battle for Mosul in Iraq to deploy to combat zones emergency medical teams unprepared in the skills of decades-tested war and armed conflict preparation and response afforded to health care providers and dictated by International Humanitarian Law and Geneva Convention protections has abruptly challenged future decision-making and deployments. (Disaster Med Public Health Preparedness. 2019;13:109–115)
For most of the Twentieth Century the angiosperm archetypal flower has been viewed as relatively large, multiparted, with spirally arranged fleshy appendages, and as being probably beetle pollinated as in some extant Magnoliales. However, the preponderance of fossil evidence indicates that flowers with such characters do not appear until the mid-Cretaceous, well after smaller simpler fossil flowers such as platanoids and chloranthoids. Winteraceous and Chloranthaceous pollen appears more or less simultaneously in the Lower Cretaceous, but rapidly mounting evidence for mosaicism in Cretaceous taxa makes it unwise to extrapolate floral structure on the basis of dispersed pollen. Mid-Late Cretaceous fossils illustrate an increasing proportion of simple flowered Rosidae in the angiosperm flora. We report new fossil evidence of charcoalified flowers and fruits representing at least 20–30 diverse angiosperm taxa from the Cenomanian and Turonian deposits of the Atlantic Coastal Plain. These fossil flowers include representatives with hypanthia and floral cups, sympetaly, syncarpy, inferior ovaries, campylotropous ovules, nectaries of various forms, specialized anther dehiscence, epipetalous stamens, and connate filament tubes. Major taxonomic groups (as defined by Cronquist) represented by these fossils include Dilleniidae, Magnoliidae, Rosidae, monocots, and possibly Caryophyllidae. Thus, the early Late Cretaceous angiosperm flora had greater floral diversity than has previously been documented. This array of floral structures includes features that are now associated with bees and other specialized insect pollinators, thus providing a new perspective on the evolution of insect pollination.
Introduction: Decreasing readmission rates and return emergency department (ED) visits represent a major challenge for health organizations. Seniors are especially vulnerable to discharge adverse events which can result in unplanned readmissions and loss of physical, functional and/or cognitive capacity. The ACE Collaborative is a national quality improvement initiative that aims to improve care of elderly patients. We aimed to adapt Mount Sinai’s Care Transitions program to our local context in order to decrease avoidable readmissions and ED visits among seniors. Methods: We performed a prospective pre/post implementation cohort study. We recruited frail elderly hospitalized patients (≥50 years old) discharged to home and at risk of readmission (modified LACE index score≥7/12). We excluded patients being discharged to long-term nursing homes or institutions. Our intervention is based on selected strategic ACE Care Transitions best practices: transition coach, telehealth personal response services and a structured discharge checklist. The intervention is offered to selected patients before hospital discharge. Our primary outcome is a 30-day post-discharge composite of hospital readmission and return ED visit rate. Our secondary outcomes are functional autonomy, satisfaction with care transition, quality of life, caregiver strain and healthcare resource use at recruitment and at 30-days follow-up. Hospital-level administrative data is also collected to measure global effect of practice changes. Results: The project is currently ongoing and preliminary results are available for the pre-implementation cohort only. Patients in this cohort (n=33) were mainly men (61%), aged 75±10 years and presented an OARS score (Activities of Daily Living instrument that ranges from 0-28) of 5.6±4.9. At 30 days post-discharge, the patients in our cohort had a 42.4% readmission rate (14 hospitalisations) and a 54.5% return ED visit rate (18 visits). For the same time period, readmission and return ED rates for all patients in the same corresponding age-group at the hospital level were 14.4% and 21.9%, respectively. Further results for our post-intervention cohort will be presented at CAEP 2017. Conclusion: Our cohort of elderly patients have high readmission and return ED visit rates. Our ongoing quality improvement project aims to decrease these readmissions and ED visits.
The primary goal was to investigate the effects of l-carnitine on fuel efficiency, as an antioxidant, and for muscle recovery in Labrador retrievers. Dogs were split into two groups, with one group being supplemented with 250 mg/d of Carniking™ l-carnitine powder. Two experiments (Expt 1 and Expt 2) were performed over a 2-year period which included running programmes, activity monitoring, body composition scans and evaluation of recovery using biomarkers. Each experiment differed slightly in dog number and design: fifty-six v. forty dogs; one endurance and two sprint runs per week v. two endurance runs; and differing blood collection time points. All dogs were fed a low-carnitine diet in which a fixed amount was offered based on maintaining the minimum starting weight. Results from Expt 1 found that the carnitine dogs produced approximately 4000 more activity points per km compared with the control group during sprint (P = 0·052) and endurance runs (P = 0·0001). Male carnitine dogs produced half the creatine phosphokinase (CPK) following exercise compared with male control dogs (P = 0·05). Carnitine dogs had lower myoglobin at 6·69 ng/ml following intensive exercise compared with controls at 24·02 ng/ml (P = 0·0295). Total antioxidant capacity (TAC) and thiobarbituric acid reactive substance (TBARS) results were not considered significant. In Expt 2, body composition scans indicated that the carnitine group gained more total tissue mass while controls lost tissue mass (P = 0·0006) and also gained lean mass while the control group lost lean mass (P < 0·0001). Carnitine dogs had lower CPK secretion at 23·06 v. control at 28·37 mU/ml 24 h after post-run (P = 0·003). Myoglobin levels were lower in carnitine v. control dogs both 1 h post-run (P = 0·0157; 23·83 v. 37·91 ng/ml) and 24 h post-run (P = 0·0189; 6·25 v.13·5 ng/ml). TAC indicated more antioxidant activity in carnitine dogs at 0·16 mmv. control at 0·13 mm (P = 0·0496). TBARS were also significantly lower in carnitine dogs both pre-run (P = 0·0013; 15·36 v. 23·42 µm) and 1 h post-run (P = 0·056; 16·45 v. 20·65 µm). Supplementing l-carnitine in the form of Carniking™ had positive benefits in Labrador retrievers for activity intensity, body composition, muscle recovery and oxidative capacity.
This review summarizes the results from the INRA (Institut National de la Recherche Agronomique) divergent selection experiment on residual feed intake (RFI) in growing Large White pigs during nine generations of selection. It discusses the remaining challenges and perspectives for the improvement of feed efficiency in growing pigs. The impacts on growing pigs raised under standard conditions and in alternative situations such as heat stress, inflammatory challenges or lactation have been studied. After nine generations of selection, the divergent selection for RFI led to highly significant (P<0.001) line differences for RFI (−165 g/day in the low RFI (LRFI) line compared with high RFI line) and daily feed intake (−270 g/day). Low responses were observed on growth rate (−12.8 g/day, P<0.05) and body composition (+0.9 mm backfat thickness, P=0.57; −2.64% lean meat content, P<0.001) with a marked response on feed conversion ratio (−0.32 kg feed/kg gain, P<0.001). Reduced ultimate pH and increased lightness of the meat (P<0.001) were observed in LRFI pigs with minor impact on the sensory quality of the meat. These changes in meat quality were associated with changes of the muscular energy metabolism. Reduced maintenance energy requirements (−10% after five generations of selection) and activity (−21% of time standing after six generations of selection) of LRFI pigs greatly contributed to the gain in energy efficiency. However, the impact of selection for RFI on the protein metabolism of the pig remains unclear. Digestibility of energy and nutrients was not affected by selection, neither for pigs fed conventional diets nor for pigs fed high-fibre diets. A significant improvement of digestive efficiency could likely be achieved by selecting pigs on fibre diets. No convincing genetic or blood biomarker has been identified for explaining the differences in RFI, suggesting that pigs have various ways to achieve an efficient use of feed. No deleterious impact of the selection on the sow reproduction performance was observed. The resource allocation theory states that low RFI may reduce the ability to cope with stressors, via the reduction of a buffer compartment dedicated to responses to stress. None of the experiments focussed on the response of pigs to stress or challenges could confirm this theory. Understanding the relationships between RFI and responses to stress and energy demanding processes, as such immunity and lactation, remains a major challenge for a better understanding of the underlying biological mechanisms of the trait and to reconcile the experimental results with the resource allocation theory.
The chromosphere is a complex region that acts as an intermediary between the magnetic flux emergence in the photosphere and the magnetic features seen in the corona. Large eruptions in the chromosphere of flares and filaments are often accompanied by ejections of coronal mass off the sun. Several studies have observed fast-moving progressive trains of compact bright points (called Sequential Chromospheric Brightenings or SCBs) streaming away from chromospheric flares that also produce a coronal mass ejection (CME). In this work, we review studies of SCBs and search for commonalties between them. We place these findings into a larger context with contemporary chromospheric and coronal observations. SCBs are fleeting indicators of the solar atmospheric environment as it existed before their associated eruption. Since they appear at the very outset of a flare eruption, SCBs are good early indication of a CME measured in the chromosphere.
Benchmarks for antimicrobial consumption measured in antimicrobial days are beginning to emerge. The relationship between the traditional measure of days of therapy and antimicrobial days is unclear. We observed a high intermethod correlation (R2=0.99): antimicrobial days were 1.9-fold lower than days of therapy across agents. Individual institutions should correlate these measures.
Shallow-water tropical corals can be used to calibrate the radiocarbon timescale. In this paper, we present a new data set based on the comparison between 14C ages and U-Th ages measured in fossil corals collected offshore the island of Tahiti during the Integrated Oceanic Drilling Program (IODP) Expedition 310. After applying strict mineralogical and geochemical screening criteria, the Tahiti record provides new data for 2 distinct time windows: 7 data for the interval between 29 and 37 cal kyr BP and 58 for the last deglaciation period, notably a higher resolution for the 14–16 cal kyr BP time interval. There are 3 main outcomes of this study. First, it extends the previous Tahiti record beyond 13.9 cal kyr BP, the oldest U-Th age obtained on cores drilled onshore in the modern Tahiti barrier reef. Second, it strengthens the data set of the 14–15 cal kyr BP period, allowing for better documentation of the 14C age plateau in this time range. This age plateau corresponds to a drop of the atmospheric 14C synchronous with an abrupt period of sea-level rise (Melt Water Pulse 1 A, MWP-1 A). The Tahiti 14C record documents complex changes in the global carbon cycle due to variations in the exchange rates between its different reservoirs. Third, during the Heinrich event 1, the Tahiti record disagrees with the Cariaco record, but is in broad agreement with other marine and continental data.
Cytomegalovirus (CMV) is the leading cause of congenital infection and non-genetic sensorineural hearing loss in children. There are no recent data on the incidence of CMV infection during pregnancy in Canada. This present study was undertaken to determine the seroprevalence of CMV IgG antibodies and the rate of seroconversion in a cohort of pregnant women in the province of Québec, Canada. We used serum samples and questionnaire data collected as part of the 3D Pregnancy and Birth Cohort Study (2010–2013) conducted in Québec, Canada. CMV IgG antibodies were determined in serum samples collected at the first and third trimesters. Associations between independent variables and seroprevalence were assessed using logistic regression, and associations with seroconversions, by Poisson regression. Of 1938 pregnant women tested, 40·4% were seropositive for CMV at baseline. Previous CMV infection was associated with: working as a daycare educator, lower education, lower income, having had children, first language other than French or English, and being born outside Canada or the United States. Of the 1122 initially seronegative women, 24 (2·1%) seroconverted between their first and third trimesters. The seroconversion rate was 1·4 [95% confidence interval (CI) 0·9–2·1]/10 000 person-days at risk or 3·9 (95% CI 2·5–5·9)/100 pregnancies (assuming a 280-day gestation). The high proportion of pregnant women susceptible to CMV infection (nearly 60%) and the subsequent rate of seroconversion are of concern.