We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The present study reports the validity of multiple assessment methods for tracking changes in body composition over time and quantifies the influence of unstandardised pre-assessment procedures. Resistance-trained males underwent 6 weeks of structured resistance training alongside a hyperenergetic diet, with four total body composition evaluations. Pre-intervention, body composition was estimated in standardised (i.e. overnight fasted and rested) and unstandardised (i.e. no control over pre-assessment activities) conditions within a single day. The same assessments were repeated post-intervention, and body composition changes were estimated from all possible combinations of pre-intervention and post-intervention data. Assessment methods included dual-energy X-ray absorptiometry (DXA), air displacement plethysmography, three-dimensional optical imaging, single- and multi-frequency bioelectrical impedance analysis, bioimpedance spectroscopy and multi-component models. Data were analysed using equivalence testing, Bland–Altman analysis, Friedman tests and validity metrics. Most methods demonstrated meaningful errors when unstandardised conditions were present pre- and/or post-intervention, resulting in blunted or exaggerated changes relative to true body composition changes. However, some methods – particularly DXA and select digital anthropometry techniques – were more robust to a lack of standardisation. In standardised conditions, methods exhibiting the highest overall agreement with the four-component model were other multi-component models, select bioimpedance technologies, DXA and select digital anthropometry techniques. Although specific methods varied, the present study broadly demonstrates the importance of controlling and documenting standardisation procedures prior to body composition assessments across distinct assessment technologies, particularly for longitudinal investigations. Additionally, there are meaningful differences in the ability of common methods to track longitudinal body composition changes.
We analyse United States presidential appointee positions subject to Senate confirmation without a confirmed appointee in office. These “vacant” positions are byproducts of American constitutional design, shaped by the interplay of institutional politics. Using a novel dataset, we analyse appointee vacancies across executive branch departments and single-headed agencies from 1989 to 2013. We develop a theoretical model that uncovers the dynamics of vacancy onset and length. We then specify an empirical model and report results highlighting both position and principal–agent relations as critical to the politics of appointee vacancies. Conditional on high status positions reducing the frequency and duration of vacancies, we find important principal–agent considerations from a separation of powers perspective. Appointee positions in agencies ideologically divergent from the relevant Senate committee chair are vacant for less time than in ideologically proximal agencies. Importantly, this relationship strengthens as agency ideology diverges away from the chair and towards the chair’s party extreme.
Food insecurity, or self-reports of inadequate food access due to limited financial resources, remains prevalent among people living with HIV (PLHIV). We examined the impact of food insecurity on combination antiretroviral therapy (cART) adherence within an integrated care programme that provides services to PLHIV, including two meals per day.
Design:
Adjusted OR (aOR) were estimated by generalized estimating equations, quantifying the relationship between food insecurity (exposure) and cART adherence (outcome) with multivariable logistic regression.
Setting:
We drew on survey data collected between February 2014 and March 2016 from the Dr. Peter Centre Study based in Vancouver, Canada.
Participants:
The study included 116 PLHIV at baseline, with ninety-nine participants completing a 12-month follow-up interview. The median (quartile 1–quartile 3) age was 46 (39–52) years at baseline and 87 % (n 101) were biologically male at birth.
Results:
At baseline, 74 % (n 86) of participants were food insecure (≥2 affirmative responses on Health Canada’s Household Food Security Survey Module) and 67 % (n 78) were adherent to cART ≥95 % of the time. In the adjusted regression analysis, food insecurity was associated with suboptimal cART adherence (aOR = 0·47, 95 % CI 0·24, 0·93).
Conclusions:
While food provision may reduce some health-related harms, there remains a relationship between this prevalent experience and suboptimal cART adherence in this integrated care programme. Future studies that elucidate strategies to mitigate food insecurity and its effects on cART adherence among PLHIV in this setting and in other similar environments are necessary.
A new deep level transient spectroscopy (DLTS) technique is described, called half-width at variable intensity analysis. This method utilizes the width and normalized intensity of a DLTS signal to determine the activation energy and capture cross section of the trap that generated the signal via a variable, kO. This constant relates the carrier emission rates giving rise to the differential capacitance signal associated with a given trap at two different temperatures: the temperature at which the maximum differential capacitance is detected, and an arbitrary temperature at which some nonzero differential capacitance signal is detected. The extracted activation energy of the detected trap center is used along with the position of the peak maximum to extract the capture cross section of the trap center.
Recent modelling estimates up to two-thirds of new HIV infections among men who have sex with men occur within partnerships, indicating the importance of dyadic HIV prevention efforts. Although new interventions are available to promote dyadic health-enhancing behaviours, minimal research has examined what factors influence partners’ mutual engagement in these behaviours, a critical component of intervention success. Actor-partner interdependence modelling was used to examine associations between relationship characteristics and several dyadic outcomes theorised as antecedents to health-enhancing behaviours: planning and decision making, communication, and joint effort. Among 270 male-male partnerships, relationship satisfaction was significantly associated with all three outcomes for actors (p = .02, .02, .06 respectively). Latino men reported poorer planning and decision making (actor p = .032) and communication (partner p = .044). Alcohol use was significantly and negatively associated with all outcomes except actors’ planning and decision making (actors: p = .11, .038, .004 respectively; partners: p = .03, .056, .02 respectively). Having a sexual agreement was significantly associated with actors’ planning and decision making (p = .007) and communication (p = .008). Focusing on interactions between partners produces a more comprehensive understanding of male couples’ ability to engage in health-enhancing behaviours. This knowledge further identifies new and important foci for the tailoring of dyadic HIV prevention and care interventions.
Timing of weed emergence and seed persistence in the soil influence the ability to implement timely and effective control practices. Emergence patterns and seed persistence of kochia populations were monitored in 2010 and 2011 at sites in Kansas, Colorado, Wyoming, Nebraska, and South Dakota. Weekly observations of emergence were initiated in March and continued until no new emergence occurred. Seed was harvested from each site, placed into 100-seed mesh packets, and buried at depths of 0, 2.5, and 10 cm in fall of 2010 and 2011. Packets were exhumed at 6-mo intervals over 2 yr. Viability of exhumed seeds was evaluated. Nonlinear mixed-effects Weibull models were fit to cumulative emergence (%) across growing degree days (GDD) and to viable seed (%) across burial time to describe their fixed and random effects across site-years. Final emergence densities varied among site-years and ranged from as few as 4 to almost 380,000 seedlings m−2. Across 11 site-years in Kansas, cumulative GDD needed for 10% emergence were 168, while across 6 site-years in Wyoming and Nebraska, only 90 GDD were needed; on the calendar, this date shifted from early to late March. The majority (>95%) of kochia seed did not persist for more than 2 yr. Remaining seed viability was generally >80% when seeds were exhumed within 6 mo after burial in March, and declined to <5% by October of the first year after burial. Burial did not appear to increase or decrease seed viability over time but placed seed in a position from which seedling emergence would not be possible. High seedling emergence that occurs very early in the spring emphasizes the need for fall or early spring PRE weed control such as tillage, herbicides, and cover crops, while continued emergence into midsummer emphasizes the need for extended periods of kochia management.
In 1964 (Solar Cycle 20; SC 20), Patrick McIntosh began creating hand-drawn synoptic maps of solar magnetic features, based on Hα images. These synoptic maps were unique in that they traced magnetic polarity inversion lines, and connected widely separated filaments, fibril patterns, and plage corridors to reveal the large-scale organization of the solar magnetic field. Coronal hole boundaries were later added to the maps, which were produced, more or less continuously, into 2009 (i.e., the start of SC 24). The result was a record of ~45 years (~570 Carrington rotations), or nearly four complete solar cycles of synoptic maps. We are currently scanning, digitizing and archiving these maps, with the final, searchable versions publicly available at NOAA's National Centers for Environmental Information. In this paper we present preliminary scientific studies using the archived maps from SC 23. We show the global evolution of closed magnetic structures (e.g., sunspots, plage, and filaments) in relation to open magnetic structures (e.g., coronal holes), and examine how both relate to the shifting patterns of large-scale positive and negative polarity regions.
Neonatal infections are usually classified according to time and mode of onset in three categories: (1) prenatal, (2) perinatal (early onset), and (3) nursery-acquired (late onset). The division in time between early and late onset is usually 2 to 7 days of age (Table 94.1). Different investigators have divided early-onset from late-onset infections at different days of life but most early-onset infections are evident during the first day of life. Infections that begin within the first month of life are considered neonatal, but many intensive care units for neonates provide continuing care for infants several months of age with complex problems that are the result of prematurity and complications of neonatal disorders. Therefore, neonatal nursery-associated infections may occur in infants up to a year or more of age. Bacterial infections due to rapidly dividing high-grade pathogens that set in substantially before birth usually result in a stillbirth. Often it is difficult to distinguish infections acquired shortly prior to birth from those acquired as a result of contact with maternal vaginal, fecal, or skin flora during delivery.
Neonatal sepsis occurs in approximately 2 to 4 per 1000 live births in the United States. World-wide reports vary from <2 to 50 per 1000 live births. The rates of early-onset sepsis have fallen to <1.0/1000 in the United States and Western Europe. Risk factors noted in Table 94.1 have a very strong predictive influence on infection rates. Full-term infants born without incident have a very low incidence of infection, lower than any other population of hospitalized patients. Infants susceptible to early-onset postnatal infections are primarily those born prematurely. Those premature infants born to mothers with an infection or whose membranes rupture more than 18 hours before delivery may have an infection rate of 20% or more. In extremely premature infants extra vigilance is required for early recognition and treatment of infection. Premature infants are much more likely to develop sepsis as a consequence of the amnionitis caused by ascending infection than are full-term infants.
Organic wheat and small grains are produced on relatively few acres in the inland Pacific Northwest. The objective of this study was to examine how the nitrogen (N) dynamics of cropping systems (CSs) produced during the transition phase impacted organic wheat yield and protein levels in the first 2 years of certified organic production. Certified organic spring wheat (SW) was produced in 2006 and winter wheat (WW) in 2007 following nine, 3-year transitional cereal, small grain and legume-intensive CSs. SW and WW following perennial alfalfa + oat/pea forage or 3 years of legume green manure tended to be more productive than wheat that followed systems that contained a small grain crop for at least 1 year during the transition. In addition to increasing soil N, well-established stands of forage and green manure provided adequate cover to reduce weed establishment prior to organic production. Effective weed control strategies were as important as increasing soil inorganic N levels for improving organic wheat production. Choice of crop type, cultivar and rotation is important in organic wheat systems and in this study, WW had better stand establishment, competition with weeds and higher overall yield than SW and would be a better-suited class of wheat for organic production in situations where spring weeds are the dominant problem. Regardless of CS or crop type, supplemental soil fertility (primarily N) during the organic production phase will be necessary to maintain high soil N levels and wheat yields in these dryland systems.
TAR DNA-binding protein 43 (TDP-43) has been identified as a major disease protein in frontotemporal lobar degeneration. More recently, TDP-43 proteinopathy has also been observed in Alzheimer's disease (AD) with a characteristic distribution of TDP-43 predominantly in the mesial temporal lobe, and to a lesser degree in the neocortical areas. AD subjects with psychotic symptoms (AD+P) represent a subgroup characterized by greater impairment of frontal cortex-dependent cognitive functions and more severe frontal cortical neuropathology. The aim of this study is to determine whether there is an association between TDP-43 pathology and AD+P. We hypothesized that TDP-43 pathology would be more frequent in AD+P than in AD without psychosis.
Methods:
We studied the presence and distribution of TDP-43 pathology by immunohistochemistry in the dentate gyrus (DG) and prefrontal cortex (FC) of postmortem brain specimens from 68 subjects with a primary neuropathologic diagnosis of AD as determined by the Neuropathology Core of the University of Pittsburgh Alzheimer's Disease Research Center.
Results:
Forty-five (66%) subjects were classified as AD+P. Fourteen (20.6%) subjects had TDP-43 pathology in DG, eight (11.8%) had TDP-43 pathology in FC, and six (8.8%) had TDP-43 pathology in both regions. TDP-43 in DG was not significantly associated with AD+P. However, TDP-43 in FC demonstrated a trend toward reduced likelihood of psychosis (p = 0.068). TDP-43 pathology in DG, but not FC, was significantly associated with greater age at death and longer duration of illness.
Conclusions:
Our findings indicate that there was no association between concomitant TDP-43 pathology in DG or FC and AD+P.
Disasters and the American State offers a thesis about the trajectory of federal government involvement in preparing for disaster shaped by contingent events. Politicians and bureaucrats claim credit for the government's successes in preparing for and responding to disaster, and they are also blamed for failures outside of government's control. New interventions have created precedents and established organizations and administrative cultures that accumulated over time and produced a general trend in which citizens, politicians and bureaucrats expect the government to provide more security from more kinds of disasters. The trend reached its peak when the Federal Emergency Management Agency adopted the idea of preparing for 'all hazards' as its mantra. Despite the rhetoric, however, the federal government's increasingly bold claims and heightened public expectations are disproportionate to the ability of the federal government to prevent or reduce the damage caused by disaster.