To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Even though sub-Saharan African women spend millions of person-hours per day fetching water and pounding grain, to date, few studies have rigorously assessed the energy expenditure costs of such domestic activities. As a result, most analyses that consider head-hauling water or hand pounding of grain with a mortar and pestle (pilão use) employ energy expenditure values derived from limited research. The current paper compares estimated energy expenditure values from heart rate monitors v. indirect calorimetry in order to understand some of the limitations with using such monitors to measure domestic activities.
This confirmation study estimates the metabolic equivalent of task (MET) value for head-hauling water and hand-pounding grain using both indirect calorimetry and heart rate monitors under laboratory conditions.
The study was conducted in Nampula, Mozambique.
Forty university students in Nampula city who recurrently engaged in water-fetching activities.
Including all participants, the mean MET value for head hauling 20 litres (20·5 kg, including container) of water (2·7 km/h, 0 % slope) was 4·3 (sd 0·9) and 3·7 (sd 1·2) for pilão use. Estimated energy expenditure predictions from a mixed model were found to correlate with observed energy expenditure (r2 0·68, r 0·82). Re-estimating the model with pilão use data excluded improved the fit substantially (r2 0·83, r 0·91).
The current study finds that heart rate monitors are suitable instruments for providing accurate quantification of energy expenditure for some domestic activities, such as head-hauling water, but are not appropriate for quantifying expenditures of other activities, such as hand-pounding grain.
The Living Life to the Full college and free online courses are based on the Cognitive Behaviour Therapy (CBT) approach and is offered at Further Education Colleges and free of charge online (www.livinglifetothefull.com). The classes teaches key skills such as identifying and challenging unhelpful thoughts, problem solving etc.
In the college course, total mean scores at baseline for knowledge questions was 8.20 increasing to 11.07 gain 2.87 p=.042). Self assessed skills were 24.00 at baseline, increasing to 34.20 at session 8 (mean difference = 10.20 p=.001). The Training Acceptability Rating scale showed content scores at session 1 of 77% rising to 91% at session 8. Process scores were 73% at session 1 rising to 89%, showing training acceptability throughout the course. The online course has over 15,000 registered users. 70% are clinical cases of anxiety (HAD scale), and 55% depression. 24% of users are clinical cases and are not receiving any support from a practitioner. The site has had over 4 million hits in 10 months and an average of 1000 hits/hour.
Delivering CBT in this way seems to lead to gains in mental health literacy. Such courses may provide another useful option for helping people access CBT for mild to moderate problems of distress. A RCT of the core course materials has just been completed.
Transnationalism provides a serious challenge in mental health care, especially due to the crucial role of communication. Emergency room interactions offer an opportunity to analyze the role of cultural competency among providers and how they relate to immigrants in the clinical encounter.
This study addresses three aims: to assess the level of provider-perceived accuracy of diagnoses; to evaluate the use of restraints; and to compare diagnoses rates between patients of diverse racial/ethnic groups.
We examined patients’ race/ethnicity and their relation to service use and perceived certainty of mental health diagnoses. Three hundred and forty-seven migrants and 67 natives as well as their providers were interviewed in psychiatry emergency rooms in Barcelona (Spain).
The perceived certainty of clinical diagnosis is lower for Asians (OR = 0.2, 95% CI [0.07–0.63]), and higher when the clinician feels comfortable with the patient (OR = 5.41, 95% CI [2.53–11.58]). The probability of restraints is higher for Maghreb patients compared to native born (OR = 3.56, 95% CI [1.03–12.26]). The probability of compulsory admission is lower for Latinos compared to native born (OR = 0.26, 95% CI [0.08–0.88]). The probability of receiving a diagnosis of psychosis is lower when the clinician can communicate in the patient's language (OR = 0.37, CI 95% [0.16–0.83]).
Cultural factors such as level of comfort and communication in the patient's language play a central role in diagnosis and treatment. This study highlights the importance of culture in psychiatric diagnosis and the role of cultural competency for mental health providers.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Scientific quality and feasibility are part of ethics review by Institutional Review Boards (IRBs). Scientific Review Committees (SRCs) were proposed to facilitate this assessment by the Clinical and Translational Science Award (CTSA) SRC Consensus Group. This study assessed SRC feasibility and impact at CTSA-affiliated academic health centers (AHCs).
SRC implementation at 10 AHCs was assessed pre/post-intervention using quantitative and qualitative methods. Pre-intervention, four AHCs had no SRC, and six had at least one SRC needing modifications to better align with Consensus Group recommendations.
Facilitators of successful SRC implementation included broad-based communication, an external motivator, senior-level support, and committed SRC reviewers. Barriers included limited resources and staffing, variable local mandates, limited SRC authority, lack of anticipated benefit, and operational challenges. Research protocol quality did not differ significantly between study periods, but respondents suggested positive effects. During intervention, median total review duration did not lengthen for the 40% of protocols approved within 3 weeks. For the 60% under review after 3 weeks, review was lengthened primarily due to longer IRB review for SRC-reviewed protocols. Site interviews recommended designing locally effective SRC processes, building buy-in by communication or by mandate, allowing time for planning and sharing best practices, and connecting SRC and IRB procedures.
The CTSA SRC Consensus Group recommendations appear feasible. Although not conclusive in this relatively short initial implementation, sites perceived positive impact by SRCs on study quality. Optimal benefit will require local or federal mandate for implementation, adapting processes to local contexts, and employing SRC stipulations.
Psychotropic prescription rates continue to increase in the United States (USA). Few studies have investigated whether social-structural factors may play a role in psychotropic medication use independent of mental illness. Food insecurity is prevalent among people living with HIV in the USA and has been associated with poor mental health. We investigated whether food insecurity was associated with psychotropic medication use independent of the symptoms of depression and anxiety among women living with HIV in the USA.
We used cross-sectional data from the Women's Interagency HIV Study (WIHS), a nationwide cohort study. Food security (FS) was the primary explanatory variable, measured using the Household Food Security Survey Module. First, we used multivariable linear regressions to test whether FS was associated with symptoms of depression (Center for Epidemiologic Studies Depression [CESD] score), generalised anxiety disorder (GAD-7 score) and mental health-related quality of life (MOS-HIV Mental Health Summary score; MHS). Next, we examined associations of FS with the use of any psychotropic medications, including antidepressants, sedatives and antipsychotics, using multivariable logistic regressions adjusting for age, race/ethnicity, income, education and alcohol and substance use. In separate models, we additionally adjusted for symptoms of depression (CESD score) and anxiety (GAD-7 score).
Of the 905 women in the sample, two-thirds were African-American. Lower FS (i.e. worse food insecurity) was associated with greater symptoms of depression and anxiety in a dose–response relationship. For the psychotropic medication outcomes, marginal and low FS were associated with 2.06 (p < 0.001; 95% confidence interval [CI] = 1.36–3.13) and 1.99 (p < 0.01; 95% CI = 1.26–3.15) times higher odds of any psychotropic medication use, respectively, before adjusting for depression and anxiety. The association of very low FS with any psychotropic medication use was not statistically significant. A similar pattern was found for antidepressant and sedative use. After additionally adjusting for CESD and GAD-7 scores, marginal FS remained associated with 1.93 (p < 0.05; 95% CI = 1.16–3.19) times higher odds of any psychotropic medication use. Very low FS, conversely, was significantly associated with lower odds of antidepressant use (adjusted odds ratio = 0.42; p < 0.05; 95% CI = 0.19–0.96).
Marginal FS was associated with higher odds of using psychotropic medications independent of depression and anxiety, while very low FS was associated with lower odds. These complex findings may indicate that people experiencing very low FS face barriers to accessing mental health services, while those experiencing marginal FS who do access services are more likely to be prescribed psychotropic medications for distress arising from social and structural factors.
The Murchison Widefield Array (MWA) is an open access telescope dedicated to studying the low-frequency (80–300 MHz) southern sky. Since beginning operations in mid-2013, the MWA has opened a new observational window in the southern hemisphere enabling many science areas. The driving science objectives of the original design were to observe 21 cm radiation from the Epoch of Reionisation (EoR), explore the radio time domain, perform Galactic and extragalactic surveys, and monitor solar, heliospheric, and ionospheric phenomena. All together
programs recorded 20 000 h producing 146 papers to date. In 2016, the telescope underwent a major upgrade resulting in alternating compact and extended configurations. Other upgrades, including digital back-ends and a rapid-response triggering system, have been developed since the original array was commissioned. In this paper, we review the major results from the prior operation of the MWA and then discuss the new science paths enabled by the improved capabilities. We group these science opportunities by the four original science themes but also include ideas for directions outside these categories.
The Murchison Widefield Array (MWA) is an electronically steered low-frequency (<300 MHz) radio interferometer, with a ‘slew’ time less than 8 s. Low-frequency (∼100 MHz) radio telescopes are ideally suited for rapid response follow-up of transients due to their large field of view, the inverted spectrum of coherent emission, and the fact that the dispersion delay between a 1 GHz and 100 MHz pulse is on the order of 1–10 min for dispersion measures of 100–2000 pc/cm3. The MWA has previously been used to provide fast follow-up for transient events including gamma-ray bursts (GRBs), fast radio bursts (FRBs), and gravitational waves, using systems that respond to gamma-ray coordinates network packet-based notifications. We describe a system for automatically triggering MWA observations of such events, based on Virtual Observatory Event standard triggers, which is more flexible, capable, and accurate than previous systems. The system can respond to external multi-messenger triggers, which makes it well-suited to searching for prompt coherent radio emission from GRBs, the study of FRBs and gravitational waves, single pulse studies of pulsars, and rapid follow-up of high-energy superflares from flare stars. The new triggering system has the capability to trigger observations in both the regular correlator mode (limited to ≥0.5 s integrations) and using the Voltage Capture System (VCS, 0.1 ms integration) of the MWA and represents a new mode of operation for the MWA. The upgraded standard correlator triggering capability has been in use since MWA observing semester 2018B (July–Dec 2018), and the VCS and buffered mode triggers will become available for observing in a future semester.
The WHO African region is characterised by the largest infectious disease burden in the world. We conducted a retrospective descriptive analysis using records of all infectious disease outbreaks formally reported to the WHO in 2018 by Member States of the African region. We analysed the spatio-temporal distribution, the notification delay as well as the morbidity and mortality associated with these outbreaks. In 2018, 96 new disease outbreaks were reported across 36 of the 47 Member States. The most commonly reported disease outbreak was cholera which accounted for 20.8% (n = 20) of all events, followed by measles (n = 11, 11.5%) and Yellow fever (n = 7, 7.3%). About a quarter of the outbreaks (n = 23) were reported following signals detected through media monitoring conducted at the WHO regional office for Africa. The median delay between the disease onset and WHO notification was 16 days (range: 0–184). A total of 107 167 people were directly affected including 1221 deaths (mean case fatality ratio (CFR): 1.14% (95% confidence interval (CI) 1.07%–1.20%)). The highest CFR was observed for diseases targeted for eradication or elimination: 3.45% (95% CI 0.89%–10.45%). The African region remains prone to outbreaks of infectious diseases. It is therefore critical that Member States improve their capacities to rapidly detect, report and respond to public health events.
Residual herbicides applied to summer cash crops have the potential to injure subsequent winter annual cover crops, yet little information is available to guide growers’ choices. Field studies were conducted in 2016 and 2017 in Blacksburg and Suffolk, Virginia, to determine carryover of 30 herbicides commonly used in corn, soybean, or cotton on wheat, barley, cereal rye, oats, annual ryegrass, forage radish, Austrian winter pea, crimson clover, hairy vetch, and rapeseed cover crops. Herbicides were applied to bare ground either 14 wk before cover crop planting for a PRE timing or 10 wk for a POST timing. Visible injury was recorded 3 and 6 wk after planting (WAP), and cover crop biomass was collected 6 WAP. There were no differences observed in cover crop biomass among herbicide treatments, despite visible injury that suggested some residual herbicides have the potential to effect cover crop establishment. Visible injury on grass cover crop species did not exceed 20% from any herbicide. Fomesafen resulted in the greatest injury recorded on forage radish, with greater than 50% injury in 1 site-year. Trifloxysulfuron and atrazine resulted in greater than 20% visible injury on forage radish. Trifloxysulfuron resulted in the greatest injury (30%) observed on crimson clover in 1 site-year. Prosulfuron and isoxaflutole significantly injured rapeseed (17% to 21%). Results indicate that commonly used residual herbicides applied in the previous cash crop growing season result in little injury on grass cover crop species, and only a few residual herbicides could potentially affect the establishment of a forage radish, crimson clover, or rapeseed cover crop.
We apply two methods to estimate the 21-cm bispectrum from data taken within the Epoch of Reionisation (EoR) project of the Murchison Widefield Array (MWA). Using data acquired with the Phase II compact array allows a direct bispectrum estimate to be undertaken on the multiple redundantly spaced triangles of antenna tiles, as well as an estimate based on data gridded to the uv-plane. The direct and gridded bispectrum estimators are applied to 21 h of high-band (167–197 MHz; z = 6.2–7.5) data from the 2016 and 2017 observing seasons. Analytic predictions for the bispectrum bias and variance for point-source foregrounds are derived. We compare the output of these approaches, the foreground contribution to the signal, and future prospects for measuring the bispectra with redundant and non-redundant arrays. We find that some triangle configurations yield bispectrum estimates that are consistent with the expected noise level after 10 h, while equilateral configurations are strongly foreground-dominated. Careful choice of triangle configurations may be made to reduce foreground bias that hinders power spectrum estimators, and the 21-cm bispectrum may be accessible in less time than the 21-cm power spectrum for some wave modes, with detections in hundreds of hours.