To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
OBJECTIVES/GOALS: The goal of this study was to understand the impact of a high sodium diet on gene networks in the kidney that correlate with blood pressure in female primates, and translating findings to women. METHODS/STUDY POPULATION: Sodium-naÃ¯ve female baboons (n=7) were fed a low-sodium (LS) diet for 6 weeks followed by a high sodium (HS) diet for 6 weeks. Sodium intake, serum 17 beta-estradiol, and ultrasound-guided kidney biopsies for RNA-Seq were collected at the end of each diet. Blood pressure was continuously measured for 64-hour periods throughout the study by implantable telemetry devices. Weighted gene coexpression network analysis was performed on RNA-Seq data to identify transcripts correlated with blood pressure on each diet. Network analysis was performed on transcripts highly correlated with BP, and in silico findings were validated by immunohistochemistry of kidney tissues. RESULTS/ANTICIPATED RESULTS: On the LS diet, Na+ intake and serum 17 beta-estradiol concentration correlated with BP. Cell type composition of renal biopsies was consistent among all animals for both diets. Kidney transcriptomes differed by diet; analysis by unbiased weighted gene co-expression network analysis revealed modules of genes correlated with BP on the HS diet. Network analysis of module genes showed causal networks linking hormone receptors, proliferation and differentiation, methylation, hypoxia, insulin and lipid regulation, and inflammation as regulators underlying variation in BP on the HS diet. Our results show variation in BP correlated with novel kidney gene networks with master regulators PPARG and MYC in female baboons on a HS diet. DISCUSSION/SIGNIFICANCE: Previous studies in primates to identify molecular networks dysregulated by HS diet focused on males. Current clinical guidelines do not offer sex-specific treatment plans for sodium sensitive hypertension. This study leveraged variation in BP as a first step to identify correlated kidney regulatory gene networks in female primates after a HS diet.
Theories of early cooperation in human society often draw from a small sample of ethnographic studies of surviving populations of hunter–gatherers, most of which are now sedentary. Borneo hunter–gatherers (Punan, Penan) have seldom figured in comparative research because of a decades-old controversy about whether they are the descendants of farmers who adopted a hunting and gathering way of life. In 2018 we began an ethnographic study of a group of still-nomadic hunter–gatherers who call themselves Punan Batu (Cave Punan). Our genetic analysis clearly indicates that they are very unlikely to be the descendants of neighbouring agriculturalists. They also preserve a song language that is unrelated to other languages of Borneo. Dispersed travelling groups of Punan Batu with fluid membership use message sticks to stay in contact, co-operate and share resources as they journey between rock shelters and forest camps. Message sticks were once widespread among nomadic Punan in Borneo, but have largely disappeared in sedentary Punan villages. Thus the small community of Punan Batu offers a rare glimpse of a hunting and gathering way of life that was once widespread in the forests of Borneo, where prosocial behaviour extended beyond the face-to-face community, facilitating successful collective adaptation to the diverse resources of Borneo's forests.
Optical tracking systems typically trade off between astrometric precision and field of view. In this work, we showcase a networked approach to optical tracking using very wide field-of-view imagers that have relatively low astrometric precision on the scheduled OSIRIS-REx slingshot manoeuvre around Earth on 22 Sep 2017. As part of a trajectory designed to get OSIRIS-REx to NEO 101955 Bennu, this flyby event was viewed from 13 remote sensors spread across Australia and New Zealand to promote triangulatable observations. Each observatory in this portable network was constructed to be as lightweight and portable as possible, with hardware based off the successful design of the Desert Fireball Network. Over a 4-h collection window, we gathered 15 439 images of the night sky in the predicted direction of the OSIRIS-REx spacecraft. Using a specially developed streak detection and orbit determination data pipeline, we detected 2 090 line-of-sight observations. Our fitted orbit was determined to be within about 10 km of orbital telemetry along the observed 109 262 km length of OSIRIS-REx trajectory, and thus demonstrating the impressive capability of a networked approach to Space Surveillance and Tracking.
Parasites sometimes expand their host range and cause new disease aetiologies. Genetic changes can then occur due to host-specific adaptive alterations, particularly when parasites cross between evolutionarily distant hosts. Characterizing genetic variation in Cryptosporidium from humans and other animals may have important implications for understanding disease dynamics and transmission. We analyse sequences from four loci (gp60, HSP-70, COWP and actin) representing multiple Cryptosporidium species reported in humans. We predicted low genetic diversity in species that present unusual human infections due to founder events and bottlenecks. High genetic diversity was observed in isolates from humans of Cryptosporidium meleagridis, Cryptosporidium cuniculus, Cryptosporidium hominis and Cryptosporidium parvum. A deviation of expected values of neutrality using Tajima's D was observed in C. cuniculus and C. meleagridis. The high genetic diversity in C. meleagridis and C. cuniculus did not match our expectations but deviations from neutrality indicate a recent decrease in genetic variability through a population bottleneck after an expansion event. Cryptosporidium hominis was also found with a significant Tajima's D positive value likely caused by recent population expansion of unusual genotypes in humans. These insights indicate that changes in genetic diversity can help us to understand host-parasite adaptation and evolution.
We critically review potential involvement of trimethylamine N-oxide (TMAO) as a link between diet, the gut microbiota and CVD. Generated primarily from dietary choline and carnitine by gut bacteria and hepatic flavin-containing mono-oxygenase (FMO) activity, TMAO could promote cardiometabolic disease when chronically elevated. However, control of circulating TMAO is poorly understood, and diet, age, body mass, sex hormones, renal clearance, FMO3 expression and genetic background may explain as little as 25 % of TMAO variance. The basis of elevations with obesity, diabetes, atherosclerosis or CHD is similarly ill-defined, although gut microbiota profiles/remodelling appear critical. Elevated TMAO could promote CVD via inflammation, oxidative stress, scavenger receptor up-regulation, reverse cholesterol transport (RCT) inhibition, and cardiovascular dysfunction. However, concentrations influencing inflammation, scavenger receptors and RCT (≥100 µm) are only achieved in advanced heart failure or chronic kidney disease (CKD), and greatly exceed pathogenicity of <1–5 µm levels implied in some TMAO–CVD associations. There is also evidence that CVD risk is insensitive to TMAO variance beyond these levels in omnivores and vegetarians, and that major TMAO sources are cardioprotective. Assessing available evidence suggests that modest elevations in TMAO (≤10 µm) are a non-pathogenic consequence of diverse risk factors (ageing, obesity, dyslipidaemia, insulin resistance/diabetes, renal dysfunction), indirectly reflecting CVD risk without participating mechanistically. Nonetheless, TMAO may surpass a pathogenic threshold as a consequence of CVD/CKD, secondarily promoting disease progression. TMAO might thus reflect early CVD risk while providing a prognostic biomarker or secondary target in established disease, although mechanistic contributions to CVD await confirmation.
We present 0.″2–0.″4 resolution ALMA images of the submillimeter dust continuum and the CO, H2O, and H2O+ line emission in a z = 3.63 strongly lensed dusty starburst. We construct the lens model for the system with an MCMC technique. While the average magnification for the dust continuum is about 11, the magnification of the line emission varies from 5 to 22 across the source, resolving the source down to sub-kpc scales. The ISM content reveals that it is a pre-coalescence major merger of two ultra-luminous infrared galaxies, both with a large amount of molecular gas reservoir. The approaching galaxy in the south shows no apparent kinematic structure with a half-light radius of 0.4 kpc, while the preceding one resembles a 1.2 kpc rotating disk, separated by a projected distance of 1.3 kpc. The distribution of dust and gas emission suggests a large amount of cold ISM concentrated in the interacting region.
Disparities exist among Latino smokers with respect to knowledge and access to smoking cessation resources. This study tested the feasibility of using case management (CM) to increase access to pharmacotherapy and quitlines among Latino smokers.
Latino smokers were randomized to CM (n = 40) or standard care (SC, n = 40). All participants received educational materials describing how to utilize pharmacy assistance for cessation pharmacotherapy and connect with quitlines. CM participants received four phone calls from staff to encourage pharmacotherapy and quitline use. At 6-months follow-up, we assessed the utilization of pharmacotherapy and quitline. Additional outcomes included self-reported smoking status and approval for pharmacotherapy assistance.
Using intention-to-treat analysis, CM produced higher utilization than SC of both pharmacotherapy (15.0% versus 2.5%; P = 0.108) and quitlines (12.5% versus 5.0%; P = 0.432), although differences were not statistically significant. Approval for pharmacotherapy assistance programs (20.0% versus 0.0%; P = 0.0005) was significantly higher for CM than SC participants. Self-reported point-prevalence smoking abstinence at 6-months were 20.0% and 17.5% for CM and SC, respectively (P = 0.775).
CM holds promise as an effective intervention to connect Latino smokers to evidence-based cessation treatment.
Acute flank pain from suspected urolithiasis is a common presenting complaint in the Emergency Department. Multiple computed tomography (CT) has traditionally been the standard imaging modality used to diagnose obstructive kidney stones, however point of care ultrasound (PoCUS) can play an important role in the diagnostic algorithm and risk stratification of acute flank pain. Here, we present the case of a 29-year-old female with suspected urolithiasis, who underwent PoCUS that revealed right-sided hydronephrosis and normal left kidney, bladder, and aorta. A subsequent KUB was negative. As the clinical course failed to improve with therapy, an abdominal and pelvic CT was ordered revealing a 5 mm distal obstructing ureteric calculus at the right vesico-ureteric junction and another 5 mm left mid ureteric calculus. To the best of our knowledge, this is the first case in which a patient presenting with acute right-sided flank pain demonstrated unilateral hydronephrosis on PoCUS, but had clinically significant bilateral ureteric stones on CT. Emergency physicians who employ PoCUS for evaluation of flank pain must be aware of its benefits and drawbacks and how they apply to each patient. As such, we have developed a script emergency physicians can use for shared decision-making with renal colic patients when deciding on the appropriate imaging modality.
Introduction: Point of care ultrasound (PoCUS) has become an established tool in the initial management of patients with undifferentiated hypotension in the emergency department (ED). Current established protocols (e.g. RUSH and ACES) were developed by expert user opinion, rather than objective, prospective data. Recently the SHoC Protocol was published, recommending 3 core scans; cardiac, lung, and IVC; plus other scans when indicated clinically. We report the abnormal ultrasound findings from our international multicenter randomized controlled trial, to assess if the recommended 3 core SHoC protocol scans were chosen appropriately for this population. Methods: Recruitment occurred at seven centres in North America (4) and South Africa (3). Screening at triage identified patients (SBP<100 or shock index>1) who were randomized to PoCUS or control (standard care with no PoCUS) groups. All scans were performed by PoCUS-trained physicians within one hour of arrival in the ED. Demographics, clinical details and study findings were collected prospectively. A threshold incidence for positive findings of 10% was established as significant for the purposes of assessing the appropriateness of the core recommendations. Results: 138 patients had a PoCUS screen completed. All patients had cardiac, lung, IVC, aorta, abdominal, and pelvic scans. Reported abnormal findings included hyperdynamic LV function (59; 43%); small collapsing IVC (46; 33%); pericardial effusion (24; 17%); pleural fluid (19; 14%); hypodynamic LV function (15; 11%); large poorly collapsing IVC (13; 9%); peritoneal fluid (13; 9%); and aortic aneurysm (5; 4%). Conclusion: The 3 core SHoC Protocol recommendations included appropriate scans to detect all pathologies recorded at a rate of greater than 10 percent. The 3 most frequent findings were cardiac and IVC abnormalities, followed by lung. It is noted that peritoneal fluid was seen at a rate of 9%. Aortic aneurysms were rare. This data from the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients, supports the use of the prioritized SHoC protocol, though a larger study is required to confirm these findings.
Introduction: Point of care ultrasound (PoCUS) is an established tool in the initial management of patients with undifferentiated hypotension in the emergency department (ED). While PoCUS protocols have been shown to improve early diagnostic accuracy, there is little published evidence for any mortality benefit. We report the findings from our international multicenter randomized controlled trial, assessing the impact of a PoCUS protocol on survival and key clinical outcomes. Methods: Recruitment occurred at 7 centres in North America (4) and South Africa (3). Scans were performed by PoCUS-trained physicians. Screening at triage identified patients (SBP<100 or shock index>1), randomized to PoCUS or control (standard care and no PoCUS) groups. Demographics, clinical details and study findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. The primary outcome measure was 30-day/discharge mortality. Secondary outcome measures included diagnostic accuracy, changes in vital signs, acid-base status, and length of stay. Categorical data was analyzed using Fishers test, and continuous data by Student T test and multi-level log-regression testing. (GraphPad/SPSS) Final chart review was blinded to initial impressions and PoCUS findings. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no difference between groups for the primary outcome of mortality; PoCUS 32/129 (24.8%; 95% CI 14.3-35.3%) vs. Control 32/129 (24.8%; 95% CI 14.3-35.3%); RR 1.00 (95% CI 0.869 to 1.15; p=1.00). There were no differences in the secondary outcomes; ICU and total length of stay. Our sample size has a power of 0.80 (α:0.05) for a moderate effect size. Other secondary outcomes are reported separately. Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We did not find any mortality or length of stay benefits with the use of a PoCUS protocol, though a larger study is required to confirm these findings. While PoCUS may have diagnostic benefits, these may not translate into a survival benefit effect.
Introduction: Point of Care Ultrasound (PoCUS) protocols are commonly used to guide resuscitation for emergency department (ED) patients with undifferentiated non-traumatic hypotension. While PoCUS has been shown to improve early diagnosis, there is a minimal evidence for any outcome benefit. We completed an international multicenter randomized controlled trial (RCT) to assess the impact of a PoCUS protocol on key resuscitation markers in this group. We report diagnostic impact and mortality elsewhere. Methods: The SHoC-ED1 study compared the addition of PoCUS to standard care within the first hour in the treatment of adult patients presenting with undifferentiated hypotension (SBP<100 mmHg or a Shock Index >1.0) with a control group that did not receive PoCUS. Scans were performed by PoCUS-trained physicians. 4 North American, and 3 South African sites participated in the study. Resuscitation outcomes analyzed included volume of fluid administered in the ED, changes in shock index (SI), modified early warning score (MEWS), venous acid-base balance, and lactate, at one and four hours. Comparisons utilized a T-test as well as stratified binomial log-regression to assess for any significant improvement in resuscitation amount the outcomes. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no significant difference in mean total volume of fluid received between the control (1658 ml; 95%CI 1365-1950) and PoCUS groups (1609 ml; 1385-1832; p=0.79). Significant improvements were seen in SI, MEWS, lactate and bicarbonate with resuscitation in both the PoCUS and control groups, however there was no difference between groups. Conclusion: SHOC-ED1 is the first RCT to compare PoCUS to standard of care in hypotensive ED patients. No significant difference in fluid used, or markers of resuscitation was found when comparing the use of a PoCUS protocol to that of standard of care in the resuscitation of patients with undifferentiated hypotension.
Introduction: Point of care ultrasonography (PoCUS) is an established tool in the initial management of hypotensive patients in the emergency department (ED). It has been shown rule out certain shock etiologies, and improve diagnostic certainty, however evidence on benefit in the management of hypotensive patients is limited. We report the findings from our international multicenter RCT assessing the impact of a PoCUS protocol on diagnostic accuracy, as well as other key outcomes including mortality, which are reported elsewhere. Methods: Recruitment occurred at 4 North American and 3 Southern African sites. Screening at triage identified patients (SBP<100 mmHg or shock index >1) who were randomized to either PoCUS or control groups. Scans were performed by PoCUS-trained physicians. Demographics, clinical details and findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. Final chart review was blinded to initial impressions and PoCUS findings. Categorical data was analyzed using Fishers two-tailed test. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. The perceived shock category changed more frequently in the PoCUS group 20/127 (15.7%) vs. control 7/125 (5.6%); RR 2.81 (95% CI 1.23 to 6.42; p=0.0134). There was no significant difference in change of diagnostic impression between groups PoCUS 39/123 (31.7%) vs control 34/124 (27.4%); RR 1.16 (95% CI 0.786 to 1.70; p=0.4879). There was no significant difference in the rate of correct category of shock between PoCUS (118/127; 93%) and control (113/122; 93%); RR 1.00 (95% CI 0.936 to 1.08; p=1.00), or for correct diagnosis; PoCUS 90/127 (70%) vs control 86/122 (70%); RR 0.987 (95% CI 0.671 to 1.45; p=1.00). Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We found that the use of PoCUS did change physicians’ perceived shock category. PoCUS did not improve diagnostic accuracy for category of shock or diagnosis.
The public health threat posed by zoonotic Plasmodium knowlesi appears to be growing: it is increasingly reported across South East Asia, and is the leading cause of malaria in Malaysian Borneo. Plasmodium knowlesi threatens progress towards malaria elimination as aspects of its transmission, such as spillover from wildlife reservoirs and reliance on outdoor-biting vectors, may limit the effectiveness of conventional methods of malaria control. The development of new quantitative approaches that address the ecological complexity of P. knowlesi, particularly through a focus on its primary reservoir hosts, will be required to control it. Here, we review what is known about P. knowlesi transmission, identify key knowledge gaps in the context of current approaches to transmission modelling, and discuss the integration of these approaches with clinical parasitology and geostatistical analysis. We highlight the need to incorporate the influences of fine-scale spatial variation, rapid changes to the landscape, and reservoir population and transmission dynamics. The proposed integrated approach would address the unique challenges posed by malaria as a zoonosis, aid the identification of transmission hotspots, provide insight into the mechanistic links between incidence and land use change and support the design of appropriate interventions.
Civilian suicide rates vary by occupation in ways related to occupational stress exposure. Comparable military research finds suicide rates elevated in combat arms occupations. However, no research has evaluated variation in this pattern by deployment history, the indicator of occupation stress widely considered responsible for the recent rise in the military suicide rate.
The joint associations of Army occupation and deployment history in predicting suicides were analysed in an administrative dataset for the 729 337 male enlisted Regular Army soldiers in the US Army between 2004 and 2009.
There were 496 suicides over the study period (22.4/100 000 person-years). Only two occupational categories, both in combat arms, had significantly elevated suicide rates: infantrymen (37.2/100 000 person-years) and combat engineers (38.2/100 000 person-years). However, the suicide rates in these two categories were significantly lower when currently deployed (30.6/100 000 person-years) than never deployed or previously deployed (41.2–39.1/100 000 person-years), whereas the suicide rate of other soldiers was significantly higher when currently deployed and previously deployed (20.2–22.4/100 000 person-years) than never deployed (14.5/100 000 person-years), resulting in the adjusted suicide rate of infantrymen and combat engineers being most elevated when never deployed [odds ratio (OR) 2.9, 95% confidence interval (CI) 2.1–4.1], less so when previously deployed (OR 1.6, 95% CI 1.1–2.1), and not at all when currently deployed (OR 1.2, 95% CI 0.8–1.8). Adjustment for a differential ‘healthy warrior effect’ cannot explain this variation in the relative suicide rates of never-deployed infantrymen and combat engineers by deployment status.
Efforts are needed to elucidate the causal mechanisms underlying this interaction to guide preventive interventions for soldiers at high suicide risk.
During improved oil recovery (IOR), gas may be introduced into a porous reservoir filled with surfactant solution in order to form foam. A model for the evolution of the resulting foam front known as ‘pressure-driven growth’ is analysed. An asymptotic solution of this model for long times is derived that shows that foam can propagate indefinitely into the reservoir without gravity override. Moreover, ‘pressure-driven growth’ is shown to correspond to a special case of the more general ‘viscous froth’ model. In particular, it is a singular limit of the viscous froth, corresponding to the elimination of a surface tension term, permitting sharp corners and kinks in the predicted shape of the front. Sharp corners tend to develop from concave regions of the front. The principal solution of interest has a convex front, however, so that although this solution itself has no sharp corners (except for some kinks that develop spuriously owing to errors in a numerical scheme), it is found nevertheless to exhibit milder singularities in front curvature, as the long-time asymptotic analytical solution makes clear. Numerical schemes for the evolving front shape which perform robustly (avoiding the development of spurious kinks) are also developed. Generalisations of this solution to geologically heterogeneous reservoirs should exhibit concavities and/or sharp corner singularities as an inherent part of their evolution: propagation of fronts containing such ‘inherent’ singularities can be readily incorporated into these numerical schemes.