We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Recently, artificial intelligence-powered devices have been put forward as potentially powerful tools for the improvement of mental healthcare. An important question is how these devices impact the physician-patient interaction.
Aims
Aifred is an artificial intelligence-powered clinical decision support system (CDSS) for the treatment of major depression. Here, we explore the use of a simulation centre environment in evaluating the usability of Aifred, particularly its impact on the physician–patient interaction.
Method
Twenty psychiatry and family medicine attending staff and residents were recruited to complete a 2.5-h study at a clinical interaction simulation centre with standardised patients. Each physician had the option of using the CDSS to inform their treatment choice in three 10-min clinical scenarios with standardised patients portraying mild, moderate and severe episodes of major depression. Feasibility and acceptability data were collected through self-report questionnaires, scenario observations, interviews and standardised patient feedback.
Results
All 20 participants completed the study. Initial results indicate that the tool was acceptable to clinicians and feasible for use during clinical encounters. Clinicians indicated a willingness to use the tool in real clinical practice, a significant degree of trust in the system's predictions to assist with treatment selection, and reported that the tool helped increase patient understanding of and trust in treatment. The simulation environment allowed for the evaluation of the tool's impact on the physician–patient interaction.
Conclusions
The simulation centre allowed for direct observations of clinician use and impact of the tool on the clinician–patient interaction before clinical studies. It may therefore offer a useful and important environment in the early testing of new technological tools. The present results will inform further tool development and clinician training materials.
Intimations of death often cause symptoms or syndromes of mental or emotional disorder, of which anxiety and depression are the most common. It is entirely appropriate, therefore, for mental health services to be involved with those who care for the dying in whatever setting. Old age psychiatrists have recognized their more direct role in caring for dying patients for many years., And some years ago a particular impetus emerged behind the notion of palliative care in dementia., This followed the early research by Ladislav Volicer and his colleagues in the dementia special care unit (DSCU) as part of the Geriatrics Research Education and Clinical Center at the E.N. Rogers Memorial Veterans Hospital in Bedford, Massachusetts. Since then there has been a burgeoning both in the field and in the literature. Gradually, different sorts of ways to provide palliative care for people with dementia have also emerged.
Palmer amaranth (Amaranthus palmeri S. Watson) populations resistant to acetolactate synthase (ALS)-inhibiting herbicides and glyphosate are fairly common throughout the state of North Carolina (NC). This has led farm managers to rely more heavily on herbicides with other sites of action (SOA) for A. palmeri control, especially protoporphyrinogen oxidase and glutamine synthetase inhibitors. In the fall of 2016, seeds from A. palmeri populations were collected from the NC Coastal Plain, the state’s most prominent agricultural region. In separate experiments, plants with 2 to 4 leaves from the 110 populations were treated with field use rates of glyphosate, glufosinate-ammonium, fomesafen, mesotrione, or thifensulfuron-methyl. Percent visible control and survival were evaluated 3 wk after treatment. Survival frequencies were highest following glyphosate (99%) or thifensulfuron-methyl (96%) treatment. Known mutations conferring resistance to ALS inhibitors were found in populations surviving thifensulfuron-methyl application (Ala-122-Ser, Pro-197-Ser, Trp-574-Leu, and/or Ser-653-Asn), in addition to a new mutation (Ala-282-Asp) that requires further investigation. Forty-two populations had survivors after mesotrione application, with one population having 17% survival. Four populations survived fomesafen treatment, while none survived glufosinate. Dose–response studies showed an increase in fomesafen needed to kill 50% of two populations (LD50); however, these rates were far below the field use rate (less than 5 g ha−1). In two populations following mesotrione dose–response studies, a 2.4- to 3.3-fold increase was noted, with LD90 values approaching the field use rate (72.8 and 89.8 g ha−1). Screening of the progeny of individuals surviving mesotrione confirmed the presence of resistance alleles, as there were a higher number of survivors at the 1X rate compared with the parent population, confirming resistance to mesotrione. These data suggest A. palmeri resistant to chemistries other than glyphosate and thifensulfuron-methyl are present in NC, which highlights the need for weed management approaches to mitigate the evolution and spread of herbicide-resistant populations.
The Advanced Cardiac Life Support (ACLS) guidelines were recently updated to include ultrasound confirmation of endotracheal tube (ETT) location as an adjunctive tool to verify placement. While this method is employed in the emergency department under the guidance of the most recent American College of Emergency Physicians (ACEP; Irving, Texas USA) guidelines, it has yet to gain wide acceptance in the prehospital setting where it has the potential for greater impact. The objective of this study to is determine if training critical care medics using simulation was a feasible and reliable method to learn this skill.
Methods:
Twenty critical care paramedics with no previous experience with point-of-care ultrasound volunteered for advanced training in prehospital ultrasound. Four ultrasound fellowship trained emergency physicians proctored two three-hour training sessions. Each session included a brief introduction to ultrasound “knobology,” normal sonographic neck and lung anatomy, and how to identify ETT placement within the trachea or esophagus. Immediately following this, the paramedics were tested with five simulated case scenarios using pre-obtained images that demonstrated a correctly placed ETT, an esophageal intubation, a bronchial intubation, and an improperly functioning ETT. Their accuracy, length of time to respond, and comfort with using ultrasound were all assessed.
Results:
All 20 critical care medics completed the training and testing session. During the five scenarios, 37/40 (92.5%) identified the correct endotracheal placements, 18/20 (90.0%) identified the esophageal intubations, 18/20 (90.0%) identified the bronchial intubation, and 20/20 (100.0%) identified the ETT malfunctions correctly. The average time to diagnosis was 10.6 seconds for proper placement, 15.5 seconds for esophageal, 15.6 seconds for bronchial intubation, and 11.8 seconds for ETT malfunction.
Conclusions:
The use of ultrasound to confirm ETT placement can be effectively taught to critical care medics using a short, simulation-based training session. Further studies on implementation into patient care scenarios are needed.
Overreliance on herbicides for weed control has led to the evolution of herbicide-resistant Palmer amaranth populations. Farm managers should consider the long-term consequences of their short-term management decisions, especially when considering the soil weed seedbank. The objectives of this research were to (1) determine how soybean population and POST herbicide application timing affects in-season Palmer amaranth control and soybean yield, and (2) how those variables influence Palmer amaranth densities and cotton yields the following season. Soybeans were planted (19-cm row spacing) at a low-, medium-, and high-density population (268,000, 546,000, and 778,000 plants ha–1, respectively). Fomesafen and clethodim (280 and 210 g ai ha–1, respectively) were applied at the VE, V1, or V2 to V3 soybean growth stage. Nontreated plots were also included to assess the effect of soybean population alone. The following season, cotton was planted into these plots so as to understand the effects of soybean planting population on Palmer amaranth densities in the subsequent crop. When an herbicide application occurred at the V1 or V2 to V3 soybean stage, weed control in the high-density soybean population increased 17% to 23% compared to the low-density population. Economic return was not influenced by soybean population and was increased 72% to 94% with herbicide application compared to no treatment. In the subsequent cotton crop, Palmer amaranth densities were 24% to 39% lower 3 wk after planting when following soybean sprayed with herbicides compared to soybean without herbicides. Additionally, Palmer amaranth densities in cotton were 19% lower when soybean was treated at the VE stage compared to later stages. Thus, increasing soybean population can improve Palmer amaranth control without adversely affecting economic returns and can reduce future weed densities. Reducing the weed seedbank and selection pressure from herbicides are critical in mitigating resistance evolution.
The effect of plant phenology and canopy structure of four crops and four weed species on reflectance spectra were evaluated in 2016 and 2017 using in situ spectroscopy. Leaf-level and canopy-level reflectance were collected at multiple phenologic time points in each growing season. Reflectance values at 2 wk after planting (WAP) in both years indicated strong spectral differences between species across the visible (VIS; 350–700 nm), near-infrared (NIR; 701–1,300 nm), shortwave-infrared I (SWIR1; 1,301–1,900 nm), and shortwave-infrared II (SWIR2; 1,901–2,500 nm) regions. Results from this study indicate that plant spectral reflectance changes with plant phenology and is influenced by plant biophysical characteristics. Canopy-level differences were detected in both years across all dates except for 1 WAP in 2017. Species with similar canopy types (e.g., broadleaf prostrate, broadleaf erect, or grass/sedge) were more readily discriminated from species with different canopy types. Asynchronous phenology between species also resulted in spectral differences between species. SWIR1 and SWIR2 wavelengths are often not included in multispectral sensors but should be considered for species differentiation. Results from this research indicate that wavelengths in SWIR1 and SWIR2 in conjunction with VIS and NIR reflectance can provide differentiation across plant phenologies and, therefore should be considered for use in future sensor technologies for species differentiation.
In recent years, there has been increased use of dicamba due to the introduction of dicamba-resistant cotton and soybean in the United States. Therefore, there is a potential increase in off-target movement of dicamba and injury to sensitive crops. Flue-cured tobacco is extremely sensitive to auxin herbicides, particularly dicamba. In addition to yield loss, residue from drift or equipment contamination can have severe repercussions for the marketability of the crop. Studies were conducted in 2016, 2017, and 2018 in North Carolina to evaluate spray-tank cleanout efficiency of dicamba using various cleaning procedures. No difference in dicamba recovery was observed regardless of dicamba formulation and cleaning agent. Dicamba residue decreased with the number of rinses. There was no difference in dicamba residue recovered from the third rinse compared with residue from the tank after being refilled for subsequent tank use. Recovery ranged from 2% to 19% of the original concentration rate among the three rinses. Field studies were also conducted in 2018 to evaluate flue-cured tobacco response to reduced rates of dicamba ranging, from 1/5 to 1/10,000 of a labeled rate. Injury and yield reductions varied by environment and application timing. When exposed to 1/500 of a labeled rate at 7 and 11 wk after transplanting, tobacco injury ranged from 39% to 53% and 10% to 16% 24 days after application, respectively. The maximum yield reduction was 62%, with a 55% reduction in value when exposed to 112 g ha−1 of dicamba. Correlations showed significant relationships between crop injury assessment and yield and value reductions, with Pearson values ranging from 0.24 to 0.63. These data can provide guidance to growers and stakeholders and emphasize the need for diligent stewardship when using dicamba technology.
Currently, there are seven herbicides labeled for U.S. tobacco production; however, additional modes of action are greatly needed in order to reduce the risk of herbicide resistance. Field experiments were conducted at five locations during the 2017 and 2018 growing seasons to evaluate flue-cured tobacco tolerance to S-metolachlor applied pretransplanting incorporated (PTI) and pretransplanting (PRETR) at 1.07 (1×) and 2.14 (2×) kg ai ha−1. Severe injury was observed 6 wk after transplanting at the Whiteville environment in 2017 when S-metolachlor was applied PTI. End-of-season plant heights from PTI treatments at Whiteville were likewise reduced by 9% to 29% compared with nontreated controls, although cured leaf yield and value were reduced only when S-metolachlor was applied PTI at the 2× rate. Severe growth reduction was also observed at the Kinston location in 2018 where S-metolachlor was applied at the 2× rate. End-of-season plant heights were reduced 11% (PTI, 2×) and 20% (PRETR, 2×) compared with nontreated control plants. Cured leaf yield was reduced in Kinston when S-metolachlor was applied PRETR at the 2× rate; however, treatments did not impact cured leaf quality or value. Visual injury and reductions in stalk height, yield, quality, and value were not observed at the other three locations. Ultimately, it appears that injury potential from S-metolachlor is promoted by coarse soil texture and high early-season precipitation close to transplanting, both of which were documented at the Whiteville and Kinston locations. To reduce plant injury and the negative impacts to leaf yield and value, application rates lower than 1.07 kg ha−1 may be required in these scenarios.
Field studies were conducted to determine sweetpotato tolerance to and weed control from management systems that included linuron. Treatments included flumioxazin preplant (107 g ai ha−1) followed by (fb) S-metolachlor (800 g ai ha−1), oryzalin (840 g ai ha−1), or linuron (280, 420, 560, 700, and 840 g ai ha−1) alone or mixed with S-metolachlor or oryzalin applied 7 d after transplanting. Weeds did not emerge before the treatment applications. Two of the four field studies were maintained weed-free throughout the season to evaluate sweetpotato tolerance without weed interference. The herbicide program with the greatest sweetpotato yield was flumioxazin fb S-metolachlor. Mixing linuron with S-metolachlor did not improve Palmer amaranth management and decreased marketable yield by up to 28% compared with flumioxazin fb S-metolachlor. Thus, linuron should not be applied POST in sweetpotato if Palmer amaranth has not emerged at the time of application.
We present a calibration component for the Murchison Widefield Array All-Sky Virtual Observatory (MWA ASVO) utilising a newly developed PostgreSQL database of calibration solutions. Since its inauguration in 2013, the MWA has recorded over 34 petabytes of data archived at the Pawsey Supercomputing Centre. According to the MWA Data Access policy, data become publicly available 18 months after collection. Therefore, most of the archival data are now available to the public. Access to public data was provided in 2017 via the MWA ASVO interface, which allowed researchers worldwide to download MWA uncalibrated data in standard radio astronomy data formats (CASA measurement sets or UV FITS files). The addition of the MWA ASVO calibration feature opens a new, powerful avenue for researchers without a detailed knowledge of the MWA telescope and data processing to download calibrated visibility data and create images using standard radio astronomy software packages. In order to populate the database with calibration solutions from the last 6 yr we developed fully automated pipelines. A near-real-time pipeline has been used to process new calibration observations as soon as they are collected and upload calibration solutions to the database, which enables monitoring of the interferometric performance of the telescope. Based on this database, we present an analysis of the stability of the MWA calibration solutions over long time intervals.
Glyphosate-resistant (GR) Palmer amaranth continues to be challenging to control across the U.S. cotton belt. Timely application of POST herbicides and herbicides applied at planting or during the season with residual activity are utilized routinely to control this weed. Although glyphosate controls large Palmer amaranth that is not GR, herbicides such as glufosinate used in resistance management programs for GR Palmer amaranth must be applied when weeds are small. Dicamba can complement both glyphosate and glufosinate in controlling GR and glyphosate-susceptible (GS) biotypes in resistant cultivars. Two studies were conducted to determine Palmer amaranth control, weed biomass, and cotton yield, as well as to estimate economic net return when herbicides were applied 2, 3, 4, and 5 wk after planting (WAP). In one experiment POST-only applications were made. In the second experiment PRE herbicides were included. In general, Palmer amaranth was controlled at least 98% by herbicides applied at least three times regardless of timing of application or herbicide sequence. Glyphosate plus dicamba applied at 4 and 5 WAP controlled Palmer amaranth similarly compared to three applications by 8 WAP; however, yield was reduced 23% because of early-season interference. The inclusion of PRE herbicides benefited treatments that did not include herbicides applied 2 or 3 WAP. Glyphosate plus dicamba applied as the only herbicides 5 WAP provided 69% control of Palmer amaranth. PRE herbicides increased control to 96% for this POST treatment. Economic returns were similar when three or more POST applications were applied, with or without PRE herbicides.
On September 1, 2019, Hurricane Dorian made landfall as a category 5 hurricane on Great Abaco Island, Bahamas. Hurricane Dorian matched the “Labor Day” hurricane of 1935 as the strongest recorded Atlantic hurricane to make landfall with maximum sustained winds of 185 miles/h.1 At the request of the Government of the Bahamas, Team Rubicon activated a World Health Organization Type 1 Mobile Emergency Medical Team and responded to Great Abaco Island. The team provided medical care and reconnaissance of medical clinics on the island and surrounding cays…
Microcredit – joint-liability loans to the poorest of the poor – has been touted as a powerful approach for combatting global poverty, but sustainability varies dramatically across banks. Efforts to improve the sustainability of microcredit have assumed defaults are caused by free-riding. Here, we point out that the response of other group members to delinquent groupmates also plays an important role in defaults. Even in the absence of any free-rider problem, some people will be unable to make their payments due to bad luck. It is other group members’ unwillingness to pitch in extra – due to, among other things, not wanting to have less than other group members – that leads to default. To support this argument, we utilize the Ultimatum Game (UG), a standard paradigm from behavioral economics for measuring one's aversion to inequitable outcomes. First, we show that country-level variation in microloan default rates is strongly correlated (overall r = 0.81) with country-level UG rejection rates, but not free-riding measures. We then introduce a laboratory model ‘Microloan Game’ and present evidence that defaults arise from inequity-averse individuals refusing to make up the difference when others fail to pay their fair share. This perspective suggests a suite of new approaches for combatting defaults that leverage findings on reducing UG rejections.
Field studies were conducted in 2016 and 2017 at Clinton, NC, to quantify the effects of season-long interference of large crabgrass [Digitaria sanguinalis (L.) Scop.] and Palmer amaranth (Amaranthus palmeri S. Watson) on ‘AG6536’ soybean [Glycine max (L.) Merr.]. Weed density treatments consisted of 0, 1, 2, 4, and 8 plants m−2 for A. palmeri and 0, 1, 2, 4, and 16 plants m−2 for D. sanguinalis with (interspecific interference) and without (intraspecific interference) soybean to determine the impacts on weed biomass, soybean biomass, and seed yield. Biomass per square meter increased with increasing weed density for both weed species with and without soybean present. Biomass per square meter of D. sanguinalis was 617% and 37% greater when grown without soybean than with soybean, for 1 and 16 plants m−2 respectively. Biomass per square meter of A. palmeri was 272% and 115% greater when grown without soybean than with soybean for 1 and 8 plants m−2, respectively. Biomass per plant for D. sanguinalis and A. palmeri grown without soybean was greatest at the 1 plant m−2 density. Biomass per plant of D. sanguinalis plants across measured densities was 33% to 83% greater when grown without soybean compared with biomass per plant when soybean was present for 1 and 16 plants m−2, respectively. Similarly, biomass per plant for A. palmeri was 56% to 74% greater when grown without soybean for 1 and 8 plants m−2, respectively. Biomass per plant of either weed species was not affected by weed density when grown with soybean due to interspecific competition with soybean. Yield loss for soybean grown with A. palmeri ranged from 14% to 37% for densities of 1 to 8 plants m−2, respectively, with a maximum yield loss estimate of 49%. Similarly, predicted loss for soybean grown with D. sanguinalis was 0 % to 37% for densities of 1 to 16 m−2 with a maximum yield loss estimate of 50%. Soybean biomass was not affected by weed species or density. Results from these studies indicate that A. palmeri is more competitive than D. sanguinalis at lower densities, but that similar yield loss can occur when densities greater than 4 plants m−2 of either weed are present.
We apply two methods to estimate the 21-cm bispectrum from data taken within the Epoch of Reionisation (EoR) project of the Murchison Widefield Array (MWA). Using data acquired with the Phase II compact array allows a direct bispectrum estimate to be undertaken on the multiple redundantly spaced triangles of antenna tiles, as well as an estimate based on data gridded to the uv-plane. The direct and gridded bispectrum estimators are applied to 21 h of high-band (167–197 MHz; z = 6.2–7.5) data from the 2016 and 2017 observing seasons. Analytic predictions for the bispectrum bias and variance for point-source foregrounds are derived. We compare the output of these approaches, the foreground contribution to the signal, and future prospects for measuring the bispectra with redundant and non-redundant arrays. We find that some triangle configurations yield bispectrum estimates that are consistent with the expected noise level after 10 h, while equilateral configurations are strongly foreground-dominated. Careful choice of triangle configurations may be made to reduce foreground bias that hinders power spectrum estimators, and the 21-cm bispectrum may be accessible in less time than the 21-cm power spectrum for some wave modes, with detections in hundreds of hours.
Field studies were conducted in 2016 and 2017 in Clinton, NC, to determine the interspecific and intraspecific interference of Palmer amaranth (Amaranthus palmeri S. Watson) or large crabgrass [Digitaria sanguinalis (L.) Scop.] in ‘Covington’ sweetpotato [Ipomoea batatas (L.) Lam.]. Amaranthus palmeri and D. sanguinalis were established 1 d after sweetpotato transplanting and maintained season-long at 0, 1, 2, 4, 8 and 0, 1, 2, 4, 16 plants m−1 of row in the presence and absence of sweetpotato, respectively. Predicted yield loss for sweetpotato was 35% to 76% for D. sanguinalis at 1 to 16 plants m−1 of row and 50% to 79% for A. palmeri at 1 to 8 plants m−1 of row. Weed dry biomass per meter of row increased linearly with increasing weed density. Individual dry biomass of A. palmeri and D. sanguinalis was not affected by weed density when grown in the presence of sweetpotato. When grown without sweetpotato, individual weed dry biomass decreased 71% and 62% from 1 to 4 plants m−1 row for A. palmeri and D. sanguinalis, respectively. Individual weed dry biomass was not affected above 4 plants m−1 row to the highest densities of 8 and 16 plants m−1 row for A. palmeri and D. sanguinalis, respectively.
The experiments reported in this research paper aimed to track the microbiological load of milk throughout a low-heat skim milk powder (SMP) manufacturing process, from farm bulk tanks to final powder, during mid- and late-lactation (spring and winter, respectively). In the milk powder processing plant studied, low-heat SMP was produced using only the milk supplied by the farms involved in this study. Samples of milk were collected from farm bulk tanks (mid-lactation: 67 farms; late-lactation: 150 farms), collection tankers (CTs), whole milk silo (WMS), skim milk silo (SMS), cream silo (CS) and final SMP. During mid-lactation, the raw milk produced on-farm and transported by the CTs had better microbiological quality than the late-lactation raw milk (e.g., total bacterial count (TBC): 3.60 ± 0.55 and 4.37 ± 0.62 log 10 cfu/ml, respectively). After pasteurisation, reductions in TBC, psychrotrophic (PBC) and proteolytic (PROT) bacterial counts were of lower magnitude in late-lactation than in mid-lactation milk, while thermoduric (LPC—laboratory pasteurisation count) and thermophilic (THERM) bacterial counts were not reduced in both periods. The microbiological quality of the SMP produced was better when using mid-lactation than late-lactation milk (e.g., TBC: 2.36 ± 0.09 and 3.55 ± 0.13 cfu/g, respectively), as mid-lactation raw milk had better quality than late-lactation milk. The bacterial counts of some CTs and of the WMS samples were higher than the upper confidence limit predicted using the bacterial counts measured in the farm milk samples, indicating that the transport conditions or cleaning protocols could have influenced the microbiological load. Therefore, during the different production seasons, appropriate cow management and hygiene practices (on-farm and within the factory) are necessary to control the numbers of different bacterial groups in milk, as those can influence the effectiveness of thermal treatments and consequently affect final product quality.