We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To understand barriers and facilitators to evidence-based prescribing of antibiotics in the outpatient dental setting.
Design:
Semistructured interviews.
Setting:
Outpatient dental setting.
Participants:
Dentists from 40 Veterans’ Health Administration (VA) facilities across the United States.
Methods:
Dentists were identified based on their prescribing patterns and were recruited to participate in a semistructured interview on perceptions toward prescribing. All interviews were recorded, transcribed, and double-coded for analysis, with high reliability between coders. We identified general trends using the theoretical domains framework and mapped overarching themes onto the behavior change wheel to identify prospective interventions that improve evidence-based prescribing.
Results:
In total, 90 dentists participated in our study. The following barriers and facilitators to evidence-based prescribing emerged as impacts on a dentist’s decision making on prescribing an antibiotic: access to resources, social influence of peers and other care providers, clinical judgment, beliefs about consequences, local features of the clinic setting, and beliefs about capabilities.
Conclusions:
Findings from this work reveal the need to increase awareness of up-to-date antibiotic prescribing behaviors in dentistry and may inform the best antimicrobial stewardship interventions to support dentists’ ongoing professional development and improve evidence-based prescribing.
To evaluate opportunities for assessing penicillin allergies among patients presenting to dental clinics.
Design:
Retrospective cross-sectional study.
Setting:
VA dental clinics.
Patients:
Adult patients with a documented penicillin allergy who received an antibiotic from a dentist between January 1, 2015, and December 31, 2018, were included.
Methods:
Chart reviews were completed on random samples of 100 patients who received a noncephalosporin antibiotic and 200 patients who received a cephalosporin. Each allergy was categorized by severity. These categories were used to determine patient eligibility for 3 testing groups based on peer-reviewed algorithms: (1) no testing, (2) skin testing, and (3) oral test-dose challenge. Descriptive and bivariate statistics were used to compare facility and patient demographics first between true penicillin allergy, pseudo penicillin allergy, and missing allergy documentation, and between those who received a cephalosporin and those who did not at the dental visit.
Results:
Overall, 19% lacked documentation of the nature of allergic reaction, 53% were eligible for skin testing, 27% were eligible for an oral test-dose challenge, and 1% were contraindicated from testing. Male patients and African American patients were less likely to receive a cephalosporin.
Conclusions:
Most penicillin-allergic patients in the VA receiving an antibiotic from a dentist are eligible for penicillin skin testing or an oral penicillin challenge. Further research is needed to understand the role of dentists and dental clinics in assessing penicillin allergies.
To determine prophylaxis appropriateness by Veterans’ Affairs (VA) dentists.
Design:
A cross-sectional study of dental visits, 2015–2019.
Methods:
Antibiotics within 7 days before a visit in the absence of an oral infection were included. Appropriate antibiotic prophylaxis was defined as visits with gingival manipulation and further delineated into narrow and broad definitions based on comorbidities. The primary analysis applied a narrow definition of appropriate prophylaxis: cardiac conditions at the highest risk of an adverse outcome from endocarditis. The secondary analysis included a broader definition: cardiac or immunocompromising condition or tooth extractions and/or implants. Multivariable log-linear Poisson generalized estimating equation regression was used to assess the association between covariates and unnecessary prophylaxis prescriptions.
Results:
In total, 358,078 visits were associated with 369,102 antibiotics. The median prescription duration was 7 days (IQR, 7–10); only 6.5% were prescribed for 1 day. With the narrow definition, 15% of prophylaxis prescriptions were appropriate, which increased to 72% with the broader definition. Prophylaxis inconsistent with guidelines increased over time. For the narrow definition, Black (vs White) race, Latine (vs non-Latine) ethnicity, and visits located in the West census region were associated with unnecessary prophylaxis. Variables associated with a lower risk were older age, prosthetic joints, immunocompromising condition, and rural location.
Conclusions:
Of every 6 antibiotic prophylaxis prescriptions, 5 were inconsistent with guidelines. Improving prophylaxis appropriateness and shortening duration may have substantial implications for stewardship. Guidelines should state whether antibiotic prophylaxis is indicated for extractions, implants, and immunocompromised patients.
Among 108 (0.05% of cohort) US veterans with a Clostridioides difficile infection (CDI) within 30 days of a dental antibiotic prescription, 80% of patients received guideline-discordant antibiotics. Half had chronic gastrointestinal illness potentially exacerbating their CDI risk. More efforts are needed to improve antibiotic stewardship.
United States dentists prescribe 10% of all outpatient antibiotics. Assessing appropriateness of antibiotic prescribing has been challenging due to a lack of guidelines for oral infections. In 2019, the American Dental Association (ADA) published clinical practice guidelines (CPG) on the management of acute oral infections. Our objective was to describe baseline national antibiotic prescribing for acute oral infections prior to the release of the ADA CPG and to identify patient-level variables associated with an antibiotic prescription.
Design:
Cross-sectional analysis.
Methods:
We performed an analysis of national VA data from January 1, 2017, to December 31, 2017. We identified cases of acute oral infections using International Classification of Disease, Tenth Revision, Clinical Modification (ICD-10-CM) codes. Antibiotics prescribed by a dentist within ±7 days of a visit were included. Multivariable logistic regression identified patient-level variables associated with an antibiotic prescription.
Results:
Of the 470,039 VA dental visits with oral infections coded, 12% of patient visits with irreversible pulpitis, 17% with apical periodontitis, and 28% with acute apical abscess received antibiotics. Although the median days’ supply was 7, prolonged use of antibiotics was frequent (≥8 days, 42%–49%). Patients with high-risk cardiac conditions, prosthetic joints, and endodontic, implant, and oral and maxillofacial surgery dental procedures were more likely to receive antibiotics.
Conclusions:
Most treatments of irreversible pulpitis and apical periodontitis cases were concordant with new ADA guidelines. However, in cases where antibiotics were prescribed, prolonged antibiotic courses >7 days were frequent. These findings demonstrate opportunities for the new ADA guidelines to standardize and improve dental prescribing practices.
To characterize postextraction antibiotic prescribing patterns, predictors for antibiotic prescribing and the incidence of and risk factors for postextraction oral infection.
Design:
Retrospective analysis of a random sample of veterans who received tooth extractions from January 1, 2017 through December 31, 2017.
Setting:
VA dental clinics.
Patients:
Overall, 69,610 patients met inclusion criteria, of whom 404 were randomly selected for inclusion. Adjunctive antibiotics were prescribed to 154 patients (38.1%).
Intervention:
Patients who received or did not receive an antibiotic were compared for the occurrence of postextraction infection as documented in the electronic health record. Multivariable logistic regression was performed to identify factors associated with antibiotic receipt.
Results:
There was no difference in the frequency of postextraction oral infection identified among patients who did and did not receive antibiotics (4.5% vs 3.2%; P = .59). Risk factors for postextraction infection could not be identified due to the low frequency of this outcome. Patients who received antibiotics were more likely to have a greater number of teeth extracted (aOR, 1.10; 95% CI, 1.03–1.18), documentation of acute infection at time of extraction (aOR, 3.02; 95% CI, 1.57–5.82), molar extraction (aOR, 1.78; 95% CI, 1.10–2.86) and extraction performed by an oral maxillofacial surgeon (aOR, 2.29; 95% CI, 1.44–3.58) or specialty dentist (aOR, 5.77; 95% CI, 2.05–16.19).
Conclusion:
Infectious complications occurred at a low incidence among veterans undergoing tooth extraction who did and did not receive postextraction antibiotics. These results suggest that antibiotics have a limited role in preventing postprocedural infection; however, future studies are necessary to more clearly define the role of antibiotics for this indication.
To determine the usefulness of adjusting antibiotic use (AU) by prevalence of bacterial isolates as an alternative method for risk adjustment beyond hospital characteristics.
AU in days of therapy per 1,000 patient days and microbiologic data from 2015 and 2016 were collected from 26 hospitals. The prevalences of Pseudomonas aeruginosa, extended-spectrum β-lactamase (ESBL)–producing bacteria, methicillin-resistant Staphylococcus aureus (MRSA), and vancomycin-resistant enterococci (VRE) were calculated and compared to the average prevalence of all hospitals in the network. This proportion was used to calculate the adjusted AU (a-AU) for various categories of antimicrobials. For example, a-AU of antipseudomonal β-lactams (APBL) was the AU of APBL divided by (prevalence of P. aeruginosa at that hospital divided by the average prevalence of P. aeruginosa). Hospitals were categorized by bed size and ranked by AU and a-AU, and the rankings were compared.
Results:
Most hospitals in 2015 and 2016, respectively, moved ≥2 positions in the ranking using a-AU of APBL (15 of 24, 63%; 22 of 26, 85%), carbapenems (14 of 23, 61%; 22 of 25; 88%), anti-MRSA agents (13 of 23, 57%; 18 of 26, 69%), and anti-VRE agents (18 of 24, 75%; 15 of 26, 58%). Use of a-AU resulted in a shift in quartile of hospital ranking for 50% of APBL agents, 57% of carbapenems, 35% of anti-MRSA agents, and 75% of anti-VRE agents in 2015 and 50% of APBL agents, 28% of carbapenems, 50% of anti-MRSA agents, and 58% of anti-VRE agents in 2016.
Conclusions:
The a-AU considerably changes how hospitals compare among each other within a network. Adjusting AU by microbiological burden allows for a more balanced comparison among hospitals with variable baseline rates of resistant bacteria.
A small 33 ± 0.8 Ma lamproite pluton is exposed in the midst of a 23–26 Ma basalt-rhyolite province in Middle Park, NW Colorado. It contains abundant phlogopite phenocrysts in a fine-grained groundmass of analcime pseudomorphs after leucite, biotite, potassic richterite, apatite, ilmenite and accessory diopside. The phlogopite phenocryst cores contain ∼4 wt.% TiO2, 1% Cr2O3 and 0.2% BaO. The smallest groundmass biotites have normal pleochroism but compositions unlike any previously reported, with ∼2% Al2O3, ∼8% TiO2 and F <1.5%. Apart from those elements affected by leucite alteration, both the elemental and isotopic composition of this lamproite are close to those of the Leucite Hills, Wyoming. Its Nd-isotopic model age (TDM = 1.6 Ga) is outside the Leucite Hills range but within that of other Tertiary strongly potassic magmatism in the region underlain by the Wyoming craton. Evidence from both teleseismic tomography and the mantle xenoliths within other western USA mafic ultrapotassic igneous suites shows that the total lithospheric thickness beneath NW Colorado was probably ∼150–200 km at 33 Ma, when the Middle Park lamproite was emplaced. This is an important constraint on tectonomagmatic models for the Cenozoic evolution of this northernmost part of the Rio Grande rift system.
Soyabean (Glycine max) is a relatively new crop for small-scale farmers in Zambia which has been adopted following the introduction of new cultivars, greater opportunity to obtain credit, easier marketing and an attractive guaranteed price. However, low yields limit production partly due to the lack of a planting method that establishes optimal populations. The present method is to plough and plant in the same operation, dribbling the seed behind the ox-plough. This often leads to uneven depth of planting, and hence to poor seedling emergence and erratic stands. Alternative planting techniques evaluated on farmers' fields for three seasons (1985/86–1987/88) suggest that farmers should replace their practice of planting behind the plough with either hand seeding following a plough–harrow operation or the use of a modified ox-drawn planter (Taparia).
We present an in-depth study of metal-poor stars, based high resolution spectra combined with newly released astrometric data from Gaia, with special attention to observational uncertainties. The results are compared to those of other studies, including Gaia benchmark stars. Chemical evolution models are discussed, highlighting few puzzles that are still affecting our understanding of stellar nucleosynthesis and of the evolution of our Galaxy.
Horseweed is an increasingly problematic weed in soybean because of the frequent occurrence of glyphosate-resistant (GR) biotypes. The objective of this study was to determine the influence of crop rotation, winter wheat cover crops (WWCC), residual nonglyphosate herbicides, and preplant herbicide application timing on the population dynamics of GR horseweed and crop yield. A field study was conducted at a site with a moderate infestation of GR horseweed (approximately 1 plant m−2) with crop rotation (soybean–corn or soybean–soybean) as main plots and management systems as subplots. Management systems were evaluated by quantifying horseweed plant density, seedbank density, and crop yield. Crop rotation did not influence in-field horseweed or seedbank densities at any data census timing. Preplant herbicides applied in the spring were more effective at reducing horseweed plant densities than when applied in the previous fall. Spring-applied, residual herbicide systems were the most effective at reducing season long horseweed densities and protecting crop yield because horseweed in this region behaves primarily as a summer annual weed. Horseweed seedbank densities declined rapidly in the soil by an average of 76% for all systems over the first 10 mo before new seed rain. Despite rapid decline in total seedbank density, seed for GR biotypes remained in the seedbank for at least 2 yr. Therefore, to reduce the presence of GR horseweed biotypes in a local no-till weed flora, integrated weed management (IWM) systems should be developed to reduce total horseweed populations based on the knowledge that seed for GR biotypes are as persistent in the seed bank as glyphosate-sensitive (GS) biotypes.
Horseweed is an increasingly common and problematic weed in no-till soybean production in the eastern cornbelt due to the frequent occurrence of biotypes resistant to glyphosate. The objective of this study was to determine the influence of crop rotation, winter wheat cover crops (WWCC), residual non-glyphosate herbicides, and preplant application timing on the population dynamics of glyphosate-resistant (GR) horseweed and crop yield. A field study was conducted from 2003 to 2007 in a no-till field located at a site that contained a moderate infestation of GR horseweed (approximately 1 plant m−2). The experiment was a split-plot design with crop rotation (soybean–corn or soybean–soybean) as main plots and management systems as subplots. Management systems were evaluated by quantifying in-field horseweed plant density, seedbank density, and crop yield. Horseweed densities were collected at the time of postemergence applications, 1 mo after postemergence (MAP) applications, and at the time of crop harvest or 4 MAP. Viable seedbank densities were also evaluated from soil samples collected in the fall following seed rain. Soybean–corn crop rotation reduced in-field and seedbank horseweed densities vs. continuous soybean in the third and fourth yr of this experiment. Preplant herbicides applied in the spring were more effective at reducing horseweed plant densities than when applied in the previous fall. Spring-applied, residual herbicide systems were the most effective at reducing season-long in-field horseweed densities and protecting crop yields since the growth habit of horseweed in this region is primarily as a summer annual. Management systems also influenced the GR and glyphosate-susceptible (GS) biotype population structure after 4 yr of management. The most dramatic shift was from the initial GR : GS ratio of 3 : 1 to a ratio of 1 : 6 after 4 yr of residual preplant herbicide use followed by non-glyphosate postemergence herbicides.
Investigations by one of the authors in connection with the design of a fan for a blower type of wind tunnel showed that regular and repeatable dust patterns occurred on the blades of a one-quarter scale model fan of 18 inches diameter. Dust was deposited on the fan blades along the leading-edge and on the suction surface over an area thought to be the turbulent region of the boundary layer. The introduction of isolated protuberances on the dust free area of a blade gave rise to turbulence wedges in which dust was also deposited and this was interpreted as confirmation of the coincidence of the dust deposits with regions of turbulent boundary-layer flow. These deposits showed the existence of a considerable extent of laminar flow on the suction surface of each blade close to the root, a region where high lift coefficients would be expected with associated adverse pressure gradients. Two-dimensional wind tunnel experiments were made to confirm the interpretation of the observed dust patterns by comparison with the smoke filament and volatile liquid methods of flow visualisation and these are reported in Reference 2.
The passive control of a shock wave-boundary-layer interaction involves placing a porous surface beneath the interaction, allowing high pressure air from the flow downstream of the shock wave to recirculate through a plenum chamber into the low pressure flow upstream of the wave.
The simple case of a normal shock wave at a Mach number of 1·4 interacting with the turbulent boundary layer on a flat wall is investigated both experimentally and numerically. The experimental investigation made use of holographic interferometry, while the computational section of the investigation made use of a Navier-Stokes code to derive pressure gradients, boundary-layer properties and total pressure losses in the interaction region. It is found that the structure of shock wave-boundary-layer interactions with passive control consists of a leading, oblique shock wave followed by a lambda foot. The oblique wave originates from the upstream end of the porous region, and its strength is determined by the magnitude of the local blowing velocities. The shape of the lambda foot depends on the position of the main shock relative to the control region, resembling an uncontrolled foot when the main shock wave is towards the downstream end of the porosity, but becoming increasingly large as the shock moves upstream and eventually merging with the leading, oblique shock to form a single, large, lambda structure.
Improved forms of passive control are suggested based on the findings of this investigation, including the use of passive control systems which incorporate streamwise variations in the level of porosity.
Invariant solutions of shear flows have recently been extended from spatially periodic solutions in minimal flow units to spatially localized solutions on extended domains. One set of spanwise-localized solutions of plane Couette flow exhibits homoclinic snaking, a process by which steady-state solutions grow additional structure smoothly at their fronts when continued parametrically. Homoclinic snaking is well understood mathematically in the context of the one-dimensional Swift–Hohenberg equation. Consequently, the snaking solutions of plane Couette flow form a promising connection between the largely phenomenological study of laminar–turbulent patterns in viscous shear flows and the mathematically well-developed field of pattern-formation theory. In this paper we present a numerical study of the snaking solutions of plane Couette flow, generalizing beyond the fixed streamwise wavelength of previous studies. We find a number of new solution features, including bending, skewing and finite-size effects. We establish the parameter regions over which snaking occurs and show that the finite-size effects of the travelling wave solution are due to a coupling between its fronts and interior that results from its shift-reflect symmetry. A new winding solution of plane Couette flow is derived from a strongly skewed localized equilibrium.
Sixteen lambs were divided into two groups and fed two different diets. Eight lambs were stall-fed with a concentrate-based diet (C), and the remaining eight lambs were allowed to graze on Lolium perenne (G). The antioxidant status was measured in the liver and plasma samples before and after solid-phase extraction (SPE) to probe the antioxidant effects that grass phenolic compounds may have conferred onto the animal tissues. The liver and plasma samples from grass-fed lambs displayed a greater antioxidant capacity than the tissues from C lamb group, but only if samples had not been passed through SPE cartridges. Finally, the feed and animal tissues, which had been purified by SPE, were analysed by liquid chromatography combined with mass spectrometry (LC–MS) to identify phenolic compounds present in L. perenne and to evaluate the results from the antioxidant assays. It would appear that the improvement of the antioxidant capacity of lamb liver and plasma from lambs fed ryegrass was not related to the direct transfer of phenolic compounds from grass to the animal tissues.
Foods and dietary patterns that enhance satiety may provide benefit to consumers. The aim of the present review was to describe, consider and evaluate research on potential benefits of enhanced satiety. The proposal that enhanced satiety could only benefit consumers by a direct effect on food intake should be rejected. Instead, it is proposed that there is a variety of routes through which enhanced satiety could (indirectly) benefit dietary control or weight-management goals. The review highlights specific potential benefits of satiety, including: providing appetite control strategies for consumers generally and for those who are highly responsive to food cues; offering pleasure and satisfaction associated with low-energy/healthier versions of foods without feeling ‘deprived’; reducing dysphoric mood associated with hunger especially during energy restriction; and improved compliance with healthy eating or weight-management efforts. There is convincing evidence of short-term satiety benefits, but only probable evidence for longer-term benefits to hunger management, possible evidence of benefits to mood and cognition, inadequate evidence that satiety enhancement can promote weight loss, and no evidence on which consumers would benefit most from satiety enhancement. The appetite-reducing effects of specific foods or diets will be much more subtle than those of pharmaceutical compounds in managing hunger; nevertheless, the experience of pharmacology in producing weight loss via effects on appetite suggests that there is potential benefit of satiety enhancement from foods incorporated into the diet to the consumer.