To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Algorithmic graph theory has been expanding at an extremely rapid rate since the middle of the twentieth century, in parallel with the growth of computer science and the accompanying utilization of computers, where efficient algorithms have been a prime goal. This book presents material on developments on graph algorithms and related concepts that will be of value to both mathematicians and computer scientists, at a level suitable for graduate students, researchers and instructors. The fifteen expository chapters, written by acknowledged international experts on their subjects, focus on the application of algorithms to solve particular problems. All chapters were carefully edited to enhance readability and standardize the chapter structure as well as the terminology and notation. The editors provide basic background material in graph theory, and a chapter written by the book's Academic Consultant, Martin Charles Golumbic (University of Haifa, Israel), provides background material on algorithms as connected with graph theory.
There is controversy regarding whether the addition of cover gowns offers a substantial benefit over gloves alone in reducing personnel contamination and preventing pathogen transmission.
Simulated patient care interactions.
To evaluate the efficacy of different types of barrier precautions and to identify routes of transmission.
In randomly ordered sequence, 30 personnel each performed 3 standardized examinations of mannequins contaminated with pathogen surrogate markers (cauliflower mosaic virus DNA, bacteriophage MS2, nontoxigenic Clostridioides difficile spores, and fluorescent tracer) while wearing no barriers, gloves, or gloves plus gowns followed by examination of a noncontaminated mannequin. We compared the frequency and routes of transfer of the surrogate markers to the second mannequin or the environment.
For a composite of all surrogate markers, transfer by hands occurred at significantly lower rates in the gloves-alone group (OR, 0.02; P < .001) and the gloves-plus-gown group (OR, 0.06; P = .002). Transfer by stethoscope diaphragms was common in all groups and was reduced by wiping the stethoscope between simulations (OR, 0.06; P < .001). Compared to the no-barriers group, wearing a cover gown and gloves resulted in reduced contamination of clothing (OR, 0.15; P < .001), but wearing gloves alone did not.
Wearing gloves alone or gloves plus gowns reduces hand transfer of pathogens but may not address transfer by devices such as stethoscopes. Cover gowns reduce the risk of contaminating the clothing of personnel.
The hands of healthcare personnel are the most important source for transmission of healthcare-associated pathogens. The role of contaminated fomites such as portable equipment, stethoscopes, and clothing of personnel in pathogen transmission is unclear.
To study routes of transmission of cauliflower mosaic virus DNA markers from 31 source patients and from environmental surfaces in their rooms.
A 3-month observational cohort study.
A Veterans’ Affairs hospital.
After providing care for source patients, healthcare personnel were observed during interactions with subsequent patients. Putative routes of transmission were identified based on recovery of DNA markers from sites of contact with the patient or environment. To assess plausibility of fomite-mediated transmission, we assessed the frequency of transfer of methicillin-resistant Staphylococcus aureus (MRSA) from the skin of 25 colonized patients via gloved hands versus fomites.
Of 145 interactions involving contact with patients and/or the environment, 41 (28.3%) resulted in transfer of 1 or both DNA markers to the patient and/or the environment. The DNA marker applied to patients’ skin and clothing was transferred most frequently by stethoscopes, hands, and portable equipment, whereas the marker applied to environmental surfaces was transferred only by hands and clothing. The percentages of MRSA transfer from the skin of colonized patients via gloved hands, stethoscope diaphragms, and clothing were 52%, 40%, and 48%, respectively.
Fomites such as stethoscopes, clothing, and portable equipment may be underappreciated sources of pathogen transmission. Simple interventions such as decontamination of fomites between patients could reduce the risk for transmission.
Although infections caused by Acinetobacter baumannii are often healthcare-acquired, difficult to treat, and associated with high mortality, epidemiologic data for this organism are limited. We describe the epidemiology, clinical characteristics, and outcomes for patients with extensively drug-resistant Acinetobacter baumannii (XDRAB).
Retrospective cohort study
Department of Veterans’ Affairs Medical Centers (VAMCs)
Patients with XDRAB cultures (defined as nonsusceptible to at least 1 agent in all but 2 or fewer classes) at VAMCs between 2012 and 2018.
Microbiology and clinical data was extracted from national VA datasets. We used descriptive statistics to summarize patient characteristics and outcomes and bivariate analyses to compare outcomes by culture source.
Among 11,546 patients with 15,364 A. baumannii cultures, 408 (3.5%) patients had 667 (4.3%) XDRAB cultures. Patients with XDRAB were older (mean age, 68 years; SD, 12.2) with median Charlson index 3 (interquartile range, 1–5). Respiratory specimens (n = 244, 36.6%) and urine samples (n = 187, 28%) were the most frequent sources; the greatest proportion of patients were from the South (n = 162, 39.7%). Most patients had had antibiotic exposures (n = 362, 88.7%) and hospital or long-term care admissions (n = 331, 81%) in the prior 90 days. Polymyxins, tigecycline, and minocycline demonstrated the highest susceptibility. Also, 30-day mortality (n = 96, 23.5%) and 1-year mortality (n = 199, 48.8%) were high, with significantly higher mortality in patients with blood cultures.
The proportion of Acinetobacter baumannii in the VA that was XDR was low, but treatment options are extremely limited and clinical outcomes were poor. Prevention of healthcare-associated XDRAB infection should remain a priority, and novel antibiotics for XDRAB treatment are urgently needed.
Reduction in the use of fluoroquinolone antibiotics has been associated with reductions in Clostridioides difficile infections (CDIs) due to fluoroquinolone-resistant strains.
To determine whether facility-level fluoroquinolone use predicts healthcare facility-associated (HCFA) CDI due to fluoroquinolone-resistant 027 strains.
Using a nationwide cohort of hospitalized patients in the Veterans’ Affairs Healthcare System, we identified hospitals that categorized >80% of CDI cases as positive or negative for the 027 strain for at least one-quarter of fiscal years 2011–2018. Within these facilities, we used visual summaries and multilevel logistic regression models to assess the association between facility-level fluoroquinolone use and rates of HCFA-CDI due to 027 strains, controlling for time and facility complexity level, and adjusting for correlated outcomes within facilities.
Between 2011 and 2018, 55 hospitals met criteria for reporting 027 results, including a total of 5,091 HCFA-CDI cases, with 1,017 infections (20.0%) due to 027 strains. Across these facilities, the use of fluoroquinolones decreased by 52% from 2011 to 2018, with concurrent reductions in the overall HCFA-CDI rate and the proportion of HCFA-CDI cases due to the 027 strain of 13% and 55%, respectively. A multilevel logistic model demonstrated a significant effect of facility-level fluoroquinolone use on the proportion of infections in the facility due to the 027 strain, most noticeably in low-complexity facilities.
Our findings provide support for interventions to reduce use of fluroquinolones as a control measure for CDI, particularly in settings where fluoroquinolone use is high and fluoroquinolone-resistant strains are common causes of infection.
Vanadium dioxide (VO2) has been widely studied due to its metal-insulator phase transition at 68 °C, below which it is a semiconducting monoclinic phase, P21/c, and above it is a metallic tetragonal phase, P42/mnm. Substituting vanadium with transition metals allows transition temperature tunability. An accelerated microwave-assisted synthesis for VO2 and 5d tungsten-substituted VO2 presented herein decreased synthesis time by three orders of magnitude while maintaining phase purity, particle size, and transition character. Tungsten substitution amount was determined using inductively coupled plasma-optical emission spectroscopy. Differential scanning calorimetry, superconducting quantum interference device measurements, and in situ heating and cooling experiments monitored through synchrotron X-ray diffraction (XRD) confirmed the transition temperature decreased with increased tungsten substitution. Scanning electron microscopy analyzed through the line-intercept method produced an average particle size of 3–5 μm. Average structure and local structure phase purity was determined through the Rietveld analysis of synchrotron XRD and the least-squares refinement of pair distribution function data.
This chapter reviews the spread of irrigation technology across the Sahara in antiquity, and its effects on settlement agriculture and the movement of people. Recent work has stressed the close connections between the introduction of foggara technology and the rise of Garamantian civilisation, which featured intensive agriculture and incipient urbanism. However, many oases achieved substantial size through the use of well technologies, artesian springs or a combination of technologies. Another key question relates to the effects of the eventual decline and failure of these irrigation systems in terms of population movement and fragmentation of states such as the Garamantes. After presenting new AMS dating evidence for Garamantian foggaras, the chapter advances the discussion by examining the wider picture of foggara distribution within a survey of the evidence of irrigation technologies across the Sahara and whether and to what extent the distribution of foggaras beyond the core Garamantian heartlands might be seen as an indication of Garamantian control or influence. It explores what foggaras, wells and new crop introductions might suggest about agricultural intensification and organisation. This has implications for assessing agricultural intensification in the ancient Sahara. Finally, it considers causes and possible effects of irrigation failure and in some cases collapse.
CHDs can be complicated by renal injury which worsens morbidity and mortality. Urinary neutrophil gelatinase-associated lipocalin, a sensitive and specific biomarker of renal tubular injury, has not been studied in children with uncorrected CHDs. This study evaluated renal injury in children with uncorrected CHDs using this biomarker.
The patients were children with uncorrected CHDs with significant shunt confirmed on echocardiogram with normal renal ultrasound scan, in the paediatric cardiology clinic of a tertiary hospital. The controls were age-matched healthy children recruited from general practice clinics. Information on bio-data and socio-demographics were collected and urine was obtained for measurement of urinary neutrophil gelatinase-associated lipocalin levels.
A total of 65 children with uncorrected CHDs aged 2 to 204 months were recruited. Thirty-one (47.7%) were males while 36 (55.4%) had acyanotic CHDs. The median urinary neutrophil gelatinase-associated lipocalin level of patients of 26.10 ng/ml was significantly higher than controls of 16.90 ng/ml (U = 1624.50, p = 0.023). The median urinary neutrophil gelatinase-associated lipocalin level of patients with cyanotic and acyanotic CHDs were 30.2 ng/ml and 22.60 ng/ml respectively; (Mann–Whitney U = 368.50, p = 0.116). The prevalence of renal injury using 95th percentile cut-off value of urinary neutrophil gelatinase-associated lipocalin was 16.9%. Median age of patients with renalinjury was 16 (4–44) months.
Children with uncorrected CHDs have renal injury detected as early as infancy. The use of urinary neutrophil gelatinase-associated lipocalin in early detection of renal injury in these children may enhance early intervention and resultant prevention of morbidity and reduction in mortality.
Obtaining objective, dietary exposure information from individuals is challenging because of the complexity of food consumption patterns and the limitations of self-reporting tools (e.g., FFQ and diet diaries). This hinders research efforts to associate intakes of specific foods or eating patterns with population health outcomes.
Dietary exposure can be assessed by the measurement of food-derived chemicals in urine samples. We aimed to develop methodologies for urine collection that minimised impact on the day-to-day activities of participants but also yielded samples that were data-rich in terms of targeted biomarker measurements.
Urine collection methodologies were developed within home settings.
Different cohorts of free-living volunteers.
Home collection of urine samples using vacuum transfer technology was deemed highly acceptable by volunteers. Statistical analysis of both metabolome and selected dietary exposure biomarkers in spot urine collected and stored using this method showed that they were compositionally similar to urine collected using a standard method with immediate sample freezing. Even without chemical preservatives, samples can be stored under different temperature regimes without any significant impact on the overall urine composition or concentration of forty-six exemplar dietary exposure biomarkers. Importantly, the samples could be posted directly to analytical facilities, without the need for refrigerated transport and involvement of clinical professionals.
This urine sampling methodology appears to be suitable for routine use and may provide a scalable, cost-effective means to collect urine samples and to assess diet in epidemiological studies.
Dialysis patients may not have access to conventional renal replacement therapy (RRT) following disasters. We hypothesized that improvised renal replacement therapy (ImpRRT) would be comparable to continuous renal replacement therapy (CRRT) in a porcine acute kidney injury model.
Following bilateral nephrectomies and 2 hours of caudal aortic occlusion, 12 pigs were randomized to 4 hours of ImpRRT or CRRT. In the ImpRRT group, blood was circulated through a dialysis filter using a rapid infuser to collect the ultrafiltrate. Improvised replacement fluid, made with stock solutions, was infused pre-pump. In the CRRT group, commercial replacement fluid was used. During RRT, animals received isotonic crystalloids and norepinephrine.
There were no differences in serum creatinine, calcium, magnesium, or phosphorus concentrations. While there was a difference between groups in serum potassium concentration over time (P < 0.001), significance was lost in pairwise comparison at specific time points. Replacement fluids or ultrafiltrate flows did not differ between groups. There were no differences in lactate concentration, isotonic crystalloid requirement, or norepinephrine doses. No difference was found in electrolyte concentrations between the commercial and improvised replacement solutions.
The ImpRRT system achieved similar performance to CRRT and may represent a potential option for temporary RRT following disasters.
For patients with methicillin-resistant Staphylococcus aureus (MRSA) colonization, a traditional fist-bump greeting did not significantly reduce MRSA transfer in comparison to a handshake. However, transfer was reduced with a modified fist bump that minimized the surface area of contact and when hand hygiene was performed before the handshake.
Marine-terminating glaciers, such as those along the coastline of Greenland, often release meltwater into the ocean in the form of subglacial discharge plumes. Though these plumes can dramatically alter the mass loss along the front of a glacier, the conditions surrounding their genesis remain poorly constrained. In particular, little is known about the geometry of subglacial outlets and the extent to which seawater may intrude into them. Here, the latter is addressed by exploring the dynamics of an arrested salt wedge – a steady-state, two-layer flow system where salty water partially intrudes a channel carrying fresh water. Building on existing theory, we formulate a model that predicts the length of a non-entraining salt wedge as a function of the Froude number, the slope of the channel and coefficients for interfacial and wall drag. In conjunction, a series of laboratory experiments were conducted to observe a salt wedge within a rectangular channel. For experiments conducted with laminar flow (Reynolds number
), good agreement with theoretical predictions are obtained when the drag coefficients are modelled as being inversely proportional to
. However, for fully turbulent flows on geophysical scales, these drag coefficients are expected to asymptote toward finite values. Adopting reasonable drag coefficient estimates for this flow regime, our theoretical model suggests that typical subglacial channels may permit seawater intrusions of the order of several kilometres. While crude, these results indicate that the ocean has a strong tendency to penetrate subglacial channels and potentially undercut the face of marine-terminating glaciers.
This study aimed to evaluate risk factors associated with shedding of pathogenic Leptospira species in urine at animal and herd levels. In total, 200 dairy farms were randomly selected from the DairyNZ database. Urine samples were taken from 20 lactating, clinically normal cows in each herd between January and April 2016 and tested by real-time polymerase chain reaction (PCR) using gyrB as the target gene. Overall, 26.5% of 200 farms had at least one PCR positive cow and 2.4% of 4000 cows were shedding Leptospira in the urine. Using a questionnaire, information about risk factors at cow and farm level was collected via face-to-face interviews with farm owners and managers. Animals on all but one farm had been vaccinated against Hardjo and Pomona and cows on 54 of 200 (27%) farms had also been vaccinated against Copenhageni in at least one age group (calves, heifers and cows). Associations found to be statistically significant in univariate analysis (at P < 0.2) were assessed by multivariable logistic regression. Factors associated with shedding included cattle age (Odds ratio (OR) 0.82, 95% CI 0.71–0.95), keeping sheep (OR 5.57, 95% confidence interval (CI) 1.46–21.25) or dogs (OR 1.45, 95% CI 1.07–1.97) and managing milking cows in a single as opposed to multiple groups (OR 0.45, 95% CI 0.20–0.99). We conclude that younger cattle were more likely to be shedding Leptospira than older cattle and that the presence of sheep and dogs was associated with an increased risk of shedding in cows. Larger herds were at higher risk of having Leptospira shedders. However, none of the environmental risk factors that were assessed (e.g. access to standing water, drinking-water source), or wildlife abundance on-farm, or pasture were associated with shedding, possibly due to low statistical power, given the low overall shedding rate.
There is a requirement in some beef markets to slaughter bulls at under 16 months of age. This requires high levels of concentrate feeding. Increasing the slaughter age of bulls to 19 months facilitates the inclusion of a grazing period, thereby decreasing the cost of production. Recent data indicate few quality differences in longissimus thoracis (LT) muscle from conventionally reared 16-month bulls and 19-month-old bulls that had a grazing period prior to finishing on concentrates. The aim of the present study was to expand this observation to additional commercially important muscles/cuts. The production systems selected were concentrates offered ad libitum and slaughter at under 16 months of age (16-C) or at 19 months of age (19-CC) to examine the effect of age per se, and the cheaper alternative for 19-month bulls described above (19-GC). The results indicate that muscles from 19-CC were more red, had more intramuscular fat and higher cook loss than those from 16-C. No differences in muscle objective texture or sensory texture and acceptability were found between treatments. The expected differences in composition and quality between the muscles were generally consistent across the production systems examined. Therefore, for the type of animal and range of ages investigated, the effect of the production system on LT quality was generally representative of the effect on the other muscles analysed. In addition, the data do not support the under 16- month age restriction, based on meat acceptability, in commercial suckler bull production.
Acute change in mental status (ACMS), defined by the Confusion Assessment Method, is used to identify infections in nursing home residents. A medical record review revealed that none of 15,276 residents had an ACMS documented. Using the revised McGeer criteria with a possible ACMS definition, we identified 296 residents and 21 additional infections. The use of a possible ACMS definition should be considered for retrospective nursing home infection surveillance.
It is now well established that CBT for chronic insomnia is as efficacious as hypnotic medication and is also likely to be better at maintaining improved sleep. Most studies have looked at the use of individual CBT; there have been only a few studies looking at CBT for insomnia given in a group format.
For nearly ten years the Bristol Insomnia Group has offered cognitive behavioural management and support for people with chronic insomnia.
The seven group sessions are led by up to three members of a team consisting of a doctor (sleep specialist), an occupational therapist and a research sleep scientist. Components of the group intervention include education about sleep science, information on insomnia medication, sleep hygiene, relaxation, and cognitive therapy. To assess efficacy participants complete sleep diaries, a quality of life scale (SF36) and the dysfunctional beliefs and attitudes scale (DBAS) pre and post group.
Sleep diaries (n=68) showed significant differences in Total Sleep Time (TST), Sleep Onset Latency (SOL) and Sleep Quality (SQ). Approximately half of the participants had clinically significant improvements in their TST (increased by 30 minutes) and about a third had a clinically significant decrease (by 30 minutes) in their SOL. SF36 scores showed statistically improved scores in all nine domains, DBAS scores showed statistically significant decreased scores post group.
These results demonstrate promising sleep parameter and quality of life improvements after attendance at the group. CBT for insomnia is a clinically and cost effective approach for the treatment of chronic insomnia.
Ecstasy (3,4-methylenedioxymethamphetamine, MDMA) is an amphetamine derivative that is used recreationally and is now being tested in clinical trials for treatment of posttraumatic stress disorder. Ecstasy can damage serotonin neurones in brain of experimental animals; however, relevance of these findings to the human is debated.
To measure by positron emission tomography (PET) levels of binding to the serotonin transporter (SERT), a marker of serotonin neurones, in brain of chronic ecstasy users and in matched controls.
An estimate of brain SERT levels was obtained, using the PET tracer 11C-DASB, in 50 chronic (confirmed by drug hair testing) ecstasy users (mean age, 26 years; mean duration of drug use, 3.9 years; median drug withdrawal time, 38 days) and 50 (drug-hair negative) control subjects (mean age 26 years).
SERT binding levels in the ecstasy group were significantly decreased by 22 to 46% in frontal, temporal, cingulate, insular and occipital cortices, and by 23% in hippocampus. However, concentrations were distinctly normal in the SERT-rich caudate, putamen, ventral striatum and thalamus.
Our imaging data suggest that cerebral cortical SERT concentration is below normal in some ecstasy users for at least one month after last use of the drug. However, it remains to be established whether low SERT might have preceded drug use, reflects actual loss of brain serotonin neurones, or is causally related to any functional impairment in the ecstasy users. (Supported by US NIH NIDA DA017301).
Pregabalin is indicated for the treatment of GAD in adults in Europe. The efficacy and safety of pregabalin for the treatment of adults and elderly patients with GAD has been demonstrated in 6 of 7 short-term clinical trials of 4 to 8 weeks.
To characterise the long-term efficacy and safety of pregabalin in subjects with GAD.
Subjects were randomised to double-blind treatment with either high-dose pregabalin (450-600 mg/d), low-dose pregabalin (150-300 mg/d), or lorazepam (3-4 mg/d) for 3 months. Treatment was extended with drug or blinded placebo for a further 3 months.
At 3 months, mean change from baseline Hamilton Anxiety Rating Scale (HAM-A) for pregabalin high- and low-dose, and for lorazepam ranged from -16.0 to -17.4. Mean change from baseline Clinical Global Impression-Severity (CGI-S) scores ranged from -2.1 to -2.3 and mean CGI-Improvement (CGI-I) scores were 1.9 for each active treatment group. At 6 months, improvement was retained for all 3 active drug groups, even when switched to placebo. HAM-A and CGI-S change from baseline scores ranged from -14.9 to -19.0 and -2.0 to -2.5, respectively. Mean CGI-I scores ranged from 1.5 to 2.3. The most frequently reported adverse events were insomnia, fatigue, dizziness, headache, and somnolence.
Efficacy was observed at 3 months, with maintained improvement in anxiety symptoms over 6 months of treatment. These results are consistent with previously reported efficacy and safety trials of shorter duration with pregabalin and lorazepam in subjects with GAD.
Pregabalin is indicated for the treatment of generalised anxiety disorder (GAD) in adults in Europe. When pregabalin is discontinued, a 1-week (minimum) taper is recommended to prevent potential discontinuation symptoms.
To evaluate whether a 1-week pregabalin taper, after 3 or 6 months of treatment, is associated with the development of discontinuation symptoms (including rebound anxiety) in subjects with GAD.
Subjects were randomised to double-blind treatment with low- (150-300 mg/d) or high-dose pregabalin (450-600 mg/d) or lorazepam (3-4 mg/d) for 3 months. After 3 months ~25% of subjects in each group (per the original randomisation) underwent a double-blind, 1-week taper, with substitution of placebo. The remaining subjects continued on active treatment for another 3 months and underwent the 1-week taper at 6 months.
Discontinuation after 3 months was associated with low mean changes in Physician Withdrawal Checklist (PWC) scores (range: +1.4 to +2.3) and Hamilton Anxiety Rating Scale (HAM A) scores (range: +0.9 to +2.3) for each pregabalin dose and lorazepam. Discontinuation after 6 months was associated with low mean changes in PWC scores (range: -1.0 to +3.0) and HAM A scores (range: -0.8 to +3.0) for all active drugs and placebo. Incidence of rebound anxiety during pregabalin taper was low and did not appear related to treatment dose or duration.
A 1-week taper following 3 or 6 months of pregabalin treatment was not associated with clinically meaningful discontinuation symptoms as evaluated by changes in the PWC and HAM A rating scales.