To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To describe epidemiologic and genomic characteristics of a severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) outbreak in a large skilled nursing facility (SNF), and the strategies that controlled transmission.
Design, Setting, and Participants:
Cohort study during March 22–May 4, 2020 of all staff and residents at a 780-bed SNF in San Francisco, California.
Contact tracing and symptom screening guided targeted testing of staff and residents; respiratory specimens were also collected through serial point prevalence surveys (PPS) in units with confirmed cases. Cases were confirmed by real-time reverse transcription–polymerase chain reaction testing for SARS-CoV-2; whole genome sequencing (WGS) characterized viral isolate lineages and relatedness. Infection prevention and control (IPC) interventions included restricting from work any staff who had close contact to a confirmed case; restricting movements between units; implementing surgical face masking facility-wide; and recommended PPE (isolation gown, gloves, N95 respirator and eye protection) for clinical interactions in units with confirmed cases.
Of 725 staff and residents tested through targeted testing and serial PPS, twenty-one (3%) were SARS-CoV-2-positive; sixteen (76%) staff and 5 (24%) residents. Fifteen (71%) were linked to a single unit. Targeted testing identified 17 (81%) cases; PPS identified 4 (19%). Most (71%) cases were identified prior to IPC intervention. WGS was performed on SARS-CoV-2 isolates from four staff and four residents; five were of Santa Clara County lineage and the three others were distinct lineages.
Early implementation of targeted testing, serial PPS, and multimodal IPC interventions limited SARS-CoV-2 transmission within the SNF.
The criteria for objective memory impairment in mild cognitive impairment (MCI) are vaguely defined. Aggregating the number of abnormal memory scores (NAMS) is one way to operationalise memory impairment, which we hypothesised would predict progression to Alzheimer’s disease (AD) dementia.
As part of the Australian Imaging, Biomarkers and Lifestyle Flagship Study of Ageing, 896 older adults who did not have dementia were administered a psychometric battery including three neuropsychological tests of memory, yielding 10 indices of memory. We calculated the number of memory scores corresponding to z ≤ −1.5 (i.e., NAMS) for each participant. Incident diagnosis of AD dementia was established by consensus of an expert panel after 3 years.
Of the 722 (80.6%) participants who were followed up, 54 (7.5%) developed AD dementia. There was a strong correlation between NAMS and probability of developing AD dementia (r = .91, p = .0003). Each abnormal memory score conferred an additional 9.8% risk of progressing to AD dementia. The area under the receiver operating characteristic curve for NAMS was 0.87 [95% confidence interval (CI) .81–.93, p < .01]. The odds ratio for NAMS was 1.67 (95% CI 1.40–2.01, p < .01) after correcting for age, sex, education, estimated intelligence quotient, subjective memory complaint, Mini-Mental State Exam (MMSE) score and apolipoprotein E ϵ4 status.
Aggregation of abnormal memory scores may be a useful way of operationalising objective memory impairment, predicting incident AD dementia and providing prognostic stratification for individuals with MCI.
Commercialization of 2,4-D–tolerant crops is a major concern for sweetpotato producers because of potential 2,4-D drift that can cause severe crop injury and yield reduction. A field study was initiated in 2014 and repeated in 2015 to assess impacts of reduced rates of 2,4-D, glyphosate, or a combination of 2,4-D with glyphosate on sweetpotato. In one study, 2,4-D and glyphosate were applied alone and in combination at 1/10, 1/100, 1/250, 1/500, 1/750, and 1/1,000 of anticipated field use rates (1.05 kg ha−1 for 2,4-D and 1.12 kg ha−1 for glyphosate) to ‘Beauregard’ sweetpotato at storage root formation (10 days after transplanting [DAP]). In a separate study, all these treatments were applied to ‘Beauregard’ sweetpotato at storage root development (30 DAP). Injury with 2,4-D alone or in combination with glyphosate was generally equal or greater than with glyphosate applied alone at equivalent herbicide rates, indicating that injury is attributable mostly to 2,4-D in the combination. There was a quadratic increase in crop injury and quadratic decrease in crop yield (with respect to most yield grades) with increased rate of 2,4-D applied alone or in combination with glyphosate applied at storage root development. However, neither the results of this relationship nor of the significance of herbicide rate were observed on crop injury or sweetpotato yield when herbicide application occurred at storage root formation, with a few exceptions. In general, crop injury and yield reduction were greatest at the highest rate (1/10×) of 2,4-D applied alone or in combination with glyphosate, although injury observed at lower rates was also a concern after initial observation by sweetpotato producers. However, in some cases, yield reduction of U.S. no.1 and marketable grades was also observed after application of 1/250×, 1/100×, or 1/10× rates of 2,4-D alone or with glyphosate when applied at storage root development.
A major concern of sweetpotato producers is the potential negative effects from herbicide drift or sprayer contamination events when dicamba is applied to nearby dicamba-resistant crops. A field study was initiated in 2014 and repeated in 2015 to assess the effects of reduced rates of N,N-Bis-(3-aminopropyl)methylamine (BAPMA) or diglycloamine (DGA) salt of dicamba, glyphosate, or a combination of these individually in separate trials with glyphosate on sweetpotato. Reduced rates of 1/10, 1/100, 1/250, 1/500, 1/750, and 1/1,000 of the 1× use rate of each dicamba formulation at 0.56 kg ha−1, glyphosate at 1.12 kg ha−1, and a combination of the two at aforementioned rates were applied to ‘Beauregard’ sweetpotato at storage root formation (10 d after transplanting) in one trial and storage root development (30 d after transplanting) in a separate trial. Injury with each salt of dicamba (BAPMA or DGA) applied alone or with glyphosate was generally equal to or greater than glyphosate applied alone at equivalent rates, indicating that injury is most attributable to the dicamba in the combination. There was a quadratic increase in crop injury and a quadratic decrease in crop yield (with respect to most yield grades) observed with an increased herbicide rate of dicamba applied alone or in combination with glyphosate applied at storage root development. However, with a few exceptions, neither this relationship nor the significance of herbicide rate was observed on crop injury or sweetpotato yield when herbicide application occurred at the storage root formation stage. In general, crop injury and yield reduction were greatest at the highest rate (1/10×) of either salt of dicamba applied alone or in combination with glyphosate, although injury observed at lower rates would be cause for concern after initial observation by sweetpotato producers. However, in some cases yield reduction of No.1 and marketable grades was observed following 1/250×, 1/100×, or 1/10× application rates of dicamba alone or with glyphosate when applied at storage root development.
Increasing weed control costs and limited herbicide options threaten vegetable crop profitability. Traditional interrow mechanical cultivation is very effective at removing weeds between crop rows. However, weed control within the crop rows is necessary to establish the crop and prevent yield loss. Currently, many vegetable crops require hand weeding to remove weeds within the row that remain after traditional cultivation and herbicide use. Intelligent cultivators have come into commercial use to remove intrarow weeds and reduce cost of hand weeding. Intelligent cultivators currently on the market such as the Robovator, use pattern recognition to detect the crop row. These cultivators do not differentiate crops and weeds and do not work well among high weed populations. One approach to differentiate weeds is to place a machine-detectable mark or signal on the crop (i.e., the crop has the mark and the weed does not), thereby facilitating weed/crop differentiation. Lettuce and tomato plants were marked with labels and topical markers, then cultivated with an intelligent cultivator programmed to identify the markers. Results from field trials in marked tomato and lettuce found that the intelligent cultivator removed 90% more weeds from tomato and 66% more weeds from lettuce than standard cultivators without reducing yields. Accurate crop and weed differentiation described here resulted in a 45% to 48% reduction in hand-weeding time per hectare.
Field studies were conducted in 2016 and 2017 in Clinton, NC, to determine the interspecific and intraspecific interference of Palmer amaranth (Amaranthus palmeri S. Watson) or large crabgrass [Digitaria sanguinalis (L.) Scop.] in ‘Covington’ sweetpotato [Ipomoea batatas (L.) Lam.]. Amaranthus palmeri and D. sanguinalis were established 1 d after sweetpotato transplanting and maintained season-long at 0, 1, 2, 4, 8 and 0, 1, 2, 4, 16 plants m−1 of row in the presence and absence of sweetpotato, respectively. Predicted yield loss for sweetpotato was 35% to 76% for D. sanguinalis at 1 to 16 plants m−1 of row and 50% to 79% for A. palmeri at 1 to 8 plants m−1 of row. Weed dry biomass per meter of row increased linearly with increasing weed density. Individual dry biomass of A. palmeri and D. sanguinalis was not affected by weed density when grown in the presence of sweetpotato. When grown without sweetpotato, individual weed dry biomass decreased 71% and 62% from 1 to 4 plants m−1 row for A. palmeri and D. sanguinalis, respectively. Individual weed dry biomass was not affected above 4 plants m−1 row to the highest densities of 8 and 16 plants m−1 row for A. palmeri and D. sanguinalis, respectively.
Syndromic surveillance is a form of surveillance that generates information for public health action by collecting, analysing and interpreting routine health-related data on symptoms and clinical signs reported by patients and clinicians rather than being based on microbiologically or clinically confirmed cases. In England, a suite of national real-time syndromic surveillance systems (SSS) have been developed over the last 20 years, utilising data from a variety of health care settings (a telehealth triage system, general practice and emergency departments). The real-time systems in England have been used for early detection (e.g. seasonal influenza), for situational awareness (e.g. describing the size and demographics of the impact of a heatwave) and for reassurance of lack of impact on population health of mass gatherings (e.g. the London 2012 Olympic and Paralympic Games).We highlight the lessons learnt from running SSS, for nearly two decades, and propose questions and issues still to be addressed. We feel that syndromic surveillance is an example of the use of ‘big data’, but contend that the focus for sustainable and useful systems should be on the added value of such systems and the importance of people working together to maximise the value for the public health of syndromic surveillance services.
We describe the motivation and design details of the ‘Phase II’ upgrade of the Murchison Widefield Array radio telescope. The expansion doubles to 256 the number of antenna tiles deployed in the array. The new antenna tiles enhance the capabilities of the Murchison Widefield Array in several key science areas. Seventy-two of the new tiles are deployed in a regular configuration near the existing array core. These new tiles enhance the surface brightness sensitivity of the array and will improve the ability of the Murchison Widefield Array to estimate the slope of the Epoch of Reionisation power spectrum by a factor of ∼3.5. The remaining 56 tiles are deployed on long baselines, doubling the maximum baseline of the array and improving the array u, v coverage. The improved imaging capabilities will provide an order of magnitude improvement in the noise floor of Murchison Widefield Array continuum images. The upgrade retains all of the features that have underpinned the Murchison Widefield Array’s success (large field of view, snapshot image quality, and pointing agility) and boosts the scientific potential with enhanced imaging capabilities and by enabling new calibration strategies.
The role that vitamin D plays in pulmonary function remains uncertain. Epidemiological studies reported mixed findings for serum 25-hydroxyvitamin D (25(OH)D)–pulmonary function association. We conducted the largest cross-sectional meta-analysis of the 25(OH)D–pulmonary function association to date, based on nine European ancestry (EA) cohorts (n 22 838) and five African ancestry (AA) cohorts (n 4290) in the Cohorts for Heart and Aging Research in Genomic Epidemiology Consortium. Data were analysed using linear models by cohort and ancestry. Effect modification by smoking status (current/former/never) was tested. Results were combined using fixed-effects meta-analysis. Mean serum 25(OH)D was 68 (sd 29) nmol/l for EA and 49 (sd 21) nmol/l for AA. For each 1 nmol/l higher 25(OH)D, forced expiratory volume in the 1st second (FEV1) was higher by 1·1 ml in EA (95 % CI 0·9, 1·3; P<0·0001) and 1·8 ml (95 % CI 1·1, 2·5; P<0·0001) in AA (Prace difference=0·06), and forced vital capacity (FVC) was higher by 1·3 ml in EA (95 % CI 1·0, 1·6; P<0·0001) and 1·5 ml (95 % CI 0·8, 2·3; P=0·0001) in AA (Prace difference=0·56). Among EA, the 25(OH)D–FVC association was stronger in smokers: per 1 nmol/l higher 25(OH)D, FVC was higher by 1·7 ml (95 % CI 1·1, 2·3) for current smokers and 1·7 ml (95 % CI 1·2, 2·1) for former smokers, compared with 0·8 ml (95 % CI 0·4, 1·2) for never smokers. In summary, the 25(OH)D associations with FEV1 and FVC were positive in both ancestries. In EA, a stronger association was observed for smokers compared with never smokers, which supports the importance of vitamin D in vulnerable populations.
BACKGROUND: IGTS is a rare phenomenon of paradoxical germ cell tumor (GCT) growth during or following treatment despite normalization of tumor markers. We sought to evaluate the frequency, clinical characteristics and outcome of IGTS in patients in 21 North-American and Australian institutions. METHODS: Patients with IGTS diagnosed from 2000-2017 were retrospectively evaluated. RESULTS: Out of 739 GCT diagnoses, IGTS was identified in 33 patients (4.5%). IGTS occurred in 9/191 (4.7%) mixed-malignant GCTs, 4/22 (18.2%) immature teratomas (ITs), 3/472 (0.6%) germinomas/germinomas with mature teratoma, and in 17 secreting non-biopsied tumours. Median age at GCT diagnosis was 10.9 years (range 1.8-19.4). Male gender (84%) and pineal location (88%) predominated. Of 27 patients with elevated markers, median serum AFP and Beta-HCG were 70 ng/mL (range 9.2-932) and 44 IU/L (range 4.2-493), respectively. IGTS occurred at a median time of 2 months (range 0.5-32) from diagnosis, during chemotherapy in 85%, radiation in 3%, and after treatment completion in 12%. Surgical resection was attempted in all, leading to gross total resection in 76%. Most patients (79%) resumed GCT chemotherapy/radiation after surgery. At a median follow-up of 5.3 years (range 0.3-12), all but 2 patients are alive (1 succumbed to progressive disease, 1 to malignant transformation of GCT). CONCLUSION: IGTS occurred in less than 5% of patients with GCT and most commonly after initiation of chemotherapy. IGTS was more common in patients with IT-only on biopsy than with mixed-malignant GCT. Surgical resection is a principal treatment modality. Survival outcomes for patients who developed IGTS are favourable.
This paper summarises developments in understanding sea level change during the Quaternary in Scotland since the publication of the Quaternary of Scotland Geological Conservation Review volume in 1993. We present a review of progress in methodology, particularly in the study of sediments in isolation basins and estuaries as well as in techniques in the field and laboratory, which have together disclosed greater detail in the record of relative sea level (RSL) change than was available in 1993. However, progress in determining the record of RSL change varies in different areas. Studies of sediments and stratigraphy offshore on the continental shelf have increased greatly, but the record of RSL change there remains patchy. Studies onshore have resulted in improvements in the knowledge of rock shorelines, including the processes by which they are formed, but much remains to be understood. Studies of Late Devensian and Holocene RSLs around present coasts have improved knowledge of both the extent and age range of the evidence. The record of RSL change on the W and NW coasts has disclosed a much longer dated RSL record than was available before 1993, possibly with evidence of Meltwater Pulse 1A, while studies in estuaries on the E and SW coasts have disclosed widespread and consistent fluctuations in Holocene RSLs. Evidence for the meltwater pulse associated with the Early Holocene discharge of Lakes Agassiz–Ojibway in N America has been found on both E and W coasts. The effects of the impact of storminess, in particular in cliff-top storm deposits, have been widely identified. Further information on the Holocene Storegga Slide tsunami has enabled a better understanding of the event, but evidence for other tsunami events on Scottish coasts remains uncertain. Methodological developments have led to new reconstructions of RSL change for the last 2000 years, utilising state-of-the-art GIA models and alongside coastal biostratigraphy to determine trends to compare with modern tide gauge and documentary evidence. Developments in GIA modelling have provided valuable information on patterns of land uplift during and following deglaciation. The studies undertaken raise a number of research questions which will require addressing in future work.
Combinatorial auctions enhance our ability to efficiently allocate multiple resources in complex economic environments. They explicitly allow buyers and sellers of goods and services to bid on packages of items with related values or costs. For example, “I bid $10 to buy 1 unit of item A and 2 units of item B, but I won't pay anything unless I get everything.” They also allow buyers, sellers and the auctioneer to impose logical constraints that limit the feasible set of auction allocations. For example, “I bid $12 to buy 2 units of item C OR $15 to buy 3 units of item D, but I don't want both.” Finally, they can handle functional relationships amongst bids or allocations, such as budget constraints or aggregation limits that allow many bids to be connected together. For example, “I won't spend more than a total of $35 on all my bids” or “This auction will allocate no more than a total of 7 units of items F, G and H.”
There are several reasons to prefer to have the bidding message space expanded beyond the simple space used for traditional single commodity auctions. As Bykowsky et al. (2000) point out, when values have strong complementarities, there is a danger of ‘financial exposure’ that results in losses to bidders if combinatorial bidding is not allowed. For example, in the case of complementary items such as airport take-off and landing times, the ability to reduce uncertainty to the bidder by allowing him to precisely declare his object of value, a cycle of slots for an entire daily flight pattern, is obvious: one component slot not acquired ruins the value of the flight cycle. In the same situation substitution possibilities would also be important to consider: if flight cycle A is not won, cycle B may be an appropriate though less valuable substitute for the crew and equipment available. Allocation inefficiencies due to financial exposure in noncombinatorial auctions have been frequently demonstrated in experiments beginning with Rassenti et al. (1982) (see also Porter (1999), Banks et al. (1989), Ledyard et al. (2002) and Kwasnika et al. (1998)).
The brain-derived neurotrophic factor (BDNF) Val66Met polymorphism Met allele exacerbates amyloid (Aβ) related decline in episodic memory (EM) and hippocampal volume (HV) over 36–54 months in preclinical Alzheimer's disease (AD). However, the extent to which Aβ+ and BDNF Val66Met is related to circulating markers of BDNF (e.g. serum) is unknown. We aimed to determine the effect of Aβ and the BDNF Val66Met polymorphism on levels of serum mBDNF, EM, and HV at baseline and over 18-months.
Non-demented older adults (n = 446) underwent Aβ neuroimaging and BDNF Val66Met genotyping. EM and HV were assessed at baseline and 18 months later. Fasted blood samples were obtained from each participant at baseline and at 18-month follow-up. Aβ PET neuroimaging was used to classify participants as Aβ– or Aβ+.
At baseline, Aβ+ adults showed worse EM impairment and lower serum mBDNF levels relative to Aβ- adults. BDNF Val66Met polymorphism did not affect serum mBDNF, EM, or HV at baseline. When considered over 18-months, compared to Aβ– Val homozygotes, Aβ+ Val homozygotes showed significant decline in EM and HV but not serum mBDNF. Similarly, compared to Aβ+ Val homozygotes, Aβ+ Met carriers showed significant decline in EM and HV over 18-months but showed no change in serum mBDNF.
While allelic variation in BDNF Val66Met may influence Aβ+ related neurodegeneration and memory loss over the short term, this is not related to serum mBDNF. Longer follow-up intervals may be required to further determine any relationships between serum mBDNF, EM, and HV in preclinical AD.
We present low-frequency spectral energy distributions of 60 known radio pulsars observed with the Murchison Widefield Array telescope. We searched the GaLactic and Extragalactic All-sky Murchison Widefield Array survey images for 200-MHz continuum radio emission at the position of all pulsars in the Australia Telescope National Facility (ATNF) pulsar catalogue. For the 60 confirmed detections, we have measured flux densities in 20 × 8 MHz bands between 72 and 231 MHz. We compare our results to existing measurements and show that the Murchison Widefield Array flux densities are in good agreement.
Approximately half of the variation in wellbeing measures overlaps with variation in personality traits. Studies of non-human primate pedigrees and human twins suggest that this is due to common genetic influences. We tested whether personality polygenic scores for the NEO Five-Factor Inventory (NEO-FFI) domains and for item response theory (IRT) derived extraversion and neuroticism scores predict variance in wellbeing measures. Polygenic scores were based on published genome-wide association (GWA) results in over 17,000 individuals for the NEO-FFI and in over 63,000 for the IRT extraversion and neuroticism traits. The NEO-FFI polygenic scores were used to predict life satisfaction in 7 cohorts, positive affect in 12 cohorts, and general wellbeing in 1 cohort (maximal N = 46,508). Meta-analysis of these results showed no significant association between NEO-FFI personality polygenic scores and the wellbeing measures. IRT extraversion and neuroticism polygenic scores were used to predict life satisfaction and positive affect in almost 37,000 individuals from UK Biobank. Significant positive associations (effect sizes <0.05%) were observed between the extraversion polygenic score and wellbeing measures, and a negative association was observed between the polygenic neuroticism score and life satisfaction. Furthermore, using GWA data, genetic correlations of -0.49 and -0.55 were estimated between neuroticism with life satisfaction and positive affect, respectively. The moderate genetic correlation between neuroticism and wellbeing is in line with twin research showing that genetic influences on wellbeing are also shared with other independent personality domains.
This paper explores the multi-faceted nature of the spiritual geopolitics that shaped Cromwellian foreign policy in relation to the Western Design of 1654-5. It stresses the central importance of Protestant religion as a motivating force and argues that the failure of the Design was interpreted in religious terms just as much as its original aims had been. A number of motives combined to drive Cromwell into launching the Western Design. These included: the ‘Elizabethan’ tradition of English anti-Spanish policy; the pursuit of England’s imperial/colonial interests in the Caribbean; an attempt to strengthen England financially by weakening the Spanish economy; the search for security within Europe by allying with France against Spain; and, underpinning all these, the launching of a Protestant crusade against a power that Cromwell regarded as England’s ‘providential enemy’. The failure of the Design in the summer of 1655 was perceived in similarly religious terms. Just as recent scholarship on Britain’s internal conflicts of the 1640s has emphasised the central role of religion and its inseparability from other issues, so the same phenomenon is evident not only in the motives for the Western Design but also in how its defeat was perceived and interpreted.