To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To conduct a pilot study implementing combined genomic and epidemiologic surveillance for hospital-acquired multidrug-resistant organisms (MDROs) to predict transmission between patients and to estimate the local burden of MDRO transmission.
Pilot prospective multicenter surveillance study.
The study was conducted in 8 university hospitals (2,800 beds total) in Melbourne, Australia (population 4.8 million), including 4 acute-care, 1 specialist cancer care, and 3 subacute-care hospitals.
All clinical and screening isolates from hospital inpatients (April 24 to June 18, 2017) were collected for 6 MDROs: vanA VRE, MRSA, ESBL Escherichia coli (ESBL-Ec) and Klebsiella pneumoniae (ESBL-Kp), and carbapenem-resistant Pseudomonas aeruginosa (CRPa) and Acinetobacter baumannii (CRAb). Isolates were analyzed and reported as routine by hospital laboratories, underwent whole-genome sequencing at the central laboratory, and were analyzed using open-source bioinformatic tools. MDRO burden and transmission were assessed using combined genomic and epidemiologic data.
In total, 408 isolates were collected from 358 patients; 47.5% were screening isolates. ESBL-Ec was most common (52.5%), then MRSA (21.6%), vanA VRE (15.7%), and ESBL-Kp (7.6%). Most MDROs (88.3%) were isolated from patients with recent healthcare exposure.
Combining genomics and epidemiology identified that at least 27.1% of MDROs were likely acquired in a hospital; most of these transmission events would not have been detected without genomics. The highest proportion of transmission occurred with vanA VRE (88.4% of patients).
Genomic and epidemiologic data from multiple institutions can feasibly be combined prospectively, providing substantial insights into the burden and distribution of MDROs, including in-hospital transmission. This analysis enables infection control teams to target interventions more effectively.
Registry-based trials have emerged as a potentially cost-saving study methodology. Early estimates of cost savings, however, conflated the benefits associated with registry utilisation and those associated with other aspects of pragmatic trial designs, which might not all be as broadly applicable. In this study, we sought to build a practical tool that investigators could use across disciplines to estimate the ranges of potential cost differences associated with implementing registry-based trials versus standard clinical trials.
We built simulation Markov models to compare unique costs associated with data acquisition, cleaning, and linkage under a registry-based trial design versus a standard clinical trial. We conducted one-way, two-way, and probabilistic sensitivity analyses, varying study characteristics over broad ranges, to determine thresholds at which investigators might optimally select each trial design.
Registry-based trials were more cost effective than standard clinical trials 98.6% of the time. Data-related cost savings ranged from $4300 to $600,000 with variation in study characteristics. Cost differences were most reactive to the number of patients in a study, the number of data elements per patient available in a registry, and the speed with which research coordinators could manually abstract data. Registry incorporation resulted in cost savings when as few as 3768 independent data elements were available and when manual data abstraction took as little as 3.4 seconds per data field.
Registries offer important resources for investigators. When available, their broad incorporation may help the scientific community reduce the costs of clinical investigation. We offer here a practical tool for investigators to assess potential costs savings.
Traumatic brain injuries (TBI) may lead to persistent depression symptoms. We conducted several pilot studies to examine the efficacy of mindfulness-based interventions to deal with this issue; all showed strong effect sizes. The logical next step was to conduct a randomized controlled trial (RCT).
We sought to determine the efficacy of mindfulness-based cognitive therapy for people with depression symptoms post-TBI (MBCT-TBI).
Using a multi-site RCT design, participants (mean age = 47) were randomized to intervention or control arms. Treatment participants received a group-based, 10-week intervention; control participants waited. Outcome measures, administered pre- and post-intervention, and after three months, included: Beck Depression Inventory-II (BDI-II), Patient Health Questionnaire-9 (PHQ-9), and Symptom Checklist-90-Revised (SCL-90-R). The Philadelphia Mindfulness Scale (PHLMS) captured present moment awareness and acceptance.
BDI-II scores decreased from 25.47 to 18.84 in treatment groups while they stayed relatively stable in control groups (respectively 27.13 to 25.00; p = .029). We did not find statistically significant differences on the PHQ-9 and SCL-90R post- treatment. However, after three months, all scores were statistically significantly lower than at baseline (ps < .01). Increases in mindfulness were associated with decreases in BDI-II scores (r = -.401, p = .025).
MBCT-TBI may alleviate depression symptoms up to three months post-intervention. Greater mindfulness may have contributed to the reduction in depression symptoms although the association does not confirm causality. More work is required to replicate these findings, identify subgroups that may better respond to the intervention, and refine the intervention to maximize its effectiveness.
Given the common view that pre-exercise nutrition/breakfast is important for performance, the present study investigated whether breakfast influences resistance exercise performance via a physiological or psychological effect. Twenty-two resistance-trained, breakfast-consuming men completed three experimental trials, consuming water-only (WAT), or semi-solid breakfasts containing 0 g/kg (PLA) or 1·5 g/kg (CHO) maltodextrin. PLA and CHO meals contained xanthan gum and low-energy flavouring (approximately 122 kJ), and subjects were told both ‘contained energy’. At 2 h post-meal, subjects completed four sets of back squat and bench press to failure at 90 % ten repetition maximum. Blood samples were taken pre-meal, 45 min and 105 min post-meal to measure serum/plasma glucose, insulin, ghrelin, glucagon-like peptide-1 and peptide tyrosine-tyrosine concentrations. Subjective hunger/fullness was also measured. Total back squat repetitions were greater in CHO (44 (sd 10) repetitions) and PLA (43 (sd 10) repetitions) than WAT (38 (sd 10) repetitions; P < 0·001). Total bench press repetitions were similar between trials (WAT 37 (sd 7) repetitions; CHO 39 (sd 7) repetitions; PLA 38 (sd 7) repetitions; P = 0·130). Performance was similar between CHO and PLA trials. Hunger was suppressed and fullness increased similarly in PLA and CHO, relative to WAT (P < 0·001). During CHO, plasma glucose was elevated at 45 min (P < 0·05), whilst serum insulin was elevated (P < 0·05) and plasma ghrelin suppressed at 45 and 105 min (P < 0·05). These results suggest that breakfast/pre-exercise nutrition enhances resistance exercise performance via a psychological effect, although a potential mediating role of hunger cannot be discounted.
Clinical Enterobacteriacae isolates with a colistin minimum inhibitory concentration (MIC) ≥4 mg/L from a United States hospital were screened for the mcr-1 gene using real-time polymerase chain reaction (RT-PCR) and confirmed by whole-genome sequencing. Four colistin-resistant Escherichia coli isolates contained mcr-1. Two isolates belonged to the same sequence type (ST-632). All subjects had prior international travel and antimicrobial exposure.
Using existing data from clinical registries to support clinical trials and other prospective studies has the potential to improve research efficiency. However, little has been reported about staff experiences and lessons learned from implementation of this method in pediatric cardiology.
We describe the process of using existing registry data in the Pediatric Heart Network Residual Lesion Score Study, report stakeholders’ perspectives, and provide recommendations to guide future studies using this methodology.
The Residual Lesion Score Study, a 17-site prospective, observational study, piloted the use of existing local surgical registry data (collected for submission to the Society of Thoracic Surgeons-Congenital Heart Surgery Database) to supplement manual data collection. A survey regarding processes and perceptions was administered to study site and data coordinating center staff.
Survey response rate was 98% (54/55). Overall, 57% perceived that using registry data saved research staff time in the current study, and 74% perceived that it would save time in future studies; 55% noted significant upfront time in developing a methodology for extracting registry data. Survey recommendations included simplifying data extraction processes and tailoring to the needs of the study, understanding registry characteristics to maximise data quality and security, and involving all stakeholders in design and implementation processes.
Use of existing registry data was perceived to save time and promote efficiency. Consideration must be given to the upfront investment of time and resources needed. Ongoing efforts focussed on automating and centralising data management may aid in further optimising this methodology for future studies.
A cluster of Salmonella Paratyphi B variant L(+) tartrate(+) infections with indistinguishable pulsed-field gel electrophoresis patterns was detected in October 2015. Interviews initially identified nut butters, kale, kombucha, chia seeds and nutrition bars as common exposures. Epidemiologic, environmental and traceback investigations were conducted. Thirteen ill people infected with the outbreak strain were identified in 10 states with illness onset during 18 July–22 November 2015. Eight of 10 (80%) ill people reported eating Brand A raw sprouted nut butters. Brand A conducted a voluntary recall. Raw sprouted nut butters are a novel outbreak vehicle, though contaminated raw nuts, nut butters and sprouted seeds have all caused outbreaks previously. Firms producing raw sprouted products, including nut butters, should consider a kill step to reduce the risk of contamination. People at greater risk for foodborne illness may wish to consider avoiding raw products containing raw sprouted ingredients.
The effect of transportation and lairage on the faecal shedding and post-slaughter contamination of carcasses with Escherichia coli O157 and O26 in young calves (4–7-day-old) was assessed in a cohort study at a regional calf-processing plant in the North Island of New Zealand, following 60 calves as cohorts from six dairy farms to slaughter. Multiple samples from each animal at pre-slaughter (recto-anal mucosal swab) and carcass at post-slaughter (sponge swab) were collected and screened using real-time PCR and culture isolation methods for the presence of E. coli O157 and O26 (Shiga toxin-producing E. coli (STEC) and non-STEC). Genotype analysis of E. coli O157 and O26 isolates provided little evidence of faecal–oral transmission of infection between calves during transportation and lairage. Increased cross-contamination of hides and carcasses with E. coli O157 and O26 between co-transported calves was confirmed at pre-hide removal and post-evisceration stages but not at pre-boning (at the end of dressing prior to chilling), indicating that good hygiene practices and application of an approved intervention effectively controlled carcass contamination. This study was the first of its kind to assess the impact of transportation and lairage on the faecal carriage and post-harvest contamination of carcasses with E. coli O157 and O26 in very young calves.
Introduction: The ECG diagnosis of acute coronary occlusion (ACO) in the setting of ventricular paced rhythm (VPR) is purported to be impossible. However, VPR has a similar ECG morphology to LBBB. The validated Smith-modified Sgarbossa criteria (MSC) have high sensitivity (Sens) and specificity (Spec) for ACO in LBBB. MSC consist of 1 of the following in 1 lead: concordant ST Elevation (STE) 1 mm, concordant ST depression 1 mm in V1-V3, or ST/S ratio <−0.25 (in leads with 1 mm STE). We hypothesized that the MSC will have higher Sens for diagnosis of ACO in VPR when compared to the original Sgarbossa criteria. We report preliminary findings of the Paced Electrocardiogram Requiring Fast Emergency Coronary Therapy (PERFECT) study Methods: The PERFECT study is a retrospective, multicenter, international investigation of ED patients from 1/2008 - 12/2016 with VPR on the ECG and symptoms suggestive of acute coronary syndrome (e.g. chest pain or shortness of breath). Data from four sites are presented. Acute myocardial infarction (AMI) was defined by the Third Universal Definition of AMI. A blinded cardiologist adjudicated ACO, defined as thrombolysis in myocardial infarction score 0 or 1 on coronary angiography; a pre-defined subgroup of ACO patients with peak cardiac troponin (cTn) >100 times the 99% upper reference limit (URL) of the cTn assay was also analyzed. Another blinded physician measured all ECGs. Statistics were by Mann Whitney U, Chi-square, and McNemars test. Results: The ACO and No-AMI groups consisted of 15 and 79 encounters, respectively. For the ACO and No-AMI groups, median age was 78 [IQR 72-82] vs. 70 [61-75] and 13 (86%) vs. 48 (61%) patients were male. The median peak cTn ratio (cTn/URL) was 260 [33-663] and 0.5 [0-1.3] for ACO vs. no-AMI. The Sens and Spec for the MSC and the original Sgarbossa criteria were 67% (95%CI 39-87) vs. 46% (22-72; p=0.25) and 99% (92-100) vs. 99% (92-100; p=0.5). In pre-defined subgroup analysis of ACO patients with peak cTn >100 times the URL (n=10), the Sens was 90% (54-100) for the MSC vs. 60% (27- 86) for original Sgarbossa criteria (p=0.25). Conclusion: ACO in VPR is an uncommon condition. The MSC showed good Sens for diagnosis of ACO in the presence of VPR, especially among patients with high peak cTn, and Spec was excellent. These methods and results are consistent with studies that have used the MSC to diagnose ACO in LBBB.
Effective methods to increase awareness of preventable infectious diseases are key components of successful control programmes. Rabies is an example of a disease with significant impact, where public awareness is variable. A recent awareness campaign in a rabies endemic region of Azerbaijan provided a unique opportunity to assess the efficacy of such campaigns. A cluster cross-sectional survey concerning rabies was undertaken following the awareness campaign in 600 households in 38 randomly selected towns, in districts covered by the campaign and matched control regions. This survey demonstrated that the relatively simple awareness campaign was effective at improving knowledge of rabies symptoms and vaccination schedules. Crucially, those in the awareness campaign group were also 1·4 times more likely to report that they had vaccinated their pets, an essential component of human rabies prevention. In addition, low knowledge of appropriate post-exposure treatment and animal sources of rabies provide information useful for future public awareness campaigns in the region and other similar areas.
On 27 April 2015, Washington health authorities identified Escherichia coli O157:H7 infections associated with dairy education school field trips held in a barn 20–24 April. Investigation objectives were to determine the magnitude of the outbreak, identify the source of infection, prevent secondary illness transmission and develop recommendations to prevent future outbreaks. Case-finding, hypothesis generating interviews, environmental site visits and a case–control study were conducted. Parents and children were interviewed regarding event activities. Odds ratios (OR) and 95% confidence intervals (CI) were computed. Environmental testing was conducted in the barn; isolates were compared to patient isolates using pulsed-field gel electrophoresis (PFGE). Sixty people were ill, 11 (18%) were hospitalised and six (10%) developed haemolytic uremic syndrome. Ill people ranged in age from <1 year to 47 years (median: 7), and 20 (33%) were female. Twenty-seven case-patients and 88 controls were enrolled in the case–control study. Among first-grade students, handwashing (i.e. soap and water, or hand sanitiser) before lunch was protective (adjusted OR 0.13; 95% CI 0.02–0.88, P = 0.04). Barn samples yielded E. coli O157:H7 with PFGE patterns indistinguishable from patient isolates. This investigation provided epidemiological, laboratory and environmental evidence for a large outbreak of E. coli O157:H7 infections from exposure to a contaminated barn. The investigation highlights the often overlooked risk of infection through exposure to animal environments as well as the importance of handwashing for disease prevention. Increased education and encouragement of infection prevention measures, such as handwashing, can prevent illness.
On August 25, 2017, Hurricane Harvey made landfall near Corpus Christi, Texas. The ensuing unprecedented flooding throughout the Texas coastal region affected millions of individuals.1 The statewide response in Texas included the sheltering of thousands of individuals at considerable distances from their homes. The Dallas area established large-scale general population sheltering as the number of evacuees to the area began to amass. Historically, the Dallas area is one familiar with “mega-sheltering,” beginning with the response to Hurricane Katrina in 2005.2 Through continued efforts and development, the Dallas area had been readying a plan for the largest general population shelter in Texas. (Disaster Med Public Health Preparedness. 2019;13:33–37)
An internationally approved and globally used classification scheme for the diagnosis of CHD has long been sought. The International Paediatric and Congenital Cardiac Code (IPCCC), which was produced and has been maintained by the International Society for Nomenclature of Paediatric and Congenital Heart Disease (the International Nomenclature Society), is used widely, but has spawned many “short list” versions that differ in content depending on the user. Thus, efforts to have a uniform identification of patients with CHD using a single up-to-date and coordinated nomenclature system continue to be thwarted, even if a common nomenclature has been used as a basis for composing various “short lists”. In an attempt to solve this problem, the International Nomenclature Society has linked its efforts with those of the World Health Organization to obtain a globally accepted nomenclature tree for CHD within the 11th iteration of the International Classification of Diseases (ICD-11). The International Nomenclature Society has submitted a hierarchical nomenclature tree for CHD to the World Health Organization that is expected to serve increasingly as the “short list” for all communities interested in coding for congenital cardiology. This article reviews the history of the International Classification of Diseases and of the IPCCC, and outlines the process used in developing the ICD-11 congenital cardiac disease diagnostic list and the definitions for each term on the list. An overview of the content of the congenital heart anomaly section of the Foundation Component of ICD-11, published herein in its entirety, is also included. Future plans for the International Nomenclature Society include linking again with the World Health Organization to tackle procedural nomenclature as it relates to cardiac malformations. By doing so, the Society will continue its role in standardising nomenclature for CHD across the globe, thereby promoting research and better outcomes for fetuses, children, and adults with congenital heart anomalies.
To assess relationships between mothers’ feeding practices (food as a reward, food for emotion regulation, modelling of healthy eating) and mothers’ willingness to purchase child-marketed foods and fruits/vegetables (F&V) requested by their children during grocery co-shopping.
Cross-sectional. Mothers completed an online survey that included questions about feeding practices and willingness (i.e. intentions) to purchase child-requested foods during grocery co-shopping. Feeding practices scores were dichotomized at the median. Foods were grouped as nutrient-poor or nutrient-dense (F&V) based on national nutrition guidelines. Regression models compared mothers with above-the-median v. at-or-below-the-median feeding practices scores on their willingness to purchase child-requested food groupings, adjusting for demographic covariates.
Participants completed an online survey generated at a public university in the USA.
Mothers (n 318) of 2- to 7-year-old children.
Mothers who scored above-the-median on using food as a reward were more willing to purchase nutrient-poor foods (β=0·60, P<0·0001), mothers who scored above-the-median on use of food for emotion regulation were more willing to purchase nutrient-poor foods (β=0·29, P<0·0031) and mothers who scored above-the-median on modelling of healthy eating were more willing to purchase nutrient-dense foods (β=0·22, P<0·001) than were mothers with at-or-below-the-median scores, adjusting for demographic covariates.
Mothers who reported using food to control children’s behaviour were more willing to purchase child-requested, nutrient-poor foods. Parental feeding practices may facilitate or limit children’s foods requested in grocery stores. Parent–child food consumer behaviours should be investigated as a route that may contribute to children’s eating patterns.
To determine the scope, source, and mode of transmission of a multifacility outbreak of extensively drug-resistant (XDR) Acinetobacter baumannii.
SETTING AND PARTICIPANTS
Residents and patients in skilled nursing facilities, long-term acute-care hospital, and acute-care hospitals.
A case was defined as the incident isolate from clinical or surveillance cultures of XDR Acinetobacter baumannii resistant to imipenem or meropenem and nonsusceptible to all but 1 or 2 antibiotic classes in a patient in an Oregon healthcare facility during January 2012–December 2014. We queried clinical laboratories, reviewed medical records, oversaw patient and environmental surveillance surveys at 2 facilities, and recommended interventions. Pulsed-field gel electrophoresis (PFGE) and molecular analysis were performed.
We identified 21 cases, highly related by PFGE or healthcare facility exposure. Overall, 17 patients (81%) were admitted to either long-term acute-care hospital A (n=8), or skilled nursing facility A (n=8), or both (n=1) prior to XDR A. baumannii isolation. Interfacility communication of patient or resident XDR status was not performed during transfer between facilities. The rare plasmid-encoded carbapenemase gene blaOXA-237 was present in 16 outbreak isolates. Contact precautions, chlorhexidine baths, enhanced environmental cleaning, and interfacility communication were implemented for cases to halt transmission.
Interfacility transmission of XDR A. baumannii carrying the rare blaOXA-237 was facilitated by transfer of affected patients without communication to receiving facilities.
Calcium-based renal calculi demonstrated significant heterogeneity in the structure, density, mineral composition, and material hardness not elucidated by routine clinical testing. Mineral density distributions within calcium oxalate stones revealed differential areas of low (590±80 mg/cc), medium (840±140 mg/cc), and high (1100±200 mg/cc) densities. Apatite stones also contained regions of low (700±200 mg/cc), medium (1100±200 mg/cc), and high (1400±140 mg/cc) densities within layers extending from single or multiple nucleation sites. Despite having lower average mineral density, calcium oxalate (CaOx) stones demonstrated higher material hardness compared to apatite stones, suggesting other chemical components might be involved in determining stone hardness properties. Carbon concentrated sites were identified between morphologic layers in CaOx stones and in stratified layers of apatite stones. Elemental analyses revealed numerous additional trace elements in both stone types. Despite the widespread assumption that stone mineral density is an indicator of susceptibility to lithotripsy, calcium stone mineral density estimates do not directly correlate with actual ex vivo stone hardness. Underlying stone heterogeneity in both structure and mineral density could explain why historical approaches have failed in accurately predicting response of stones to lithotripsy.
Salmonella is a leading cause of bacterial foodborne illness. We report the collaborative investigative efforts of US and Canadian public health officials during the 2013–2014 international outbreak of multiple Salmonella serotype infections linked to sprouted chia seed powder. The investigation included open-ended interviews of ill persons, traceback, product testing, facility inspections, and trace forward. Ninety-four persons infected with outbreak strains from 16 states and four provinces were identified; 21% were hospitalized and none died. Fifty-four (96%) of 56 persons who consumed chia seed powder, reported 13 different brands that traced back to a single Canadian firm, distributed by four US and eight Canadian companies. Laboratory testing yielded outbreak strains from leftover and intact product. Contaminated product was recalled. Although chia seed powder is a novel outbreak vehicle, sprouted seeds are recognized as an important cause of foodborne illness; firms should follow available guidance to reduce the risk of bacterial contamination during sprouting.
The Dark Energy Survey is undertaking an observational programme imaging 1/4 of the southern hemisphere sky with unprecedented photometric accuracy. In the process of observing millions of faint stars and galaxies to constrain the parameters of the dark energy equation of state, the Dark Energy Survey will obtain pre-discovery images of the regions surrounding an estimated 100 gamma-ray bursts over 5 yr. Once gamma-ray bursts are detected by, e.g., the Swift satellite, the DES data will be extremely useful for follow-up observations by the transient astronomy community. We describe a recently-commissioned suite of software that listens continuously for automated notices of gamma-ray burst activity, collates information from archival DES data, and disseminates relevant data products back to the community in near-real-time. Of particular importance are the opportunities that non-public DES data provide for relative photometry of the optical counterparts of gamma-ray bursts, as well as for identifying key characteristics (e.g., photometric redshifts) of potential gamma-ray burst host galaxies. We provide the functional details of the DESAlert software, and its data products, and we show sample results from the application of DESAlert to numerous previously detected gamma-ray bursts, including the possible identification of several heretofore unknown gamma-ray burst hosts.
Fontan survivors have depressed cardiac index that worsens over time. Serum biomarker measurement is minimally invasive, rapid, widely available, and may be useful for serial monitoring. The purpose of this study was to identify biomarkers that correlate with lower cardiac index in Fontan patients.
Methods and results
This study was a multi-centre case series assessing the correlations between biomarkers and cardiac magnetic resonance-derived cardiac index in Fontan patients ⩾6 years of age with biochemical and haematopoietic biomarkers obtained ±12 months from cardiac magnetic resonance. Medical history and biomarker values were obtained by chart review. Spearman’s Rank correlation assessed associations between biomarker z-scores and cardiac index. Biomarkers with significant correlations had receiver operating characteristic curves and area under the curve estimated. In total, 97 cardiac magnetic resonances in 87 patients met inclusion criteria: median age at cardiac magnetic resonance was 15 (6–33) years. Significant correlations were found between cardiac index and total alkaline phosphatase (−0.26, p=0.04), estimated creatinine clearance (0.26, p=0.02), and mean corpuscular volume (−0.32, p<0.01). Area under the curve for the three individual biomarkers was 0.63–0.69. Area under the curve for the three-biomarker panel was 0.75. Comparison of cardiac index above and below the receiver operating characteristic curve-identified cut-off points revealed significant differences for each biomarker (p<0.01) and for the composite panel [median cardiac index for higher-risk group=2.17 L/minute/m2 versus lower-risk group=2.96 L/minute/m2, (p<0.01)].
Higher total alkaline phosphatase and mean corpuscular volume as well as lower estimated creatinine clearance identify Fontan patients with lower cardiac index. Using biomarkers to monitor haemodynamics and organ-specific effects warrants prospective investigation.