To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Cannabis use has been linked to poorer episodic memory. However, little is known about whether depression and sex may interact as potential moderators of this association, particularly among adolescents. The current study addresses this by examining interactions between depression symptoms and sex on the association between cannabis use and episodic memory in a large sample of adolescents.
Cross-sectional data from 360 adolescents (Mage = 17.38, SD = .75) were analyzed at the final assessment wave of a two-year longitudinal study. We used the Drug Use History Questionnaire to assess for lifetime cannabis use, and the Computerized Diagnostic Interview Schedule for Children, Fourth edition to assess the number of depression symptoms in the past year. Subtests from the Wechsler Memory Scale, Fourth Edition and the California Verbal Learning Test, Second Edition were used to assess episodic memory performance.
The effect of the three-way interaction among cannabis use, depression symptoms, and sex did not have a significant impact on episodic memory performance. However, follow-up analyses revealed a significant effect of the two-way interaction of cannabis use and depression symptoms on episodic memory, such that associations between cannabis use and episodic memory were only significant at lower and average levels of depression symptoms.
Contrary to our hypotheses, we found that as depression symptoms increased, the negative association between cannabis use and episodic memory diminished. Given the use of a predominantly subsyndromic sample, future studies should attempt to replicate findings among individuals with more severe depression.
In difficult-to-treat depression (DTD) the outcome metrics historically used to evaluate treatment effectiveness may be suboptimal. Metrics based on remission status and on single end-point (SEP) assessment may be problematic given infrequent symptom remission, temporal instability, and poor durability of benefit in DTD.
Self-report and clinician assessment of depression symptom severity were regularly obtained over a 2-year period in a chronic and highly treatment-resistant registry sample (N = 406) receiving treatment as usual, with or without vagus nerve stimulation. Twenty alternative metrics for characterizing symptomatic improvement were evaluated, contrasting SEP metrics with integrative (INT) metrics that aggregated information over time. Metrics were compared in effect size and discriminating power when contrasting groups that did (N = 153) and did not (N = 253) achieve a threshold level of improvement in end-point quality-of-life (QoL) scores, and in their association with continuous QoL scores.
Metrics based on remission status had smaller effect size and poorer discrimination of the binary QoL outcome and weaker associations with the continuous end-point QoL scores than metrics based on partial response or response. The metrics with the strongest performance characteristics were the SEP measure of percentage change in symptom severity and the INT metric quantifying the proportion of the observation period in partial response or better. Both metrics contributed independent variance when predicting end-point QoL scores.
Revision is needed in the metrics used to quantify symptomatic change in DTD with consideration of INT time-based measures as primary or secondary outcomes. Metrics based on remission status may not be useful.
The purpose of this investigation was to expand upon the limited existing research examining the test–retest reliability, cross-sectional validity and longitudinal validity of a sample of bioelectrical impedance analysis (BIA) devices as compared with a laboratory four-compartment (4C) model. Seventy-three healthy participants aged 19–50 years were assessed by each of fifteen BIA devices, with resulting body fat percentage estimates compared with a 4C model utilising air displacement plethysmography, dual-energy X-ray absorptiometry and bioimpedance spectroscopy. A subset of thirty-seven participants returned for a second visit 12–16 weeks later and were included in an analysis of longitudinal validity. The sample of devices included fourteen consumer-grade and one research-grade model in a variety of configurations: hand-to-hand, foot-to-foot and bilateral hand-to-foot (octapolar). BIA devices demonstrated high reliability, with precision error ranging from 0·0 to 0·49 %. Cross-sectional validity varied, with constant error relative to the 4C model ranging from −3·5 (sd 4·1) % to 11·7 (sd 4·7) %, standard error of the estimate values of 3·1–7·5 % and Lin’s concordance correlation coefficients (CCC) of 0·48–0·94. For longitudinal validity, constant error ranged from −0·4 (sd 2·1) % to 1·3 (sd 2·7) %, with standard error of the estimate values of 1·7–2·6 % and Lin’s CCC of 0·37–0·78. While performance varied widely across the sample investigated, select models of BIA devices (particularly octapolar and select foot-to-foot devices) may hold potential utility for the tracking of body composition over time, particularly in contexts in which the purchase or use of a research-grade device is infeasible.
Disruptive behavior disorders (DBD) are heterogeneous at the clinical and the biological level. Therefore, the aims were to dissect the heterogeneous neurodevelopmental deviations of the affective brain circuitry and provide an integration of these differences across modalities.
We combined two novel approaches. First, normative modeling to map deviations from the typical age-related pattern at the level of the individual of (i) activity during emotion matching and (ii) of anatomical images derived from DBD cases (n = 77) and controls (n = 52) aged 8–18 years from the EU-funded Aggressotype and MATRICS consortia. Second, linked independent component analysis to integrate subject-specific deviations from both modalities.
While cases exhibited on average a higher activity than would be expected for their age during face processing in regions such as the amygdala when compared to controls these positive deviations were widespread at the individual level. A multimodal integration of all functional and anatomical deviations explained 23% of the variance in the clinical DBD phenotype. Most notably, the top marker, encompassing the default mode network (DMN) and subcortical regions such as the amygdala and the striatum, was related to aggression across the whole sample.
Overall increased age-related deviations in the amygdala in DBD suggest a maturational delay, which has to be further validated in future studies. Further, the integration of individual deviation patterns from multiple imaging modalities allowed to dissect some of the heterogeneity of DBD and identified the DMN, the striatum and the amygdala as neural signatures that were associated with aggression.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
Introduced mammalian predators are responsible for the decline and extinction of many native species, with rats (genus Rattus) being among the most widespread and damaging invaders worldwide. In a naturally fragmented landscape, we demonstrate the multi-year effectiveness of snap traps in the removal of Rattus rattus and Rattus exulans from lava-surrounded forest fragments ranging in size from <0.1 to >10 ha. Relative to other studies, we observed low levels of fragment recolonization. Larger rats were the first to be trapped, with the average size of trapped rats decreasing over time. Rat removal led to distinct shifts in the foraging height and location of mongooses and mice, emphasizing the need to focus control efforts on multiple invasive species at once. Furthermore, because of a specially designed trap casing, we observed low non-target capture rates, suggesting that on Hawai‘i and similar islands lacking native rodents the risk of killing non-target species in snap traps may be lower than the application of rodenticides, which have the potential to contaminate food webs. These efforts demonstrate that targeted snap-trapping is an effective removal method for invasive rats in fragmented habitats and that, where used, monitoring of recolonization should be included as part of a comprehensive biodiversity management strategy.
Perceived discrimination is associated with worse mental health. Few studies have assessed whether perceived discrimination (i) is associated with the risk of psychotic disorders and (ii) contributes to an increased risk among minority ethnic groups relative to the ethnic majority.
We used data from the European Network of National Schizophrenia Networks Studying Gene-Environment Interactions Work Package 2, a population-based case−control study of incident psychotic disorders in 17 catchment sites across six countries. We calculated odds ratios (OR) and 95% confidence intervals (95% CI) for the associations between perceived discrimination and psychosis using mixed-effects logistic regression models. We used stratified and mediation analyses to explore differences for minority ethnic groups.
Reporting any perceived experience of major discrimination (e.g. unfair treatment by police, not getting hired) was higher in cases than controls (41.8% v. 34.2%). Pervasive experiences of discrimination (≥3 types) were also higher in cases than controls (11.3% v. 5.5%). In fully adjusted models, the odds of psychosis were 1.20 (95% CI 0.91–1.59) for any discrimination and 1.79 (95% CI 1.19–1.59) for pervasive discrimination compared with no discrimination. In stratified analyses, the magnitude of association for pervasive experiences of discrimination appeared stronger for minority ethnic groups (OR = 1.73, 95% CI 1.12–2.68) than the ethnic majority (OR = 1.42, 95% CI 0.65–3.10). In exploratory mediation analysis, pervasive discrimination minimally explained excess risk among minority ethnic groups (5.1%).
Pervasive experiences of discrimination are associated with slightly increased odds of psychotic disorders and may minimally help explain excess risk for minority ethnic groups.
In recent years, a variety of efforts have been made in political science to enable, encourage, or require scholars to be more open and explicit about the bases of their empirical claims and, in turn, make those claims more readily evaluable by others. While qualitative scholars have long taken an interest in making their research open, reflexive, and systematic, the recent push for overarching transparency norms and requirements has provoked serious concern within qualitative research communities and raised fundamental questions about the meaning, value, costs, and intellectual relevance of transparency for qualitative inquiry. In this Perspectives Reflection, we crystallize the central findings of a three-year deliberative process—the Qualitative Transparency Deliberations (QTD)—involving hundreds of political scientists in a broad discussion of these issues. Following an overview of the process and the key insights that emerged, we present summaries of the QTD Working Groups’ final reports. Drawing on a series of public, online conversations that unfolded at www.qualtd.net, the reports unpack transparency’s promise, practicalities, risks, and limitations in relation to different qualitative methodologies, forms of evidence, and research contexts. Taken as a whole, these reports—the full versions of which can be found in the Supplementary Materials—offer practical guidance to scholars designing and implementing qualitative research, and to editors, reviewers, and funders seeking to develop criteria of evaluation that are appropriate—as understood by relevant research communities—to the forms of inquiry being assessed. We dedicate this Reflection to the memory of our coauthor and QTD working group leader Kendra Koivu.1
Background: Candidemia is associated with high morbidity and mortality. Although risk factors for candidemia and other bloodstream infections (BSIs) overlap, little is known about patient characteristics and the outcomes of polymicrobial infections. We used data from the CDC Emerging Infections Program (EIP) candidemia surveillance to describe polymicrobial candidemia infections and to assess clinical differences compared with Candida-only BSIs. Methods: During January 2017–December 2017 active, population-based candidemia surveillance was conducted in 45 counties in 9 states covering ~6% of the US population through the CDC EIP. A case was defined as a blood culture with Candida spp in a surveillance-area resident; a blood culture >30 days from the initial culture was considered a second case. Demographic and clinical characteristics were abstracted from medical records by trained EIP staff. We examined characteristics of polymicrobial cases, in which Candida and ≥1 non-Candida organism were isolated from a blood specimen on the same day, and compared these to Candida-only cases using logistic regression or t tests using SAS v 9.4 software. Results: Of the 1,221 candidemia cases identified during 2017, 215 (10.2%) were polymicrobial. Among polymicrobial cases, 50 (23%) involved ≥3 organisms. The most common non-Candida organisms were Staphylococcus epidermidis (n = 30, 14%), Enterococcus faecalis (n = 26, 12%), Enterococcus faecium (n = 17, 8%), and Staphylococcus aureus, Klebsiella pneumoniae, and Stenotrophomonas maltophilia (n = 15 each, 7%). Patients with polymicrobial cases were significantly younger than those with Candida-only cases (54.3 vs 60.7 years; P < .0004). Healthcare exposures commonly associated with candidemia like total parenteral nutrition (relative risk [RR], 0.82; 95% CI, 0.60–1.13) and surgery (RR, 0.99; 95% CI, 0.77–1.29) were similar between the 2 groups. Polymicrobial cases had shorter median time from admission to positive culture (1 vs 4 days, P < .001), were more commonly associated with injection drug use (RR, 1.95; 95% CI, 1.46–2.61), and were more likely to be community onset-healthcare associated (RR, 1.91; 95% CI, 1.50–2.44). Polymicrobial cases were associated with shorter hospitalization (14 vs 17 days; P = .031), less ICU care (RR, 0.7; 95% CI, 0.51–0.83), and lower mortality (RR, 0.7; 95% CI, 0.50–0.92). Conclusions: One in 10 candidemia cases were polymicrobial, with nearly one-quarter of those involving ≥3 organisms. Lower mortality among polymicrobial cases is surprising but may reflect the younger age and lower severity of infection of this population. Greater injection drug use, central venous catheter use, and long-term care exposures among polymicrobial cases suggest that injection or catheter practices play a role in these infections and may guide prevention opportunities.
Few studies have examined burnout in psychosocial oncology clinicians. The aim of this systematic review was to summarize what is known about the prevalence and severity of burnout in psychosocial clinicians who work in oncology settings and the factors that are believed to contribute or protect against it.
Articles on burnout (including compassion fatigue and secondary trauma) in psychosocial oncology clinicians were identified by searching PubMed/MEDLINE, EMBASE, PsycINFO, the Cumulative Index to Nursing and Allied Health Literature, and the Web of Science Core Collection.
Thirty-eight articles were reviewed at the full-text level, and of those, nine met study inclusion criteria. All were published between 2004 and 2018 and included data from 678 psychosocial clinicians. Quality assessment revealed relatively low risk of bias and high methodological quality. Study composition and sample size varied greatly, and the majority of clinicians were aged between 40 and 59 years. Across studies, 10 different measures were used to assess burnout, secondary traumatic stress, and compassion fatigue, in addition to factors that might impact burnout, including work engagement, meaning, and moral distress. When compared with other medical professionals, psychosocial oncology clinicians endorsed lower levels of burnout.
Significance of results
This systematic review suggests that psychosocial clinicians are not at increased risk of burnout compared with other health care professionals working in oncology or in mental health. Although the data are quite limited, several factors appear to be associated with less burnout in psychosocial clinicians, including exposure to patient recovery, discussing traumas, less moral distress, and finding meaning in their work. More research using standardized measures of burnout with larger samples of clinicians is needed to examine both prevalence rates and how the experience of burnout changes over time. By virtue of their training, psychosocial clinicians are well placed to support each other and their nursing and medical colleagues.
Prolonged survival of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) on environmental surfaces and personal protective equipment may lead to these surfaces transmitting this pathogen to others. We sought to determine the effectiveness of a pulsed-xenon ultraviolet (PX-UV) disinfection system in reducing the load of SARS-CoV-2 on hard surfaces and N95 respirators.
Chamber slides and N95 respirator material were directly inoculated with SARS-CoV-2 and were exposed to different durations of PX-UV.
For hard surfaces, disinfection for 1, 2, and 5 minutes resulted in 3.53 log10, >4.54 log10, and >4.12 log10 reductions in viral load, respectively. For N95 respirators, disinfection for 5 minutes resulted in >4.79 log10 reduction in viral load. PX-UV significantly reduced SARS-CoV-2 on hard surfaces and N95 respirators.
With the potential to rapidly disinfectant environmental surfaces and N95 respirators, PX-UV devices are a promising technology to reduce environmental and personal protective equipment bioburden and to enhance both healthcare worker and patient safety by reducing the risk of exposure to SARS-CoV-2.
A 2018 workshop on the White Mountain Apache Tribe lands in Arizona examined ways to enhance investigations into cultural property crime (CPC) through applications of rapidly evolving methods from archaeological science. CPC (also looting, graverobbing) refers to unauthorized damage, removal, or trafficking in materials possessing blends of communal, aesthetic, and scientific values. The Fort Apache workshop integrated four generally partitioned domains of CPC expertise: (1) theories of perpetrators’ motivations and methods; (2) recommended practice in sustaining public and community opposition to CPC; (3) tactics and strategies for documenting, investigating, and prosecuting CPC; and (4) forensic sedimentology—uses of biophysical sciences to link sediments from implicated persons and objects to crime scenes. Forensic sedimentology served as the touchstone for dialogues among experts in criminology, archaeological sciences, law enforcement, and heritage stewardship. Field visits to CPC crime scenes and workshop deliberations identified pathways toward integrating CPC theory and practice with forensic sedimentology’s potent battery of analytic methods.
To identify potential participants for clinical trials, electronic health records (EHRs) are searched at potential sites. As an alternative, we investigated using medical devices used for real-time diagnostic decisions for trial enrollment.
To project cohorts for a trial in acute coronary syndromes (ACS), we used electrocardiograph-based algorithms that identify ACS or ST elevation myocardial infarction (STEMI) that prompt clinicians to offer patients trial enrollment. We searched six hospitals’ electrocardiograph systems for electrocardiograms (ECGs) meeting the planned trial’s enrollment criterion: ECGs with STEMI or > 75% probability of ACS by the acute cardiac ischemia time-insensitive predictive instrument (ACI-TIPI). We revised the ACI-TIPI regression to require only data directly from the electrocardiograph, the e-ACI-TIPI using the same data used for the original ACI-TIPI (development set n = 3,453; test set n = 2,315). We also tested both on data from emergency department electrocardiographs from across the US (n = 8,556). We then used ACI-TIPI and e-ACI-TIPI to identify potential cohorts for the ACS trial and compared performance to cohorts from EHR data at the hospitals.
Receiver-operating characteristic (ROC) curve areas on the test set were excellent, 0.89 for ACI-TIPI and 0.84 for the e-ACI-TIPI, as was calibration. On the national electrocardiographic database, ROC areas were 0.78 and 0.69, respectively, and with very good calibration. When tested for detection of patients with > 75% ACS probability, both electrocardiograph-based methods identified eligible patients well, and better than did EHRs.
Using data from medical devices such as electrocardiographs may provide accurate projections of available cohorts for clinical trials.
To summarize and discuss logistic and administrative challenges we encountered during the Benefits of Enhanced Terminal Room (BETR) Disinfection Study and lessons learned that are pertinent to future utilization of ultraviolet (UV) disinfection devices in other hospitals
Multicenter cluster randomized trial
SETTING AND PARTICIPANTS
Nine hospitals in the southeastern United States
All participating hospitals developed systems to implement 4 different strategies for terminal room disinfection. We measured compliance with disinfection strategy, barriers to implementation, and perceptions from nurse managers and environmental services (EVS) supervisors throughout the 28-month trial.
Implementation of enhanced terminal disinfection with UV disinfection devices provides unique challenges, including time pressures from bed control personnel, efficient room identification, negative perceptions from nurse managers, and discharge volume. In the course of the BETR Disinfection Study, we utilized several strategies to overcome these barriers: (1) establishing safety as the priority; (2) improving communication between EVS, bed control, and hospital administration; (3) ensuring availability of necessary resources; and (4) tracking and providing feedback on compliance. Using these strategies, we deployed ultraviolet (UV) disinfection devices in 16,220 (88%) of 18,411 eligible rooms during our trial (median per hospital, 89%; IQR, 86%–92%).
Implementation of enhanced terminal room disinfection strategies using UV devices requires recognition and mitigation of 2 key barriers: (1) timely and accurate identification of rooms that would benefit from enhanced terminal disinfection and (2) overcoming time constraints to allow EVS cleaning staff sufficient time to properly employ enhanced terminal disinfection methods.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
Adélie penguins (Pygoscelis adeliae) are responding to ocean–climate variability throughout the marine ecosystem of the western Antarctic Peninsula (WAP) where some breeding colonies have declined by 80%. Nuclear and mitochondrial DNA (mtDNA) markers were used to understand historical population genetic structure and gene flow given relatively recent and continuing reductions in sea ice habitats and changes in numbers of breeding adults at colonies throughout the WAP. Genetic diversity, spatial genetic structure, genetic signatures of fluctuations in population demography and gene flow were assessed in four regional Adélie penguin colonies. The analyses indicated little genetic structure overall based on bi-parentally inherited microsatellite markers (FST =-0.006–0.004). No significant variance was observed in overall haplotype frequency (mtDNA ΦST =0.017; P=0.112). Some comparisons with Charcot Island were significant, suggestive of female-biased philopatry. Estimates of gene flow based on a two-population coalescent model were asymmetrical from the species’ regional core to its northern range. Breeding Adélie penguins of the WAP are a panmictic population and hold adequate genetic diversity and dispersal capacity to be resilient to environmental change.
Field experiments were conducted to determine the effect of shattercane interference on corn grain yield and nitrogen uptake in central Missouri. A glyphosate-resistant corn variety was planted, and atrazine was used to control all weeds except shattercane. Glyphosate was applied when shattercane was 8, 15, 23, 31, 38, or 46 cm tall, and plots were hand weeded weekly thereafter. Season-long shattercane interference resulted in an 85% yield loss in 1999 and a 43% yield loss in 2000. Yield reductions occurred when shattercane was allowed to remain with corn until it was 31 cm tall. In both years, late-season corn biomass N content was highly correlated (r = 0.95 and 0.84, respectively) with corn yield. When shattercane was allowed to reach the maximum recommended height for nicosulfuron or primisulfuron application (31 cm), significant yield losses occurred, and shattercane accumulated 10 and 20 kg N/ha, whereas corn accumulated 10 and 16 kg N/ha, respectively, in 1999 and 2000. Corn grain yield was reduced 0.66% (r2 = 0.71) for each day of interference before a postemergence (POST) application of glyphosate.
Herbicides are the foundation of weed control in commercial crop-production systems. However, herbicide-resistant (HR) weed populations are evolving rapidly as a natural response to selection pressure imposed by modern agricultural management activities. Mitigating the evolution of herbicide resistance depends on reducing selection through diversification of weed control techniques, minimizing the spread of resistance genes and genotypes via pollen or propagule dispersal, and eliminating additions of weed seed to the soil seedbank. Effective deployment of such a multifaceted approach will require shifting from the current concept of basing weed management on single-year economic thresholds.
Cores and exposed cliff sections in salt marshes around Ho Bugt, a tidal embayment in the northernmost part of the Danish Wadden Sea, were subjected to 14C dating and litho- and biostratigraphical analyses to reconstruct paleoenvironmental changes and to establish a late Holocene relative sea-level history. Four stages in the late Holocene development of Ho Bugt can be identified: (1) groundwater-table rise and growth of basal peat (from at least 2300 BC to AD 0); (2) salt-marsh formation (0 to AD 250); (3) a freshening phase (AD 250 to AD 1600?), culminating in the drying out of the marshes and producing a distinct black horizon followed by an aeolian phase with sand deposition; and (4) renewed salt-marsh deposition (AD 1600? to present). From 16 calibrated AMS radiocarbon ages on fossil plant fragments and 4 calibrated conventional radiocarbon ages on peat, we reconstructed a local relative sea-level history that shows a steady sea-level rise of 4 m since 4000 cal yr BP. Contrary to suggestions made in the literature, the relative sea-level record of Ho Bugt does not contain a late Holocene highstand. Relative sea-level changes at Ho Bugt are controlled by glacio-isostatic subsidence and can be duplicated by a glacial isostatic adjustment model in which no water is added to the world's oceans after ca. 5000 cal yr BP.