To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
As part of a project to implement antimicrobial dashboards at select facilities, we assessed physician attitudes and knowledge regarding antibiotic prescribing.
An online survey explored attitudes toward antimicrobial use and assessed respondents’ management of four clinical scenarios: cellulitis, community-acquired pneumonia, non–catheter-associated asymptomatic bacteriuria, and catheter-associated asymptomatic bacteriuria.
This study was conducted across 16 Veterans’ Affairs (VA) medical centers in 2017.
Physicians working in inpatient settings specializing in infectious diseases (ID), hospital medicine, and non-ID/hospitalist internal medicine.
Scenario responses were scored by assigning +1 for answers most consistent with guidelines, 0 for less guideline-concordant but acceptable answers and −1 for guideline-discordant answers. Scores were normalized to 100% guideline concordant to 100% guideline discordant across all questions within a scenario, and mean scores were calculated across respondents by specialty. Differences in mean score per scenario were tested using analysis of variance (ANOVA).
Overall, 139 physicians completed the survey (19 ID physicians, 62 hospitalists, and 58 other internists). Attitudes were similar across the 3 groups. We detected a significant difference in cellulitis scenario scores (concordance: ID physicians, 76%; hospitalists, 58%; other internists, 52%; P = .0087). Scores were numerically but not significantly different across groups for community-acquired pneumonia (concordance: ID physicians, 75%; hospitalists, 60%; other internists, 56%; P = .0914), for non–catheter-associated asymptomatic bacteriuria (concordance: ID physicians, 65%; hospitalists, 55%; other internists, 40%; P = .322), and for catheter-associated asymptomatic bacteriuria (concordance: ID physicians, 27% concordant; hospitalists, 8% discordant; other internists 13% discordant; P = .12).
Significant differences in performance regarding management of cellulitis and low overall performance regarding asymptomatic bacteriuria point to these conditions as being potentially high-yield targets for stewardship interventions.
Developmental adversities early in life are associated with later psychopathology. Clustering may be a useful approach to group multiple diverse risks together and study their relation with psychopathology. To generate risk clusters of children, adolescents, and young adults, based on adverse environmental exposure and developmental characteristics, and to examine the association of risk clusters with manifest psychopathology. Participants (n = 8300) between 6 and 23 years were recruited from seven sites in India. We administered questionnaires to elicit history of previous exposure to adverse childhood environments, family history of psychiatric disorders in first-degree relatives, and a range of antenatal and postnatal adversities. We used these variables to generate risk clusters. Mini-International Neuropsychiatric Interview-5 was administered to evaluate manifest psychopathology. Two-step cluster analysis revealed two clusters designated as high-risk cluster (HRC) and low-risk cluster (LRC), comprising 4197 (50.5%) and 4103 (49.5%) participants, respectively. HRC had higher frequencies of family history of mental illness, antenatal and neonatal risk factors, developmental delays, history of migration, and exposure to adverse childhood experiences than LRC. There were significantly higher risks of any psychiatric disorder [Relative Risk (RR) = 2.0, 95% CI 1.8–2.3], externalizing (RR = 4.8, 95% CI 3.6–6.4) and internalizing disorders (RR = 2.6, 95% CI 2.2–2.9), and suicidality (2.3, 95% CI 1.8–2.8) in HRC. Social-environmental and developmental factors could classify Indian children, adolescents and young adults into homogeneous clusters at high or low risk of psychopathology. These biopsychosocial determinants of mental health may have practice, policy and research implications for people in low- and middle-income countries.
OBJECTIVES/GOALS: The major objective of this project is two points. First, is to repeat and confirm previous observations that there is elevated cytosolic Line1 DNA in the cytoplasm of cells derived from old mice compared to young. Second is to identify which Line1s in the genome are contributing to this free DNA and test if targeting them rescues the age-related phenotype. METHODS/STUDY POPULATION: This project will focus on data collected from both tissues and primary cells derived from multiple tissues. Using cellular/tissue fractionation kits, we isolate specifically from the cytoplasm. This specificity is confirmed by western blotting. Measurement of the Line1 levels is measured by quantitative PCR. Subsequently, these cytoplasmic samples are sent off for sequencing in order to quantify the length of the free DNA in the cytoplasm and to identify which Line1 genomic families the cytosolic DNA originates. Additionally, FISH is utilized to visualize Line1 DNA in the cytoplasm of aged versus young cells RESULTS/ANTICIPATED RESULTS: We anticipate this research to confirm the hypothesis that extranuclear Line1 DNA accumulates with age in both tissues and primary fibroblasts. Additionally, we expect to be able to determine which specific families of genomic Line1 is driving this extranuclear DNA, which would suggest the active retrotransopable elements that are directly involved in this aging related phenotype. Assuming successful identification of such families, we can then target and silence these specific elements to determine not only if cytoplasmic Line1 in aged mice decreases, but additionally if the healthspan and/or lifespan of these mice improves DISCUSSION/SIGNIFICANCE: Dereoressed Line1s have been shown to be involved in detrimental phenotypes, including autoimmune disease, cancer, and inflammaging. Targeting retrotransposons, either directly through degradation of transcriptional product of LINE1s or indirectly by improving function of regulators, will be crucial in ablating aging phenotypes
Negative symptoms are one of the most incapacitating features of Schizophrenia but their pathophysiology remains unclear. They have been linked to alterations in grey matter in several brain regions, but findings have been inconsistent. This may reflect the investigation of relatively small patient samples, and the confounding effects of chronic illness and exposure to antipsychotic medication. We sought to address these issues by investigating concurrently grey matter volumes (GMV) and cortical thickness (CTh) in a large sample of antipsychotic-naïve or minimally treated patients with First-Episode Schizophrenia (FES).
T1-weighted structural MRI brain scans were acquired from 180 antipsychotic-naïve or minimally treated patients recruited as part of the OPTiMiSE study. The sample was stratified into subgroups with (N = 88) or without (N = 92) Prominent Negative Symptoms (PMN), based on PANSS ratings at presentation. Regional GMV and CTh in the two groups were compared using Voxel-Based Morphometry (VBM) and FreeSurfer (FS). Between-group differences were corrected for multiple comparisons via Family-Wise Error (FWE) and Monte Carlo z-field simulation respectively at p < 0.05 (2-tailed).
The presence of PMN symptoms was associated with larger left inferior orbitofrontal volume (p = 0.03) and greater CTh in the left lateral orbitofrontal gyrus (p = 0.007), but reduced CTh in the left superior temporal gyrus (p = 0.009).
The findings highlight the role of orbitofrontal and temporal cortices in the pathogenesis of negative symptoms of Schizophrenia. As they were evident in generally untreated FEP patients, the results are unlikely to be related to effects of previous treatment or illness chronicity.
To identify the impact of universal masking on COVID-19 incidence and putative SARS-CoV-2 transmissions events among children’s hospital healthcare workers (HCWs).
Single academic free-standing children’s hospital.
We performed whole-genome sequencing of SARS-CoV-2- PCR-positive samples collected from HCWs 3 weeks before and 6 weeks after implementing a universal masking policy. Phylogenetic analyses were performed to identify clusters of clonally related SARS-CoV-2 indicative of putative transmission events. We measured COVID-19 incidence, SARS-CoV-2 test positivity rates, and frequency of putative transmission events before and after the masking policy was implemented.
HCW COVID-19 incidence and test positivity declined from 14.3 to 4.3 cases per week, and from 18.4% to 9.0%, respectively. Putative transmission events were only identified prior to universal masking.
A universal masking policy was associated with reductions in HCW COVID-19 infections and occupational acquisition of SARS-CoV-2.
We use Navier–Stokes-based linear models for wall-bounded turbulent flows to estimate large-scale fluctuations at different wall-normal locations from their measurements at a single wall-normal location. In these models, we replace the nonlinear term by a combination of a stochastic forcing term and an eddy dissipation term. The stochastic forcing term plays a role in energy production by the large scales, and the eddy dissipation term plays a role in energy dissipation by the small scales. Based on the results in channel flow, we find that the models can estimate large-scale fluctuations with reasonable accuracy only when the stochastic forcing and eddy dissipation terms vary with wall distance and with the length scale of the fluctuations to be estimated. The dependence on the wall distance ensures that energy production and energy dissipation are not concentrated close to the wall but are evenly distributed across the near-wall and logarithmic regions. The dependence on the length scale of the fluctuations ensures that lower wavelength fluctuations are not excessively damped by the eddy dissipation term and hence that the dominant scales shift towards lower wavelengths towards the wall. This highlights that, on the one hand, energy extraction in wall turbulence is predominantly linear and thus physics-based linear models give reasonably accurate results. On the other hand, the absence of linearly unstable modes in wall turbulence means that the nonlinear term still plays an essential role in energy extraction and thus the modelled terms should include the observed wall distance and length scale dependencies of the nonlinear term.
Psychosis is a major mental illness with first onset in young adults. The prognosis is poor in around half of the people affected, and difficult to predict. The few tools available to predict prognosis have major weaknesses which limit their use in clinical practice. We aimed to develop and validate a risk prediction model of symptom non-remission in first-episode psychosis.
Our development cohort consisted of 1027 patients with first-episode psychosis recruited between 2005 to 2010 from 14 early intervention services across the National Health Service in England. Our validation cohort consisted of 399 patients with first-episode psychosis recruited between 2006 to 2009 from a further 11 English early intervention services. The one-year non-remission rate was 52% and 54% in the development and validation cohorts, respectively. Multivariable logistic regression was used to develop a risk prediction model for non-remission, which was externally validated.
The prediction model showed good discrimination (C-statistic of 0.74 (0.72, 0.76) and adequate calibration with intercept alpha of 0.13 (0.03, 0.23) and slope beta of 0.99 (0.87, 1.12). Our model improved the net-benefit by 16% at a risk threshold of 50%, equivalent to 16 more detected non-remitted first-episode psychosis individuals per 100 without incorrectly classifying remitted cases.
Once prospectively validated, our first episode psychosis prediction model could help identify patients at increased risk of non-remission at initial clinical contact.
It is widely acknowledged that we are in the midst of an extinction crisis and habitat loss is generally considered the primary driver. However, providing accurate estimates of extinction rates has proven to be problematic and a range of extinction estimates have been published. Arguably, the most commonly used method for predicting extinctions resulting from habitat loss has been application of the species–area relationship (SAR). The purpose of this chapter is to provide a review of the many ways in which the SAR has been used to predict the number of extinctions resulting from habitat loss. By doing so, we highlight the pitfalls of using the SAR in such a way and discuss how the SAR has been argued to both over-predict and under-predict extinctions. We also provide examples of the myriad ways in which studies have extended and built on standard SAR models and approaches to better model and predict extinctions. We conclude by arguing that there is a need to recognize that any approach based on a single variable (i.e. area), such as the SAR, is unlikely to provide a perfect extinction prediction, regardless of the specific details.
ABSTRACT IMPACT: This work seeks to improve the diagnostic accuracy of urinary tract infection among hospitalized older adults and mitigate antibiotic overuse in this population. OBJECTIVES/GOALS: Primary objective: To determine the diagnostic accuracy of serum procalcitonin (PCT) for the diagnosis of symptomatic urinary tract infection (UTI) in hospitalized older adults. Secondary objectives: (1) To develop a predictive model for the diagnosis of UTI; (2) To determine the ability of PCT in discriminating between lower and upper UTI. METHODS/STUDY POPULATION: We performed a prospective observational cohort study of 228 participants from a single institution. The study population included older adults (age 65 or older) who were hospitalized on the general medicine wards with a possible or suspected urinary tract infection (UTI). Upon obtaining informed consent, serum procalcitonin (PCT) was processed on remnant blood samples collected from the emergency department. We performed additional data collection through the electronic health record to obtain demographic information, clinical characteristics, and other laboratory and imaging results. Clinicians were surveyed for the diagnosis of UTI and charts were adjudicated by independent reviews of the medical record by infectious diseases experts to determine the primary endpoint of symptomatic UTI. RESULTS/ANTICIPATED RESULTS: We anticipate that serum procalcitonin predicts the presence of symptomatic urinary tract infection (UTI) by demonstrating an area under the receiver operating characteristic curve of at least 0.85. A predictive model developed in our cohort for the diagnosis of symptomatic UTI will be improved by the addition of serum PCT to the prediction model. Finally, we anticipate the serum PCT will accurately discriminate between upper and lower UTI. DISCUSSION/SIGNIFICANCE OF FINDINGS: Diagnosis of symptomatic UTI in hospitalized older adults is challenging and may lead to overuse of antibiotics and the development of antibiotic resistance in this vulnerable patient population. Serum procalcitonin offers a novel diagnostic strategy in the diagnosis of symptomatic UTI to enable more appropriate antibiotic therapy.
A retrospective study was conducted to describe the impact of a molecular assay to detect the most common carbapenemase genes in carbapenem-resistant Enterobacterales isolates recovered in culture. Carbapenemases were detected in 69% of isolates, and assay results guided treatment modifications or epidemiologic investigation in 20% and 4% of cases, respectively.
The COVID-19 pandemic and mitigation measures are likely to have a marked effect on mental health. It is important to use longitudinal data to improve inferences.
To quantify the prevalence of depression, anxiety and mental well-being before and during the COVID-19 pandemic. Also, to identify groups at risk of depression and/or anxiety during the pandemic.
Data were from the Avon Longitudinal Study of Parents and Children (ALSPAC) index generation (n = 2850, mean age 28 years) and parent generation (n = 3720, mean age 59 years), and Generation Scotland (n = 4233, mean age 59 years). Depression was measured with the Short Mood and Feelings Questionnaire in ALSPAC and the Patient Health Questionnaire-9 in Generation Scotland. Anxiety and mental well-being were measured with the Generalised Anxiety Disorder Assessment-7 and the Short Warwick Edinburgh Mental Wellbeing Scale.
Depression during the pandemic was similar to pre-pandemic levels in the ALSPAC index generation, but those experiencing anxiety had almost doubled, at 24% (95% CI 23–26%) compared with a pre-pandemic level of 13% (95% CI 12–14%). In both studies, anxiety and depression during the pandemic was greater in younger members, women, those with pre-existing mental/physical health conditions and individuals in socioeconomic adversity, even when controlling for pre-pandemic anxiety and depression.
These results provide evidence for increased anxiety in young people that is coincident with the pandemic. Specific groups are at elevated risk of depression and anxiety during the COVID-19 pandemic. This is important for planning current mental health provisions and for long-term impact beyond this pandemic.
A survey of acute-care hospitals found that rapid molecular diagnostic tests (RMDTs) have been widely adopted. Although many hospitals use their antimicrobial stewardship team and/or guidelines to help clinicians interpret results and optimize treatment, opportunities to more fully achieve the potential benefits of RMDTs remain.
An observational study was conducted to characterize high-touch surfaces in emergency departments and hemodialysis facilities. Certain surfaces were touched with much greater frequency than others. A small number of surfaces accounted for the majority of touch episodes. Prioritizing disinfection of these surfaces may reduce pathogen transmission within healthcare environments.
Background:Candida dubliniensis is a worldwide fungal opportunistic pathogen, closely related to C. albicans. Originally identified in patients infected with HIV in Dublin, Ireland, C. dubliniensis has emerged as a pathogen in other immunocompromised individuals, including patients receiving chemotherapy and transplant recipients. Pediatric epidemiological data for this organism are limited. Methods: We report a descriptive review of C. dubliniensis isolates recovered between January 2018 and June 2019 at a large tertiary-care pediatric institution in Columbus, Ohio. Results:C. dubliniensis was identified in 48 patients in the 18-month review period. In total, 67 positive cultures were collected in these patients with the following distribution of sources: 44 sputum (66%), 11 bronchoalveloar lavage fluid (16%), 4 blood (6%), 3 wounds (4%), 2 esophageal (3%), 2 peritoneal fluid (3%), and 1 vaginal (1%). Of the 48 patients in whom C. dubliniensis was identified, 35 (73%) were patients with cystic fibrosis. Also, 8 patients (17%) were considered to have clinical infections and received antifungal therapy: 3 patients with pneumonia, 2 patients with esophagitis, 1 patient with peritonitis, 1 patient with catheter-related bloodstream infection, and 1 patient with disseminated candidiasis. The remaining 40 patients (83%) were considered colonized. Conclusions: We report a descriptive series over 18 months of clinical isolates with C. dubliniensis recovery at a pediatric institution. Most isolates were identified as colonizing strains in patients with cystic fibrosis. C. dubliniensis was a rare cause of invasive disease in our institution, with only 8 cases identified.
Background: In recent years, several rapid molecular diagnostic tests (RMDTs) for infectious diseases diagnostics, such as bloodstream infections (BSIs), have become available for clinical use. The extent to which RMDTs have been adopted and how the results of these tests have been incorporated into clinical care are currently unknown. Methods: We surveyed members of the Society for Healthcare Epidemiology of America Research Network to characterize utilization of RMDT in hospitals and antimicrobial stewardship program (ASP) involvement in result communication and interpretation. The survey was administered using Qualtrics software, and data were analyzed using Stata and Excel software. Results: Overall, 57 responses were received (response rate, 59%), and 72% were from academic hospitals; 50 hospitals (88%) used at least 1 RMDT for BSI (Fig. 1). The factors most commonly reported to have been important in the decision to adopt RMDT were improvements in antimicrobial usage (82%), clinical outcomes (74%), and laboratory efficiency (52%). Among 7 hospitals that did not use RMDT for BSI, the most common reason was cost of new technology. In 50 hospitals with RMDT for BSI, 54% provided written guidelines for optimization or de-escalation of antimicrobials based upon RMDT results. In 40 hospitals (80%), microbiology laboratories directly notified a healthcare worker of the RMDT results: 70% provided results to a physician, nurse practitioner, or physician assistant; 48% to the ASP team; and 33% to a nurse. Furthermore, 11 hospitals (22%) had neither guidelines nor ASP intervention. In addition, 24 hospitals (48%) reported performing postimplementation evaluation of RMDT impact. Reported findings included reduction in time to antibiotic de-escalation (75%), reduction in length of stay (25%), improved laboratory efficiency (20%), and reduction in mortality and overall costs (12%). Among the 47 hospitals with both RMDT and ASP, 79% reported that the ASP team routinely reviewed blood culture RMDT results, and 53.2% used clinical decision support software to do so. Finally, 53 hospitals (93%) used 1 or more RMDT for non–bloodstream infections (Fig. 1). Fewer than half of hospitals provided written guidelines to assist clinicians in interpreting these RMDT results. Conclusions: RMDTs have been widely adopted by participating hospitals and are associated with positive self-reported clinical, logistic, and financial outcomes. However, nearly 1 in 4 hospitals did not have guidelines or ASP interventions to assist clinicians with optimization of antimicrobial prescribing based on RMDT results for BSI. Also, most hospitals did not have guidelines for RMDT results for non-BSI. These findings suggest that opportunities exist to further enhance the potential benefits of RMDT.
Background: The healthcare environment can serve as a reservoir for many microorganisms and, in the absence of appropriate cleaning and disinfection, can contribute to pathogen transmission. Identification of high-touch surfaces (HTS) in hospital patient rooms has allowed the recognition of surfaces that represent the greatest transmission risk and prioritization of cleaning and disinfection resources for infection prevention. HTS in other healthcare settings, including high-volume and high-risk settings such as emergency departments (EDs) and hemodialysis facilities (HDFs), have not been well studied or defined. Methods: Observations were conducted in 2 EDs and 3 HDFs using structured observation tools. All touch episodes, defined as hand-to-surface contact regardless of hand hygiene and/or glove use, were recorded. Touches by healthcare personnel, patients, and visitors were included. Surfaces were classified as being allocated to individual patients or shared among multiple patients. The number of touch episodes per hour was calculated for each surface to rank surfaces by frequency of touch. Results: In total, 28 hours of observation (14 hours each in EDs and HDFs) were conducted. 1,976 touch episodes were observed among 62 surfaces. On average, more touch episodes were observed per hour in HDFs than in EDs (89 vs 52, respectively). The most frequently touched surfaces in EDs included stretcher rails, privacy curtains, visitor chair arm rests and seats, and patient bedside tables, which together accounted for 68.8% of all touch episodes in EDs (Fig. 1). Frequently touched surfaces in HDFs included both shared and single-patient surfaces: 27.8% and 72.2% of HDF touch episodes, respectively. The most frequently touched surfaces in HDFs were supply cart drawers, dialysis machine control panels and keyboards, handwashing faucet handles, bedside work tables, and bed rail or dialysis chair armrests, which accounted for 68.4% of all touch-episodes recorded. Conclusions: To our knowledge, this is the first quantitative study to identify HTSs in EDs and HDFs. Our observations reveal that certain surfaces within these environments are subject to a substantially greater frequency of hand contact than others and that a relatively small number of surfaces account for most touch episodes. Notably, whereas HTSs in EDs were primarily single-patient surfaces, HTSs in HDFs included surfaces shared in the care of multiple patients, which may represent an even greater risk of patient-to-patient pathogen transmission than single-patient surfaces. The identification of HTSs in EDs and HDFs contributes to a better understanding of the risk of environment-related pathogen transmission in these settings and may allow prioritization and optimization of cleaning and disinfection resources within facilities.
Background:Pseudomonas aeruginosa is an important nosocomial pathogen associated with intrinsic and acquired resistance mechanisms to major classes of antibiotics. To better understand clinical risk factors for drug-resistant P. aeruginosa infection, decision-tree models for the prediction of fluoroquinolone and carbapenem-resistant P. aeruginosa were constructed and compared to multivariable logistic regression models using performance characteristics. Methods: In total, 5,636 patients admitted to 4 hospitals within a New York City healthcare system from 2010 to 2016 with blood, respiratory, wound, or urine cultures growing PA were included in the analysis. Presence or absence of drug-resistance was defined using the first culture of any source positive for P. aeruginosa during each hospitalization. To train and validate the prediction models, cases were randomly split (60 of 40) into training and validation datasets. Clinical decision-tree models for both fluoroquinolone and carbapenem resistance were built from the training dataset using 21 clinical variables of interest, and multivariable logistic regression models were built using the 16 clinical variables associated with resistance in bivariate analyses. Decision-tree models were optimized using K-fold cross validation, and performance characteristics between the 4 models were compared. Results: From 2010 through 2016, prevalence of fluoroquinolone and carbapenem resistance was 32% and 18%, respectively. For fluoroquinolone resistance, the logistic regression algorithm attained a positive predictive value (PPV) of 0.57 and a negative predictive value (NPV) of 0.73 (sensitivity, 0.27; specificity, 0.90) and the decision-tree algorithm attained a PPV of 0.65 and an NPV of 0.72 (sensitivity 0.21, specificity 0.95). For carbapenem resistance, the logistic regression algorithm attained a PPV of 0.53 and a NPV of 0.85 (sensitivity 0.20, specificity 0.96) and the decision-tree algorithm attained a PPV of 0.59 and an NPV of 0.84 (sensitivity 0.22, specificity 0.96). The decision-tree partitioning algorithm identified prior fluoroquinolone resistance, SNF stay, sex, and length-of-stay as variables of greatest importance for fluoroquinolone resistance compared to prior carbapenem resistance, age, and length-of-stay for carbapenem resistance. The highest-performing decision tree for fluoroquinolone resistance is illustrated in Fig. 1. Conclusions: Supervised machine-learning techniques may facilitate prediction of P. aeruginosa resistance and risk factors driving resistance patterns in hospitalized patients. Such techniques may be applied to readily available clinical information from hospital electronic health records to aid with clinical decision making.
Antarctica's ice shelves modulate the grounded ice flow, and weakening of ice shelves due to climate forcing will decrease their ‘buttressing’ effect, causing a response in the grounded ice. While the processes governing ice-shelf weakening are complex, uncertainties in the response of the grounded ice sheet are also difficult to assess. The Antarctic BUttressing Model Intercomparison Project (ABUMIP) compares ice-sheet model responses to decrease in buttressing by investigating the ‘end-member’ scenario of total and sustained loss of ice shelves. Although unrealistic, this scenario enables gauging the sensitivity of an ensemble of 15 ice-sheet models to a total loss of buttressing, hence exhibiting the full potential of marine ice-sheet instability. All models predict that this scenario leads to multi-metre (1–12 m) sea-level rise over 500 years from present day. West Antarctic ice sheet collapse alone leads to a 1.91–5.08 m sea-level rise due to the marine ice-sheet instability. Mass loss rates are a strong function of the sliding/friction law, with plastic laws cause a further destabilization of the Aurora and Wilkes Subglacial Basins, East Antarctica. Improvements to marine ice-sheet models have greatly reduced variability between modelled ice-sheet responses to extreme ice-shelf loss, e.g. compared to the SeaRISE assessments.