To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The physiology of mesophotic Scleractinia varies with depth in response to environmental change. Previous research has documented trends in heterotrophy and photosynthesis with depth, but has not addressed between-site variation for a single species. Environmental differences between sites at a local scale and heterogeneous microhabitats, because of irradiance and food availability, are likely important factors when explaining the occurrence and physiology of Scleractinia. Here, 108 colonies of Agaricia lamarcki were sampled from two locations off the coast of Utila, Honduras, distributed evenly down the observed 50 m depth range of the species. We found that depth alone was not sufficient to fully explain physiological variation. Pulse Amplitude-Modulation fluorometry and stable isotope analyses revealed that trends in photochemical and heterotrophic activity with depth varied markedly between sites. Our isotope analyses do not support an obligate link between photosynthetic activity and heterotrophic subsidy with increasing depth. We found that A. lamarcki colonies at the bottom of the species depth range can be physiologically similar to those nearer the surface. As a potential explanation, we hypothesize sites with high topographical complexity, and therefore varied microhabitats, may provide more physiological niches distributed across a larger depth range. Varied microhabitats with depth may reduce the dominance of depth as a physiological determinant. Thus, A. lamarcki may ‘avoid’ changes in environment with depth, by instead existing in a subset of favourable niches. Our observations correlate with site-specific depth ranges, advocating for linking physiology and abiotic profiles when defining the distribution of mesophotic taxa.
Generalized Algebraic Data Types, or simply GADTs, can encode non-trivial properties in the types of the constructors. Once such properties are encoded in a datatype, however, all code manipulating that datatype must provide proof that it maintains these properties in order to typecheck. In this paper, we take a step toward gradualizing these obligations. We introduce a tool, Ghostbuster, that produces simplified versions of GADTs which elide selected type parameters, thereby weakening the guarantees of the simplified datatype in exchange for reducing the obligations necessary to manipulate it. Like ornaments, these simplified datatypes preserve the recursive structure of the original, but unlike ornaments, we focus on information-preserving bidirectional transformations. Ghostbuster generates type-safe conversion functions between the original and simplified datatypes, which we prove are the identity function when composed. We evaluate a prototype tool for Haskell against thousands of GADTs found on the Hackage package database, generating simpler Haskell'98 datatypes and round-trip conversion functions between the two.
Epstein Barr virus (EBV) infects 95% of the global population and is associated with up to 2% of cancers globally. Immunoglobulin G (IgG) antibody levels to EBV have been shown to be heritable and associated with developing malignancies. We, therefore, performed a pilot genome-wide association analysis of anti-EBV IgG traits in an African population, using a combined approach including array genotyping, whole-genome sequencing and imputation to a panel with African sequence data. In 1562 Ugandans, we identify a variant in human leukocyte antigen (HLA)-DQA1, rs9272371 (p = 2.6 × 10−17) associated with anti-EBV nuclear antigen-1 responses. Trans-ancestry meta-analysis and fine-mapping with European-ancestry individuals suggest the presence of distinct HLA class II variants driving associations in Uganda. In addition, we identify four putative, novel, very rare African-specific loci with preliminary evidence for association with anti-viral capsid antigen IgG responses which will require replication for validation. These findings reinforce the need for the expansion of such studies in African populations with relevant datasets to capture genetic diversity.
Introduction: Active substance use and unstable housing are both associated with increased emergency department (ED) utilization. This study examined ED health care costs among a cohort of substance using and/or homeless adults following an index ED visit, relative to a control ED population. Methods: Consecutive patients presenting to an inner-city ED between August 2010 and November 2011 who reported unstable housing and/or who had a chief presenting complaint related to acute or chronic substance use were evaluated. Controls were enrolled in a 1:4 ratio. Participants’ health care utilization was tracked via electronic medical record for six months after the index ED visit. Costing data across all EDs in the region was obtained from Alberta Health Services and calculated to include physician billing and the cost of an ED visit excluding investigations. The cost impact of ED utilization was estimated by multiplying the derived ED cost per visit by the median number of visits with interquartile ranges (IQR) for each group during follow up. Proportions were compared using non-parametric tests. Results: From 4679 patients screened, 209 patients were enrolled (41 controls, 46 substance using, 91 unstably housed, 31 both unstably housed and substance using (UHS)). Median costs (IQR) per group over the six-month period were $0 ($0-$345.42) for control, $345.42 ($0-$1139.89) for substance using, $345.42 ($0-$1381.68) for unstably housed and $1381.68 ($690.84-$4248.67) for unstably housed and substance using patients (p<0.05). Conclusion: The intensity of excess ED costs was greatest in patients who were both unstably housed and presenting with a chief complaint related to substance use. This group had a significantly larger impact on health care expenditure relative to ED users who were not unstably housed or who presented with a substance use related complaint. Further research into how care or connection to community resources in the ED can reduce these costs is warranted.
Introduction: Data regarding adverse events (AEs) (unintended harm to the patient from health care provided) among children seen in the emergency department (ED) are scarce despite the high risk setting and population. The objective of our study was to estimate the risk and type of AEs, and their preventability and severity, among children treated in pediatric EDs. Methods: Our prospective cohort study enrolled children <18 years of age presenting for care during 21 randomized 8 hr-shifts at 9 pediatric EDs from Nov 2014 to October 2015. Exclusion criteria included unavailability for follow-up or insurmountable language barrier. RAs collected demographic, medical history, ED course, and systems level data. At day 7, 14, and 21 a RA administered a structured telephone interview to all patients to identify flagged outcomes (e.g. repeat ED visits, worsening/new symptoms, etc). A validated trigger tool was used to screen admitted patients’ health records. For any patients with a flagged outcome or trigger, 3 ED physicians independently determined if an AE occurred. Primary outcome was the proportion of patients with an AE related to ED care within 3 weeks of their ED visit. Results: We enrolled 6377 (72.0%) of 8855 eligible patients; 545 (8.5%) were lost to follow-up. Median age was 4.4 years (range 3 months to 17.9 yrs). Eight hundred and seventy seven (13.8%) were triaged as CTAS 1 or 2, 2638 (41.4%) as CTAS 3, and 2839 (44.7%) as CTAS 4 or 5. Top entrance complaints were fever (11.2%) and cough (8.8%). Flagged outcomes/triggers were identified for 2047 (32.1%) patients. While 252 (4.0%) patients suffered at least one AE within 3 weeks of ED visit, 163 (2.6%) suffered an AE related to ED care. In total, patients suffered 286 AEs, most (67.9%) being preventable. The most common AE types were management issues (32.5%) and procedural complications (21.9%). The need for a medical intervention (33.9%) and another ED visit (33.9%) were the most frequent clinical consequences. In univariate analysis, older age, chronic conditions, hospital admission, initial location in high acuity area of the ED, having >1 ED MD or a consultant involved in care, (all p<0.001) and longer length of stay (p<0.01) were associated with AEs. Conclusion: While our multicentre study found a lower risk of AEs among pediatric ED patients than reported among pediatric inpatients and adult ED patients, a high proportion of these AEs were preventable.
A range of wheat cultivars, including elite cultivars, older cultivars and some preferred by organic growers, were trialled under high and low nitrogen (N) conventional and organic conditions to determine whether cultivars that yield highly under organic conditions have the same relative yield under conventional conditions. A range of cultivar mixtures was also assessed to see whether these gave yield advantages or superiority in either farming system. The conventional trials were grown with and without full fungicide programmes, which largely controlled disease. Amongst the cultivars, Alchemy showed superior yield under organic conditions as did Pegassos, but under conventional conditions Pegassos was always one of the low-ranking cultivars. Under conventional conditions the more recent cultivars Alchemy, Glasgow and Istabraq yielded highly, while an older one, Consort, yielded highly under low fertilizer conditions, and both Ambrosia and Deben also yielded highly generally. Fungicide and high N favoured the disease-susceptible, high-yield cultivars such as Glasgow whereas Consort, an older susceptible cultivar, was favoured by fungicide and low N. Together this demonstrates that whilst the yield characteristics of some elite germplasm are also expressed under organic conditions, at least one cultivar that yielded poorly under conventional conditions showed adaptation towards the organic conditions of these trials. Other cultivars yielding poorly under conventional conditions also gave poor yield under organic conditions. The equal proportion mixtures of cultivars grown under conventional conditions showed no evidence of differences in yield from the mean of the component cultivars grown separately, but combinations of Glasgow, Alchemy and Istabraq gave consistently high yield.
Background: The psychological literature suggests that therapist perfectionism is common and potentially detrimental to client recovery. Little is known about the relationship between therapist perfectionism and client outcomes. Aims: This study aimed to measure perfectionism in High Intensity Cognitive Behavioural therapists, and establish any relationships between dimensions of therapist perfectionism, client outcomes and drop-out rates in treatment. Method: Thirty-six therapists took part in the study; levels of perfectionism were measured using a self-report questionnaire and these were analysed in relation to the clinical outcomes from a sample of their clients. Results: The results indicated that therapist perfectionism may be less common than previously suggested. Overall, a number of significant negative associations were observed between aspects of therapist perfectionism (e.g. having high standards for others), treatment efficacy and client retention in treatment. Conclusions: Therapist perfectionism is associated with CBT treatment outcomes; tentative recommendations for therapists managing their own schema as part of their clinical practice have been made, although further investigation is required.
No existing models of alcohol prevention concurrently adopt universal and selective approaches. This study aims to evaluate the first combined universal and selective approach to alcohol prevention.
A total of 26 Australian schools with 2190 students (mean age: 13.3 years) were randomized to receive: universal prevention (Climate Schools); selective prevention (Preventure); combined prevention (Climate Schools and Preventure; CAP); or health education as usual (control). Primary outcomes were alcohol use, binge drinking and alcohol-related harms at 6, 12 and 24 months.
Climate, Preventure and CAP students demonstrated significantly lower growth in their likelihood to drink and binge drink, relative to controls over 24 months. Preventure students displayed significantly lower growth in their likelihood to experience alcohol harms, relative to controls. While adolescents in both the CAP and Climate groups demonstrated slower growth in drinking compared with adolescents in the control group over the 2-year study period, CAP adolescents demonstrated faster growth in drinking compared with Climate adolescents.
Findings support universal, selective and combined approaches to alcohol prevention. Particularly novel are the findings of no advantage of the combined approach over universal or selective prevention alone.
Vibrio alginolyticus causes soft tissue and bloodstream infection; little systematically collected clinical and epidemiological information is available. In the USA, V. alginolyticus infections are reported to the Cholera and Other Vibrio Illness Surveillance system. Using data from 1988 to 2012, we categorised infections using specimen source and exposure history, analysed case characteristics, and calculated incidence rates using US Census Bureau data. Most (96%) of the 1331 V. alginolyticus infections were from coastal states. Infections of the skin and ear were most frequent (87%); ear infections occurred more commonly in children, lower extremity infections more commonly in older adults. Most (86%) infections involved water activity. Reported incidence of infections increased 12-fold over the study period, although the extent of diagnostic or surveillance bias is unclear. Prevention efforts should target waterborne transmission in coastal areas and provider education to promote more rapid diagnosis and prevent complications.
Krystyna K. Matusiak, Assistant Professor in the Library and Information Science Program (LIS) at the University of Denver, Colorado.,
Padma Polepeddi, Public Services Manager of Lakewood and Edgewater Libraries in Jefferson County, Colorado.,
Allison Tyler, 2016 graduate of the Library and Information Science Program at the University of Denver, Colorado.,
Catherine Newton, 2015 graduate of the Library and Information Science Program at the University of Denver, Colorado.,
Julianne Rist, Assistant Director of Public Services for the Jefferson County Public Library and oversees the Jeffco Stories project.
ORAL HISTORIES PROVIDE a unique opportunity for the preservation of cultural heritage and community engagement by collecting life stories, documenting shared experience and giving voice to community members. Digital technologies have improved the process of recording oral histories and offer innovative methods for their organization, dissemination and archiving (Boyd, 2011). New means of discovery and distribution are available, granting new levels of access not only to born-digital recordings but also to older oral histories recorded on analogue media. Digitization both expands access to valuable first-hand accounts recorded in the past and addresses preservation concerns, since many analogue recordings were originally created on tapes that deteriorate and become fragile with time.
The digitization of oral history collections has been undertaken primarily by university and audiovisual archives with well-established digital infrastructures for hosting and preserving digital assets (Daniels, 2009; Stevens and Latham, 2009; Weig, Terry and Lybarger, 2007). Public libraries and small cultural heritage institutions engage less frequently in building digital collections of oral histories, due to their relative lack of technical expertise and the challenge of acquiring affordable content management systems and digital preservation solutions. Partnering with other institutions and collaborating with researchers and community stakeholders is one way of expanding access to oral history collections (McKether and Jeter, 2011).
This chapter presents a case study of Jeffco Stories, a collection of digitized oral histories created by the Jefferson County Public (JCP) Library in Colorado, in the USA. The project was created in collaboration with local historical societies and with the assistance of faculty and graduate students from the Library and Information Science (LIS) programme at the University of Denver, located in Denver, Colorado. The online collection, Jeffco Stories, currently includes 163 oral histories. It was recently migrated to open source content management software, Omeka, and is available at http://jeffcostories.omeka.net/. This chapter contributes to research on the digitization of oral histories at small and mid-sized cultural heritage institutions and is particularly relevant to librarians and archivists who are exploring access and preservation solutions for digital collections.
Oral history background
Oral history is a systematic way of collecting personal life stories and memories of historical and community events through recorded interviews (Ritchie, 2003). It is a recognized research method in history and the social sciences that acknowledges spoken memories as primary source materials.
With the changing distribution of infectious diseases, and an increase in the burden of non-communicable diseases, low- and middle-income countries, including those in Africa, will need to expand their health care capacities to effectively respond to these epidemiological transitions. The interrelated risk factors for chronic infectious and non-communicable diseases and the need for long-term disease management, argue for combined strategies to understand their underlying causes and to design strategies for effective prevention and long-term care. Through multidisciplinary research and implementation partnerships, we advocate an integrated approach for research and healthcare for chronic diseases in Africa.
Toxigenic strains of Vibrio cholerae serogroups O1 and O139 have caused cholera epidemics, but other serogroups – such as O75 or O141 – can also produce cholera toxin and cause severe watery diarrhoea similar to cholera. We describe 31 years of surveillance for toxigenic non-O1, non-O139 infections in the United States and map these infections to the state where the exposure probably originated. While serogroups O75 and O141 are closely related pathogens, they differ in how and where they infect people. Oysters were the main vehicle for O75 infection. The vehicles for O141 infection include oysters, clams, and freshwater in lakes and rivers. The patients infected with serogroup O75 who had food traceback information available ate raw oysters from Florida. Patients infected with O141 ate oysters from Florida and clams from New Jersey, and those who only reported being exposed to freshwater were exposed in Arizona, Michigan, Missouri, and Texas. Improving the safety of oysters, specifically, should help prevent future illnesses from these toxigenic strains and similar pathogenic Vibrio species. Post-harvest processing of raw oysters, such as individual quick freezing, heat-cool pasteurization, and high hydrostatic pressurization, should be considered.
Layers of volcanic ash, or tephra form widespread chronostratigraphic marker horizons which are important because of their distinctive characteristics and rapid deposition over large areas. Absolute dating of prehistoric layers effectively depends upon 14C analysis. We focus here on Icelandic tephra layers at both proximal and distal sites and consider three strategies to obtain age estimates: 1) the conventional dating of individual profiles; 2) high-precision multisample techniques or “wiggle-matching” using stratigraphic sequences of peat; and 3) a combination of routine analyses from multiple sites. The first approach is illustrated by the dating of a peat profile in Scotland containing tephra from the ad 1510 eruption of Hekla. This produced a 14C age compatible with ad 1510, independently derived by geochemical correlation with historically dated Icelandic deposits. In addition, the ca. 2100 bp date for the Glen Garry tephra in Scotland, determined by a series of dates on a peat profile in Caithness, is supported by its stratigraphic position within 14C dated profiles in Sutherland, and may be applied over a very large area of Scotland. More precise dates for individual tephras may be produced by “wiggle-matching”, although this approach could be biased by changes in peat-bog stratigraphy close to the position of the tephra fall. As appropriate sites for “wiggle-match” exercises may be found only for a few Icelandic tephras, we also consider the results of a spatial approach to 14C dating tephra layers. We combined dates on peat underlying the same layer at several sites to estimate the age of the tephra: 3826 ± 12 bp for the Hekla-4 tephra and 2879 ± 34 bp for the Hekla-3 tephra. This approach is effective in terms of cost, the need for widespread applicability to Icelandic tephra stratigraphy and the production of ages of a useful resolution. We stress the need for accurate identification of tephra deposits without which the conclusions drawn from subsequent 14C dating will be fundamentally flawed.
Introduction: Substance use and unstable housing are associated with heavy use of the Emergency Department (ED). This study examined the impact of substance use and unstable housing on the probability of future ED use. Methods: Case-control study of patients presenting to an urban ED. Patients were eligible if they were unstably housed for the past 30 days, and/or if their chief complaint was related to substance use. Following written informed consent, patients completed a baseline survey and health care use was tracked via electronic medical records for the next six months. Controls were enrolled in a 1:4 ratio. More than 2 ED visits during the follow-up was pre-specified as a measure of excess ED use. Descriptive analyses included proportions and medians with interquartile ranges (IQR). Binomial logistic regression models were used to estimate the impact of housing status, high-risk alcohol use (AUDIT) and drug use (DUDIT), and combinations of these factors on subsequent acute care system contacts (ED visits + admissions). We controlled for age, gender, comorbidities at baseline, and baseline presenting acuity. Results: 41 controls, 46 substance using, 91 unstably housed, and 31 both unstably housed and substance using patients were enrolled (n = 209). Median ED visits during follow up were 0 (IQR: 0-1.0) for controls, 1.0 (IQR: 0-3.3) for substance using, 1.0 (IQR: 0-4.0) for unstably housed and 4 (IQR: 2-12.3) for unstably housed and substance using patients. The median acute care system contacts over the same period was 1.0 (IQR 0-2.0) for controls, 1.0 (IQR: 0-4.0) for substance using, 1.0 (IQR: 0-5.0) for unstably housed and 4.5 (IQR: 2.8-14.3) for unstably housed and substance using patients. Being unstably housed was the factor most strongly associated with having > 2 ED visits (b=3.288, p<0.005) followed by high-risk alcohol and drug use (b=2.149, p<0.08); high risk alcohol use alone was not significantly associated with ED visits (b=1.939, p<0.1). The number of comorbidities present at baseline was a small but statistically significant additional risk factor (b=0.478, p<0.05). The model correctly predicted 70.1% of patients’ ED utilization status. Conclusion: Unstable housing is a substantial risk factor for ED use; high-risk alcohol and drug use, and comorbidities at baseline increased this risk. The intensity of excess ED use was greatest in patients who were unstably housed and substance using.
Introduction: Diabetes mellitus affects over 2.7 million Canadians, with 90% being Type-2 diabetes (CDA 2010). Complications of diabetes are major causes for emergency department (ED) visits, adversely affecting patients’ health and costing the health system. Improving diabetes self-management can lead to avoidance of ED visits and revisits after discharge. Recent developments in mobile Health (mHealth), such as home health monitoring with sensors, social media, and text messaging, have shown promise in supporting patients in chronic disease self-management. This project tested the feasibility of these tools to support self-management for people with type-2 diabetes. Methods: Forty-three people with type-2 diabetes took part in a three month program that provided: health information via text messages, online access to curated resources and a facilitated discussion board, and access to wireless monitoring devices. Participants were outfitted with a wireless blood pressure monitor and weight scale, standard blood glucose monitor, and online access to their physiological data. Data collected included pre and post-self-reported health measures, tracking of physiological changes, website and discussion board use, cost survey, and interviews. Results: Participants reported significantly less health distress and an increase in diabetes empowerment. HbA1c levels decreased from an average of 7.41 to 6.77. Average weight and blood glucose also decreased over the study period. Interview and cost survey findings revealed most participants felt mDAWN provided good value; 78% expressed interest in continuing all or parts of the program. Interview findings revealed that participants developed self-management routines, and experienced increased self-awareness of, and ownership over, their health achievements. Conclusion: mHealth tools provided participants with their own physiologic information, connection with peers, and evidence informed advice. Participants highly valued this combination and improved their self-management and health outcomes. Equipping patients with similar tools for self-management post ED discharge holds great promise for decreasing revisits and improving health outcomes. This study has stimulated a clinical trial now underway to evaluate the effectiveness of home monitoring to facilitate the transition of patients between acute care and community settings.
The main objective of our target article was to sketch the empirical case for the importance of selection at the level of groups on cultural variation. Such variation is massive in humans, but modest or absent in other species. Group selection processes acting on this variation is a framework for developing explanations of the unusual level of cooperation between non-relatives found in our species. Our case for cultural group selection (CGS) followed Darwin's classic syllogism regarding natural selection: If variation exists at the level of groups, if this variation is heritable, and if it plays a role in the success or failure of competing groups, then selection will operate at the level of groups. We outlined the relevant domains where such evidence can be sought and characterized the main conclusions of work in those domains. Most commentators agree that CGS plays some role in human evolution, although some were considerably more skeptical. Some contributed additional empirical cases. Some raised issues of the scope of CGS explanations versus competing ones.
In western Canada, more money is spent on wild oat herbicides than on any
other weed species, and wild oat resistance to herbicides is the most
widespread resistance issue. A direct-seeded field experiment was conducted
from 2010 to 2014 at eight Canadian sites to determine crop life cycle, crop
species, crop seeding rate, crop usage, and herbicide rate combination
effects on wild oat management and canola yield. Combining 2× seeding rates
of early-cut barley silage with 2× seeding rates of winter cereals and
excluding wild oat herbicides for 3 of 5 yr (2011 to 2013) often led to
similar wild oat density, aboveground wild oat biomass, wild oat seed
density in the soil, and canola yield as a repeated canola–wheat rotation
under a full wild oat herbicide rate regime. Wild oat was similarly well
managed after 3 yr of perennial alfalfa without wild oat herbicides.
Forgoing wild oat herbicides in only 2 of 5 yr from exclusively summer
annual crop rotations resulted in higher wild oat density, biomass, and seed
banks. Management systems that effectively combine diverse and optimal
cultural practices against weeds, and limit herbicide use, reduce selection
pressure for weed resistance to herbicides and prolong the utility of
threatened herbicide tools.
Most empirical studies into the covariance structure of psychopathology have been confined to adults. This work is not developmentally informed as the meaning, age-of-onset, persistence and expression of disorders differ across the lifespan. This study investigates the underlying structure of adolescent psychopathology and associations between the psychopathological dimensions and sex and personality risk profiles for substance misuse and mental health problems.
This study analyzed data from 2175 adolescents aged 13.3 years. Five dimensional models were tested using confirmatory factor analysis and the external validity was examined using a multiple-indicators multiple-causes model.
A modified bifactor model, with three correlated specific factors (internalizing, externalizing, thought disorder) and one general psychopathology factor, provided the best fit to the data. Females reported higher mean levels of internalizing, and males reported higher mean levels of externalizing. No significant sex differences emerged in liability to thought disorder or general psychopathology. Liability to internalizing, externalizing, thought disorder and general psychopathology was characterized by a number of differences in personality profiles.
This study is the first to identify a bifactor model including a specific thought disorder factor. The findings highlight the utility of transdiagnostic treatment approaches and the importance of restructuring psychopathology in an empirically based manner.
The paper by Dritschel et al. (J. Fluid Mech., vol. 783, 2015, pp. 1–22) describes the long-time behaviour of inviscid two-dimensional fluid dynamics on the surface of a sphere. At issue is whether the flow settles down to an equilibrium or whether, for generic (random) initial conditions, the long-time solution is periodic, quasi-periodic or chaotic. While it might be surprising that this issue is not settled in the literature, it is important to keep in mind that the Euler equations form a dissipationless Hamiltonian system, hence the set of equations only redistributes the initial vorticity, generating smaller and smaller scales, while keeping kinetic energy, angular impulse and an infinite family of vorticity moments (Casimirs) intact. While special solutions that never settle down to an equilibrium state can be constructed using point vortices, vortex patches and other distributions, the fate of random initial conditions is a trickier problem. Previous statistical theories indicate that the long-time state should be a stationary large-scale distribution of vorticity. By carrying out careful numerical simulations using two different methods, the authors make a compelling case that the generic long-time state resembles a large-scale oscillating quadrupolar vorticity field, surrounded by persistent small-scale vortices. While numerical simulations can never conclusively settle this issue, the results might help guide future theories that seek to prove the existence of such an interesting dynamical long-time state.