To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Long understood as purifying the church by rejecting worship routine and devotional ceremony, pious New England settlers in fact observed formal and informal rituals that defined lived religion within the Reformed tradition. Given that access to the vernacular word was central to puritan self-definition, literacy and reading became intensely ritualized. Thus, along with life-cycle rites (birth, marriage, death), annual and occasional ceremonies (fast days, thanksgiving days, election sermons, artillery sermons), and sabbath customs (the sacraments, the public confession, the audition of preaching), ritual was derived from the experience of books. The chapter demonstrates this experience by looking at moments of cross-cultural contact during Metacom’s War, where reading seeks to stabilize tradition. It studies reader annotations of devotional works as a means to understand the meditative, recursive, and extractive practices that grounded and routinized lay piety. And it examines the visual iconography of illustrations within devotional manuals, illustrations that idealize and demonize kinds of identity for the proper pilgrim reader. Ritual, routine, and iconography are not typically associated with puritan worship, but with an ear and eye to reading habits, we better understand experiential religion in early New England.
The principal aim of this study was to optimize the diagnosis of canine neuroangiostrongyliasis (NA). In total, 92 cases were seen between 2010 and 2020. Dogs were aged from 7 weeks to 14 years (median 5 months), with 73/90 (81%) less than 6 months and 1.7 times as many males as females. The disease became more common over the study period. Most cases (86%) were seen between March and July. Cerebrospinal fluid (CSF) was obtained from the cisterna magna in 77 dogs, the lumbar cistern in f5, and both sites in 3. Nucleated cell counts for 84 specimens ranged from 1 to 146 150 cells μL−1 (median 4500). Percentage eosinophils varied from 0 to 98% (median 83%). When both cisternal and lumbar CSF were collected, inflammation was more severe caudally. Seventy-three CSF specimens were subjected to enzyme-linked immunosorbent assay (ELISA) testing for antibodies against A. cantonensis; 61 (84%) tested positive, titres ranging from <100 to ⩾12 800 (median 1600). Sixty-one CSF specimens were subjected to real-time quantitative polymerase chain reaction (qPCR) testing using a new protocol targeting a bioinformatically-informed repetitive genetic target; 53/61 samples (87%) tested positive, CT values ranging from 23.4 to 39.5 (median 30.0). For 57 dogs, it was possible to compare CSF ELISA serology and qPCR. ELISA and qPCR were both positive in 40 dogs, in 5 dogs the ELISA was positive while the qPCR was negative, in 9 dogs the qPCR was positive but the ELISA was negative, while in 3 dogs both the ELISA and qPCR were negative. NA is an emerging infectious disease of dogs in Sydney, Australia.
Previous genetic association studies have failed to identify loci robustly associated with sepsis, and there have been no published genetic association studies or polygenic risk score analyses of patients with septic shock, despite evidence suggesting genetic factors may be involved. We systematically collected genotype and clinical outcome data in the context of a randomized controlled trial from patients with septic shock to enrich the presence of disease-associated genetic variants. We performed genomewide association studies of susceptibility and mortality in septic shock using 493 patients with septic shock and 2442 population controls, and polygenic risk score analysis to assess genetic overlap between septic shock risk/mortality with clinically relevant traits. One variant, rs9489328, located in AL589740.1 noncoding RNA, was significantly associated with septic shock (p = 1.05 × 10–10); however, it is likely a false-positive. We were unable to replicate variants previously reported to be associated (p < 1.00 × 10–6 in previous scans) with susceptibility to and mortality from sepsis. Polygenic risk scores for hematocrit and granulocyte count were negatively associated with 28-day mortality (p = 3.04 × 10–3; p = 2.29 × 10–3), and scores for C-reactive protein levels were positively associated with susceptibility to septic shock (p = 1.44 × 10–3). Results suggest that common variants of large effect do not influence septic shock susceptibility, mortality and resolution; however, genetic predispositions to clinically relevant traits are significantly associated with increased susceptibility and mortality in septic individuals.
Presently, evidence guiding clinicians on the optimal approach to safely screen patients for coronavirus disease 2019 (COVID-19) to a nonemergent hospital procedure is scarce. In this report, we describe our experience in screening for SARS-CoV-2 prior to semiurgent and urgent hospital procedures.
Retrospective case series.
A single tertiary-care medical center.
Our study cohort included patients ≥18 years of age who had semiurgent or urgent hospital procedures or surgeries.
Overall, 625 patients were screened for SARS-CoV-2 using a combination of phone questionnaire (7 days prior to the anticipated procedure), RT-PCR and chest computed tomography (CT) between March 1, 2020, and April 30, 2020.
Of the 625 patients, 520 scans (83.2%) were interpreted as normal; 1 (0.16%) had typical features of COVID-19; 18 scans (2.88%) had indeterminate features of COVID-19; and 86 (13.76%) had atypical features of COVID-19. In total, 640 RT-PCRs were performed, with 1 positive result (0.15%) in a patient with a CT scan that yielded an atypical finding. Of the 18 patients with chest CTs categorized as indeterminate, 5 underwent repeat negative RT-PCR nasopharyngeal swab 1 week after their initial swab. Also, 1 patient with a chest CT categorized as typical had a follow-up repeat negative RT-PCR, indicating that the chest CT was likely a false positive. After surgery, none of the patients developed signs or symptoms suspicious of COVID-19 that would indicate the need for a repeated RT-PCR or CT scan.
In our experience, chest CT scanning did not prove provide valuable information in detecting asymptomatic cases of SARS-CoV-2 (COVID-19) in our low-prevalence population.
OBJECTIVES/GOALS: We use a tissue engineered, biomimetic, 3D model to study the pathogenesis of breast implant-associated anaplastic large cell lymphoma (BIA-ALCL) by comparing the effect of silicone implant shell on proliferation of patient-derived BIA-ALCL to its precursor T cells within the breast microenvironment. METHODS/STUDY POPULATION: Patient-derived breast tissue was processed for component adipocytes, ductal organoids, and stromal vascular fraction. These were suspended within 50 µl of 0.3% type I collagen matrix to which was added 200,000 cells/mL of either patient-derived BIA-ALCL cells or T progenitor cells. These were then plated into 6mm wells. As a control, both BIA-ALCL cells and T progenitor cells were suspended within type I collagen alone at the same seeding density without breast components. Before plating, wells were lined circumferentially with either textured, smooth, or no implant shell. These were 1cm by 2cm pieces dissected from the whole implant. Wells were imaged using confocal microscopy over 8 days. RESULTS/ANTICIPATED RESULTS: Unstimulated T progenitor cell count showed no significant increase in any of the conditions tested. The change in cell count over 8 days was 3.85% in each condition (p = 0.3352). A Tukey’s multiple comparison test comparing each condition revealed no significant increase in cell count over 8 days for all six conditions. Notably, our previous studies have shown proliferation of BIA-ALCL cells to be significantly more robust in the biomimetic platform compared to collagen-only groups, regardless of implant shell type (p < 0.01). BIA-ALCL cells grew nearly 30% faster in textured and smooth shell biomimetic groups compared to biomimetic wells lacking implant shell. DISCUSSION/SIGNIFICANCE OF IMPACT: Towards elucidating BIA-ALCL’s etiopathology, we show that silicone implant shell has a significant effect on proliferation of BIA-ALCL cells, but not their precursor T cells. If breast implant silicone shell is not a sufficient stimulus for T cell proliferation, co-stimulatory factors are required.
In 2018 Pearson et al. published a new sequence of annual radiocarbon (14C) data derived from oak (Quercus sp.) trees from Northern Ireland and bristlecone pine (Pinus longaeva) from North America across the period 1700–1500 BC. The study indicated that the more highly resolved shape of an annually based calibration dataset could improve the accuracy of 14C calibration during this period. This finding had implications for the controversial dating of the eruption of Thera in the Eastern Mediterranean. To test for interlaboratory variation and improve the robustness of the annual dataset for calibration purposes, we have generated a replicate sequence from the same Irish oaks at ETH Zürich. These data are compatible with the Irish oak 14C dataset previously produced at the University of Arizona and are used (along with additional data) to examine inter-tree and interlaboratory variation in multiyear annual 14C time-series. The results raise questions about regional 14C offsets at different scales and demonstrate the potential of annually resolved 14C for refining subdecadal and larger scale features for calibration, solar reconstruction, and multiproxy synchronization.
The Genomics Used to Improve DEpresssion Decisions (GUIDED) trial assessed outcomes associated with combinatorial pharmacogenomic (PGx) testing in patients with major depressive disorder (MDD). Analyses used the 17-item Hamilton Depression (HAM-D17) rating scale; however, studies demonstrate that the abbreviated, core depression symptom-focused, HAM-D6 rating scale may have greater sensitivity toward detecting differences between treatment and placebo. However, the sensitivity of HAM-D6 has not been tested for two active treatment arms. Here, we evaluated the sensitivity of the HAM-D6 scale, relative to the HAM-D17 scale, when assessing outcomes for actively treated patients in the GUIDED trial.
Outpatients (N=1,298) diagnosed with MDD and an inadequate treatment response to >1 psychotropic medication were randomized into treatment as usual (TAU) or combinatorial PGx-guided (guided-care) arms. Combinatorial PGx testing was performed on all patients, though test reports were only available to the guided-care arm. All patients and raters were blinded to study arm until after week 8. Medications on the combinatorial PGx test report were categorized based on the level of predicted gene-drug interactions: ‘use as directed’, ‘moderate gene-drug interactions’, or ‘significant gene-drug interactions.’ Patient outcomes were assessed by arm at week 8 using HAM-D6 and HAM-D17 rating scales, including symptom improvement (percent change in scale), response (≥50% decrease in scale), and remission (HAM-D6 ≤4 and HAM-D17 ≤7).
At week 8, the guided-care arm demonstrated statistically significant symptom improvement over TAU using HAM-D6 scale (Δ=4.4%, p=0.023), but not using the HAM-D17 scale (Δ=3.2%, p=0.069). The response rate increased significantly for guided-care compared with TAU using both HAM-D6 (Δ=7.0%, p=0.004) and HAM-D17 (Δ=6.3%, p=0.007). Remission rates were also significantly greater for guided-care versus TAU using both scales (HAM-D6 Δ=4.6%, p=0.031; HAM-D17 Δ=5.5%, p=0.005). Patients taking medication(s) predicted to have gene-drug interactions at baseline showed further increased benefit over TAU at week 8 using HAM-D6 for symptom improvement (Δ=7.3%, p=0.004) response (Δ=10.0%, p=0.001) and remission (Δ=7.9%, p=0.005). Comparatively, the magnitude of the differences in outcomes between arms at week 8 was lower using HAM-D17 (symptom improvement Δ=5.0%, p=0.029; response Δ=8.0%, p=0.008; remission Δ=7.5%, p=0.003).
Combinatorial PGx-guided care achieved significantly better patient outcomes compared with TAU when assessed using the HAM-D6 scale. These findings suggest that the HAM-D6 scale is better suited than is the HAM-D17 for evaluating change in randomized, controlled trials comparing active treatment arms.
Neighbourhood greenness or vegetative presence has been associated with indicators of health and well-being, but its relationship to depression in older adults has been less studied. Understanding the role of environmental factors in depression may inform and complement traditional depression interventions, including both prevention and treatment.
This study examines the relationship between neighbourhood greenness and depression diagnoses among older adults in Miami-Dade County, Florida, USA.
Analyses examined 249 405 beneficiaries enrolled in Medicare, a USA federal health insurance programme for older adults. Participants were 65 years and older, living in the same Miami location across 2 years (2010–2011). Multilevel analyses assessed the relationship between neighbourhood greenness, assessed by average block-level normalised difference vegetative index via satellite imagery, and depression diagnosis using USA Medicare claims data. Covariates were individual age, gender, race/ethnicity, number of comorbid health conditions and neighbourhood median household income.
Over 9% of beneficiaries had a depression diagnosis. Higher levels of greenness were associated with lower odds of depression, even after adjusting for demographics and health comorbidities. When compared with individuals residing in the lowest tertile of greenness, individuals from the middle tertile (medium greenness) had 8% lower odds of depression (odds ratio 0.92; 95% CI 0.88, 0.96; P = 0.0004) and those from the high tertile (high greenness) had 16% lower odds of depression (odds ratio 0.84; 95% CI 0.79, 0.88; P < 0.0001).
Higher levels of greenness may reduce depression odds among older adults. Increasing greenery – even to moderate levels – may enhance individual-level approaches to promoting wellness.
White mold caused by the fungus, Sclerotinia sclerotiorum is a devastating disease of soybean (Glycine max) and other leguminous crops, including dry bean (Phaseolus vulgaris). Previous research has demonstrated that no-till planting soybean into rolled–crimped cereal rye residue can enhance weed management, improve soil health and reduce labor requirements in organic production. However, there are limited data on the effects of cereal rye residue on white mold suppression in no-till planted soybean and dry bean. Two field trials were conducted in 2016–2017 (Year 1) and repeated in 2017–2018 (Year 2) to evaluate the potential of cereal rye cover crop residue to suppress white mold in these crops. In each trial (soybean and dry bean), the experimental design was a randomized complete block with two treatments: (1) rolled–crimped cereal rye residue and (2) no cover crop control. Treatment effects on plant population, biomass and yield components varied between the main crops. Compared with the control treatment, cereal rye residue reduced the incidence of white mold in soybean in both years and in dry bean in Year 2. The reduction in white mold in cereal rye residue plots was due to a combination of (1) decreased sclerotial germination (no stipes formed) and (2) increased nonfunctional sclerotial germination defined here as sclerotia that germinated but produced stipes without the expanded cup where asci containing ascospores are formed. Weed density and biomass were lower in cereal rye residue plots in soybean and dry bean, except in Year 1 in soybean when weed biomass was low in both treatments. Our findings indicate that cereal rye residue could help organic and conventional farmers manage white mold in no-till planted soybean and dry bean. Germination of sclerotia resulting in nonfunctional apothecia could potentially exhaust soilborne inoculum in the upper soil profile and reduce infections in subsequent crops.
Debussy and counterpoint: the statement initially sounds preposterous. Debussy is normally remembered as the quintessential harmonist, who flaunted the conventional rules of voice leading. When quizzed by his teacher Ernest Guiraud in 1889 or 1890, he famously denied that there was any need to resolve French sixth chords or any reason to avoid parallel triads. According to him: “pleasure is the law.” And when he addressed the same issues later in his career, Debussy went so far as to denounce the conventional distinction between so-called perfect and imperfect chords: “Nothing is more mysterious than a perfect chord! Despite all theories, both old and new, we are still not sure, first why it is perfect, and second, why the other chords have to bear the stigma of being imperfect. Music ought therefore to free itself as quickly as possible from these little rituals with which the conservatories insist on encumbering it.”
When Debussy denounced traditional rules of counterpoint, he challenged an area of music theory whose history dates back many centuries. Vast in scope, contrapuntal theory deals with the conditions under which voices can be stacked above or below one another. Traditional species counterpoint focuses on three main issues: how individual voices proceed from one note to the next (e.g., the predominance of steps over leaps); how stacked voices proceed in relation to one another (e.g., the role of oblique, contrary, similar, or parallel motion); and how far apart those voices should be stacked (e.g., the classification and treatment of consonances and dissonances). Learned counterpoint then considers the ways in which specific melodic lines can be transformed (e.g., transposition, inversion, retrograde, retrograde inversion, augmentation and diminution), restacked (e.g., invertible or double counterpoint at the octave, tenth, and twelfth), and staggered (e.g., strict canonic imitation, free fugal imitation, and stretto).
Although contrapuntal theory was originally developed to explain modal polyphony from the Middle Ages and Renaissance, it was subsequently adapted to cover tonal works from the common-practice period, including the scholastic fugue. These adaptations were necessary because the individual voices now operated within the context of functional harmony. In styles where the textures are controlled harmonically rather than intervallically, the individual lines can contain more frequent and more extreme leaps; in so-called compound melodies such leaps are created by shifting from one chord tone to another or from one “latent voice” to another.
If 2015 saw the largest wave of migration to Europe since the Second World War, 2016 might be defined as the year of migration policy responses. While the mass movement of human beings across the Mediterranean Sea and the southern borders of Europe garnered unprecedented global attention in 2015, many analysts turned their attention to the ensuing financial costs, not so much for the countries migrants left, but for the various Southern European and North Atlantic nations to which they fled. The extent of the financial crisis was particularly evident in the success of a popular referendum (colloquially called “Brexit”), which called for the United Kingdom to leave the European Union in order to gain greater control over its borders. Donald J. Trump echoed the rhetoric of crisis in his proposal to build a wall along the United States’ southern border—and then bill Mexico for its construction—as did UK Prime Minister David Cameron in his proposal to expel immigrants whose income was deemed inadequate for contributing to national prosperity. Gestures like these imply that the economic underdevelopment of the global South, the outbreak of civil war in relatively poor countries, and the rise of international terror organizations like the so-called Islamic State are problems for the world's largest economies to the degree that they strain national budgets, which are meant to serve national citizens. In the rhetoric of crisis, migrants are depicted as liabilities more often than fellow citizens of the world, and they are rarely depicted as potential new laborers, taxpayers, or innovators in the global North. As an article in the Atlantic suggested, the “crisis” was less about migration and more about the status of the modern welfare state and its relationship to the increasingly global movement of labor and the bodies that perform it. Indeed, at the heart of the Brexit decision, and the rise of other nationalist policy positions around the world, is an important question about who is responsible for ensuring that ordinary people have access to work and social security in an increasingly interconnected world. Are welfare states responsible only to their natural-born citizens? What about the external workers and resources upon which the economies of welfare states rely? The 2016 immigration crisis raises a critical and productive question for social and cultural theory: What is the relationship between the concept of the welfare state and the concept of borders?
Each year, Emergency Medical Services (EMS) personnel respond to over 30 million calls for assistance in the United States alone. These EMS personnel have a rate of occupational fatality comparable to firefighters and police, and a rate of non-fatal injuries that is higher than the rates for police and firefighters and much higher than the national average for all workers. In Australia, no occupational group has a higher injury or fatality rate than EMS personnel. Emergency Medical Services personnel in the US have a rate of occupational violence injuries that is about 22-times higher than the average for all workers. On average, more than one EMS provider in the US is killed every year in an act of violence.
The objective of this epidemiological study was to identify the risks and factors associated with work-related physical violence against EMS personnel internationally.
An online survey, based on a tool developed by the World Health Organization (WHO; Geneva, Switzerland), collected responses from April through November 2016.
There were 1,778 EMS personnel respondents from 13 countries; 69% were male and 54% were married. Around 55% described their primary EMS work location as “urban.” Approximately 68% described their employer as a “public provider.” The majority of respondents were from the US.
When asked “Have you ever been physically attacked while on-duty?” 761 (65%) of the 1,172 who answered the question answered “Yes.” In almost 10% (67) of those incidents, the perpetrator used a weapon. Approximately 90% of the perpetrators were patients and around five percent were patient family members. The influence of alcohol and drugs was prevalent. Overall, men experienced more assaults than women, and younger workers experienced more assaults than older workers.
In order to develop and implement measures to increase safety, EMS personnel must be involved with the research and implementation process. Furthermore, EMS agencies must work with university researchers to quantify agency-level risks and to develop, test, and implement interventions in such a way that they can be reliably evaluated and the results published in peer-reviewed journals.
MaguireBJ, BrowneM, O’NeillBJ, DealyMT, ClareD, O’MearaP. International Survey of Violence Against EMS Personnel: Physical Violence Report. Prehosp Disaster Med. 2018;33(5):526–531.
Utilising routine surveillance data, this study presents a method for generating a baseline comparison that can be used in future foodborne outbreak investigations following a case–case methodology. Salmonella and Campylobacter cases (2012–2015) from Maricopa County, AZ were compared to determine differences in risk factors, symptoms and demographics. For foods and other risk factors, adjusted odds ratios were developed using Campylobacter as the reference. Comparisons were also made for three major Salmonella subtypes, Typhimurium, Enteritidis and Poona as compared with Campylobacter. Salmonella cases were younger, while Campylobacter cases were more Hispanic and female. Campylobacter cases reported consuming peppers, sprouts, poultry, queso fresco, eggs and raw nuts more and reported contact with animal products, birds, visiting a farm or dairy, owning a pet, a sick pet, swimming in a river, lake or pond, or handling multiple raw meats more. Salmonella cases reported visiting a petting zoo and contact with a reptile more. There were significant variations by Salmonella subtype in both foods and exposures. We recommend departments conduct this analysis to generate a baseline comparison and a running average of relevant odds ratios allowing staff to focus on trace-back of contaminated food items earlier in the outbreak investigation process.
Recent modelling estimates up to two-thirds of new HIV infections among men who have sex with men occur within partnerships, indicating the importance of dyadic HIV prevention efforts. Although new interventions are available to promote dyadic health-enhancing behaviours, minimal research has examined what factors influence partners’ mutual engagement in these behaviours, a critical component of intervention success. Actor-partner interdependence modelling was used to examine associations between relationship characteristics and several dyadic outcomes theorised as antecedents to health-enhancing behaviours: planning and decision making, communication, and joint effort. Among 270 male-male partnerships, relationship satisfaction was significantly associated with all three outcomes for actors (p = .02, .02, .06 respectively). Latino men reported poorer planning and decision making (actor p = .032) and communication (partner p = .044). Alcohol use was significantly and negatively associated with all outcomes except actors’ planning and decision making (actors: p = .11, .038, .004 respectively; partners: p = .03, .056, .02 respectively). Having a sexual agreement was significantly associated with actors’ planning and decision making (p = .007) and communication (p = .008). Focusing on interactions between partners produces a more comprehensive understanding of male couples’ ability to engage in health-enhancing behaviours. This knowledge further identifies new and important foci for the tailoring of dyadic HIV prevention and care interventions.
The Taipan galaxy survey (hereafter simply ‘Taipan’) is a multi-object spectroscopic survey starting in 2017 that will cover 2π steradians over the southern sky (δ ≲ 10°, |b| ≳ 10°), and obtain optical spectra for about two million galaxies out to z < 0.4. Taipan will use the newly refurbished 1.2-m UK Schmidt Telescope at Siding Spring Observatory with the new TAIPAN instrument, which includes an innovative ‘Starbugs’ positioning system capable of rapidly and simultaneously deploying up to 150 spectroscopic fibres (and up to 300 with a proposed upgrade) over the 6° diameter focal plane, and a purpose-built spectrograph operating in the range from 370 to 870 nm with resolving power R ≳ 2000. The main scientific goals of Taipan are (i) to measure the distance scale of the Universe (primarily governed by the local expansion rate, H0) to 1% precision, and the growth rate of structure to 5%; (ii) to make the most extensive map yet constructed of the total mass distribution and motions in the local Universe, using peculiar velocities based on improved Fundamental Plane distances, which will enable sensitive tests of gravitational physics; and (iii) to deliver a legacy sample of low-redshift galaxies as a unique laboratory for studying galaxy evolution as a function of dark matter halo and stellar mass and environment. The final survey, which will be completed within 5 yrs, will consist of a complete magnitude-limited sample (i ⩽ 17) of about 1.2 × 106 galaxies supplemented by an extension to higher redshifts and fainter magnitudes (i ⩽ 18.1) of a luminous red galaxy sample of about 0.8 × 106 galaxies. Observations and data processing will be carried out remotely and in a fully automated way, using a purpose-built automated ‘virtual observer’ software and an automated data reduction pipeline. The Taipan survey is deliberately designed to maximise its legacy value by complementing and enhancing current and planned surveys of the southern sky at wavelengths from the optical to the radio; it will become the primary redshift and optical spectroscopic reference catalogue for the local extragalactic Universe in the southern sky for the coming decade.
Archaeological fieldwork preceding housing development revealed a Mesolithic site in a primary context. A central hearth was evident from a cluster of calcined flint and bone, the latter producing a modelled date for the start of occupation at 8220–7840 cal bc and ending at 7960–7530 cal bc (95% probability). The principal activity was the knapping of bladelets, the blanks for microlith production. Impact-damaged microliths indicated the re-tooling of hunting weaponry, while microwear analysis of other tools demonstrated hide working and butchery activity at the site. The lithics can be classified as a Honey Hill assemblage type on the basis of distinctive leaf-shaped microlithic points with inverse basal retouch.
Such assemblages have a known concentration in central England and are thought to be temporally intermediate between the conventional British Early and Late Mesolithic periods. The lithic assemblage is compared to other Honey Hill type and related Horsham type assemblages from south-eastern England. Both assemblage types are termed Middle Mesolithic and may be seen as part of wider developments in the late Preboreal and Boreal periods of north-west Europe. Rapid climatic warming at this time saw the northward expansion of deciduous woodland into north-west Europe. Emerging new ecosystems presented changes in resource patterns and the Middle Mesolithic lithic typo-technological developments reflect novel foraging strategies as adaptations to the new opportunities of Boreal forest conditions. While Honey Hill-type assemblages are seen as part of such wider processes their distinctive typological signature attests to autochthonous, regional developments of human groups infilling the landscape. Such cultural insularity may reflect changing social boundaries with reduction in mobility range and physical isolation caused by rising sea level and the creation of the British archipelago.
This article draws from “big-data” analysis of Netflix’s usage, which suggests that what spectators tend to like about films is inherently generic. Moreover, the process of liking serves as a metaphor, over and above the process of taking pleasure, for the ways in which spectators make texts meaningful rather than deriving meaning from them. The article then discusses some examples of African cultural production in order to focus attention on the category of analysis at stake in theorizing genre—a discussion that helps to distinguish genre’s thematic ontology from its material, formal, and stylistic features. Finally, at the intersection of spectator agency and theme, genre appears to be an “ideological impulse,” a way of relating to and encoding experience that begins with people and that they distribute over texts. This way of understanding genre, the article argues, may help scholars write more productively about the social nature of the concept.