We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Chapter 1 provides some background on the shift in emphasis from Universal Grammar (UG) to third factors and gives a description of selected third factors, e.g. the Inclusiveness Condition and the Extension Condition. The main emphasis is on the Labeling Algorithm and the Principle of Determinacy. Generative models focus on the faculty of language as represented in the mind/brain. UG is the “system of principles, conditions, and rules” that all languages share through biological necessity. However, although UG received a lot of attention, recently principles “grounded in physical law” and the general “capacity to acquire knowledge” have been emphasized more. This chapter also introduces two main reasons of language change that are responsible for the linguistic cycle: those caused by economy and those by innovation.
In this pioneering study, a world-renowned generative syntactician explores the impact of phenomena known as 'third factors' on syntactic change. Generative syntax has in recent times incorporated third factors – factors not specific to the language faculty – into its framework, including minimal search, labelling, determinacy and economy. Van Gelderen's study applies these principles to language change, arguing that change is a cyclical process, and that third factor principles must combine with linguistic information to fully account for the cyclical development of 'optimal' language structures. Third Factor Principles also account for language variation around that-trace phenomena, CP-deletion, and the presence of expletives and Verb-second. By linking insights from recent theoretical advances in generative syntax to phenomena from language variation and change, this book provides a unique perspective, making it essential reading for academic researchers and students in syntactic theory and historical linguistics.
Chapter 3 provides an overview of a number of theoretical proposals that have been put forward in the literature to account for language variation. It elaborates on models that combine formal generative theorizing and quantitative sociolinguistic methodology, in line with current minimalist analyses (Adger & Smith 2005; Sessarego & Gutiérrez-Rexach 2011; Sessarego 2014a). This chapter also stresses the importance of embracing a perspective of mutual complementation – rather than mutual exclusion – between these two fields, especially when the varieties under study consist of stigmatized vernaculars, for which it may be hard to obtain reliable grammaticality judgments and that may be characterized by high levels of inter- and intra-speaker speech variability (Cornips & Poletto 2005).
In this paper I first worry that Rorty’s attack on various conceptions of “the world” has an alarming tendency to veer from opposition to the kind of realism that he associates with various philosophers, such as Plato, Descartes, or even Kant, into skepticism about ordinary activities including those of observing things and referring to them. I try to uncover the roots of this slide in various semantic doctrines, and explore the distinction between minimalist or deflationist theories of truth, and any wider, and less plausible general doctrine of semantic minimalism.
Laura Dean's creative output in minimalist art spans interconnected work in dance, music, and drawing. Throughout the early 1970s, Dean represented her compositional structures as works on paper, which present an expanded visualization of her artistic experimentation with color, symmetry, repetition, and form. Dean rejects the reconstruction of her performance works, instead she advances a notion of dances as impermanent. Situating Dean in the context of serial and conceptual art in which the material art object is deemphasized in favor of communicating compositional logic, I argue that Dean presents a choreographic legacy premised on the intentional disappearance of her work in favor of perpetuating ideation and concept.
This chapter begins (1.1) by looking at prescriptive and descriptive approaches to grammar, and at different sources of linguistic data. It goes on to discuss the approach to syntax in traditional grammar, looking at grammatical categories (1.2) and grammatical functions (1.3). 1.4 considers aspects of syntax which are potentially universal before going on to consider the nature of universals, the architecture of grammars, and the Strong Minimalist Thesis. 1.5 examines parameters of variation between languages, before turning to consider the role of parameter-setting in language acquisition, and outlining Principles and Parameters Theory (1.6). The chapter concludes with a summary (1.7), and a set of bibliographical notes (1.8). There is a free-to-download Students’ Workbook that includes a separate set of exercise material for each core section and a Students’ Answerbook. The free-to-download Teachers’ Answerbook provides detailed written answers for every single exercise example. The free-to-download Powerpoints provide a more vivid and visual representation of the material in each core section of the chapter.
This new edition of Andrew Radford's outstanding resource for students is a step-by-step, practical introduction to English syntax and syntactic principles, written by a globally-renowned expert in the field. Assuming little or no prior background in syntax, Radford outlines key concepts and how they can be used to describe various aspects of English sentence structure. Each chapter contains core modules focusing on a specific topic, a summary recapitulating the main points of the chapter, and a bibliographical section providing references to original source material. This edition has been extensively updated, with new analyses, exercise materials, references and a brand-new chapter on adjuncts. Students will benefit from the online workbook, which contains a vast amount of exercise material for each module, including self-study materials and a student answerbook for these. Teachers will value the extensive PowerPoints outlining module contents and the comprehensive teacher answerbook, which covers all workbook and PowerPoint exercises.
The problem of creeping minimalism concerns how to tell the difference between metaethical expressivism and its rivals given contemporary expressivists’ acceptance of minimalism about truth and related concepts. Explanationism finds the difference in what expressivists use to explain why ethical language and thought has the content it does. I argue that two recent versions of explanationism are unsatisfactory and offer a third version, subject matter explanationism. This view, I argue, captures the advantages of previous views without their disadvantages and gives us a principled and general characterisation of non-representational views about all kinds of language and thought.
The chapter begins by delineating the separate tasks of truthmaker theory and theories of truth. The two kinds of theories can be separated, and so are in principle distinct. However, history has not always treated them that way. It is proposed that one way of understanding the distinction between substantive and deflationary theories of truth is in terms of their contrasting relationship to truthmaking. It is then argued that truthmaking cannot be put to work in a theory of truth. Consequently, truthmaking motivates the rejection of substantive accounts of the property of truth. (It ultimately remains neutral regarding the substance of the concept of truth.) As a result, it is shown how correspondence theorists – traditional allies of the notion of truthmaking – are threatened by this book’s approach to truthmaking, whereas deflationists – who frequently see an opponent in the truthmaker theorist – have found a friend.
Does a bilingual person have two separate lexicons and two separate grammatical systems? Or should the bilingual linguistic competence be regarded as an integrated system? This book explores this issue, which is central to current debate in the study of bilingualism, and argues for an integrated hypothesis: the linguistic competence of an individual is a single cognitive faculty, and the bilingual mind should not be regarded as fundamentally different from the monolingual one. This conclusion is backed up with a variety of empirical data, in particular code-switching, drawn from a variety of bilingual pairs. The book introduces key notions in minimalism and distributed morphology, making them accessible to readers with different scholarly foci. This book is of interest to those working in linguistics and psycholinguistics, especially bilingualism, code-switching, and the lexicon.
Ths chapter introduces MacSwann’s (1999) model. It is a minimalist framework within separationist assumptions to the extent that it is claimed that the bilingual has two lexicons and two PFs. The chapter also presents two empirical challenges to this model: mixed selection and noun classes. Mixed selection refers to the empirical fact that an item from “one lexicon” may select for an item in “the other” lexicon. Under the label “noun classes,” I show that an English noun can be inserted into a Swahili discourse, in the process acquiring a noun class. Both these well-known features of code-switching are empirical problems for any theory that posits separate lexicons. The chapter ends with a brief discussion of Multiple Grammar Theory highlighting the points of contact and divergence with the integrationist approach.
This chapter is prompted by Coetzee’s longstanding interest in stories and storytelling, an interest that is registered across his critical essays and reviews, and thematized in several of his works. Focusing on In the Heart of the Country, The Master of Petersburg, and The Childhood of Jesus, as well as the computer poem ‘Hero and Bad Mother in Epic’, the chapter charts the relationship between the kinds of story that Coetzee has told – generally limited in the scope of their plots and the number of their principal characters – and the forms of narration he has adopted, which vary from the first-person character narration of certain of his early and middle fictions, to the tightly focalized external narration of his later works, to the dialogue-heavy and somewhat affectless narration of the Jesus novels. In each case, it is suggested that the particular form of narration is related to the particular truth with which the work in question seeks to confront its readers.
This chapter introduces the theoretical assumptions that ground the analyses in later chapters. I refer to this model as MDM: Minimalist Distributed Morhpology. It presents a minimalist syntax with emphasis on phases as cycles of syntactic derivation. Roots and categories are separated as distinct syntactic nodes and roots are reanalyzed as indices that link an Encyclopedia item with an exponent. Morpholoy is realizational, with an important role for impoverishment rules and vocabuary insertion rules. Code switching data is used to present these assumptions. The third module of the model is the Encyclopedia, where minimal syntactic structures find conceptual meaning.
Cormac McCarthy is the foremost American novelist to have simultaneously inspired and crafted screenplays which are successful in their own right. Beginning with his early script for The Gardener’s Son (1977) and continuing through both the Coen brothers’ film adaptation of No Country for Old Men (2007) and his own screenplay for The Counselor (2013), McCarthy has emerged as a formidable figure on both the page and on screen. Yet the intriguing aspect of this dual career is how fully his cinematic efforts have altered the trajectory of his novelistic creations, with an early verbal style that culminated in the famously baroque Blood Meridian honed decades later into a more elliptical, streamlined novelistic strain. And this becomes most clearly apparent in reviewing the successes (and failures) involved in adapting McCarthy’s astringent novelistic vision for the cinema. The review of seven distinct original screenplays and adaptations suggests that the arc of McCarthy’s novels cannot be adequately understood independent of his strong, developing commitment to cinematic possibilities, which have progressively altered a vision initiated as exclusively verbal and become increasingly tempered by visual and filmic considerations.
Hobbes’s On the Citizen discussed religion and church-state relations less fully than his later Leviathan. In Leviathan, he trenchantly attacked theories which granted the clergy power that was independent from that of the state and its sovereign. In On the Citizen, he expressed his views with greater moderation and circumspection. Modern scholars debate whether Hobbes changed his ideas or just his tone between the two books. This chapter discusses the evidence for and against the claim that On the Citizen put forward relatively conventional views on the relationship between the powers of the state and the church, and that it was only in Leviathan that he abandoned a theory that was close to orthodox Anglicanism, and characteristic of royalists at the time of the English Civil War. The chapter examines what Hobbes said in On the Citizen, and also discusses the ideas of some of his contemporaries. It notes that the book soon encountered criticism for its contentions concerning religion and church-state relations, and especially for granting the sovereign too great power over the church and the clergy. It argues that the theory presented in On the Citizen is not so very distant from that which Hobbes espoused in Leviathan.
The introduction discusses the changes in consumers' preferences and the symbolic and practical decline of ownership. Young consumers, especially millennials, prefer experiences over things, comfort and ease of use over ownership. It then presents the sharing economy phenomenon, renames it "the access economy" and introduces the main argument of the book: the sharing economy pushes for a mobile and flexible vision of engaging with possessions and, as a result, with other people. It then places this argument within the broader context of property discourse.
One of the main problems that Paul Horwich’s Minimalist theory of truth must face is the generalization problem, which shows that Minimalism is too weak to have the fundamental explanatory role Horwich claims it has. In this paper, I defend Horwich’s response to the generalization problem from an objection raised by Bradley Armour-Garb. I also argue that, given my response to Armour-Garb, Horwich’s proposal to cope with the generalization problem can be simplified.
In this paper I discuss the so-called problem of creeping minimalism, the problem of distinguishing metaethical expressivism from its rivals once expressivists start accepting minimalist theories about truth, representation, belief, and similar concepts. I argue that Dreier's ‘explanation’ explanation is almost correct, but by critically examining it we not only get a better solution, but also draw out some interesting results about expressivism and non-representationalist theories of meaning more generally.
Traditional expressivists want to preserve a contrast between the representational use of declarative sentences in descriptive domains and the non-representational use of declarative sentences in other areas of discourse (in particular, normative speech). However, expressivists have good reasons to endorse minimalism about representational notions, and minimalism seems to threaten the existence of such a bifurcation. Thus, there are pressures for expressivists to become global anti-representationalists. In this paper I discuss how to reconstruct in non-representationalist terms the sort of bifurcation traditional expressivists were after. My proposal is that the relevant bifurcation can be articulated by appeal to the contrast between relativistic and non-relativistic assertoric practices. I argue that this contrast, which can be specified without appeal to representational notions, captures the core intuitions behind the expressivist bifurcation (in particular, it captures the anti-realist intuitions motivating many expressivist proposals).
Economists are notoriously averse to paternalism. Happiness-driven economics (HDE) has been widely accused of paternalism, particularly by friends of minimalism. The question of paternalism in HDE raises broader questions about the potential for paternalism in economic policy analysis. This chapter begins with a characterization of minimalism. Then, using a broad definition of paternalism, the chapter examines the anti-paternalist credentials of minimalism and find them wanting. It considers how policy-makers might avoid, or at least minimize, paternalism, arguing that HDE should be part of a less paternalistic approach to policy analysis. Minimalist cost-benefit analysis (MCBA)-based policy threatens paternalism not just for farmers and fishermen, but for those subject to any policy with highly disruptive effects on people's lives. The central charge against minimalist economics is moral incoherence, specifically where that framework extends to the policy realm for weighing the costs and benefits of policy options.