To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Presents practical tips to help clinicians go from good to great in their approach to older patients. Reviews key skills, knowledge, and attitudes about older people that they probably didn’t learn in their training to add to their approach for better outcomes. Treat the person not the person’s age. Sit down. Talk slowly not loudly. Think broadly not algorithmically. Drugs, drugs, drugs. Go for a walk. Be a team player. Learn about frailty.
An effective, parsimonious way to treat patients who present with comorbid conditions and other complexities is to use process-based, generic CBT employing case conceptualization. This approach allows therapists to assess and target the patient’s maladaptive processes in functioning that may underlie several areas of diagnostic concern, and whose remediation may produce multiple clinical benefits. The case conceptualization serves as a road map to understand the patient’s subjective phenomenology, thus facilitating well-targeted interventions and abetting the therapeutic relationship. The case of Zina demonstrates how the patient’s avoidance strategies and maladaptive schemas played roles in her mood disorder (with suicidality), anxiety, eating disorder, substance use, and purging – and how all of these interfered with her life goals. The therapist prioritized Zina’s safety and attended closely to the therapeutic relationship. The case conceptualization helped illuminate ways to enhance Zina’s participation in treatment (including pharmacotherapy). Eighty sessions produced positive results.
Cognitive behavioral therapy (CBT) and exposure and response prevention (E/RP) remain the treatments of choice for OCD, but for many clients outcomes are suboptimal. In the first part of this chapter, we present the CBT approach to OCD, alongside four promising areas to inform and refine current interventions, namely, optimizing E/RP with inhibitory learning principles; understanding complexity in OCD, differences in disgust, and harm avoidance; and using imagery rescripting for clients with intrusive images. In the second part, we provide an updated CBT and E/RP approach to OCD that integrates these areas into the standard assessment and treatment protocol. The approach emphasizes the importance of a clear developmental formulation with links of past relevant experiences to current OCD, and understanding the context, function, and unintended consequences of obsessions and compulsions. OCD measures and screening tools are introduced.
It is important to be able to name the plants and animals in one’s environment, but knowing the names does not in and of itself advance the study of ecology. Frank Rigler argued that the species-oriented approach to studying ecology is intractable simply because of the time it would take to obtain enough information on each species to generalize to the community scale. Life on Earth can be named (or classified) in two complementary ways, using phylogeny and functional traits. Trait matrices provide the raw material for trait-based ecology. Compilations and screening are two distinct sources of data for trait matrices. Compilation of traits across studies is an important way of generating data for global-scale synthesis. Screening traits of local communities in the field or under standard conditions is the most effective way of generating quality data for local communities.
How do actors construct complexity? This chapter looks at the ways and extent of influence by epistemic communities and communities of practice in shaping the global agenda on environmental governance, in the shadow of concerns about complexity. Employing Emanuel Adler’s new theoretical framework it argues that environmental governance has emerged through a process of cognitive evolution heavily shaped by epistemic communities.
Research on complementizer selection has shown that the presence of a negative particle in a subordinate complement clause influences complement choice, leading to a relatively higher proportion of finite complementation patterns by increasing the complexity of the syntactic environment. Studies have also shown that different types of negation, namely not- and no-negation, increase the tendency towards more explicit complementation options (Rohdenburg 2015). The current study focuses on the effect of not- and no-negation on the complementation profile of the verb regret, which allows variation between finite that/zero-complement clauses and nonfinite (S) -ing clauses. The GloWbE corpus was used to create a data set of more than 4,000 examples from 16 varieties of English. The results of the analysis support previous findings that the presence of a negative marker in the complement clause increases the preference for finite patterns, especially in L2 varieties of English. However, contrary to the expectations of this study, no-negation was found to have a stronger effect on complement choice than not-negation.
Chapter 7 introduces the subject matter of artificial complexity. First, it presents examples of artificial complexity by means of cellular automata. It presents one-dimensional cellular automata following Wolfram’s rules, and a two-dimensional cellular automaton in the form of a spatial evolutionary game. Then it introduces the concepts of complexity and emergence, as used in the science of complexity, and discusses some issues related to their definition and measurement. Finally, it discusses the scope and controversies around the application of the concepts and models of the science of complexity in economics.
Climate is an emergent system with many interacting processes and components. Complexity is essential to accurately model the system and make quantitative predictions. But this complexity obscures the different compensating errors inherent in climate models. The Anna Karenina principle, which assumes that these compensating errors are random, is introduced. By using models with different formulations for small-scale processes to make predictions and then averaging them, we can expect to cancel out the random errors. This multimodel averaging can increase the skill of climate predictions, provided the models are sufficiently diverse. Climate models tend to borrow formulations from each other, which can lead to “herd mentality” and reduce model diversity. The need to preserve the diversity of models works against the need for replicability of results from those models. A compromise between these two conflicting goals becomes essential.
Strategic decision makers interpret information and translate it into organizational action through the lens of strategic schemas. How should firms realize high performance with various strategic schemas? Cognitive content and structure have been shown to underlie strategic schemas, but few studies have considered them together. This study employs aggregation analysis to clarify the interaction between cognitive content (technology orientation, market orientation) and structure (complexity, centrality) in affecting the firm performance (FP) of ‘hidden champion’ companies, identified by the Economy and Information Technology Department of Zhejiang Province, China. The empirical method applies fuzzy-set qualitative comparative analysis to generate strategic schema profiles for high FP. This exploratory study fills a gap in the literature on managerial cognition and provides key lessons from ‘hidden champion’ companies in China and their paths for small- and medium-sized enterprises to grow.
Samira Farwaneh interrogates the largely unquestioned and untested assumption in linguistics that languages are all equally complex. Again, the indelible mark of diglossia in Arabic on theorizing about language colours the analysis, specifically with the notion that the formal Arabic of writing and declamation is necessarily more complex than the natively spoken varieties of the language. Starting from the assumption, shared by native speakers of Arabic and many linguists studying Arabic alike, that spoken varieties are simplifications of a more structurally complex and presumably chronologically older Arabic, represented by the Arabic of classical writing and its modern written descendant, she demonstrates that Arab dialects are in some ways more structurally complex than the Arabic of writing, specifically respecting the tense, mood, and aspect systems of spoken Arabi, the manifestations of indefinite noun constructs and object marking, and specifically in the so-called ‘dialectal tanwin’, co-referential and ethical dative marking, and in negation.
This contribution surveys various large-scale quantitative techniques that have been utilized in the literature on varieties and dialects of English to determine their typological relatedness: (a) aggregative measures of distance or similarity, based on atlas or survey data; (b) typological profiling, a technique that draws on naturalistic text corpora to calculate usage- and frequency-based measures of grammatical analyticity and syntheticity; (c) a corpus-based method, inspired by work in information theory, that is designed to map out varieties based on how they differ in terms of language/dialect complexity: and (d) an approach to calculate distances between varieties as a function of the extent to which grammatical variation patterns in usage data are dissimilar.
Chapter 11 shows how figurative messaging can be used in the most optimal way in advertising campaigns and branding exercises. It provides recommendations designed to help practitioners to make informed choices about the use of figurative messaging in their campaigns and to anticipate possible outcomes and pitfalls. It covers issues such as the way in which metaphor works as a ‘disruptor’, how people experience and interact with metaphor, the different ways in which figurative messaging can be used creatively, how this relates to fast and slow thinking, and the ways in which figurative messaging can be made to appeal to different audiences.
This chapter challenges the binary contrast between ’myth’ and rational account (logos), reviewing the negative impact of the application of that dichotomy when used to draw contrasts between properly scientific modes of discourse and those to be dismissed as irrational. Ethnographic reports show that there is often no equivalent to our term ’myth’ in indigenous vocabularies, at least not one that carries similar pejorative undertones. The arguments of Lévi-Strauss that systems of myth may convey ’concrete science’ have the merit of taking those systems seriously, but still imply a pejorative binary judgement.
This chapter examines the varying roles that definitions may play in scientific investigations. Obviously they may laudably aim at clarifying the problem to be explored, but the demand and search for univocal definitions can have a limiting effect on the inquiry subsequently pursued. When a definition is presented as the goal of an investigation, for example of the characteristics of an animal species, that may have the effect of obscuring some of the complexities that may be uncovered along the way. The problem of the role of definitions in an axiomatic system such as Euclid’s lies in their presumed self-evidence.
Schizophrenia is a complex mental disorder, which has been recently conceptualized as a neurodevelopmental disease. This conceptualization has changed the psychopathological approach to schizophrenia, which is now described as lying on a continuum from mild psychotic experiences to frank psychotic episodes. According to this theory, the presence of psychotic symptoms would represent the final pathway of a complex dysregulation and interaction of different genetic and environmental risk factors. As regards genetic liability, recent genome-wide association studies have identified a total of 108 loci containing common risk alleles, and which meet genome-wide significance. As regards environmental factors, higher rates of schizophrenia have been found in ethnic minority groups, in persons who are heavy cannabis smokers, in those who suffered from severe childhood traumas, in persons who have been reared in highly deprived settings. The identification of risk factors associated with vulnerability to psychosis is essential for improving our understanding and early detection of vulnerable individuals, and to propose tailored and timely interventions for sufferers. There is the need for an interdisciplinary approach to schizophrenia which includes screening procedures for individuals reporting specific vulnerabilities and treatment strategies tailored on patients’ needs.
Attention deficit hyperactivity (ADHD) disorder is a common childhood neurodevelopmental disorder, and Methylphenidate (MPH) is a first-line therapeutic option for treating ADHD.However, how brain complexity and entropy changes with methylphenidate treatment the clinical implications of possible changes in entropy and the clinical implications of possible changes in entropy have yet to be studied.
This study aimed to reveal how the MPH treatment affects the complexity in the brain of children with ADHD by entropy-based qEEG analysis. In addition, the presence of the relationship between possible neurophysiological changes to be detected with clinical variables and how they are two other important questions of this study to be answered.
During eyes-open resting, EEG signals were recorded from 25 boys with ADHD-combined type before MPH administration and at the end of the 1st month of the treatment. Approximate entropy (ApEn), sample entropy (SampEn), permutation entropy (PermEn) were used to analyse.
A statistically significant decrease in entropy level was found with MPH treatment in the F4 channel according to approximate entropy (ApEn) and sample entropy (SampEn) analysis (p<0.05). In addition, according to permutation entropy (PermEn) analysis, the decrease in entropy with MPH treatment in the regions indicated by F3, F4, P4, T3, T6, and O2 channels was found to be statistically significant (p <0.05).
This is the first study to investigate how MPH treatment affects the complexity in the brain of children with ADHD. Entropy-based qEEG analysis may be a new method that can be used in diagnostic, clinical and prognostic predictions in ADHD.
To deal with the complexity caused by a constantly increasing need for product customization many companies have adopted a product modularization strategy. Product modularization has the potential to give benefits both in the design of products, in manufacturing as well as in the supply chain. But, it poses great challenges in its implementation, which includes complex decision-making that will affect the whole value-chain. The purpose of this paper is to describe how a game-based approach can be used for academic education, and management training, with the aim of improving decision-making in product modularization. This by visualizing, and practice, the complex interplay between product, manufacturing and supply chain architecture. The paper describes the development of the LEGO Exploratorium game set up, based on the LEGO minifigures, and how it has been used for both teaching engineering students and in company workshop. Using this game set up will increase companies' possibilities to develop modularized products that are designed for both efficient manufacturing and supply chain management.
Engineering designers seek to explore ‘real’ problems that must be solved across design processes. This exploration might be challenging in complex problem situations. An effective way of encouraging design exploration is conjecture-based problem exploration—informing problem re-interpretation by potential solutions. However, little evidence indicated how this process unfolds, especially in complex problem situations. This study addresses this question by articulating the underlying cognitive mechanism of conjecture-based problem exploration. Situated in a creative design practice that tackles real-world, complex problem situations, we employ grounded theory to conduct qualitative coding of interview transcripts and documents elicited from ten multidisciplinary graduate students. We developed a three-phase process model to explain conjecture-based problem exploration: (1) triggering through analogizing, inspiring, evaluating, and questioning; (2) transitioning to problem space expansion; and (3) resulting in problem focus adjustment incrementally or radically. Our explanation contributes to design theory building and encourages engineering designers to embrace a dynamic view of design problems when addressing complexity.