To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To determine the burden of skin and soft tissue infections (SSTI), the nature of antimicrobial prescribing and factors contributing to inappropriate prescribing for SSTIs in Australian aged care facilities, SSTI and antimicrobial prescribing data were collected via a standardised national survey. The proportion of residents prescribed ⩾1 antimicrobial for presumed SSTI and the proportion whose infections met McGeer et al. surveillance definitions were determined. Antimicrobial choice was compared to national prescribing guidelines and prescription duration analysed using a negative binomial mixed-effects regression model. Of 12 319 surveyed residents, 452 (3.7%) were prescribed an antimicrobial for a SSTI and 29% of these residents had confirmed infection. Topical clotrimazole was most frequently prescribed, often for unspecified indications. Where an indication was documented, antimicrobial choice was generally aligned with recommendations. Duration of prescribing (in days) was associated with use of an agent for prophylaxis (rate ratio (RR) 1.63, 95% confidence interval (CI) 1.08–2.52), PRN orders (RR 2.10, 95% CI 1.42–3.11) and prescription of a topical agent (RR 1.47, 95% CI 1.08–2.02), while documentation of a review or stop date was associated with reduced duration of prescribing (RR 0.33, 95% CI 0.25–0.43). Antimicrobial prescribing for SSTI is frequent in aged care facilities in Australia. Methods to enhance appropriate prescribing, including clinician documentation, are required.
We provide detailed contextual information on 25 14C dates for unusually well-preserved archaeological and paleontological remains from Daisy Cave. Paleontological materials, including faunal and floral remains, have been recovered from deposits spanning roughly the past 16,000 yr, while archaeological materials date back to ca. 10,500 BP. Multidisciplinary investigations at the site provide a detailed record of environmental and cultural changes on San Miguel Island during this time period. This record includes evidence for the local or regional extinction of a number of animal species, as well as some of the earliest evidence for the human use of boats and other maritime activities in the Americas. Data from Daisy Cave contribute to a growing body of evidence that Paleoindians had adapted to a wide variety of New World environments prior to 10,000 PB. Analysis of shell-charcoal pairs, along with isotopic analysis of associated marine shells, supports the general validity of marine shell dating, but also provides evidence for temporal fluctuations in the reservoir effect within the Santa Barbara Channel region.
The need for higher energy density batteries has spawned recent renewed interest in alternatives to lithium ion batteries, including multivalent chemistries that theoretically can provide twice the volumetric capacity if two electrons can be transferred per intercalating ion. Initial investigations of these chemistries have been limited to date by the lack of understanding of the compatibility between intercalation electrode materials, electrolytes, and current collectors. This work describes the utilization of hybrid cells to evaluate multivalent cathodes, consisting of high surface area carbon anodes and multivalent nonaqueous electrolytes that are compatible with oxide intercalation electrodes. In particular, electrolyte and current collector compatibility was investigated, and it was found that the carbon and active material play an important role in determining the compatibility of PF6-based multivalent electrolytes with carbon-based current collectors. Through the exploration of electrolytes that are compatible with the cathode, new cell chemistries and configurations can be developed, including a magnesium-ion battery with two intercalation host electrodes, which may expand the known Mg-based systems beyond the present state of the art sulfide-based cathodes with organohalide-magnesium based electrolytes.
Persons who develop tuberculosis (TB) may have subtle immune defects that could predispose to other intracellular bacterial infections (ICBIs). We obtained data on TB and five ICBIs (Chlamydia trachomatis, Salmonella spp., Shigella spp., Yersinia spp., Listeria monocytogenes) reported to the Tennessee Department of Health, USA, 2000–2011. Incidence rate ratios (IRRs) comparing ICBIs in persons who developed TB and ICBIs in the Tennessee population, adjusted for age, sex, race and ethnicity were estimated. IRRs were not significantly elevated for all ICBIs combined [IRR 0·87, 95% confidence interval (CI) 0·71–1·06]. C. trachomatis rate was lowest in the year post-TB diagnosis (IRR 0·17, 95% CI 0·04–0·70). More Salmonella infections occurred in extrapulmonary TB compared to pulmonary TB patients (IRR 14·3, 95% CI 1·67–122); however, this appeared to be related to HIV co-infection. TB was not associated with an increased risk of other ICBIs. In fact, fewer C. trachomatis infections occurred after recent TB diagnosis. Reasons for this association, including reduced exposure, protection conferred by anti-TB drugs or macrophage activation by Mycobacterium tuberculosis infection warrant further investigation.
Common sources of shiga toxin-producing Escherichia coli (STEC) O157 infection have been identified by investigating outbreaks and by case-control studies of sporadic infections. We conducted an analysis to attribute STEC O157 infections ascertained in 1996 and 1999 by the Foodborne Diseases Active Surveillance Network (FoodNet) to sources. Multivariable models from two case-control studies conducted in FoodNet and outbreak investigations that occurred during the study years were used to calculate the annual number of infections attributable to six sources. Using the results of the outbreak investigations alone, 27% and 15% of infections were attributed to a source in 1996 and 1999, respectively. Combining information from both data sources, 65% of infections in 1996 and 34% of infections in 1999 were attributed. The results suggest that methods to incorporate data from multiple surveillance systems and over several years are needed to improve estimation of the number of illnesses attributable to exposure sources.
There is increasing emphasis on the need for effective ways of sharing knowledge to enhance environmental management and sustainability. Knowledge exchange (KE) are processes that generate, share and/or use knowledge through various methods appropriate to the context, purpose, and participants involved. KE includes concepts such as sharing, generation, coproduction, comanagement, and brokerage of knowledge. This paper elicits the expert knowledge of academics involved in research and practice of KE from different disciplines and backgrounds to review research themes, identify gaps and questions, and develop a research agenda for furthering understanding about KE. Results include 80 research questions prefaced by a review of research themes. Key conclusions are: (1) there is a diverse range of questions relating to KE that require attention; (2) there is a particular need for research on understanding the process of KE and how KE can be evaluated; and (3) given the strong interdependency of research questions, an integrated approach to understanding KE is required. To improve understanding of KE, action research methodologies and embedding evaluation as a normal part of KE research and practice need to be encouraged. This will foster more adaptive approaches to learning about KE and enhance effectiveness of environmental management.
Carbon nanotubes (CNTs) offer great potential for advanced sensing devices due to their unique electronic transport properties. However, a significant obstacle to the realization of practical CNT devices is the formation of controlled, reliable and reproducible CNT to metallic contacts. In this work, a procedure for the deposition and alignment of CNTs onto metallic electrodes using chemically functionalized lithographic patterns is reported. This method uses photo and electron beam lithography to pattern simple Cr/Au thin film circuits on oxidized Si substrates. The circuits are then re-patterned with a self-assembled monolayer (SAM) of 3-aminopropyltriethoxysilane (APTES) to specify desired CNT locations between electrodes. The application of an electric field to the metallic contacts during the deposition of solution suspended single walled CNTs causes alignment of the CNTs in the field direction. This method consistently produces aligned CNTs in the defined locations.
A dual ion beam deposition system was used to deposit thin films of CNx from a carbon target. A 1 keV nitrogen ion beam from a 3 cm Kaufman source was used to sputter carbon from a graphite target, and a second nitrogen ion beam of 50 eV, from an RF ion source, was used to bombard the growing film with nitrogen ions. Using this technique, rather than direct ion beam deposition from methane, it is possible to reduce the amount of hydrogen in these films to less than 5% (atomic), and to boost the nitrogen content to over 30%. These films were then subjected to isochronal heating up to 900°C to determine the stability of the films as compared to those with much higher concentrations of hydrogen.
CNx is a material that is difficult to fabricate without the inclusion of large amounts of hydrogen. A high hydrogen content has the tendency to make the material sensitive to property changes as it is heated over 200°C. Concomitant with a loss of hydrogen is the loss of nitrogen.
In the films that had lower amounts of hydrogen it was found that the loss of nitrogen during heating was delayed until higher temperatures were reached. However, instead of hydrogen being evolved during heating, the amount of hydrogen in the film increases, reaching a maximum concentration of ∼45% at 800°C.
Unbalanced magnetron sputtering deposition of CNx Hy films has been performed with various levels of negative substrate bias and with different flow rates of nitrogen and hydrogen. Argon was used as a sputtering gas and formed the majority of the gas in the plasma. The elemental concentrations of the films were measured in samples deposited on glassy carbon with a 2.2 MeV of He beam used to perform simultaneous RBS and ERS. Argon was found to be trapped in the non-hydrogenated films to a level of up to ∼ 4.6 %. The concentration of argon increased for the films deposited under higher negative bias. With the introduction of hydrogen, argon trapping was first minimized and later completely eliminated, even at higher bias conditions, suggesting that the softness of the films brought on by hydrogenation also caused the films to be unable to trap argon during growth and thus showing that argon stability is dependent on burial below a surface of particular structural properties.
Several conditions that allow the preservation, storage and rapid, efficient recovery of viable Acanthamoeba castellanii organisms were investigated. The viability of trophozoites (as determined by time to confluence) significantly declined over a period of 12 months when stored at −70°C using dimethyl sulfoxide (DMSO; 5 or 10%) as cryopreservant. As A. castellanii are naturally capable of encystment, studies were undertaken to determine whether induced encystment might improve the viability of organisms under a number of storage conditions. A. castellanii cysts stored in the presence of Mg2+ at 4°C remained viable over the study period, although time to confluence was increased from approximately 8 days to approximately 24 days over the 12-month period. Storage of cysts at −70°C with DMSO (5 or 10%) or 40% glycerol, but not 80% glycerol as cryopreservants increased their viability over the 12-month study period compared with those stored at room temperature. Continued presence of Mg2+ in medium during storage had no adverse effects and generally improved recovery of viable organisms. The present study demonstrates that A. castellanii can be stored as a non-multiplicative form inexpensively, without a need for cryopreservation, for at least 12 months, but viability is increased by storage at −70°C.
What biological factors make human communication possible? How do we process and understand language? How does brain damage affect these mechanisms, and what can this tell us about how language is organized in the brain? The field of neurolinguistics seeks to answer these questions, which are crucial to linguistics, psychology and speech pathology alike. This textbook, first published in 2007, introduces the central topics in neurolinguistics: speech recognition, word and sentence structure, meaning, and discourse - in both 'normal' speakers and those with language disorders. It moves on to provide a balanced discussion of key areas of debate such as modularity and the 'language areas' of the brain, 'connectionist' versus 'symbolic' modelling of language processing, and the nature of linguistic and mental representations. Making accessible over half a century of scientific and linguistic research, and containing extensive study questions, it will be welcomed by all those interested in the relationship between language and the brain.
In the previous chapter we outlined two opposing theories of the role that syntactic processing plays in sentence comprehension. According to one view – the modular theory, inspired by early psycholinguistic attempts to apply Chomsky's generative grammar – a specialized syntactic parser assigns grammatical structure to an input sentence, yielding an intermediate representation which strongly constrains the assignment of meaning, but which needs to be further operated upon by interpretive (semantic and pragmatic) processes to yield the full meaning of the utterance. According to the opposing view, dubbed the interactive model, sentence meanings are assigned incrementally to word sequences as soon as they are identified, making maximal use of whatever constraints can be applied from the speakers' tacit knowledge of the grammar of their language, pragmatic knowledge and expectations, or even collocational restrictions on word usage (such as habitual phrases or idioms). Sometimes these cues will conflict, in which case constraints may compete to produce local ambiguities which are usually resolved by further input.
In principle, it should be possible to decide between these opposing models (or some intermediate theory between the two) if we had some means of observing changes in state of the language processor as it steps through the input sentence in real time. We may never fully achieve this privileged perspective, but over the past two or three decades a variety of ‘on-line’ techniques, based initially upon behavioural reaction time measurements and latterly upon functional neural imaging techniques, have been devised, which arguably enable us to observe local fluctuations in ‘processing load’, as sentences are judged or comprehended in real time.
Thus far, we have not entirely neglected but certainly down-played the role of the lexicon in speech perception. In chapters 5 and 6 we sought to make a case that speech recognizers must be able to build phonological representations of possible word forms, purely on the basis of acoustic phonetic input. Otherwise, it is difficult to account for the robustness and flexibility of our ‘bottom-up’ speech recognition capabilities. But it is also true that the goal of speech recognition is to identify words in the service of understanding whole utterances, and that there are a host of ‘top-down’ lexical, semantic and discourse effects that arise as a consequence of lexical retrieval mechanisms. Such effects express themselves in (a) the different ways that we respond perceptually to words (e.g. kelp) versus non-words (whether pronounceable like klep – a possible word – or phonotactically illegal, like tlep), (b) neighbourhood effects, arising from the fact that particular words vary in the number of phonologically near neighbours that compete for matching to the acoustic signal, and (c) other effects, such as phoneme restoration (see below), which may or may not be lexical in origin, but nevertheless require explanation.
The account given in previous chapters has characterized speech perception as an active process whereby phonological forms are constructed from speech-specific (phonetic) features in the acoustic signal, via the application of specialized perceptual analysers that exploit tacit knowledge of the sound pattern of the language and the sound production constraints of the human vocal tract.
In the two preceding chapters, we have explored in a preliminary way two different paths to understanding the human ‘language faculty’ (Chomsky, 1965; Jackendoff, 1997) or our capacity for spoken language communication. The linguistic approach seeks to isolate and describe the elements of a system of spoken communication by studying varieties of linguistic expressions in the world's languages and human language in general. The neuropathological approach examines types of language breakdown in response to brain damage of various kinds. It is hoped that the search for parallels or correspondences in these two very different domains will yield empirical constraints on a theory of language that could not otherwise be discovered if these two strands of inquiry were conducted in isolation from one another. For example, a fundamental distinction that grammarians draw between lexis and rule in the architecture of the language faculty may turn out to have a correspondence – or not – in the classification of language pathologies, reflecting the organization of language capacities in the human brain. We have already provided you with some classical findings from these two domains, which provides at least a foundation for speculation and further inquiry.
However, it is time to draw some critical methodological distinctions in the interests of making our search for correspondences and a cross-disciplinary theory of language more precise. The distinctions that we draw here will anticipate issues discussed more fully in subsequent chapters.
In the previous chapter, we described language as ‘the most complex of human artefacts’. In this chapter we shall flesh out this claim with an overview of the major components of the linguistic system from the perspective of the linguist. Broadly speaking, any language may be viewed from three complementary perspectives: (a) as an internalized body of ‘tacit’ knowledge, (b) as a social construction or set of conventions shared by a language community or (c) as a natural object ‘out there’ in the external world (the ‘E-language’). The internal view of language (or ‘I-language’, as it is sometimes referred to by Chomskian linguistics) is clearly the most relevant perspective for the concerns of this book. The I-language consists of the personal knowledge base that each speaker of the language carries around in his/her head as to how meanings or intentions may be encoded in linguistic expressions. Language users rely on their I-language to interpret or decode other speakers' linguistic expressions and to encode their own meanings in the expressions that they produce. The I-language is usually thought of as a personal dictionary of word meanings and rules for utterance construction; an internal grammar, which we know how to use but cannot easily describe.
Because each speaker acquires language under unique circumstances, I-language grammars may vary somewhat from one speaker to another, in ways that are probably mostly inconsequential for communication between members of a speech community.
As we indicated in the previous chapter, a breakdown at the discourse level of language comprehension would be expected to reveal itself in difficulties of reference retrieval and failure to successfully construct and maintain a mental model that serves the interlocutors engaged in a particular discourse. Discourse construction, insofar as it involves formulating communicative intentions, reference management and taking account of the listener's perspective, places high demands on working memory and attentional resources. Deficits in these higher cognitive abilities are likely to result in violations of the Gricean pragmatic felicity conditions mentioned in the previous chapter. The spoken language which results from poor discourse model construction or management may manifest itself in incoherent or bizarre speech that is likely to be characterized as ‘thought disordered’ in the psychiatric literature (Andreasen, 1982).
Thought disorder is traditionally clinically characterized in terms of either ‘looseness or bizarreness of association’ between ideas, or as an absence of appropriate expressions which enable the listener to construct a coherent model of what the speaker is talking about. The term formal thought disorder is often used specifically to indicate that what is being referred to is the ‘form’ of thought or its overt expression, and not necessarily a pathology of an underlying cognitive process or condition, which might nevertheless be responsible for the production of thought disordered speech.
There has been much debate about the underlying cognitive pathology of thought disordered speech. The symptom is most closely identified with schizophrenia in its acute phase.
This chapter seeks to ‘let the brain do the talking’ about how it organizes itself for language. Our approach is consistent with the co-evolution hypothesis of chapter 1, and a long-established principle that biological systems evolve new capabilities by reconfiguring or adding an emergent layer of control upon systems already evolved to serve more basic and often quite unrelated biological functions. Thus, three functionally distinct systems for breathing, coughing (expelling foreign bodies from the windpipe) and deglutition (chewing and swallowing food) were harnessed into a single co-ordinated system for controlling the airstream, voicing and articulation mechanisms for the emergent function of speech production. Similarly, human language capabilities most likely emerged as a reconfiguration of pre-linguistic (or pre-symbolic) systems of perceptual representation, memory and response planning, which in turn evolved from more primitive sensory-motor (stimulus–response) control systems.
Of course, the brain cannot speak for itself, so we are obliged to adopt the next best course and view our subject matter from the perspective of those whose principal concern was/is the understanding of the brain and who were bold (or foolish) enough to extend their inquiries to the question of how the brain represents language. We begin by reviewing the classical clinical findings from the history of aphasiology to acquaint the reader with the major symptom clusters of speech and language disorder and to provide a first-approximation model of how language may be represented in the brain.