To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Hala Sultan Tekke is a large Bronze Age city located on the southeastern littoral of Cyprus. The city flourished from approximately 1650 BC to 1150 BC according to the archaeological evidence. Since 2010, Swedish excavations have exposed four new city quarters (CQ1–4) with three occupational phases, the 14C dating of which is of highest importance also for other contemporaneous cultures. The finds demonstrate vast intercultural connections in the Mediterranean and even with southern Scandinavia. In 2014, roughly 500 m to the east of CQ1, one of the richest cemeteries on the island was discovered. According to the archaeological evidence, the finds from the city date mainly to the 13th and 12th centuries BC. However, many of the wealthy tombs and the offering pits from the cemetery are considerably older with the oldest finds dating to the 16th century BC. This raises the question where the city quarters belonging to the oldest finds from the cemetery are situated. The radiocarbon (14C) dates from Hala Sultan Tekke have much influence on the dating of related sites because of numerous imports from a vast area. We present here new 14C data obtained in the course of the current excavations, which add to sets of already existing data.
We report on a case study of radiocarbon (14C) measurements applied to three pairs of tusks from African elephants, which were supposedly hunted in the 1960s in Tanzania and/or Kenya. The 14C results of 1.40 to 1.60 F14C fall into 14C bomb peak values between 1960 and 1975, thus confirming the suspected hunting time. Since the trading of ivory from African elephants killed after 1989 was banned by the international CITES convention, the investigated tusks are not affected by this ban.
Hydrilla is an invasive aquatic plant that has rapidly spread through many inland water bodies across the globe by outcompeting native aquatic plants. The negative impacts of hydrilla invasion have become a concern for water resource management authorities, power companies, and environmental scientists. The early detection of hydrilla infestation is very important to reduce the costs associated with control and removal efforts of this invasive species. Therefore, in this study, we aimed to develop a tool for rapid, frequent, and large-scale monitoring and predicting spatial extent of hydrilla habitat. This was achieved by integrating in situ and Landsat 8 Operational Land Imager satellite data for Lake J. Strom Thurmond, the largest US Army Corps of Engineers lake east of the Mississippi River, located on the border of Georgia and South Carolina border. The predictive model for presence of hydrilla incorporated radiometric and physical measurements, including remote-sensing reflectance, Secchi disk depth (SDD), light-attenuation coefficient (Kd), maximum depth of colonization (Zc), and percentage of light available through the water column (PLW). The model-predicted ideal habitat for hydrilla featured high SDD, Zc, and PLW values, low values of Kd. Monthly analyses based on satellite images showed that hydrilla starts growing in April, reaches peak coverage around October, begins retreating in the following months, and disappears in February. Analysis of physical and meteorological factors (i.e., water temperature, surface runoff, net inflow, precipitation) revealed that these parameters are closely associated with hydrilla extent. Management agencies can use these results not only to plan removal efforts but also to evaluate and adapt their current mitigation efforts.
In many optimization problems arising from scientific, engineering and artificial intelligence applications, objective and constraint functions are available only as the output of a black-box or simulation oracle that does not provide derivative information. Such settings necessitate the use of methods for derivative-free, or zeroth-order, optimization. We provide a review and perspectives on developments in these methods, with an emphasis on highlighting recent developments and on unifying treatment of such problems in the non-linear optimization and machine learning literature. We categorize methods based on assumed properties of the black-box functions, as well as features of the methods. We first overview the primary setting of deterministic methods applied to unconstrained, non-convex optimization problems where the objective function is defined by a deterministic black-box oracle. We then discuss developments in randomized methods, methods that assume some additional structure about the objective (including convexity, separability and general non-smooth compositions), methods for problems where the output of the black-box oracle is stochastic, and methods for handling different types of constraints.
Objectives: The Tower of London (TOL) test has probably become the most often used task to assess planning ability in clinical and experimental settings. Since its implementation, efforts were made to provide a task version with adequate psychometric properties, but extensive normative data are not publicly available until now. The computerized TOL-Freiburg Version (TOL-F) was developed based on theory-grounded task analyses, and its psychometric adequacy has been repeatedly demonstrated in several studies but often with small and selective samples. Method: In the present study, we now report reliability estimates and normative data for the TOL-F stratified for age, sex, and education from a large population-representative sample collected in the Gutenberg Health Study in Mainz, Germany (n=7703; 40–80 years). Results: The present data confirm previously reported adequate indices of reliability (>.70) of the TOL-F. We also provide normative data for the TOL-F stratified for age (5-year intervals), sex, and education (low vs. high education). Conclusions: Together, its adequate reliability and the representative age-, sex-, and education-fair normative data render the computerized TOL-F a suitable diagnostic instrument to assess planning ability. (JINS, 2019, 25, 520–529)
Paramphistomosis, caused by Calicophoron daubneyi, is an emerging infection of ruminants throughout Western Europe. Despite its prevalence, many questions remain regarding the basic biology of this parasite and how it interacts with its host. Consequently, there is a need to develop methods to study C. daubneyi in vitro to improve our understanding of rumen fluke biology. Towards this, we aimed to identify a suitable protocol for in vitro excystment of C. daubneyi metacercariae. Six methods that have been used to excyst metacercariae from a number of trematode species were tested with C. daubneyi metacercariae. Three of these achieved an average of >50% excystment whilst one method, which included an acid-pepsin treatment, incubation in reducing conditions and an alkaline/bile salt solution to activate the larvae, consistently gave >80% excystment. The latter protocol also showed no detrimental effect on the motility of newly excysted juvenile (NEJ) parasites when observed for up to 24 h in RPMI 1640 medium post-excystment. The successful production of C. daubneyi NEJs in vitro is a significant step forward, and will enable the discovery of infective stage-specific parasite antigens and facilitate drug screening trials, to aid the development of much needed diagnostic and therapeutic options for paramphistomosis.
Zinc is an essential trace element necessary for the activity of numerous enzymes. Supplemental zinc is considered normal for ruminant livestock to ensure that requirements are met. Although zinc deficiency is not generally recognised in the UK, there is considerable evidence that this supplemental zinc is beneficial. The aim of this study was to investigate the effect of partially replacing zinc oxide with a zinc proteinate in the diet of ewes in late pregnancy and lactation on performance and health of ewes and lambs.
There are many rationing models used commercially for evaluating diets fed to dairy cows. A new model – BioParaMilk – uses a unique protein degradation model to determine microbial protein synthesis, based upon the in vitro gas production technique (IVGPT). Optigen®, a slow release, blended, non-protein nitrogen source, can partially replace soyabean meal (SBM) in a dairy diet. The partial replacement of soyabean meal and rapeseed meal with Optigen® has been shown to increase fibre digestion and may improve volatile fatty acid (VFA) and microbial nitrogen (N) flow in the rumen (Sinclair et al., 2008). The purpose of this study was to evaluate the protein degradation curve from Optigen® compared to SBM for use in this new model, using IVGPT.
The introduction of the Single Farm Payment support system sees a change from headage to area payments. The removal of the Beef Special Premium for steers is likely to see a move towards either 12-15 month intensive finishing systems or low input extensive grass based 24-30 month finishing systems. Late maturing breed type cattle reared on the latter system may however require a 2-3 month intensive finishing period to achieve adequate fat cover. With falling cereal prices there is increased interest in their use in beef cattle rations. Antibiotic based feed additives e.g., monensin sodium, have been successfully used for over 40 years to manipulate microbial activity and improve beef cattle performance. The use of monensin sodium will be banned from January 2006 and there is therefore a requirement to find alternative ‘natural’ products that can improve the efficiency of beef production with intensive cereal based rations. Yeast cultures are composed of yeast (Saccharomyces cerevisiae ) and the medium on which it was grown. These products are dried in a manner which preserves the fermenting activity of the yeast. It is suggested that production responses associated with the use of live yeast culture supplements in ruminants may be related to their stimulatory effects on specific groups of micro-organisms in the rumen. The objective of this study was to determine the effect of feeding a live yeast culture (Yea-Sacc1026) on the performance of cereal fed beef cattle.
There has been growing public concern with regard to the effects of heavy metals excreted by animals on the environment. Heavy metals can be lost from manure and slurry to ground and surface waters, and subsequently affect water quality (Van Horn et al., 1996). In regions of high domestic livestock production, farmers are therefore seeking means to reduce the quantity of heavy metals excreted by animals and subsequently onto the land (Van Horn et al., 1996). The objective of this study was to establish the effect of two levels of dietary Zn inclusion in two different forms (organically chelated vs. inorganic) on apparent absorption and retention in dairy cattle.
Yeasts are the first micro-organism that become active in the silage upon exposure to air, using the residual sugars and lactic acid to produce carbon dioxide. Maize silage is particularly prone to spoilage as maize silage tends to have a larger concentration of water soluble carbohydrates, which was considered to be a better substrate for micro-organisms than volatile fatty acids (Auerbach et al., 1998). The aim of this experiment was to measure the effect of inoculating maize silage with Maize-all GS (inoculant) and Sil-all Fireguard (inoculant and preservative) on aerobic stability.
Major depression and anxiety disorders are known to negatively influence cognitive performance. Moreover, there is evidence for greater cognitive decline in older adults with generalized anxiety disorder. Except for clinical studies, complex executive planning functions and subclinical levels of anxiety have not been examined in a population-based sample with a broad age range.
Planning performance was assessed using the Tower of London task in a population-based sample of 4240 participants aged 40–80 years from the Gutenberg Health Study (GHS) and related to self-reported anxiety and depression by means of multiple linear regression analysis.
Higher anxiety ratings were associated with lower planning performance (β = −0.20; p < 0.0001) independent of age (β = 0.03; p = 0.47). When directly comparing the predictive value of depression and anxiety on cognition, only anxiety attained significance (β = −0.19; p = 0.0047), whereas depression did not (β = −0.01; p = 0.71).
Subclinical levels of anxiety but not of depression showed negative associations with cognitive functioning independent of age. Our results demonstrate that associations observed in clinical groups might differ from those in population-based samples, also with regard to the trajectory across the life span. Further studies are needed to uncover causal interrelations of anxiety and cognition, which have been proposed in the literature, in order to develop interventions aimed at reducing this negative affective state and to improve executive functioning.
It has now been some years since I completed the first draft of the first edition of this book. In this time, I have learned much from many collaborators and I am grateful to them. During the past few years, Mario Berta, Nilanjana Datta, Saikat Guha, and Andreas Winter have strongly shaped my thinking about quantum information theory, and Mario and Nilanjana in particular have influenced my technical writing style, which is reflected in the new edition of the book. Also, the chance to work with them and others has led me to new research directions in quantum information theory that I never would have imagined on my own.
I am also thankful to Todd Brun, Paul Cuff, Ludovico Lami, Ciara Morgan, and Giannicola Scarpa for using the book as the main text in their graduate courses on quantum information theory and for feedback. One can try as much as possible to avoid typos in a book, but inevitably, they seem to show up in unexpected places. I am grateful to many people for pointing out typos or errors and for suggesting how to fix them, including Todd Brun, Giulio Chiribella, Paul Cuff, Dawei (David) Ding, Will Matthews, Milan Mosonyi, David Reeb, and Marco Tomamichel. I also thank Corsin Pfister for helpful discussions about unique linear extensions of quantum physical evolutions. I am grateful to David Tranah and the editorial staff at Cambridge University Press for their help with publishing the second edition.
So what's new in the second edition? Suffice it to say that every page of the book has been rewritten and there are over 100 pages of new material! I formulated many thoughts about the revision during fall 2013 while teaching a graduate course on quantum information at LSU, and I then formulated many more thoughts and made the actual changes during fall 2015 (when teaching it again). In that regard, I am thankful to both the Department of Physics and Astronomy and the Center for Computation and Technology at LSU for providing a great environment and support. I also thank the graduate students at LSU who gave feedback during and after lectures. There are many little changes throughout that will probably go unnoticed.
We cannot overstate the importance of Shannon's contribution to modern science. His introduction of the field of information theory and his solutions to its two main theorems demonstrate that his ideas on communication were far beyond the other prevailing ideas in this domain around 1948.
In this chapter, our aim is to discuss Shannon's two main contributions in a descriptive fashion. The goal of this high-level discussion is to build up the intuition for the problem domain of information theory and to understand the main concepts before we delve into the analogous quantum information-theoretic ideas. We avoid going into deep technical detail in this chapter, leaving such details for later chapters where we formally prove both classical and quantum Shannontheoretic coding theorems. We do use some mathematics from probability theory (namely, the law of large numbers).
We will be delving into the technical details of this chapter's material in later chapters (specifically, Chapters 10, 13, and 14). Once you have reached later chapters that develop some more technical details, it might be helpful to turn back to this chapter to get an overall flavor for the motivation of the development.
We first discuss the problem of data compression. Those who are familiar with the Internet have used several popular data formats such as JPEG, MPEG, ZIP, GIF, etc. All of these file formats have corresponding algorithms for compressing the output of an information source. A first glance at the compression problem might lead one to believe that it is not possible to compress the output of the information source to an arbitrarily small size, and Shannon proved that this is the case. This result is the content of Shannon's first noiseless coding theorem.
An Example of Data Compression
We begin with a simple example that illustrates the concept of an information source. We then develop a scheme for coding this source so that it requires fewer bits to represent its output faithfully.
Suppose that Alice is a sender and Bob is a receiver. Suppose further that a noiseless bit channel connects Alice to Bob—a noiseless bit channel is one that transmits information perfectly from sender to receiver, e.g., Bob receives “0” if Alice transmits “0” and Bob receives “1” if Alice transmits “1.”
We have now seen in Chapters 20–22 how Alice can communicate classical or quantum information to Bob, perhaps even with the help of shared entanglement. One might argue that these communication tasks are the most fundamental tasks in quantum Shannon theory, given that they have furthered our understanding of the nature of information transmission over quantum channels. However, when discussing the communication of classical information, we made no stipulation as to whether this classical information should be public, so that any third party might have partial or full access to it, or private, so that no third party has access.
This chapter establishes the private classical capacity theorem, which gives the maximum rate at which Alice can communicate classical information privately to Bob without anyone else in the universe knowing what she sent to him. A variation of the information-processing task corresponding to this theorem was one of the earliest studied in quantum information theory, with the Bennett– Brassard-84 quantum key distribution protocol being the first proposed protocol for exploiting quantum mechanics to establish a shared secret key between two parties. The private classical capacity theorem is important for quantum key distribution because it establishes the maximum rate at which two parties can generate a shared secret key.
Another equally important, but less obvious utility of private classical communication is in establishing a protocol for quantum communication at the coherent information rate. Section 22.2 demonstrated a somewhat roundabout way of arriving at the conclusion that it is possible to communicate quantum information reliably at the coherent information rate—recall that we “coherified” the entanglement-assisted classical capacity theorem and then exploited the coherent communication identity and catalytic use of entanglement. Establishing achievability of the coherent information rate via private classical coding is another way of arriving at the same result, with the added benefit that the resulting protocol does not require the catalytic use of entanglement.
The intuition for quantum communication via privacy arises from the nocloning theorem. Suppose that Alice is able to communicate private classical messages to Bob, so that the channel's environment (Eve) is not able to distinguish which message Alice is transmitting to Bob.
I began working on this book in the summer of 2008 in Los Angeles, with much time to spare in the final months of dissertation writing. I had a strong determination to review quantum Shannon theory, a beautiful area of quantum information science that Igor Devetak had taught me three years earlier at USC in fall 2005. I was carefully studying a manuscript entitled “Principles of Quantum Information Theory,” a text that Igor had initiated in collaboration with Patrick Hayden and Andreas Winter. I read this manuscript many times, and many parts of it I understood well, though other parts I did not.
After a few weeks of reading and rereading, I decided “if I can write it out myself from scratch, perhaps I would then understand it!”, and thus began the writing of the chapters on the packing lemma, the covering lemma, and quantum typicality. I knew that Igor's (now former) students Min-Hsiu Hsieh and Zhicheng Luo knew the topic well because they had already written several quality research papers with him, so I requested if they could meet with me weekly for an hour to review the fundamentals. They kindly agreed and helped me quite a bit in understanding the packing and covering techniques.
After graduating, I began collaborating with Min-Hsiu on a research project that Igor had suggested to the both of us: “find the triple trade-off capacity formulas of a quantum channel.” This was perhaps the best starting point for me to learn quantum Shannon theory because proving this theorem required an understanding of most everything that had already been accomplished in the area. After a month of effort, I continued to work with Min-Hsiu on this project while joining Andreas Winter's Singapore group for a two-month visit. As I learned more, I added more to the notes, and they continued to grow.
After landing a job in the DC area for January 2009, I realized that I had almost enough material for teaching a course, and so I contacted local universities in the area to see if they would be interested. Can Korman, formerly chair of the Electrical Engineering Department at GeorgeWashington University, was excited about the possibility.
In these first few chapters, our aim is to establish a firm grounding so that we can address some fundamental questions regarding information transmission over quantum channels. This area of study has become known as “quantum Shannon theory” in the broader quantum information community, in order to distinguish this topic from other areas of study in quantum information science. In this text, we will use the terms “quantum Shannon theory” and “quantum information theory” somewhat interchangeably. We will begin by briefly overviewing several fundamental aspects of the quantum theory. Our study of the quantum theory, in this chapter and future ones, will be at an abstract level, without giving preference to any particular physical system such as a spin-1/2 particle or a photon. This approach will be more beneficial for the purposes of our study, but, here and there, we will make some reference to actual physical systems to ground us in reality.
You may be wondering, what is quantum Shannon theory and why do we name this area of study as such? In short, quantum Shannon theory is the study of the ultimate capability of noisy physical systems, governed by the laws of quantum mechanics, to preserve information and correlations. Quantum information theorists have chosen the name quantum Shannon theory to honor Claude Shannon, who single-handedly founded the field of classical information theory with a groundbreaking paper (Shannon, 1948). In particular, the name refers to the asymptotic theory of quantum information, which is the main topic of study in this book. Information theorists since Shannon have dubbed him the “Einstein of the information age.”1 The name quantum Shannon theory is fit to capture this area of study because we often use quantum versions of Shannon's ideas to prove some of the main theorems in quantum Shannon theory.
We prefer the name “quantum Shannon theory” over such names as “quantum information science” or just “quantum information.” These other names are too broad, encompassing subjects as diverse as quantum computation, quantum algorithms, quantum complexity theory, quantum communication complexity, entanglement theory, quantum key distribution, quantum error correction, and even the experimental implementation of quantum protocols.