To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Following parturition, contamination of the uterine lumen by bacteria is ubiquitous, and uterine health is impaired in cattle because infection persists in 10% to 15% of animals as endometritis. Endometritis causes infertility for the duration of infection, and subfertility persists even after apparent successful resolution of the disease. Escherichia coli is the pathogenic bacterium most frequently isolated from the post partum uterus, and is associated with increased concentrations of peripheral plasma acute phase proteins and fetid vaginal mucus. The presence of E. coli is also associated with slower growth of the first post partum dominant follicle and perturbed oestradiol secretion. Furthermore, in animals that ovulate the first dominant follicle, the corpus luteum is smaller and secretes less progesterone. The endotoxin lipopolysaccharide (LPS), which is released from E.coli, can pass from the uterine lumen to the peripheral circulation and LPS concentrations are increased in cows with uterine infection. Infusion of E. coli LPS into the uterine lumen suppresses the pre-ovulatory luteinising hormone surge and disrupts ovulation in heifers. In vitro, endometrial explants produce prostaglandins in response to LPS. Addition of LPS or E. coli to stromal or epithelial cells increases cyclooxygenase-2 mRNA expression, and stimulates the production of prostaglandin E2 and prostaglandin F2α . Furthermore, uterine and ovarian cells express mRNA of the molecules required for recognition of LPS, Toll-like receptor-4 and CD14. In summary, E. coli is a common cause of infertility involving the perturbation of the hypothalamus, pituitary and ovary in dairy cows.
Two fundamentally different views of how proteins fold are now being debated. Do proteins fold through multiple unpredictable routes directed only by the energetically downhill nature of the folding landscape or do they fold through specific intermediates in a defined pathway that systematically puts predetermined pieces of the target native protein into place? It has now become possible to determine the structure of protein folding intermediates, evaluate their equilibrium and kinetic parameters, and establish their pathway relationships. Results obtained for many proteins have serendipitously revealed a new dimension of protein structure. Cooperative structural units of the native protein, called foldons, unfold and refold repeatedly even under native conditions. Much evidence obtained by hydrogen exchange and other methods now indicates that cooperative foldon units and not individual amino acids account for the unit steps in protein folding pathways. The formation of foldons and their ordered pathway assembly systematically puts native-like foldon building blocks into place, guided by a sequential stabilization mechanism in which prior native-like structure templates the formation of incoming foldons with complementary structure. Thus the same propensities and interactions that specify the final native state, encoded in the amino-acid sequence of every protein, determine the pathway for getting there. Experimental observations that have been interpreted differently, in terms of multiple independent pathways, appear to be due to chance misfolding errors that cause different population fractions to block at different pathway points, populate different pathway intermediates, and fold at different rates. This paper summarizes the experimental basis for these three determining principles and their consequences. Cooperative native-like foldon units and the sequential stabilization process together generate predetermined stepwise pathways. Optional misfolding errors are responsible for 3-state and heterogeneous kinetic folding.
Arnold and Bowie (2003) attempt to derive ethical constraints on the actions of the managers of multinational enterprises (MNEs), or the MNEs themselves, from a Kantian perspective. We contest Arnold and Bowie's claims regarding MNE duties, in particular that MNEs have a duty to pay a subsistence wage above market levels. We conclude that even within Arnold and Bowie's Kantian framework such a duty does not properly emerge. In addition, we argue that the account of coercion used by Arnold and Bowie does not serve their purposes. Arnold and Bowie address consequentialist issues by arguing that their conclusions are not undercut by economic considerations regarding unemployment. We argue that Arnold and Bowie have misread the economic literature in this regard.
The effect of dietary intake of flavonols (predominantly quercetin) on oxidative DNA damage was studied in thirty-six healthy human subjects (sixteen men, twenty women). The study was a randomised crossover study, comprising two 14 d treatments of either a low-flavonol (LF) or high-flavonol (HF) diet with a 14 d wash-out period between treatments. Subjects were asked to avoid foods containing flavonols, flavones and flavanols during the LF dietary treatment period and to consume one 150 g onion (Allium cepa) cake (containing 89·7 mg quercetin) and one 300 ml cup of black tea (containing 1·4 mg quercetin) daily during the HF dietary treatment. A 7 d food diary was kept during each dietary period and blood samples were taken after each dietary treatment. Products of oxidative damage to DNA bases were measured in DNA from leucocytes. The study had more than 95 % power to detect a change of 20 % in DNA damage products Plasma vitamin C and plasma quercetin concentrations were also measured. No significant differences in intake of macronutrients or assessed micronutrients, measured DNA base damage products, or plasma vitamin C were found between the HF and LF dietary treatments. The plasma quercetin concentration was significantly higher after the HF dietary treatment period (228·5 (SEM 34·7) nmol/l) than after the LF dietary treatment period (less than the limit of detection, i.e. <66·2 nmol/l). These findings do not support the hypothesis that dietary quercetin intake substantially affects oxidative DNA damage in leucocytes.
Temporalis fascia, placed as an underlay graft, is commonly used to repair tympanic membrane perforations. Graft failure, however, is a well recognized complication. Grafts are often allowed to dry out during the procedure and, therefore, are often positioned in a dry or partially dehydrated state and only become fully rehydrated after placement. This study looked at how the size of the temporalis fascia alters with its state of hydration. The size of 20 temporalis fascia grafts was measured when fresh, after flattening and allowing them to dry, and finally after rehydrating the grafts with 0.9 per cent saline solution. Significant shrinkage was demonstrated. It is therefore proposed that a cause of increased failure rates, particularly in anterior myringoplasties, is loss of underlay due to graft rehydration and shrinkage. Thus graft shrinkage should be considered when positioning the graft.
Atomic profiles (SIMS) and cross-section TEM images of selectively etched, annealed profiles were studied for Boron energies from 20 da eV (200 eV) to 10 keV and RTP anneals at 900, 975 and 1050 C Consistent variations of dopant depth was obtained over this process range. TEM images showed evidence of lateral dopant variation near the edges of poly-Si gate structures, perhaps an effect of lateral straggling and reflection of ions from the poly mask.
We describe a rare case of acardius in a triplet pregnancy terminated by Caesarean Section at 32 weeks gestation. Morphological and chromosomal abnormalities of the fetus as well as structural abnormalities of the placenta are presented. Cytogenetic analysis and examination of the single disc triplet placenta provide evidence for the two major theories of pathogenesis of acardius, the twin reversed arterial perfusion (TRAP) sequence and the genetic theory, which we believe are not necessarily mutually exclusive.
A large body of mathematics consists of facts that can be presented and described much like any other natural phenomenon. These facts, at times explicitly brought out as theorems, at other times concealed within a proof, make up most of the applications of mathematics, and are the most likely to survive changes of style and of interest.
This ENCYCLOPEDIA will attempt to present the factual body of all mathematics. Clarity of exposition, accessibility to the non-specialist, and a thorough bibliography are required of each author. Volumes will appear in no particular order, but will be organized into sections, each one comprising a recognizable branch of present-day mathematics. Numbers of volumes and sections will be reconsidered as times and needs change.
It is hoped that this enterprise will make mathematics more widely used where it is needed, and more accessible in fields in which it can be applied but where it has not yet penetrated because of insufficient information.
Thirty years ago, Claude Shannon published a paper with the title “A mathematical theory of communication”. In this paper, he defined a quantity, which he called entropy, that measures the uncertainty associated with random phenomena. The effects of this paper on communications in both theory and practice are still being felt, and his entropy function has been applied very successfully to several areas of mathematics. In particular, an extension of it to dynamic situations by A. N. Kolmogorov and Ja. G. Sinai led to a complete solution of a long-unsolved problem in ergodic theory, to a new invariant for differentiable dynamic systems, and to more precision in certain concepts in classical statistical mechanics.
Our intent in this book is to give a rather complete and self-contained development of the entropy function and its extension that is understandable to a reader with a knowledge of abstract measure theory as it is taught in most first-year graduate courses and to indicate how it has been applied to the subjects of information theory, ergodic theory, and topological dynamics. We have made no attempt to give a comprehensive treatment of these subjects; rather we have restricted ourselves to just those parts of the subject which have been influenced by Shannon's entropy and the Kolmogorov-Sinai extension of it. Thus, our purpose is twofold: first, to give a self-contained treatment of all the major properties of entropy and its extension, with rather detailed proofs, and second, to give an exposition of its uses in those areas of mathematics where it has been applied with some success.
Entropy is a subject which has played a central role in a number of areas such as statistical mechanics and information theory. The connections between the various applications of entropy have become clearer in recent years by the introduction of probability theory into its foundations. It is now possible to see a number of what were previously isolated results in various disciplines as part of a more general mathematical theory of entropy.
This volume presents a self-contained exposition of the mathematical theory of entropy. Those parts of probability theory which are necessary for an understanding of the central topics concerning entropy have been included. In addition, carefully chosen examples are given in order that the reader may omit proofs of some of the theorems and yet by studying these examples and discussion obtain insight into the theorems.
The last four chapters give a description of those parts of information theory, ergodic theory, statistical mechanics, and topological dynamics which are most affected by entropy. These chapters may be read independently of each other. The examples show how ideas originating in one area have influenced other areas. Chapter III contains a brief description of how entropy as a measure of information flow has affected information theory and complements the first part of The Theory of Information and Coding by R. J. McEliece (volume 3 of this ENCYCLOPEDIA). Recent applications of entropy to statistical mechanics and topological dynamics are given in chapters V and VI.
In this chapter we shall give a mathematical definition of the information in a random event and the entropy of experiments with a countable number of outcomes. We shall also indicate how the entropy is a measure of uncertainty and then give the main properties satisfied by both information and entropy. Then the definition of entropy will be extended to include experiments with an arbitrary number of outcomes, and the properties of entropy will be proven for this case. Finally we give the definitions of the rate of information generation and the entropy of a dynamical system and derive their most important properties. We conclude with several examples and a brief discussion of two useful extensions of these definitions.
Information and Uncertainty of Events
Let (Ω, F, P) be a Lebesgue space and E an event in F. Thinking of the Lebesgue space as being a mathematical model of some random experiment, suppose an outcome of this experiment results in the event E. We have gained some information because we know that E occurred. The purpose of this first section is to define a function I on the events in a Lebesgue space so that I(E) will give a quantitative measure of the information gained if the event E results from the outcome of the experiment.
Before the experiment is performed, the uncertainty of its outcome resulting in the event E should equal the information we have gained if the outcome does result in E.
In this preliminary chapter we shall give an exposition of certain topics in probability theory which are necessary to understand and interpret the definition and properties of entropy. We have tried to write the chapter in such a way that a reader with a knowledge of measure theory as given in Ash , Halmos , or any other basic measure theory text can follow the arguments and understand the examples. We introduce just those parts of probability theory which are necessary for the subsequent chapters and attempt to make them meaningful by use of very simple examples. We also restrict the discussion to “nice” probability spaces, so that conditional expectation and conditional probability are more intuitive and hopefully easier to understand. These “nice” spaces also make it possible to use partitions as models for random experiments, even those experiments which are limits of sequences of experiments.
Entropy is a quantitative measurement of uncertainty associated with random phenomena. In order to define this quantity precisely, it is necessary to have a mathematical model for random phenomena which is general enough to include many different physical situations and which has enough structure to allow us to use mathematical reasoning to answer questions about the phenomena.
Such a model is given by a mathematical structure called a probability space, which is nothing more than a measure space in which the measure of the universe set is 1.
Information theory is concerned with constructing a mathematical model for systems which transmit information and then analyzing this model. The object of this analysis is to develop techniques to reproduce at one point (the destination) a message, or an adequate approximation to a message, which has been chosen at another point (the source) and transmitted over a channel.
Fundamental to this problem is a measure of the information transmitted by such a system. It is only in terms of quantity or rate of information processed by a system that one can judge the effectiveness of the system. Central to the notion of quantity of information is an interpretation of the idea of entropy. This interpretation equates the removal of uncertainty with the gain in information. Since entropy is a numerical measure of uncertainty, this interpretation of information gives entropy its central position. The interpretation and the resulting theories were initiated by C. E. Shannon  in his fundamental paper “A Mathematical Theory of Communication” published in 1948.
In this chapter we shall describe a standard model of an information system and show how the notion of entropy provides a quantitative basis for measuring the information processed by a system. We shall focus on the construction of the various models, and the definitions and theorems necessary to understand these models.