Book contents
- Frontmatter
- Contents
- Chapter dependencies
- Preface
- 1 Introduction to probability
- 2 Introduction to discrete random variables
- 3 More about discrete random variables
- 4 Continuous random variables
- 5 Cumulative distribution functions and their applications
- 6 Statistics
- 7 Bivariate random variables
- 8 Introduction to random vectors
- 9 Gaussian random vectors
- 10 Introduction to random processes
- 11 Advanced concepts in random processes
- 12 Introduction to Markov chains
- 13 Mean convergence and applications
- 14 Other modes of convergence
- 15 Self similarity and long-range dependence
- Bibliography
- Index
3 - More about discrete random variables
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- Chapter dependencies
- Preface
- 1 Introduction to probability
- 2 Introduction to discrete random variables
- 3 More about discrete random variables
- 4 Continuous random variables
- 5 Cumulative distribution functions and their applications
- 6 Statistics
- 7 Bivariate random variables
- 8 Introduction to random vectors
- 9 Gaussian random vectors
- 10 Introduction to random processes
- 11 Advanced concepts in random processes
- 12 Introduction to Markov chains
- 13 Mean convergence and applications
- 14 Other modes of convergence
- 15 Self similarity and long-range dependence
- Bibliography
- Index
Summary
This chapter develops more tools for working with random variables. The probability generating function is the key tool for working with sums of nonnegative integer-valued random variables that are independent. When random variables are only uncorrelated, we can work with averages (normalized sums) by using the weak law of large numbers. We emphasize that the weak law makes the connection between probability theory and the every-day practice of using averages of observations to estimate probabilities of real-world measurements. The last two sections introduce conditional probability and conditional expectation. The three important tools here are the law of total probability, the law of substitution, and, for independent random variables, “dropping the conditioning.”
The foregoing concepts are developed here for discrete random variables, but they will all be extended to more general settings in later chapters.
Probability generating functions
In many problems we have a sum of independent random variables, and we would like to know the probability mass function of their sum. For example, in an optical communication system, the received signal might be Y = X + W, where X is the number of photoelectrons due to incident light on a photodetector, and W is the number of electrons due to dark current noise in the detector. An important tool for solving these kinds of problems is the probability generating function. The name derives from the fact that it can be used to compute the probability mass function.
- Type
- Chapter
- Information
- Publisher: Cambridge University PressPrint publication year: 2006
- 1
- Cited by