6 - Jointly distributed random variables
Published online by Cambridge University Press: 05 June 2012
Summary
PREVIEW
In chapter 5 we looked at probability distributions of single random variables. But of course we often wish to consider the behaviour of two or more random variables together. This chapter extends the ideas of chapters 4 and 5, so that we can make probability statements about collections and sequences of random variables.
The most important instrument in this venture is the joint probability distribution, which we meet in section 6.2. We also define the concept of independence for random variables, and explore some consequences. Jointly distributed random variables have joint moments, and we look at the important ideas of covariance and correlation. Finally, we consider conditional distributions and conditional expectation in this new setting.
Prerequisites. We shall use one new technique in this chapter; see appendix 5.11 on double integrals.
JOINT DISTRIBUTIONS
A random variable X(ω) is a real-valued function on Ω. Often there will be several random variables of interest defined on Ω, and it may be important and useful to examine their joint behaviour. For example:
(i) A meteorological station may record the wind speed and direction, air pressure, and the air temperature.
(ii) Your physician may record your height, weight, blood pressure, cholesterol level, and more.
[…]
- Type
- Chapter
- Information
- Probability and Random VariablesA Beginner's Guide, pp. 238 - 308Publisher: Cambridge University PressPrint publication year: 1999
- 1
- Cited by