Introduction
In recent years, linear algebra has become as fundamental a mathematical tool as calculus. Since its role in statistics is so prominent, matrix computations and the solution of linear equations are fundamental to the computing of statistics. Hence the treatment in this chapter is rather traditional. The study of one particular statistical problem, regression, is postponed, and some problems arising in time-series analysis are discussed in the next chapter.
Numerical analysts always talk about the solution to a system of equations, Ax = b, for the thought of computing an inverse is considered (for reasons often unstated) naive and gauche. Although the tone is haughty, the reasoning is sound, and while the mathematics of A-1B speaks of inverses, its computation means solving systems of equations with several right-hand sides. To emphasize, although the algebra may be written in terms of inverses, careful analysis to convert the computations to solving systems of equations with many right-hand sides may lead to substantial savings in computing time.
The systems of equations to be treated here will always be square and consistent – that is, they will always have a solution. When this assumption is violated, the problem of solving a system of equations changes its nature, sometimes to a regression problem (discussed in Chapter 5) or to an eigenproblem (Chapter 6).
The first topic to be covered is an introduction to the computational and storage tricks that are so useful in matrix computations.