Skip to main content Accessibility help
×
Home
Hostname: page-component-7ccbd9845f-2c279 Total loading time: 0.486 Render date: 2023-01-27T09:23:41.276Z Has data issue: true Feature Flags: { "useRatesEcommerce": false } hasContentIssue false

5 - Regression Computations

Published online by Cambridge University Press:  01 June 2011

John F. Monahan
Affiliation:
North Carolina State University
Get access
Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2011

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Beaton, Albert E. (1977), “Comment on ‘More on Computational Accuracy in Regression’,” Journal of the American Statistical Association 72: 600.Google Scholar
Beaton, Albert E., Rubin, Donald B., and Barone, John L. (1976), “The Acceptability of Regression Solutions: Another Look at Computational Accuracy,” Journal of the American Statistical Association 71: 158–68.CrossRefGoogle Scholar
Belsley, D. A., Kuh, E., and Welsch, R. E. (1980), Regression Diagnostics: Identifying Influential Data and Sources of Collinearity. New York: Wiley.CrossRefGoogle Scholar
Bjorck, Ake (1967), “Solving Least Squares Problems by Gram–Schmidt Orthogonalization,” BIT 7: 1–21.CrossRefGoogle Scholar
Brownlee, K. A. (1965), Statistical Theory and Methodology in Science and Engineering, 2nd ed. New York: Wiley.Google Scholar
Chan, Tony F., Golub, Gene H., and LeVeque, Randall J. (1983), “Algorithms for Computing the Sample Variance,” American Statistician 37: 242–7.Google Scholar
Clarke, M. R. B. (1981), “AS 163: A Givens Algorithm for Moving from One Linear Model to Another without Going Back to the Data,” Applied Statistics 30: 198–203.CrossRefGoogle Scholar
Cook, R. D. and Weisberg, S. (1982), Residuals and Influence in Regression. New York: Chapman & Hall.Google Scholar
Dahlquist, Germund and Bjorck, Ake (1974), Numerical Methods (trans. by Anderson, N.). Englewood Cliffs, NJ: Prentice-Hall.Google Scholar
Doolittle, M. H. (1878), United States Coast Guard and Geodetic Survey Report, vol. 115.
Efron, Bradley, Hastie, Trevor, Johnstone, Iain, and Tibshirani, Robert (2004), “Least Angle Regression,” Annals of Statistics 32: 407–499.Google Scholar
Freund, R. J. (1979), “Multicollinearity etc., Some ‘New’ Examples,” in Proceedings of the Statistical Computing Section, pp. 111–12. Washington, DC: American Statistical Association.Google Scholar
Furnival, George M. (1971), “All Possible Regressions with Less Computation,” Technometrics 13: 403–8.CrossRefGoogle Scholar
Furnival, George M. and Wilson, Robert W. Jr., (1974), “Regressions by Leaps and Bounds,” Technometrics 16: 499–511.CrossRefGoogle Scholar
Gentleman, W. Morven (1973), “Least Squares Computations by Givens Transformations without Square Roots,” Journal of the Institute of Mathematics and Its Applications 12: 329–36.CrossRefGoogle Scholar
Gill, Philip E. and Murray, Walter (1978), “Numerically Stable Methods for Quadratic Programming,” Mathematical Programming 14: 349–372.CrossRefGoogle Scholar
Golub, Gene, Klema, Virginia, and Peters, Stephen C. (1980), “Rules and Software for Detecting Rank Degeneracy,” Journal of Econometrics 12: 41–8.CrossRefGoogle Scholar
Golub, Gene H. and Loan, Charles van (1984), Matrix Computations. Baltimore: Johns Hopkins University Press.Google Scholar
Goodnight, James H. (1979), “A Tutorial on the SWEEP Operator,” American Statistician 33: 149–58.Google Scholar
Huber, Peter J. (1964), “Robust Estimation of a Location Parameter,” Annals of Mathematical Statistics 35: 73–101.CrossRefGoogle Scholar
Kim, Jinseog, Kim, Yuwon, and Kim, Yongdai (2008), “A Gradient-Based Optimization Algorithm for LASSO,” Journal of Computational and Graphical Statistics 17: 994–1009.CrossRefGoogle Scholar
Lawson, Charles L. and Hanson, Richard J. (1974), Solving Least Squares Problems. Englewood Cliffs, NJ: Prentice-Hall.Google Scholar
Longley, James W. (1967), “An Appraisal of Least Squares Programs for the Electronic Computer for the Point of View of the User,” Journal of the American Statistical Association 62: 819–41.CrossRefGoogle Scholar
McIntosh, Allen (1982), Fitting Linear Models: An Application of Conjugate Gradient Algorithms. New York: Springer-Verlag.CrossRefGoogle Scholar
Myers, Raymond H. (1989), Classical and Modern Regression with Applications, 2nd ed. Boston: PWS-Kent.Google Scholar
Osborne, Michael R., Presnell, Brett, and Turlach, Berwin A. (2000), “On the LASSO and Its Dual,” Journal of Computational and Graphical Statistics 9: 319–337.Google Scholar
Portnoy, Stephen and Koenker, Roger (1997), “The Gaussian Hare and the Laplacian Tortoise: Computability of Square-Error versus Absolute-error Estimators,” Statistical Science, 12: 279–300.Google Scholar
Schatzoff, M., Tsao, R., and Feinberg, S. (1968), “Efficient Calculation of All Possible Regressions,” Technometrics 10: 769–79.CrossRefGoogle Scholar
Searle, Shayle R. (1973), Linear Models. New York: Wiley.Google Scholar
Snedecor, George W. and Cochran, William G. (1967), Statistical Methods, 6th ed. Ames: Iowa State University Press.Google Scholar
Steel, R. G. D. and Torrie, J. H. (1980), Principles and Procedures of Statistics, 2nd ed. New York: McGraw-Hill.Google Scholar
Stewart, G. W. (1973), Introduction to Matrix Computations. New York: Academic Press.Google Scholar
Stewart, G. W. (1976), “The Economic Storage of Plane Rotations,” Numerische Mathematik 25: 137–8.CrossRefGoogle Scholar
Tibshirani, Robert (1996), “Regression Shrinkage and Selection via the Lasso,” Journal of the Royal Statistical Society 58: 267–288.Google Scholar
Velleman, Paul F. and Welsch, Roy E. (1981), “Efficient Computing of Regression Diagnostics,” American Statistician 35: 234–42.Google Scholar
Wampler, Roy H. (1980), “Test Procedures and Test Problems for Least Squares Algorithms,” Journal of Econometrics 12: 3–22.CrossRefGoogle Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

  • Regression Computations
  • John F. Monahan, North Carolina State University
  • Book: Numerical Methods of Statistics
  • Online publication: 01 June 2011
  • Chapter DOI: https://doi.org/10.1017/CBO9780511977176.007
Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

  • Regression Computations
  • John F. Monahan, North Carolina State University
  • Book: Numerical Methods of Statistics
  • Online publication: 01 June 2011
  • Chapter DOI: https://doi.org/10.1017/CBO9780511977176.007
Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

  • Regression Computations
  • John F. Monahan, North Carolina State University
  • Book: Numerical Methods of Statistics
  • Online publication: 01 June 2011
  • Chapter DOI: https://doi.org/10.1017/CBO9780511977176.007
Available formats
×