Skip to main content

Advertisement

Springer Nature Link
Log in
Menu
Find a journal Publish with us Track your research
Search
Saved research
Cart
  1. Home
  2. Machine Learning
  3. Article

Bagging predictors

  • Published: August 1996
  • Volume 24, pages 123–140, (1996)
  • Cite this article
Download PDF
View saved research
Machine Learning Aims and scope Submit manuscript
Bagging predictors
Download PDF
  • Leo Breiman1 
  • 112k Accesses

  • 20k Citations

  • 57 Altmetric

  • 4 Mentions

  • Explore all metrics

Abstract

Bagging predictors is a method for generating multiple versions of a predictor and using these to get an aggregated predictor. The aggregation averages over the versions when predicting a numerical outcome and does a plurality vote when predicting a class. The multiple versions are formed by making bootstrap replicates of the learning set and using these as new learning sets. Tests on real and simulated data sets using classification and regression trees and subset selection in linear regression show that bagging can give substantial gains in accuracy. The vital element is the instability of the prediction method. If perturbing the learning set can cause significant changes in the predictor constructed, then bagging can improve accuracy.

Article PDF

Download to read the full article text

Similar content being viewed by others

Evidential Bagging: Combining Heterogeneous Classifiers in the Belief Functions Framework

Chapter © 2018

MetaBags: Bagged Meta-Decision Trees for Regression

Chapter © 2019

Actively Balanced Bagging for Imbalanced Data

Chapter © 2017

Explore related subjects

Discover the latest articles, books and news in related subjects, suggested using machine learning.
  • Data Science
  • Data Mining
  • Learning algorithms
  • Machine Learning
  • Predictive medicine
  • Statistical Learning

References

  • Belsley, D., Kuh, E., & Welsch, R. (1980). “Regression Diagnostics”, John Wiley and Sons.

  • Breiman, L. (1994). Heuristics of instability in model selection, Technical Report, Statistics Department, University of California at Berkeley (to appear, Annals of Statistics).

  • Breiman, L., Friedman, J., Olshen, R., & Stone, C. (1984). “Classification and Regression Trees”, Wadsworth.

  • Breiman, L. & Friedman, J. (1985). Estimating optimal transformations in multiple regression and correlation (with discussion), Journal of the American Statistical Association, 80, 580–619.

    Google Scholar 

  • Breiman, L. & Spector, P (1992). Submodel Selection and Evaluation in Regression-the X-Random Case, International Statistical Review, 3, 291–319

    Google Scholar 

  • Buntine, W. (1991). “Learning classification trees”, Artificial Intelligence Frontiers in Statistics, ed D.J. Hand, Chapman and Hall, London, 182–201.

    Google Scholar 

  • Dietterich, T.G. & Bakiri, G. (1991). Error-correcting output codes: A general method for improving multiclass inductive learning programs, Proceedings of the Ninth National Conference on Artificial Intelligence (AAAI-91), Anaheim, CA: AAAI Press.

    Google Scholar 

  • Efron, B., & Tibshirani, R. (1993). “An Introduction to the Bootstrap”. Chapman and Hall.

  • Friedman, J. (1991). Multivariate adaptive regression splines (with discussion), Annals of Statistics, 19, 1–141.

    Google Scholar 

  • Heath, D., Kasif, S., & Salzberg, S. (1993). k-dt: a multi-tree learning method. Proceedings of the Second International Workshop on Multistrategy Learning, 1002–1007, Chambery, France, Morgan Kaufman.

    Google Scholar 

  • Kwok, S., & Carter, C. (1990). Multiple decision trees, Uncertainty in Artificial Intelligence 4, ed. Shachter, R., Levitt, T., Kanal, L., and Lemmer, J., North-Holland, 327–335.

  • Michie, D., Spiegelhalter, D.J. & Taylor, C.C. (1994). Machine Learning, Neural and Statistical Classification. Ellis Horwood Limited.

  • Olshen, R., Gilpin, A., Henning, H., LeWinter, M., Collins, D., & Ross, J. (1985). Twelve-month prognosis following myocardial infarction: Classification trees, logistic regression, and stepwise linear discrimination, Proceedings of the Berkeley conference in honor of Jerzy Neyman and Jack Kiefer, L. Le Cam;R. Olshen, (Ed), Wadsworth, 245–267.

  • Smith, J., Everhart, J., Dickson, W., Knowler, W., & Johannes, R. (1988). Using the ADAP learning algorithm to forecast the onset of diabetes mellitus. In Proceedings of the Symposium on Computer Applications and Medical Care 261–265. IEEE Computer Society Press.

  • Sigillito, V. G., Wing, S. P., Hutton, L. V., & Baker, K. B. (1989). Classification of radar returns from the ionosphere using neural networks. Johns Hopkins APL Technical Digest, 10, 262–266.

    Google Scholar 

  • Wolberg, W. & Mangasarian, O (1990). Multisurface method of pattern separation for medical diagnosis applied to breast cytology, Proceedings of the National Academy of Sciences, U.S.A., Volume 87, December 1990, pp 9193–9196.

Download references

Author information

Authors and Affiliations

  1. Statistics Department, University of California, 94720, Berkeley, CA

    Leo Breiman

Authors
  1. Leo Breiman
    View author publications

    Search author on:PubMed Google Scholar

Rights and permissions

Reprints and permissions

About this article

Cite this article

Breiman, L. Bagging predictors. Mach Learn 24, 123–140 (1996). https://doi.org/10.1007/BF00058655

Download citation

  • Received: 19 September 1994

  • Accepted: 02 January 1995

  • Issue date: August 1996

  • DOI: https://doi.org/10.1007/BF00058655

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Keywords

  • Aggregation
  • Bootstrap
  • Averaging
  • Combining

Advertisement

Search

Navigation

  • Find a journal
  • Publish with us
  • Track your research

Discover content

  • Journals A-Z
  • Books A-Z

Publish with us

  • Journal finder
  • Publish your research
  • Language editing
  • Open access publishing

Products and services

  • Our products
  • Librarians
  • Societies
  • Partners and advertisers

Our brands

  • Springer
  • Nature Portfolio
  • BMC
  • Palgrave Macmillan
  • Apress
  • Discover
  • Your US state privacy rights
  • Accessibility statement
  • Terms and conditions
  • Privacy policy
  • Help and support
  • Legal notice
  • Cancel contracts here

104.23.197.170

Not affiliated

Springer Nature

© 2026 Springer Nature