Mastering Machine Learning for Penetration Testing
上QQ阅读APP看书,第一时间看更新

Improving classification with ensemble learning 

In many cases, when you build a machine learning model, you receive low accuracy and low results. In order to get good results, we can use ensemble learning techniques. This can be done by combining many machine learning techniques into one predictive model.

We can categorize ensemble learning techniques into two categories:

  • Parallel ensemble methods—The following graph illustrates how parallel ensemble learning works:
  • Sequential ensemble methods—The following graph illustrates how sequential ensemble learning works:

The following are the three most used ensemble learning techniques:

  • Bootstrap aggregating (bagging): This involves building separate models and combining them by using model averaging techniques, like weighted average and majority vote.
  • Boosting: This is a sequential ensemble learning technique. Gradient boosting is one of the most used boosting techniques.
  • Stacking: This is like boosting, but it uses a new model to combine submodels.