Gradient boosting

Gradient boosting is another boosting algorithm. It is a more generalized boosting framework compared to AdaBoost, which also makes it more complicated and math-intensive. Instead of trying to emphasize problematic instances by assigning weights and resampling the dataset, gradient boosting builds each base learner on the previous learner's errors. Furthermore, gradient boosting uses decision trees of varying depths. In this section, we will present gradient boosting, without delving much into the math involved. Instead, we will present the basic concepts, as well as a custom Python implementation.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset