202407200332
Status: #idea
Tags: Deep Learning, Tree-Based Methods
State: #nascient
Gradient Boosting Machine-Trees
Related to Random Forests.
There are two main metas in AI:
- Bagging : Take multiple models into an ensemble and take the mean of the inferences. (What a random forest does)
- Boosting : Make an inference using a simple tree. Then find a tree that makes an inference for the error of the first tree. And then find one that makes an inference on the error of the error of the first two models. Etc. Instead of taking the mean of the inferences you take the sum. Gradient Boosting trees or machines in a nutshell.
Advantages:
- More accurate (not necessarily by much)
Cons:
- Contrarily to Random Forest, you can overfit them.
Best seen as a possible next step for when random Forests are not good enough, or when your
goal is to get the absolute highest accuracy possible without resorting to Neural Networks
Relevant Links
| File | Folder | Last Modified |
|---|---|---|
| Boosting | 1. Cosmos | 9:15 AM - January 14, 2026 |
| Classification Trees | 1. Cosmos | 8:53 AM - January 14, 2026 |