202601131004
Status: #reference
Tags: Machine Learning, Financial Machine Learning
State: #nascient
Ensembling
A technique often used in machine learning that follows the idea of the wisdom of the crowds. While one weak model may be weak, surprisingly (not really if you know statistics, but still interesting) if the models are uncorrelated you can build quite a strong model by aggregating the results.
Depending of how this is done, you can often drastically increase the accuracy of your ensembled model, and reduce the variance as well, which seems to spit in the face of the Bias-Variance Tradeoff where we observe that generally, a higher accuracy on a dataset (generally due to the lower of bias of the model) is associated with a higher variance.
My personal favorite example of ensembling is the random forest which uses a specific form of ensembling called Bagging (Boostrapped Aggregation).
Relevant Links
| File | Folder | Last Modified |
|---|---|---|
| Decision Trees | 1. Cosmos | 8:46 AM - January 14, 2026 |
| When to burst and when to be careful | 1. Cosmos | 3:32 PM - January 11, 2026 |