Random forest is a popular machine learning algorithm that belongs to the supervised learning technique.
It can be used for both classification and regression and is based on the concept of ensemble learning i.e. a process of
combining multiple classifiers to solve a complex problem and to improve the performance of the model.
Random forest makes use of the bagging technique.
✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨
In random forest, the models M1, M2...etc. are nothing but decision trees. There are two phases namely 🌴: 🍂. Creating the random forest by combining N decision trees 🍂. Making predictions for each tree
✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨
🍃. Starts by selecting random samples from given dataset using the bootstrap technique.
🍃. This algorithm will construct a decision tree for every sample, then it will get prediction result from each decision tree.
🍃. Next, voting will be performed for every predicted result.
🍃. At last, select the most voted prediction as the final prediction result.
✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨
- 🌸 For a Random Forest Classifier :
the final result will be the majority vote.
- 🌸 For a Random Forest Regressor :
the final result will be mean of results of all the decision trees.
✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨✨