# Load libraries from sklearn.ensemble import AdaBoostClassifier from sklearn import datasets
Load Iris Flower Dataset
# Load data iris = datasets.load_iris() X = iris.data y = iris.target
Create Adaboost Classifier
The most important parameters are
base_estimatoris the learning algorithm to use to train the weak models. This will almost always not needed to be changed because by far the most common learner to use with AdaBoost is a decision tree – this parameter’s default argument.
n_estimatorsis the number of models to iteratively train.
learning_rateis the contribution of each model to the weights and defaults to
1. Reducing the learning rate will mean the weights will be increased or decreased to a small degree, forcing the model train slower (but sometimes resulting in better performance scores).
lossis exclusive to
AdaBoostRegressorand sets the loss function to use when updating weights. This defaults to a linear loss function however can be changed to
# Create adaboost-decision tree classifer object clf = AdaBoostClassifier(n_estimators=50, learning_rate=1, random_state=0)
Train Adaboost Classifer
# Train model model = clf.fit(X, y)