Open links in new tab
  1. A decision tree is a supervised learning algorithm that classifies data by recursively splitting it into subsets based on feature values until reaching a decision at a leaf node. It works like a flowchart: internal nodes represent feature-based tests, branches represent outcomes of those tests, and leaves represent class predictions.

    Core working principle:

    1. Start at the root node with the full dataset.

    2. Select the best feature to split the data using a criterion such as Information Gain (ID3, C4.5) or Gini Impurity (CART).

    3. Split into subsets based on feature values.

    4. Repeat recursively for each subset until all samples in a node belong to the same class or no further split is possible.

    5. Assign class labels to leaf nodes.

    Key splitting criteria:

    • Information Gain: Measures reduction in entropy after a split. Higher gain means better separation.

    • Gini Impurity: Measures probability of incorrect classification. Lower Gini means purer nodes.

    Example in Python (CART with scikit-learn):

    from sklearn.tree import DecisionTreeClassifier, export_text
    from sklearn.datasets import load_iris
    from sklearn.model_selection import train_test_split

    # Load dataset
    X, y = load_iris(return_X_y=True)
    X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3)

    # Train Decision Tree
    clf = DecisionTreeClassifier(criterion='gini', max_depth=3, random_state=42)
    clf.fit(X_train, y_train)

    # Display tree rules
    print(export_text(clf, feature_names=load_iris().feature_names))

    # Evaluate
    print("Accuracy:", clf.score(X_test, y_test))
    Copied!
  1. Decision Trees for Classification — Complete Example

    Jan 1, 2023 · In this article, we discussed a simple but detailed example of how to construct a decision tree for a classification problem and how it can be used to …

  2. 1.10. Decision Trees — scikit-learn 1.8.0 documentation

    Decision Trees (DTs) are a non-parametric supervised learning method used for classification and regression. The goal is to create a model that predicts the …

  3. Decision Tree Classification in Python Tutorial - DataCamp

    1. Decision trees are easy to interpret and visualize.
    2. It can easily capture Non-linear patterns.
    3. It requires fewer data preprocessing from the user, for example, there is no need to normalize columns.
    4. It can be used for feature engineering such as predicting missing values, suitable for variable selecti…
    1. Decision trees are easy to interpret and visualize.
    2. It can easily capture Non-linear patterns.
    3. It requires fewer data preprocessing from the user, for example, there is no need to normalize columns.
    4. It can be used for feature engineering such as predicting missing values, suitable for variable selection.
  4. Decision Tree in Machine Learning: How They Work

    Oct 29, 2025 · This example illustrates how decision trees naturally segment populations into meaningful groups, creating interpretable rules that loan officers …

  5. Decision Tree Algorithm: Interpretable Classification …

    Learn everything about the Decision Tree Algorithm: an interpretable classification method in machine learning. Step-by-step explanation with examples, visuals, …

  6. Decision Tree Classifier, Explained: A Visual Guide with …

    Aug 30, 2024 · A Decision Tree classifier creates an upside-down tree to make predictions, starting at the top with a question about an important feature in your …

  7. Let’s consider an example of using the CART algorithm for a binary classification problem. Suppose we have a dataset of patients with information about their age, gender, blood pressure, and cholesterol …

  8. Decision Tree Tutorials & Notes | Machine Learning

    Detailed tutorial on Decision Tree to improve your understanding of Machine Learning. Also try practice problems to test & improve your skill level.

  9. Decision Trees for Classification - Example

    Dec 19, 2023 · Decision Trees are a powerful, yet simple Machine Learning Model. An advantage of their simplicity is that we can build and understand them step …