Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
jxareas
GitHub Repository: jxareas/Machine-Learning-Notebooks
Path: blob/master/2_Advanced_Learning_Algorithms/Week 4. Decision trees/_Learning Objectives.md
2826 views

Week 4: Decision trees

This week, you'll learn about a practical and very commonly used learning algorithm the decision tree. You'll also learn about variations of the decision tree, including random forests and boosted trees (XGBoost).

Learning Objectives

  • See what a decision tree looks like and how it can be used to make predictions

  • Learn how a decision tree learns from training data

  • Learn the "impurity" metric "entropy" and how it's used when building a decision tree

  • Learn how to use multiple trees, "tree ensembles" such as random forests and boosted trees

  • Learn when to use decision trees or neural networks

Decision trees

Decision tree model - Video • Duration: 7 min

Learning Process - Video • Duration: 11 min

Practice quiz: Decision trees

Practice quiz: Decision trees

Measuring purity - Video • Duration: 7 min

Choosing a split: Information Gain - Video • Duration: 11 min

Putting it together - Video • Duration: 9 min

Using one-hot encoding of categorical features - Video • Duration: 5 min

Continuous valued features - Video • Duration: 6 min

Regression Trees (optional) - Video • Duration: 9 min

Practice quiz: Decision tree learning

Practice quiz: Decision tree learning

Using multiple decision trees - Video • Duration: 3 min

Sampling with replacement - Video • Duration: 3 min

Random forest algorithm - Video • Duration: 6 min

XGBoost - Video • Duration: 7 min

When to use decision trees - Video • Duration: 6 min

Practice quiz: Tree ensembles

Practice quiz: Tree ensembles

Practice Lab: Decision Trees

Practice Lab: Decision Trees

Acknowledgements

Acknowledgements - Reading • Duration: 10 min