XGBoost parameter tuning – Part 2/2 This is the second part about XGBoost parameter tuning. In the first part, we tuned the first parameter – ‘eta‘. Now we will explore parameter tuning for ‘max_depth‘ and ‘min_child_weight‘. Finally, we’ll train the final model. Tuning max_depth max_depth=6 Now that we’ve set ‘eta‘ to 0.1, which we determinedContinue reading “ML Zoomcamp 2023 – Decision Trees and Ensemble Learning– Part 13”
Tag Archives: Parameter Tuning
ML Zoomcamp 2023 – Decision Trees and Ensemble Learning– Part 12
XGBoost parameter tuning – Part 1/2 This part is about XGBoost parameter tuning. It’s the first part of a two-part series, where we begin by tuning the initial parameter – ‘eta‘. The subsequent article will explore parameter tuning for ‘max_depth‘ and ‘min_child_weight‘. In the final phase, we’ll train the final model. Let’s start tuning theContinue reading “ML Zoomcamp 2023 – Decision Trees and Ensemble Learning– Part 12”
ML Zoomcamp 2023 – Decision Trees and Ensemble Learning– Part 9
Ensemble and random forest This article discusses the concept of Random Forest as a technique for combining multiple decision trees. Before diving into Random Forest, we’ll explore the concept of ensemble modeling, where multiple models act as a ‘board of experts.’ The final part of the article will cover the process of tuning a RandomContinue reading “ML Zoomcamp 2023 – Decision Trees and Ensemble Learning– Part 9”
ML Zoomcamp 2023 – Decision Trees and Ensemble Learning– Part 8
Decision trees parameter tuning Part 8 of the ‘Decision Trees and Ensemble Learning’ section is dedicated to tuning decision tree parameters. Parameter tuning involves selecting the best parameters for training. In this context, ‘tuning’ means choosing parameters in a way that maximizes or minimizes a chosen performance metric (such as AUC or RMSE) on theContinue reading “ML Zoomcamp 2023 – Decision Trees and Ensemble Learning– Part 8”
ML Zoomcamp 2023 – Evaluation metrics for classification– Part 7
Cross-Validation Evaluating the same model on different subsets of data In this article, I’ll discuss parameter tuning, which involves selecting the optimal parameter. Typically, we start by splitting our entire dataset into three parts: training, validation, and testing. We utilize the validation dataset to determine the best parameter for the formula g(xi), essentially finding theContinue reading “ML Zoomcamp 2023 – Evaluation metrics for classification– Part 7”