Hyper parameter tuning with Optuna - tutorial

By Hans Tierens

You all know that datarootsians are excellent data athletes. Olympics athletes train with weights, we evolved past that mere display of physical strength and started training the weights. In this way, Machine Learning Engineers train our models to achieve optimal performance on any task given to us. That’s how we shine!

In this tutorial, we explain how state of the art hyper parameters techniques work and when to apply them, using Optuna library.

The code

GitHub - datarootsio/tutorial-hyperparameter-optimization: Tutorial for Rootlabs@Lunch: Practical Hyperparameter Optimisation
Tutorial for Rootlabs@Lunch: Practical Hyperparameter Optimisation - GitHub - datarootsio/tutorial-hyperparameter-optimization: Tutorial for Rootlabs@Lunch: Practical Hyperparameter Optimisation

You might also like

Marketing strategy - How to go beyond propensity models - Virginie Marelli
When you start integrating data into your marketing strategy, the firstquestions that needs to be answered are often: who’s going to churn in the nextcouple months? To whom should we best sell what product? Does that person needthis product? To answer these types of questions one can build a mod…
A light introduction to transformers for NLP - Murilo Cunha
If you ever took a look into Natural Language Processing (NLP) for the pastyears, you probably heard of transformers. But what are these things? How didthey come to be? Why is it so good? How to use them? A good place to start answering these questions is to look back at what wasthere before tra…