Path: blob/master/guides/md/keras_tuner/getting_started.md
3508 views
Getting started with KerasTuner
Authors: Luca Invernizzi, James Long, Francois Chollet, Tom O'Malley, Haifeng Jin
Date created: 2019/05/31
Last modified: 2021/10/27
Description: The basics of using KerasTuner to tune model hyperparameters.
Introduction
KerasTuner is a general-purpose hyperparameter tuning library. It has strong integration with Keras workflows, but it isn't limited to them: you could use it to tune scikit-learn models, or anything else. In this tutorial, you will see how to tune model architecture, training process, and data preprocessing steps with KerasTuner. Let's start from a simple example.
Tune the model architecture
The first thing we need to do is writing a function, which returns a compiled Keras model. It takes an argument hp
for defining the hyperparameters while building the model.
Define the search space
In the following code example, we define a Keras model with two Dense
layers. We want to tune the number of units in the first Dense
layer. We just define an integer hyperparameter with hp.Int('units', min_value=32, max_value=512, step=32)
, whose range is from 32 to 512 inclusive. When sampling from it, the minimum step for walking through the interval is 32.
You can quickly test if the model builds successfully.
You can print a summary of the search space:
Then, start the search for the best hyperparameter configuration. All the arguments passed to search
is passed to model.fit()
in each execution. Remember to pass validation_data
to evaluate the model.
Again, we can do a quick check to see if the code works correctly.
1/4 ━━━━━[37m━━━━━━━━━━━━━━━ 0s 279ms/step - accuracy: 0.0000e+00 - loss: 12.2230
1/4 ━━━━━[37m━━━━━━━━━━━━━━━ 0s 276ms/step - accuracy: 0.1250 - loss: 12.0090
Run the search with the custom objective.
0.5918120126201316