Neptune-Keras Tuner Integration¶
What will you get with this integration?¶
Keras Tuner is an open source hyperparameter optimization framework enables hyperparameter search on Keras Models.
With Neptune integration, you can:
see charts of logged metrics for every trial
see the parameters tried at every trial,
see hardware consumption during search,
log the best parameters after training,
log hyperparameter search space
log Keras Tuner project directory with information for all the trials
Note
This integration is tested with keras-tuner==1.0.2
, neptune-client==0.4.133
, and neptune-contrib==0.26.0
.
Quickstart¶
This quickstart will show you how to:
Install the necessary neptune packages
Connect Neptune to your Keras Tuner hyperparameter search code and create the first experiment
Log metrics, parameters, and artifacts from your Keras Tuner sweep to Neptune
Monitor hardware consumption and search performance during a sweep
Explore them in the Neptune UI.
Before you start¶
You have Python 3.x
and following libraries installed:
neptune-client
, andneptune-contrib
. See neptune-client installation guide.keras-tuner
. See how to install Keras Tuner.
You also need minimal familiarity with Keras Tuner. Have a look at the Keras Tuner guide to get started.
pip install --quiet keras-tuner neptune-client neptune-contrib['monitoring']
Step 1: Initialize Neptune¶
Run the code below:
import neptune
neptune.init(api_token='ANONYMOUS', project_qualified_name='shared/keras-tuner-integration')
Tip
You can also use your personal API token. Read more about how to securely set the Neptune API token.
Step 2: Create an Experiment¶
Run the code below to create a Neptune experiment:
neptune.create_experiment('bayesian-sweep')
This also creates a link to the experiment. Open the link in a new tab. The charts will currently be empty, but keep the window open. You will be able to see live metrics once logging starts.
Step 3: Pass Neptune Logger to Keras Tuner¶
Import NeptuneLogger()
from neptunecontrib
and pass it to the Tuner.
import neptunecontrib.monitoring.kerastuner as npt_utils
tuner = BayesianOptimization(
build_model,
objective='val_accuracy',
max_trials=10,
num_initial_points=3,
executions_per_trial=3,
project_name='bayesian-sweep',
logger=npt_utils.NeptuneLogger())
This will log the following after every trial:
run parameters under ‘hyperparameters/values’ text log
loss and all the metrics defined when compiling Keras model
hardware consumption with CPU, GPU and Memory during search
Note
You can use NeptuneLogger()
with all Keras Tuners: BayesianOptimization
, Hyperband
, RandomSearch
, and Sklearn
.
Step 4: Run the search and monitor it in Neptune¶
Now you can switch to the Neptune tab which you had opened previously to watch the optimization live!
tuner.search(x=x, y=y,
epochs=5,
validation_data=(val_x, val_y))
Check out this example experiment.

Step 5: Log additional sweep information after the sweep¶
Log more information from Keras Tuner object to Neptune with log_tuner_info()
npt_utils.log_tuner_info(tuner)
This will log:
best score (‘best_score’ metric),
best parameters (‘best_parameters’ property),
score for every run (‘run_score’, metric),
tuner project directory (‘TUNER_PROJECT_NAME’ artifact),
parameter space (‘hyperparameters/space’ text log),
name of the metric/loss used as objective (‘objective/name’ property),
direction of the metric/loss used as objective (‘objective/direction’ property),
tuner id (‘tuner_id’ property),
best trial id (‘best_trial_id’ property).
Check out this example experiment.

Remember that you can try it out with zero setup:
How to ask for help?¶
Please visit the Getting help page. Everything regarding support is there.
Other pages you may like¶
You may also find the following pages useful: