tfdf.tuner.RandomSearch

Tuner using random hyperparameter values.

Inherits From: Tuner

Used in the notebooks

Used in the tutorials

The candidate hyper-parameter can be evaluated independently and in parallel.

num_trials Number of random hyperparameter values to evaluate.
use_predefined_hps If true, automatically configure the the space of hyper-parameters explored by the tuner. In this case, configuring the hyper-parameters manually (e.g. calling "choice(...)" on the tuner) is not necessary.
trial_num_threads Number of threads used to train the models in each trial. This parameter is different from the num_threads parameter of the model constructor that indicates how many threads to use for the overal training+possibly tuning. For example trial_num_threads=2 and num_threads=5, 5 models will be training in parallel during tuning, and each of those models will be trained with 2 threads. In reverse, if you want to run at most 100 threads globally, make sure that trial_num_threads*num_threads = 100.
trial_maximum_training_duration_seconds Maximum training duration of an individual trial expressed in seconds. This parameter is different from the maximum_training_duration_seconds parameter of the model constructor that define the maximum training+tuning duration.

Methods

choice

View source

Adds a hyperparameter with a list of possible values.

Args
key Name of the hyper-parameter.
values List of possible value for the hyperparameter.
merge If false (default), raises an error if the hyper-parameter already exist. If true, adds values to the parameter if it already exist.

Returns
The conditional SearchSpace corresponding to the values in "values".

set_base_learner

View source

Sets the base learner key.

train_config

View source

YDF training configuration for the Hyperparameter optimizer.