Join TensorFlow at Google I/O, May 11-12 Register now


A wrapper for that allows for running concurrent CloudTuner jobs.

This method takes the same parameters as and it allows duplicating a job multiple times to enable running parallel tuning jobs using CloudTuner. All jobs are identical except they will have a unique KERASTUNER_TUNER_ID environment variable set in the cluster to enable tuning job concurrency. This feature is only supported in Notebooks and Colab.

num_jobs Number of concurrent jobs to be submitted to AI Platform training. Note that these are clones of the same job that are executed independently. Setting this value to 1 is identical to just calling
**kwargs keyword arguments for

A dictionary with two keys.'job_ids' - a list of training job ids and 'docker_image'- Docker image generated for the training job.