tf.test.is_gpu_available

tf.test.is_gpu_available(
    cuda_only=False,
    min_cuda_compute_capability=None
)

Defined in tensorflow/python/framework/test_util.py.

See the guide: Testing > Utilities

Returns whether TensorFlow can access a GPU.

Args:

  • cuda_only: limit the search to CUDA gpus.
  • min_cuda_compute_capability: a (major,minor) pair that indicates the minimum CUDA compute capability required, or None if no requirement.

Returns:

True iff a gpu device of the requested kind is available.