|View source on GitHub|
Returns an execution context backed by C++ runtime.
tff.backends.native.create_async_local_cpp_execution_context( default_num_clients: int = 0, max_concurrent_computation_calls: int = -1, stream_structs: bool = False ) ->
This execution context starts a C++ worker assumed to be at path
binary_path, serving on
port, and constructs a Python remote execution
context to talk to this worker.
||If an internal C++ worker binary can not be found.|