Help protect the Great Barrier Reef with TensorFlow on Kaggle Join Challenge


Constructs an executor factory to execute computations locally.

default_num_clients The number of clients to run by default if cardinality cannot be inferred from arguments.
max_fanout The maximum fanout at any point in the aggregation hierarchy. If num_cients > max_fanout, the constructed executor stack will consist of multiple levels of aggregators. The height of the stack will be on the order of log(num_cients) / log(max_fanout).
clients_per_thread Integer number of clients for each of TFF's threads to run in sequence. Increasing clients_per_thread therefore reduces the concurrency of the TFF runtime, which can be useful if client work is very lightweight or models are very large and multiple copies cannot fit in memory.
server_tf_device A tf.config.LogicalDevice to place server and other computation without explicit TFF placement.
client_tf_devices List/tuple of tf.config.LogicalDevice to place clients for simulation. Possibly accelerators returned by tf.config.list_logical_devices().
reference_resolving_clients Boolean indicating whether executors representing clients must be able to handle unplaced TFF lambdas.
support_sequence_ops Boolean indicating whether this executor supports sequence ops (currently False by default).
leaf_executor_fn A function that constructs leaf-level executors. Default is the eager TF executor (other possible options: XLA, IREE). Should accept the device keyword argument if the executor is to be configured with explicitly chosen devices.
local_computation_factory An instance of LocalComputationFactory to use to construct local computations used as parameters in certain federated operators (such as tff.federated_sum, etc.). Defaults to a TensorFlow computation factory that generates TensorFlow code.

An instance of executor_factory.ExecutorFactory encapsulating the executor construction logic specified above.

ValueError If the number of clients is specified and not one or larger.