tf.estimator.BaselineRegressor

A regressor that can establish a simple baseline.

Inherits From: Estimator

This regressor ignores feature values and will learn to predict the average value of each label.

Example:


# Build BaselineRegressor
regressor = tf.estimator.BaselineRegressor()

# Input builders
def input_fn_train:
  # Returns tf.data.Dataset of (x, y) tuple where y represents label's class
  # index.
  pass

def input_fn_eval:
  # Returns tf.data.Dataset of (x, y) tuple where y represents label's class
  # index.
  pass

# Fit model.
regressor.train(input_fn=input_fn_train)

# Evaluate squared-loss between the test and train targets.
loss = regressor.evaluate(input_fn=input_fn_eval)["loss"]

# predict outputs the mean value seen during training.
predictions = regressor.predict(new_samples)

Input of train and evaluate should have following features, otherwise there will be a KeyError:

  • if weight_column is not None, a feature with key=weight_column whose value is a Tensor.

model_dir Directory to save model parameters, graph and etc. This can also be used to load checkpoints from the directory into a estimator to continue training a previously saved model.
label_dimension Number of regression targets per example. This is the size of the last dimension of the labels and logits Tensor objects (typically, these have shape [batch_size, label_dimension]).
weight_column A string or a _NumericColumn created by tf.feature_column.numeric_column defining feature column representing weights. It will be multiplied by the loss of the example.
optimizer String, tf.keras.optimizers.* object, or callable that creates the optimizer to use for training. If not specified, will use Ftrl as the default optimizer.
config RunConfig object to configure the runtime settings.
loss_reduction One of tf.losses.Reduction except NONE. Describes how to reduce training loss over batch. Defaults to SUM_OVER_BATCH_SIZE.

Eager Compatibility

Estimators can be