View source on GitHub |
Acquisition function for reducing over batch dimensions.
Inherits From: AcquisitionFunction
tfp.experimental.bayesopt.acquisition.MCMCReducer(
predictive_distribution,
observations,
seed=None,
acquisition_class=None,
reduce_dims=None,
**acquisition_kwargs
)
MCMCReducer
evaluates a base acquisition function and takes the mean of the
function values over the dimensions indicated by reduce_dims
. MCMCReducer
is useful for marginalizing over an MCMC sample of GP kernel hyperparameters,
for example.
Examples
Build and evaluate an acquisition function that computes Gaussian Process Expected Improvement and then marginalizes over the leftmost batch dimension.
import numpy as np
import tensorflow_probability as tfp
tfd = tfp.distributions
tfpk = tfp.math.psd_kernels
tfp_acq = tfp.experimental.bayesopt.acquisition
# Sample 10 20-dimensional index points and associated observations.
index_points = np.random.uniform(size=[10, 20])
observations = np.random.uniform(size=[10])
# The kernel and GP have batch shape [32], representing a sample of
# hyperparameters that we want to marginalize over.
kernel_amplitudes = np.random.uniform(size=[32])
# Build a (batched) Gaussian Process regression model.
dist = tfd.GaussianProcessRegressionModel(
kernel=tfpk.MaternFiveHalves(amplitude=kernel_amplitudes),
observation_index_points=index_points,
observations=observations)
# Define an `MCMCReducer` with GP Expected Improvement.
mcmc_ei = tfp_acq.MCMCReducer(
predictive_distribution=dist,
observations=observations,
acquisition_class=GaussianProcessExpectedImprovement,
reduce_dims=0)
# Evaluate the acquisition function at a new set of index points,
# marginalizing over the hyperparameter batch.
pred_index_points = np.random.uniform(size=[6, 20])
acq_fn_vals = mcmc_ei(pred_index_points) # Has shape [6].
Methods
__call__
__call__(
**kwargs
)
Call self as a function.