|View source on GitHub|
Takes Edward probabilistic program and returns its log joint function.
model: Python callable which executes the generative process of a computable probability distribution using
A log-joint probability function. Its inputs are
model's original inputs
and random variables which appear during the program execution. Its output
is a scalar tf.Tensor.
Below we define Bayesian logistic regression as an Edward program,
representing the model's generative process. We apply
order to represent the model in terms of its joint probability function.
from tensorflow_probability import edward2 as ed def logistic_regression(features): coeffs = ed.Normal(loc=0., scale=1., sample_shape=features.shape, name="coeffs") outcomes = ed.Bernoulli(logits=tf.tensordot(features, coeffs, [, ]), name="outcomes") return outcomes log_joint = ed.make_log_joint_fn(logistic_regression) features = tf.random_normal([3, 2]) coeffs_value = tf.random_normal() outcomes_value = tf.round(tf.random_uniform()) output = log_joint(features, coeffs=coeffs_value, outcomes=outcomes_value)