|View source on GitHub|
tf.Variable initializers so they load from a checkpoint file.
tf.compat.v1.train.init_from_checkpoint( ckpt_dir_or_file, assignment_map )
Values are not loaded immediately, but when the initializer is run
(typically by running a
Assignment map supports following syntax:
'checkpoint_scope_name/': 'scope_name/'- will load all variables in current
checkpoint_scope_namewith matching tensor names.
'checkpoint_scope_name/some_other_variable': 'scope_name/variable_name'- will initialize
'scope_variable_name': variable- will initialize given
tf.Variableobject with tensor 'scope_variable_name' from the checkpoint.
'scope_variable_name': list(variable)- will initialize list of partitioned variables with tensor 'scope_variable_name' from the checkpoint.
'/': 'scope_name/'- will load all variables in current
scope_namefrom checkpoint's root (e.g. no scope).
Supports loading into partitioned variables, which are represented as
# Say, '/tmp/model.ckpt' has the following tensors: # -- name='old_scope_1/var1', shape=[20, 2] # -- name='old_scope_1/var2', shape=[50, 4] # -- name='old_scope_2/var3', shape=[100, 100] # Create new model's variables with tf.compat.v1.variable_scope('new_scope_1'): var1 = tf.compat.v1.get_variable('var1', shape=[20, 2], initializer=tf.compat.v1.zeros_initializer()) with tf.compat.v1.variable_scope('new_scope_2&