'checkpoint_scope_name/': 'scope_name/' - will load all variables in
current scope_name from checkpoint_scope_name with matching tensor
names.
'checkpoint_scope_name/some_other_variable': 'scope_name/variable_name' -
will initialize scope_name/variable_name variable
from checkpoint_scope_name/some_other_variable.
'scope_variable_name': variable - will initialize given tf.Variable
object with tensor 'scope_variable_name' from the checkpoint.
'scope_variable_name': list(variable) - will initialize list of
partitioned variables with tensor 'scope_variable_name' from the checkpoint.
'/': 'scope_name/' - will load all variables in current scope_name from
checkpoint's root (e.g. no scope).
Supports loading into partitioned variables, which are represented as
'<variable>/part_<part #>'.
Example:
# Say, '/tmp/model.ckpt' has the following tensors:# -- name='old_scope_1/var1', shape=[20, 2]# -- name='old_scope_1/var2', shape=[50, 4]# -- name='old_scope_2/var3', shape=[100, 100]# Create new model's variableswithtf.compat.v1.variable_scope('new_scope_1'):var1=tf.compat.v1.get_variable('var1',shape=[20,2],initializer=tf.compat.v1.zeros_initializer())withtf.compat.v1.variable_scope('new_scope_2'):var2=tf.compat.v1.get_variable('var2',shape=[50,4],initializer=tf.compat.v1.zeros_initializer())# Partition into 5 variables along the first axis.var3=tf.compat.v1.get_variable(name='var3',shape=[100,100],initializer=tf.compat.v1.zeros_initializer(),partitioner=lambdashape,dtype:[5,1])# Initialize all variables in `new_scope_1` from `old_scope_1`.init_from_checkpoint('/tmp/model.ckpt',{'old_scope_1/':'new_scope_1'})# Use names to specify which variables to initialize from checkpoint.init_from_checkpoint('/tmp/model.ckpt',{'old_scope_1/var1':'new_scope_1/var1','old_scope_1/var2':'new_scope_2/var2'})# Or use tf.Variable objects to identify what to initialize.init_from_checkpoint('/tmp/model.ckpt',{'old_scope_1/var1':var1,'old_scope_1/var2':var2})# Initialize partitioned variables using variable's nameinit_from_checkpoint('/tmp/model.ckpt',{'old_scope_2/var3':'new_scope_2/var3'})# Or specify the list of tf.Variable objects.init_from_checkpoint('/tmp/model.ckpt',{'old_scope_2/var3':var3._get_variable_list()})
Args
ckpt_dir_or_file
Directory with checkpoints file or path to checkpoint.
assignment_map
Dict, where keys are names of the variables in the
checkpoint and values are current variables or names of current variables
(in default graph).
Raises
ValueError
If missing variables in current graph, or if missing
checkpoints or tensors in checkpoints.
[null,null,["Last updated 2020-10-01 UTC."],[],[],null,["# tf.train.init_from_checkpoint\n\n\u003cbr /\u003e\n\n|-----------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v1.15.0/tensorflow/python/training/checkpoint_utils.py#L203-L291) |\n\nReplaces [`tf.Variable`](../../tf/Variable) initializers so they load from a checkpoint file.\n\n#### View aliases\n\n\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n[`tf.compat.v1.train.init_from_checkpoint`](/api_docs/python/tf/compat/v1/train/init_from_checkpoint)\n\n\u003cbr /\u003e\n\n tf.train.init_from_checkpoint(\n ckpt_dir_or_file, assignment_map\n )\n\nValues are not loaded immediately, but when the initializer is run\n(typically by running a [`tf.compat.v1.global_variables_initializer`](../../tf/initializers/global_variables) op).\n| **Note:** This overrides default initialization ops of specified variables and redefines dtype.\n\nAssignment map supports following syntax:\n\n- `'checkpoint_scope_name/': 'scope_name/'` - will load all variables in current `scope_name` from `checkpoint_scope_name` with matching tensor names.\n- `'checkpoint_scope_name/some_other_variable': 'scope_name/variable_name'` - will initialize `scope_name/variable_name` variable from `checkpoint_scope_name/some_other_variable`.\n- `'scope_variable_name': variable` - will initialize given [`tf.Variable`](../../tf/Variable) object with tensor 'scope_variable_name' from the checkpoint.\n- `'scope_variable_name': list(variable)` - will initialize list of partitioned variables with tensor 'scope_variable_name' from the checkpoint.\n- `'/': 'scope_name/'` - will load all variables in current `scope_name` from checkpoint's root (e.g. no scope).\n\nSupports loading into partitioned variables, which are represented as\n`'\u003cvariable\u003e/part_\u003cpart #\u003e'`.\n\n#### Example:\n\n\n # Say, '/tmp/model.ckpt' has the following tensors:\n # -- name='old_scope_1/var1', shape=[20, 2]\n # -- name='old_scope_1/var2', shape=[50, 4]\n # -- name='old_scope_2/var3', shape=[100, 100]\n\n # Create new model's variables\n with tf.compat.v1.variable_scope('new_scope_1'):\n var1 = tf.compat.v1.get_variable('var1', shape=[20, 2],\n initializer=tf.compat.v1.zeros_initializer())\n with tf.compat.v1.variable_scope('new_scope_2'):\n var2 = tf.compat.v1.get_variable('var2', shape=[50, 4],\n initializer=tf.compat.v1.zeros_initializer())\n # Partition into 5 variables along the first axis.\n var3 = tf.compat.v1.get_variable(name='var3', shape=[100, 100],\n initializer=tf.compat.v1.zeros_initializer(),\n partitioner=lambda shape, dtype: [5, 1])\n\n # Initialize all variables in `new_scope_1` from `old_scope_1`.\n init_from_checkpoint('/tmp/model.ckpt', {'old_scope_1/': 'new_scope_1'})\n\n # Use names to specify which variables to initialize from checkpoint.\n init_from_checkpoint('/tmp/model.ckpt',\n {'old_scope_1/var1': 'new_scope_1/var1',\n 'old_scope_1/var2': 'new_scope_2/var2'})\n\n # Or use tf.Variable objects to identify what to initialize.\n init_from_checkpoint('/tmp/model.ckpt',\n {'old_scope_1/var1': var1,\n 'old_scope_1/var2': var2})\n\n # Initialize partitioned variables using variable's name\n init_from_checkpoint('/tmp/model.ckpt',\n {'old_scope_2/var3': 'new_scope_2/var3'})\n\n # Or specify the list of tf.Variable objects.\n init_from_checkpoint('/tmp/model.ckpt',\n {'old_scope_2/var3': var3._get_variable_list()})\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|--------------------|--------------------------------------------------------------------------------------------------------------------------------------------------|\n| `ckpt_dir_or_file` | Directory with checkpoints file or path to checkpoint. |\n| `assignment_map` | Dict, where keys are names of the variables in the checkpoint and values are current variables or names of current variables (in default graph). |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Raises ------ ||\n|--------------|---------------------------------------------------------------------------------------------|\n| `ValueError` | If missing variables in current graph, or if missing checkpoints or tensors in checkpoints. |\n\n\u003cbr /\u003e"]]