'checkpoint_scope_name/': 'scope_name/' - will load all variables in
current scope_name from checkpoint_scope_name with matching variable
names.
'checkpoint_scope_name/some_other_variable': 'scope_name/variable_name' -
will initialize scope_name/variable_name variable
from checkpoint_scope_name/some_other_variable.
'scope_variable_name': variable - will initialize given tf.Variable
object with variable from the checkpoint.
'scope_variable_name': list(variable) - will initialize list of
partitioned variables with variable from the checkpoint.
'/': 'scope_name/' - will load all variables in current scope_name from
checkpoint's root (e.g. no scope).
Supports loading into partitioned variables, which are represented as
'<variable>/part_<part #>'.
Example:
# Create variables.withtf.compat.v1.variable_scope('test'):m=tf.compat.v1.get_variable('my_var')withtf.compat.v1.variable_scope('test2'):var2=tf.compat.v1.get_variable('my_var')var3=tf.compat.v1.get_variable(name="my1",shape=[100,100],partitioner=lambdashape,dtype:[5,1])...# Specify which variables to initialize from checkpoint.init_from_checkpoint(checkpoint_dir,{'some_var':'test/my_var','some_scope/':'test2/'})...# Or use `Variable` objects to identify what to initialize.init_from_checkpoint(checkpoint_dir,{'some_scope/var2':var2,})# Initialize partitioned variablesinit_from_checkpoint(checkpoint_dir,{'some_var_from_ckpt':'part_var',})# Or specifying the list of `Variable` objects.init_from_checkpoint(checkpoint_dir,{'some_var_from_ckpt':var3._get_variable_list(),})...# Initialize variables as usual.session.run(tf.get_all_variables())
Args
checkpoint_dir
Directory with checkpoints file or path to checkpoint.
assignment_map
Dict, where keys are names of the variables in the
checkpoint and values are current variables or names of current variables
(in default graph).
[null,null,["Last updated 2020-10-01 UTC."],[],[],null,["# tf.contrib.framework.init_from_checkpoint\n\n\u003cbr /\u003e\n\n|------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v1.15.0/tensorflow/contrib/framework/python/framework/checkpoint_utils.py#L154-L302) |\n\nUsing assignment map initializes current variables with loaded tensors. \n\n tf.contrib.framework.init_from_checkpoint(\n checkpoint_dir, assignment_map\n )\n\n| **Note:** This overrides default initialization ops of specified variables and redefines dtype.\n\nAssignment map supports following syntax:\n\n- `'checkpoint_scope_name/': 'scope_name/'` - will load all variables in current `scope_name` from `checkpoint_scope_name` with matching variable names.\n- `'checkpoint_scope_name/some_other_variable': 'scope_name/variable_name'` - will initialize `scope_name/variable_name` variable from `checkpoint_scope_name/some_other_variable`.\n- `'scope_variable_name': variable` - will initialize given [`tf.Variable`](../../../tf/Variable) object with variable from the checkpoint.\n- `'scope_variable_name': list(variable)` - will initialize list of partitioned variables with variable from the checkpoint.\n- `'/': 'scope_name/'` - will load all variables in current `scope_name` from checkpoint's root (e.g. no scope).\n\nSupports loading into partitioned variables, which are represented as\n`'\u003cvariable\u003e/part_\u003cpart #\u003e'`.\n\n#### Example:\n\n # Create variables.\n with tf.compat.v1.variable_scope('test'):\n m = tf.compat.v1.get_variable('my_var')\n with tf.compat.v1.variable_scope('test2'):\n var2 = tf.compat.v1.get_variable('my_var')\n var3 = tf.compat.v1.get_variable(name=\"my1\", shape=[100, 100],\n partitioner=lambda shape, dtype: [5, 1])\n ...\n # Specify which variables to initialize from checkpoint.\n init_from_checkpoint(checkpoint_dir, {\n 'some_var': 'test/my_var',\n 'some_scope/': 'test2/'})\n ...\n # Or use `Variable` objects to identify what to initialize.\n init_from_checkpoint(checkpoint_dir, {\n 'some_scope/var2': var2,\n })\n # Initialize partitioned variables\n init_from_checkpoint(checkpoint_dir, {\n 'some_var_from_ckpt': 'part_var',\n })\n # Or specifying the list of `Variable` objects.\n init_from_checkpoint(checkpoint_dir, {\n 'some_var_from_ckpt': var3._get_variable_list(),\n })\n ...\n # Initialize variables as usual.\n session.run(tf.get_all_variables())\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|------------------|--------------------------------------------------------------------------------------------------------------------------------------------------|\n| `checkpoint_dir` | Directory with checkpoints file or path to checkpoint. |\n| `assignment_map` | Dict, where keys are names of the variables in the checkpoint and values are current variables or names of current variables (in default graph). |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Raises ------ ||\n|-----------------------------------------------------------|---------------------------------------------------|\n| [`tf.errors.OpError`](/api_docs/python/tf/errors/OpError) | If missing checkpoints or tensors in checkpoints. |\n| `ValueError` | If missing variables in current graph. |\n\n\u003cbr /\u003e"]]