tf.contrib.layers.bow_encoder
Maps a sequence of symbols to a vector per example by averaging embeddings.
tf.contrib.layers.bow_encoder(
ids, vocab_size, embed_dim, sparse_lookup=True, initializer=None,
regularizer=None, trainable=True, scope=None, reuse=None
)
Args |
ids
|
[batch_size, doc_length] Tensor or SparseTensor of type
int32 or int64 with symbol ids.
|
vocab_size
|
Integer number of symbols in vocabulary.
|
embed_dim
|
Integer number of dimensions for embedding matrix.
|
sparse_lookup
|
bool , if True , converts ids to a SparseTensor
and performs a sparse embedding lookup. This is usually faster,
but not desirable if padding tokens should have an embedding. Empty rows
are assigned a special embedding.
|
initializer
|
An initializer for the embeddings, if None default for
current scope is used.
|
regularizer
|
Optional regularizer for the embeddings.
|
trainable
|
If True also add variables to the graph collection
GraphKeys.TRAINABLE_VARIABLES (see tf.Variable).
|
scope
|
Optional string specifying the variable scope for the op, required
if reuse=True .
|
reuse
|
If True , variables inside the op will be reused.
|
Returns |
Encoding Tensor [batch_size, embed_dim] produced by
averaging embeddings.
|
Raises |
ValueError
|
If embed_dim or vocab_size are not specified.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2020-10-01 UTC.
[null,null,["Last updated 2020-10-01 UTC."],[],[]]