tf.contrib.training.resample_at_rate
Stay organized with collections
Save and categorize content based on your preferences.
Given inputs
tensors, stochastically resamples each at a given rate.
tf.contrib.training.resample_at_rate(
inputs, rates, scope=None, seed=None, back_prop=False
)
For example, if the inputs are [[a1, a2], [b1, b2]]
and the rates
tensor contains [3, 1]
, then the return value may look like [[a1,
a2, a1, a1], [b1, b2, b1, b1]]
. However, many other outputs are
possible, since this is stochastic -- averaged over many repeated
calls, each set of inputs should appear in the output rate
times
the number of invocations.
Args |
inputs
|
A list of tensors, each of which has a shape of [batch_size, ...]
|
rates
|
A tensor of shape [batch_size] containing the resampling rates
for each input.
|
scope
|
Scope for the op.
|
seed
|
Random seed to use.
|
back_prop
|
Whether to allow back-propagation through this op.
|
Returns |
Selections from the input tensors.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2020-10-01 UTC.
[null,null,["Last updated 2020-10-01 UTC."],[],[],null,["# tf.contrib.training.resample_at_rate\n\n\u003cbr /\u003e\n\n|------------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v1.15.0/tensorflow/contrib/training/python/training/resample.py#L68-L98) |\n\nGiven `inputs` tensors, stochastically resamples each at a given rate. \n\n tf.contrib.training.resample_at_rate(\n inputs, rates, scope=None, seed=None, back_prop=False\n )\n\nFor example, if the inputs are `[[a1, a2], [b1, b2]]` and the rates\ntensor contains `[3, 1]`, then the return value may look like `[[a1,\na2, a1, a1], [b1, b2, b1, b1]]`. However, many other outputs are\npossible, since this is stochastic -- averaged over many repeated\ncalls, each set of inputs should appear in the output `rate` times\nthe number of invocations.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|-------------|----------------------------------------------------------------------------------|\n| `inputs` | A list of tensors, each of which has a shape of `[batch_size, ...]` |\n| `rates` | A tensor of shape `[batch_size]` containing the resampling rates for each input. |\n| `scope` | Scope for the op. |\n| `seed` | Random seed to use. |\n| `back_prop` | Whether to allow back-propagation through this op. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| Selections from the input tensors. ||\n\n\u003cbr /\u003e"]]