tensorflow:: ops:: ApplyAdadelta

#include <training_ops.h>

Update '*var' according to the adadelta scheme.

Summary

accum = rho() * accum + (1 - rho()) * grad.square(); update = (update_accum + epsilon).sqrt() * (accum + epsilon()).rsqrt() * grad; update_accum = rho() * update_accum + (1 - rho()) * update.square(); var -= update;

Args:

  • scope: A Scope object
  • var: Should be from a Variable().
  • accum: Should be from a Variable().
  • accum_update: Should be from a Variable().
  • lr: Scaling factor. Must be a scalar.
  • rho: Decay factor. Must be a scalar.
  • epsilon: Constant factor. Must be a scalar.
  • grad: The gradient.

Optional attributes (see Attrs ):

  • use_locking: If True, updating of the var, accum and update_accum tensors will be protected by a lock; otherwise the behavior is undefined, but may exhibit less contention.

Returns:

Constructors and Destructors

ApplyAdadelta (const :: tensorflow::Scope & scope, :: tensorflow::Input var, :: tensorflow::Input accum, :: tensorflow::Input accum_update, :: tensorflow::Input lr, :: tensorflow::Input rho, :: tensorflow::Input epsilon, :: tensorflow::Input grad)
ApplyAdadelta (const :: tensorflow::Scope & scope, :: tensorflow::Input var, :: tensorflow::Input accum, :: tensorflow::Input accum_update, :: tensorflow::Input lr, :: tensorflow::Input rho, :: tensorflow::Input epsilon, :: tensorflow::Input grad, const ApplyAdadelta::Attrs & attrs)

Public attributes

operation
out

Public functions

node () const
::tensorflow::Node *
operator::tensorflow::Input () const
operator::tensorflow::Output () const

Public static functions

UseLocking (bool x)

Structs

tensorflow:: ops:: ApplyAdadelta:: Attrs

Optional attribute setters for ApplyAdadelta .

Public attributes

operation

Operation operation

out

::tensorflow::Output out

Public functions

ApplyAdadelta

 ApplyAdadelta(
  const ::tensorflow::Scope & scope,
  ::tensorflow::Input var,
  ::tensorflow::Input accum,
  ::tensorflow::Input accum_update,
  ::tensorflow::Input lr,
  ::tensorflow::Input rho,
  ::tensorflow::Input epsilon,
  ::tensorflow::Input grad
)

ApplyAdadelta

 ApplyAdadelta(
  const ::tensorflow::Scope & scope,
  ::tensorflow::Input var,
  ::tensorflow::Input accum,
  ::tensorflow::Input accum_update,
  ::tensorflow::Input lr,
  ::tensorflow::Input rho,
  ::tensorflow::Input epsilon,
  ::tensorflow::Input grad,
  const ApplyAdadelta::Attrs & attrs
)

node

::tensorflow::Node * node() const 

operator::tensorflow::Input

 operator::tensorflow::Input() const 

operator::tensorflow::Output

 operator::tensorflow::Output() const 

Public static functions

UseLocking

Attrs UseLocking(
  bool x
)