RSVP for your your local TensorFlow Everywhere event today!

NonuniformTrainingEpochs

public final class NonuniformTrainingEpochs<
  Samples: Collection,
  Entropy: RandomNumberGenerator
>: Sequence, IteratorProtocol

An infinite sequence of collections of sample batches suitable for training a DNN when samples are not uniformly sized.

The batches in each epoch:

  • all have exactly the same number of samples.
  • are formed from samples of similar size.
  • start with a batch whose maximum sample size is the maximum size over all samples used in the epoch.
  • Creates an instance drawing samples from samples into batches of size batchSize.

    Declaration

    public init(
      samples: Samples,
      batchSize: Int,
      entropy: Entropy,
      batchesPerSort: Int? = nil,
      areInAscendingSizeOrder:
        @escaping (Samples.Element, Samples.Element) -> Bool
    )

    Parameters

    entropy

    a source of randomness used to shuffle sample ordering. It will be stored in self, so if it is only pseudorandom and has value semantics, the sequence of epochs is determinstic and not dependent on other operations.

    batchesPerSort

    the number of batches across which to group sample sizes similarly, or nil to indicate that the implementation should choose a number. Choosing too high can destroy the effects of sample shuffling in many training schemes, leading to poor results. Choosing too low will reduce the similarity of sizes in a given batch, leading to inefficiency.

    areInAscendingSizeOrder

    a predicate that returns true iff the size of the first parameter is less than that of the second.

  • The type of each epoch, a collection of batches of samples.

    Declaration

    public typealias Element = Slices<
      Sampling<Samples, Array<Samples.Index>.SubSequence>
    >
  • Returns the next epoch in sequence.

    Declaration

    public func next() -> Element?
  • Creates an instance drawing samples from samples into batches of size batchSize.

    Declaration

    public convenience init(
      samples: Samples,
      batchSize: Int,
      batchesPerSort: Int? = nil,
      areInAscendingSizeOrder:
        @escaping (Samples.Element, Samples.Element) -> Bool
    )

    Parameters

    batchesPerSort

    the number of batches across which to group sample sizes similarly, or nil to indicate that the implementation should choose a number. Choosing too high can destroy the effects of sample shuffling in many training schemes, leading to poor results. Choosing too low will reduce the similarity of sizes in a given batch, leading to inefficiency.

    areInAscendingSizeOrder

    a predicate that returns true iff the size of the first parameter is less than that of the second.