public final class NonuniformTrainingEpochs<
Samples: Collection,
Entropy: RandomNumberGenerator
>: Sequence, IteratorProtocol
An infinite sequence of collections of sample batches suitable for training a DNN when samples are not uniformly sized.
The batches in each epoch:
- all have exactly the same number of samples.
- are formed from samples of similar size.
- start with a batch whose maximum sample size is the maximum size over all samples used in the epoch.
-
Creates an instance drawing samples from
samples
into batches of sizebatchSize
.Declaration
Parameters
entropy
a source of randomness used to shuffle sample ordering. It will be stored in
self
, so if it is only pseudorandom and has value semantics, the sequence of epochs is determinstic and not dependent on other operations.batchesPerSort
the number of batches across which to group sample sizes similarly, or
nil
to indicate that the implementation should choose a number. Choosing too high can destroy the effects of sample shuffling in many training schemes, leading to poor results. Choosing too low will reduce the similarity of sizes in a given batch, leading to inefficiency.areInAscendingSizeOrder
a predicate that returns
true
iff the size of the first parameter is less than that of the second. -
Returns the next epoch in sequence.
Declaration
public func next() -> Element?
-
Creates an instance drawing samples from
samples
into batches of sizebatchSize
.Declaration
Parameters
batchesPerSort
the number of batches across which to group sample sizes similarly, or
nil
to indicate that the implementation should choose a number. Choosing too high can destroy the effects of sample shuffling in many training schemes, leading to poor results. Choosing too low will reduce the similarity of sizes in a given batch, leading to inefficiency.areInAscendingSizeOrder
a predicate that returns
true
iff the size of the first parameter is less than that of the second.