-
I am experimenting with Autoencoders on the MNIST dataset. I can successfully train and test the whole model by evaluating the L2 loss on the image reconstruction. For the prediction I use a predictor so: val predictor: Predictor[NDArray, NDArray] = model.newPredictor(translator) and my translator is: val translator = new Translator[NDArray, NDArray]:
override def processInput(ctx: TranslatorContext, input: NDArray): NDList =
// Reshape to correct input size
NDList(input.reshape(Shape(1, Mnist.IMAGE_HEIGHT * Mnist.IMAGE_WIDTH)))
override def processOutput(ctx: TranslatorContext, output: NDList): NDArray =
// Reshape to correct output size
output.head.reshape(Shape(Mnist.IMAGE_HEIGHT * Mnist.IMAGE_WIDTH))
override def getBatchifier(): Batchifier =
// The Batchifier describes how to combine a batch together
// Stacking, the most common batchifier, takes N [X1, X2, ...] arrays to a single [N, X1, X2, ...] array
Batchifier.STACK
The batchifier generates batches so that each // Get the batch
val images = batch.getData.head
val labels = batch.getLabels.head
// get the first element of the batch
val image = images.get(0).squeeze(0)
val label = labels.get(0).squeeze(0)
// predict (reconstruction): label and image are the same
val predicted = predictor.predict(label)
val deltaLoss = loss.evaluate(NDList(image), NDList(predicted)).getFloat() In other words I take a batch of 256 images, sample the first one of Now what I would like to do is take the full So the first issue is that the code of the translator has the I have also attempted to use a translator that does the following: it takes the Finally, how does one handle the last batch that may not have the required length of TIA |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
To use the predictor with batches, just call |
Beta Was this translation helpful? Give feedback.
To use the predictor with batches, just call
predictor.batchPredict(...)
instead. In this case, you would want to use a translator that translates single images at a time and the StackBatchifier. It batches any size so no modifications are necessary for the last partial batch.