We study embedded Binarized Neural Networks (eBNNs) with the aim of allowing current binarized neural networks (BNNs) in the literature to perform feedforward inference efficiently on small embedded devices. We focus on minimizing the required memory footprint, given that these devices often have memory as small as tens of kilobytes (KB). Beyond minimizing the memory required to store weights, as in a BNN, we show that it is essential to minimize the memory used for temporaries which hold intermediate results between layers in feedforward inference. To accomplish this, eBNN reorders the computation of inference while preserving the original BNN structure, and uses just a single floating-point temporary for the entire neural network. All intermediate results from a layer are stored as binary values, as opposed to floating-points used in current BNN implementations, leading to a 32x reduction in required temporary space. We provide empirical evidence that our proposed eBNN approach allows efficient inference (10s of ms) on devices with severely limited memory (10s of KB). For example, eBNN achieves 95% accuracy on the MNIST dataset running on an Intel Curie with only 15 KB of usable memory with an inference runtime of under 50 ms per sample.
This repository contains a code to train a neural network and generate C/Arduino code for embedded devices such as Arduino 101.
Clone and enter the repoistory.
git clone [email protected]:kunglab/ebnn.git
cd ebnn
Setup virtualenv (read more here if interested).
virtualenv env
source env/bin/activate
Now the bash prompt should have (env) in front of it.
Install required packages. Note: This project uses Chainer 2.0, which has a few differences to Chainer 1.0.
pip install -r requirements.txt
This library has two components: a python module that trains the eBNN and generates a C header file, and the C library which uses the generated header file and is compiled on the target device to perform inference. A simple example of network training is located at located in examples/simple.py.
This will generate the simple_mnist.h header file which requires the ebnn.h file. These two files should be included in the C/Arduino code. The C library can be used as follows, as in simple_mnist.c:
#include <stdio.h>
#include <stdint.h>
#include "simple_mnist.h"
#include "mnist_data.h"
int main()
{
uint8_t output[1];
for(int j = 0; j < 20; ++j) {
ebnn_compute(&train_data[1*28*28*j], output);
printf("actual: %d, predicted: %d\n", (int)train_labels[j], output[0]);
}
return 0;
}
For examples of generated networks, see the c/tests folder.
Our paper is available here
If you use this model or codebase, please cite:
@article{mcdanelembedded,
title={Embedded Binarized Neural Networks},
author={McDanel, Bradley and Teerapittayanon, Surat and Kung, HT},
jornal={Proceedings of the 2017 International Conference on Embedded Wireless Systems and Networks},
year={2017}
}