From ac7c399af24d100423efd614dcd6e930ae649d14 Mon Sep 17 00:00:00 2001 From: Deepak Mangla Date: Wed, 3 Jul 2019 23:45:13 +0530 Subject: [PATCH] Update link path --- docs/overall.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/overall.md b/docs/overall.md index ab62f1d..442a7db 100644 --- a/docs/overall.md +++ b/docs/overall.md @@ -8,4 +8,4 @@ BNNs can save 10X+ memory, and several times as fast as float NNs. What's more, 1. The BNN models can be trained by any frameworks which support ONNX. Note that binary convs are custom operations, please check out [onnx2bnn.md](docs/onnx2bnn.md) for how to make the model comptabile with dabnn. -2. For the implementation of binary convolutions, please check out [bconv.md](docs/bconv.md). +2. For the implementation of binary convolutions, please check out [bconv.md](bconv.md).