-
Notifications
You must be signed in to change notification settings - Fork 61
Backlog (To Do)
gnawice edited this page Jul 8, 2018
·
6 revisions
- center loss (prototype)
- separate activation from layer
- speedups with dropout or drop connect (skip some convolutions)
- documentation / doxy
- function to lock/remove/mod/clean layer
- move logging code into network class
- drop connect
- batch norm
- reproduce resnet results
- activation layers - don't seem to need right now. Can use resize layer.
- finish fractional max pool (the correct way)
- finish support for non-square elements (off the table. Do want to handle 1D convolutions though)
- conv stride > 1 - in but broke fix it
- sliding window code with full image network processing
- lecun-like multi-dimensional embedding
- C wrapper with python example