-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inserting an LSTM layer raises a shape error #122
Comments
If you use |
Thanks. Got this to work with the following code: with nengo.Network(seed=seed) as net:
nengo_dl.configure_settings(
trainable=None, stateful=False, keep_history=False,
)
inp = nengo.Node(np.zeros(np.prod(train_images.shape[1:])))
h = nengo_dl.Layer(tf.keras.layers.LSTM(units=128))(
inp, shape_in=(train_images.shape[1], 1))
out = nengo_dl.Layer(tf.keras.layers.Dense(units=10))(h)
p = nengo.Probe(out) and using |
Marking this as resolved since the fix above works! |
Steps to reproduce:
docs/examples/lmu.ipynb
I have also tried adding
unroll=True
to the LSTM and/or configuringstateful=True
and/or configuringkeep_history=True
undernengo_dl.configure_settings
.The text was updated successfully, but these errors were encountered: