BiLSTM implementation question #31
Unanswered
Matsumotorise
asked this question in
Q&A
Replies: 1 comment 1 reply
-
Good question! This is by design, the weights are shared between the forward and backward direction following https://arxiv.org/abs/1902.08661 |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
I was wondering if the following would be an implementation bug or not.
Does applying the same LSTM to x_pre and x_post mean they share the same weights? I think Keras's Bidirectional wrapper makes the weights distinct in the forwards and backwards operations. Would this be to reduce model size?
Thank you!
Beta Was this translation helpful? Give feedback.
All reactions