You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I've been trying to test out the autoencoder using the pre-trained model and examples in the readme. I pre-processed both the 50k and 500k datasets and get the same error with both. Has anyone also experienced this error?
Traceback (most recent call last):
File "sample.py", line 97, in
main()
File "sample.py", line 90, in main
autoencoder(args, model)
File "sample.py", line 44, in autoencoder
model.load(charset, args.model, latent_rep_size = latent_dim)
File "/Users/brentkuenzi/Documents/GitHub/keras-molecules/molecules/model.py", line 95, in load
self.create(charset, weights_file = weights_file, latent_rep_size = latent_rep_size)
File "/Users/brentkuenzi/Documents/GitHub/keras-molecules/molecules/model.py", line 50, in create
self.autoencoder.load_weights(weights_file)
File "/Users/brentkuenzi/anaconda2/lib/python2.7/site-packages/keras/engine/network.py", line 1180, in load_weights
f, self.layers, reshape=reshape)
File "/Users/brentkuenzi/anaconda2/lib/python2.7/site-packages/keras/engine/saving.py", line 916, in load_weights_from_hdf5_group
reshape=reshape)
File "/Users/brentkuenzi/anaconda2/lib/python2.7/site-packages/keras/engine/saving.py", line 675, in preprocess_weights_for_loading
weights[0] = np.transpose(weights[0], (3, 2, 0, 1))
File "/Users/brentkuenzi/anaconda2/lib/python2.7/site-packages/numpy/core/fromnumeric.py", line 575, in transpose
return _wrapfunc(a, 'transpose', axes)
File "/Users/brentkuenzi/anaconda2/lib/python2.7/site-packages/numpy/core/fromnumeric.py", line 52, in _wrapfunc
return getattr(obj, method)(*args, **kwds)
ValueError: axes don't match array
The text was updated successfully, but these errors were encountered:
Hi, I have the same problem right now. could you solve it?. I have suspicions that this may be due to the different charsets in the database that was used for the training compared to the one used to perform the encoder.
if someone solved this problem I would really appreciate an answer!
Hi,
I already solved the problem. I did this by saving the charset occupied by the trained data and forced this to be the charset of the data set to which the autoencoder will be performed.
Hello, I've been trying to test out the autoencoder using the pre-trained model and examples in the readme. I pre-processed both the 50k and 500k datasets and get the same error with both. Has anyone also experienced this error?
python sample.py data/processed_500k.h5 data/model_500k.h5 --target autoencoder
Traceback (most recent call last):
File "sample.py", line 97, in
main()
File "sample.py", line 90, in main
autoencoder(args, model)
File "sample.py", line 44, in autoencoder
model.load(charset, args.model, latent_rep_size = latent_dim)
File "/Users/brentkuenzi/Documents/GitHub/keras-molecules/molecules/model.py", line 95, in load
self.create(charset, weights_file = weights_file, latent_rep_size = latent_rep_size)
File "/Users/brentkuenzi/Documents/GitHub/keras-molecules/molecules/model.py", line 50, in create
self.autoencoder.load_weights(weights_file)
File "/Users/brentkuenzi/anaconda2/lib/python2.7/site-packages/keras/engine/network.py", line 1180, in load_weights
f, self.layers, reshape=reshape)
File "/Users/brentkuenzi/anaconda2/lib/python2.7/site-packages/keras/engine/saving.py", line 916, in load_weights_from_hdf5_group
reshape=reshape)
File "/Users/brentkuenzi/anaconda2/lib/python2.7/site-packages/keras/engine/saving.py", line 675, in preprocess_weights_for_loading
weights[0] = np.transpose(weights[0], (3, 2, 0, 1))
File "/Users/brentkuenzi/anaconda2/lib/python2.7/site-packages/numpy/core/fromnumeric.py", line 575, in transpose
return _wrapfunc(a, 'transpose', axes)
File "/Users/brentkuenzi/anaconda2/lib/python2.7/site-packages/numpy/core/fromnumeric.py", line 52, in _wrapfunc
return getattr(obj, method)(*args, **kwds)
ValueError: axes don't match array
The text was updated successfully, but these errors were encountered: