Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: cannot copy sequence with size 11 to array axis with dimension 10 #18

Open
huiyang865 opened this issue Aug 9, 2018 · 2 comments

Comments

@huiyang865
Copy link

Hi,
when I train the model with version 1.2 of Visual Genome Dataset. An error appears:

Traceback (most recent call last):
File "/home/XXX/Documents/densecap-tensorflow/tools/train_net.py", line 215, in
main()
File "/home/XXX/Documents/densecap-tensorflow/tools/train_net.py", line 211, in main
max_iters=args.max_iters)
File "/home/XXX/Documents/densecap-tensorflow/lib/dense_cap/train.py", line 485, in train_net
sw.train_model(sess, max_iters)
File "/home/XXX/Documents/densecap-tensorflow/lib/dense_cap/train.py", line 356, in train_model
blobs = self.data_layer.forward()
File "/home/XXX/Documents/densecap-tensorflow/lib/fast_rcnn/layer.py", line 99, in forward
blobs = self._get_next_minibatch()
File "/home/XXX/Documents/densecap-tensorflow/lib/fast_rcnn/layer.py", line 95, in _get_next_minibatch
return get_minibatch(minibatch_db)
File "/home/XXX/Documents/densecap-tensorflow/lib/fast_rcnn/minibatch.py", line 54, in get_minibatch
blobs['gt_phrases'] = _process_gt_phrases(roidb[0]['gt_phrases'])
File "/home/XXX/Documents/densecap-tensorflow/lib/fast_rcnn/minibatch.py", line 77, in _process_gt_phrases
gt_phrases[ix, :l] = phra
ValueError: cannot copy sequence with size 11 to array axis with dimension 10

Some describe words of bounding boxes are more than 10 words. Then I update the __C.MAX_WORDS to be 15, similar error appear. How can I set the parameter of __C.MAX_WORDS? Does anyone else meet this problem?

@InnerPeace-Wu
Copy link
Owner

What the MAX_WARDS is when you preprocessing the dataset with preprocess.py? One solution may be preprocessing the dataset again with setting the proper MAX_WORDS.
best,

@princeamitlali
Copy link

problem is that, in caption there can be any number of words but the model can use a certain limit of words that is MAX_WORDS. I increase the word limit but the error remains same for some other caption so you can do two things.

  1. increase limit (like 25 or 30) because no caption has more than 20 words. but it will be computational expensive and also affect the result.
  2. during preprocessing the data limit captions to 8 words. because SOS and EOS are two additional keywords.

I have done the second approch and my result was same.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants