You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
when I train the model with version 1.2 of Visual Genome Dataset. An error appears:
Traceback (most recent call last):
File "/home/XXX/Documents/densecap-tensorflow/tools/train_net.py", line 215, in
main()
File "/home/XXX/Documents/densecap-tensorflow/tools/train_net.py", line 211, in main
max_iters=args.max_iters)
File "/home/XXX/Documents/densecap-tensorflow/lib/dense_cap/train.py", line 485, in train_net
sw.train_model(sess, max_iters)
File "/home/XXX/Documents/densecap-tensorflow/lib/dense_cap/train.py", line 356, in train_model
blobs = self.data_layer.forward()
File "/home/XXX/Documents/densecap-tensorflow/lib/fast_rcnn/layer.py", line 99, in forward
blobs = self._get_next_minibatch()
File "/home/XXX/Documents/densecap-tensorflow/lib/fast_rcnn/layer.py", line 95, in _get_next_minibatch
return get_minibatch(minibatch_db)
File "/home/XXX/Documents/densecap-tensorflow/lib/fast_rcnn/minibatch.py", line 54, in get_minibatch
blobs['gt_phrases'] = _process_gt_phrases(roidb[0]['gt_phrases'])
File "/home/XXX/Documents/densecap-tensorflow/lib/fast_rcnn/minibatch.py", line 77, in _process_gt_phrases
gt_phrases[ix, :l] = phra
ValueError: cannot copy sequence with size 11 to array axis with dimension 10
Some describe words of bounding boxes are more than 10 words. Then I update the __C.MAX_WORDS to be 15, similar error appear. How can I set the parameter of __C.MAX_WORDS? Does anyone else meet this problem?
The text was updated successfully, but these errors were encountered:
What the MAX_WARDS is when you preprocessing the dataset with preprocess.py? One solution may be preprocessing the dataset again with setting the proper MAX_WORDS.
best,
problem is that, in caption there can be any number of words but the model can use a certain limit of words that is MAX_WORDS. I increase the word limit but the error remains same for some other caption so you can do two things.
increase limit (like 25 or 30) because no caption has more than 20 words. but it will be computational expensive and also affect the result.
during preprocessing the data limit captions to 8 words. because SOS and EOS are two additional keywords.
I have done the second approch and my result was same.
Hi,
when I train the model with version 1.2 of Visual Genome Dataset. An error appears:
Traceback (most recent call last):
File "/home/XXX/Documents/densecap-tensorflow/tools/train_net.py", line 215, in
main()
File "/home/XXX/Documents/densecap-tensorflow/tools/train_net.py", line 211, in main
max_iters=args.max_iters)
File "/home/XXX/Documents/densecap-tensorflow/lib/dense_cap/train.py", line 485, in train_net
sw.train_model(sess, max_iters)
File "/home/XXX/Documents/densecap-tensorflow/lib/dense_cap/train.py", line 356, in train_model
blobs = self.data_layer.forward()
File "/home/XXX/Documents/densecap-tensorflow/lib/fast_rcnn/layer.py", line 99, in forward
blobs = self._get_next_minibatch()
File "/home/XXX/Documents/densecap-tensorflow/lib/fast_rcnn/layer.py", line 95, in _get_next_minibatch
return get_minibatch(minibatch_db)
File "/home/XXX/Documents/densecap-tensorflow/lib/fast_rcnn/minibatch.py", line 54, in get_minibatch
blobs['gt_phrases'] = _process_gt_phrases(roidb[0]['gt_phrases'])
File "/home/XXX/Documents/densecap-tensorflow/lib/fast_rcnn/minibatch.py", line 77, in _process_gt_phrases
gt_phrases[ix, :l] = phra
ValueError: cannot copy sequence with size 11 to array axis with dimension 10
Some describe words of bounding boxes are more than 10 words. Then I update the __C.MAX_WORDS to be 15, similar error appear. How can I set the parameter of __C.MAX_WORDS? Does anyone else meet this problem?
The text was updated successfully, but these errors were encountered: