-
-
Notifications
You must be signed in to change notification settings - Fork 340
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
模型导出onnx报错 #504
Comments
使用python tools/export_onnx.py -c path/to/config -r path/to/checkpoint --check导出onnx时出现中间数据形状出错的问题,看了一下评论区只有我有这个问题,问一下大佬有没有解决办法:) |
改啥了没;用的是最新的代码吗;贴一下你这个文件里用到 https://github.com/lyuwenyu/RT-DETR/blob/main/rtdetrv2_pytorch/src/zoo/rtdetr/rtdetrv2_decoder.py |
好的好的谢谢,我试一下:) |
|
|
下面应该还有一个 |
Load PResNet50 state_dict
/mnt/2/leejq/RTDETRV2/rtdetrv2_pytorch/onnx/../src/zoo/rtdetr/rtdetr_decoder.py:465: TracerWarning: torch.tensor results are registered as constants in the trace. You can safely ignore this warning if you use this function to create tensors out of constant variables that would be the same every time you call this function. In any other case, this might cause the trace to be incorrect.
valid_WH = torch.tensor([w, h]).to(dtype)
/mnt/2/leejq/RTDETRV2/rtdetrv2_pytorch/onnx/../src/zoo/rtdetr/rtdetr_decoder.py:122: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
if reference_points.shape[-1] == 2:
/mnt/2/leejq/RTDETRV2/rtdetrv2_pytorch/onnx/../src/zoo/rtdetr/rtdetr_decoder.py:129: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
elif reference_points.shape[-1] == 4:
/home/leejq/.conda/envs/RTDETR/lib/python3.8/site-packages/torch/onnx/utils.py:1703: UserWarning: The exported ONNX model failed ONNX shape inference. The model will not be executable by the ONNX Runtime. If this is unintended and you believe there is a bug, please report an issue at https://github.com/pytorch/pytorch/issues. Error reported by strict ONNX shape inference: [ShapeInferenceError] (op_type:Where, node name: /model/decoder/Where): Y has inconsistent type tensor(double) (Triggered internally at ../torch/csrc/jit/serialization/export.cpp:1484.)
_C._check_onnx_proto(proto)
Check export onnx model done...
The text was updated successfully, but these errors were encountered: