-
Notifications
You must be signed in to change notification settings - Fork 148
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
why the pretrained model with low accuracy #7
Comments
it's right.you need more epochs. |
but i run next, the Accuracy is too low always, why? 169:[lfw][2000]Accuracy-Flip: 0.98717+-0.00563 170:[lfw][2000]Accuracy-Flip: 0.99117+-0.00472 |
maybe you can try --margin-s 64 first! |
may because your batch size is small! |
what's the train dataset you use?@qidiso |
Hello, using your pre-training model training-margin-s 128, -margin-s64, my training accuracy has been around 99.3x , you can give some advice, and your arcface training model of not the highest accuracy can share it? We'll retrain on it, I look forward to hearing from you.Thank you |
Why I use your training model with low initial accuracy? Am I wrong?
cmd:CUDA_VISIBLE_DEVICES='0' python -u train_softmax.py --network y1 --ckpt 2 --loss-type 4 --lr 0.01 --lr-steps 40000,60000,70000 --wd 0.00004 --fc7-wd-mult 10 --emb-size 512 --per-batch-size 90 --margin-s 128 --data-dir ../datasets/faces_ms1m_112x112 --pretrained ../models/MF/model-y1-softmax12,31 --prefix ../models/MF/model-y1-arcface >>& file.txt &
first test:
testing verification..
(12000, 512)
infer time 154.665504
[lfw][2000]XNorm: 22.648476
[lfw][2000]Accuracy-Flip: 0.98717+-0.00563
testing verification..
(14000, 512)
infer time 175.866512
[cfp_fp][2000]XNorm: 19.110292
[cfp_fp][2000]Accuracy-Flip: 0.83271+-0.01969
testing verification..
(12000, 512)
infer time 157.715533
[agedb_30][2000]XNorm: 22.118053
[agedb_30][2000]Accuracy-Flip: 0.90683+-0.01981
saving 1
INFO:root:Saved checkpoint to "../models/MF/model-y1-arcface-0001.params"
The text was updated successfully, but these errors were encountered: