-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Accuracy drop after simplify #13
Comments
@ZhuLIAO001 Hi, thanks for opening the issue. I'll take a look ASAP. |
@ZhuLIAO001 I'm unable to run the script. Is it possible that the file with the model class is missing? |
Hi. I have updated the code. When you try to reproduce the code, could you please download all the files in to a folder, then I think it will work. |
@ZhuLIAO001 I did not forget, but I'm swamped with NeurIPS; I'll get back to this when that is over. Thanks for your patience. |
@ZhuLIAO001, we noticed that tests were failing. It seems the issue is now fixed (the tests are all ok). Can you try using this branch? https://github.com/EIDOSLAB/simplify/tree/development |
Hi guys, thank you for your library. It is quite cool.
However, I have met a problem when I try to use Simplify with Resnet18 on CIFAR10 dataset. I have used global_unstructured pruning in Pytorch and the sparsity was set to 0.875. Then I have transfer the weights of a list of neurons in layer4.1.Conv1 (Penultimate layer) to zero. In the next, I implemented the Simplify. The test_acc after “pruning” is 92.6%, after “pruning + transfer neurons’ weight to 0” is 92.24%, after “pruning + transfer neurons’ weight to 0 + simplify” is 92.21%.
I think there was supposed to be no test_acc decrease after the Simplify process. Could you please help me understand what happens here? I hope I have described my problem well.
If you want to reproduce, here is the link of my code and required files.
https://partage.imt.fr/index.php/s/9jemn7WfkBWS5tx
The text was updated successfully, but these errors were encountered: