-
Notifications
You must be signed in to change notification settings - Fork 504
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LPIPS Loss producing negative values #72
Comments
Is there a good workaround for this? |
If the code is installed and the weights are loaded properly (and weren't changed by accidentally fine-tuning them, for example), it is not possible to get negative values. Check the weights at all non-negative, by doing the following
|
Thank you, this makes perfect sense. |
For those trapped in this same problem, if you load LPIPS with lpips = lpips.LPIPS(net='vgg') Remember to refute gradients! Do the following instead! lpips = lpips.LPIPS(net='vgg').requires_grad_(False) |
Hi,
While running the LPIPS loss based on AlexNet, I obtained a negative value,
While looking at the values contained in
res
(defined in theforward()
), I have noticed that the implementation does not match theEq. 1
from the paper.Here's Eq. 1:
While this is what is implemented,
The square operation
** 2
at line 94 should be removed and instead applied on theself.lins[kk].model(diffs[kk])
(at lines 98 and 100), and ondiff[kk]
(at lines 103 and 105).Thanks in advance,
Guillaume
The text was updated successfully, but these errors were encountered: