Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

alpha的处理是不是有问题 #2

Open
woofyzhao opened this issue Apr 27, 2018 · 5 comments
Open

alpha的处理是不是有问题 #2

woofyzhao opened this issue Apr 27, 2018 · 5 comments

Comments

@woofyzhao
Copy link

看原文应该是,和p_t一样对正负例是不一样的权重

@alyato
Copy link

alyato commented Jun 6, 2018

@woofyzhao
@zhezh

对这个代码有一丝疑问?

func_fl = focal_loss(labels, model_out, fl_gamma, fl_alpha)

labelsmodel_out 是什么?
看了解释,说model_out也是softmax后的概率。
请问,有没有简单的example,例如在mnist数据集上运行的。
谢谢。

@zhezh
Copy link
Owner

zhezh commented Jun 6, 2018

@alyato
labels 就是y的ground truth, model_out就是logit,其实就是输出的概率, 这里做了一个加epsilon,避免log(0).
比如三分类, 一张图是第一类,则labels=(0,1,0), 假设模型输出概率为(0.1,0.6,0.3),则model_out约等于 logit=(0.1,0.6,0.3).

@alyato
Copy link

alyato commented Jun 6, 2018

@zhezh
谢谢。
这也是我所困惑的地方。
模型的输出概率model_out需要怎么得到?
labels 可以直接输入。
(我是用keras,backend 是TensorFlow,将focal_loss作为loss function,训练的时候,怎么拿到model_out值,感觉只有做预测的时候,能够得到)

@zheLim
Copy link

zheLim commented Nov 9, 2019

logits shall be input of softmax instead of probability( output of softmax).

@yzlnew
Copy link

yzlnew commented Oct 11, 2021

是错的

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants