Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adabelief Optimizer: Adapting Stepsizes by the Belief in Observed Gradients #19

Open
nocotan opened this issue Jan 7, 2021 · 0 comments
Assignees

Comments

@nocotan
Copy link
Member

nocotan commented Jan 7, 2021

一言でいうと

汎化性能の向上,収束レートの改善,安定性の向上を実現したAdamの亜種

論文リンク

著者/所属機関

Juntang Zhuang, Tommy Tang, Yifan Ding, Sekhar Tatikonda, Nicha Dvornek, Xenophon Papademetris, James S. Duncan
( Yale University, University of Illinois at Urbana-Champaign, University of Central Florida)

投稿日付(yyyy/MM/dd)

2020/12/20

概要

https://arxiv.org/pdf/2010.07468.pdf

Screen Shot 2021-01-07 at 20 10 07

新規性・差分

勾配の予測値と実際の勾配との差でステップサイズを決定

手法

Screen Shot 2021-01-07 at 20 09 52

結果

Screen Shot 2021-01-07 at 20 10 16

Screen Shot 2021-01-07 at 20 10 22

コメント

@nocotan nocotan self-assigned this Jan 7, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant