Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gradient Descent with Early Stopping is Provably Robust to Label Noise for Overparameterized Neural Networks #17

Open
nocotan opened this issue Jan 5, 2021 · 0 comments
Assignees

Comments

@nocotan
Copy link
Member

nocotan commented Jan 5, 2021

一言でいうと

停止規則としてearly stoppingを採用したoverparameterized DNNがラベルノイズにロバストであることを示した.

論文リンク

AISTATS2020
http://proceedings.mlr.press/v108/li20j/li20j.pdf

著者/所属機関

Mingchen Li (University of California), Mahdi Soltanolkotabi (University of Southern California), Samet Oymak (University of California)

投稿日付(yyyy/MM/dd)

2019/03/27

概要

以下の図のように,overparametrized DNNは

  1. 初期のイテレーションではラベルノイズを無視するように振る舞い,
  2. 後半のイテレーションでは過学習し始める.

Screen Shot 2021-01-05 at 10 10 31

新規性・差分

手法

Screen Shot 2021-01-05 at 10 10 54

結果

Screen Shot 2021-01-05 at 10 11 13

Screen Shot 2021-01-05 at 10 11 22

コメント

近年のDeep Double Descent現象と矛盾する気もする.
https://arxiv.org/abs/1912.02292

@nocotan nocotan self-assigned this Jan 5, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant