Statistical Theory of Overtraining - Is Cross-Validation Asymptotically Effective?

  • S. Amari
  • , N. Murata
  • , K. R. Müller
  • , M. Finke
  • , H. Yang

研究成果: 書籍の章/レポート/Proceedings会議への寄与査読

45 被引用数 (Scopus)

抄録

A statistical theory for overtraining is proposed. The analysis treats realizable stochastic neural networks, trained with Kullback-Leibler loss in the asymptotic case. It is shown that the asymptotic gain in the generalization error is small if we perform early stopping, even if we have access to the optimal stopping time. Considering cross-validation stopping we answer the question: In what ratio the examples should be divided into training and testing sets in order to obtain the optimum performance. In the non-asymptotic region cross-validated early stopping always decreases the generalization error. Our large scale simulations done on a CM5 are in nice agreement with our analytical findings.

本文言語英語
ホスト出版物のタイトルAdvances in Neural Information Processing Systems 8, NIPS 1995
編集者D. Touretzky, M.C. Mozer, M. Hasselmo
出版社Neural information processing systems foundation
ページ176-182
ページ数7
ISBN(電子版)0262201070, 9780262201070
出版ステータス出版済み - 1995
外部発表はい
イベント8th Advances in Neural Information Processing Systems, NIPS 1995 - Denver, 米国
継続期間: 27 11月 199530 11月 1995

出版物シリーズ

名前Advances in Neural Information Processing Systems
8
ISSN(印刷版)1049-5258

会議

会議8th Advances in Neural Information Processing Systems, NIPS 1995
国/地域米国
CityDenver
Period27/11/9530/11/95

フィンガープリント

「Statistical Theory of Overtraining - Is Cross-Validation Asymptotically Effective?」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル