Node perturbation learning without noiseless baseline

Tatsuya Cho, Kentaro Katahira, Kazuo Okanoya, Masato Okada

研究成果: ジャーナルへの寄稿記事査読

6 被引用数 (Scopus)

抄録

Node perturbation learning is a stochastic gradient descent method for neural networks. It estimates the gradient by comparing an evaluation of the perturbed output and the unperturbed output performance, which we call the baseline. Node perturbation learning has primarily been investigated without taking noise on the baseline into consideration. In real biological systems, however, neural activities are intrinsically noisy, and hence, the baseline is likely contaminated with the noise. In this paper, we propose an alternative learning method that does not require such a noiseless baseline. Our method uses a "second perturbation", which is calculated with different noise than the first perturbation. By comparing the evaluation of the outcomes with the first perturbation and with the second perturbation, the network weights are updated. We reveal that the learning speed showed only a linear decrease with the variance of the second perturbation. Moreover, using the second perturbation can lead to a decrease in residual error compared to the case of using the noiseless baseline.

本文言語英語
ページ(範囲)267-272
ページ数6
ジャーナルNeural Networks
24
3
DOI
出版ステータス出版済み - 4月 2011
外部発表はい

フィンガープリント

「Node perturbation learning without noiseless baseline」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル