Re: [徵文] Self-Normalizing Neural Networks
感謝MXNet大的詳細解說
想請教MXNet
我一直以來有個疑惑未明
就是selu是make Feed-forward great again
但是如果加在convolution layer也有self normalize的效果嗎?
以這篇post的作者使用DCGAN的經驗來看
https://ajolicoeur.wordpress.com/cats/
“All my initial attempts at generating cats in 128 x 128 with DCGAN failed.
However, simply by replacing the batch normalizations and ReLUs with SELUs,
I was able to get slow (6+ hours) but steady convergence with the same learning
rates as before.
SELUs are self-normalizing and thus remove the need for batch normalization.”
看似是selu也能用在convolution layer且self normalize
不知道數學上也能支持這件事嗎?
selu paper裡的數學推導應該是在Feed-forward的前提?
--
※ 發信站: 批踢踢實業坊(ptt.cc), 來自: 140.112.25.100
※ 文章網址: https://www.ptt.cc/bbs/DataScience/M.1531115157.A.8F5.html
※ 編輯: PyTorch (140.112.25.100), 07/09/2018 13:48:18
討論串 (同標題文章)
以下文章回應了本文:
完整討論串 (本文為第 2 之 4 篇):
DataScience 近期熱門文章
PTT數位生活區 即時熱門文章