Re: [徵文] Self-Normalizing Neural Networks

看板DataScience作者 (PY火炬)時間6年前 (2018/07/09 13:45), 6年前編輯推噓0(000)
留言0則, 0人參與, 最新討論串2/4 (看更多)
感謝MXNet大的詳細解說 想請教MXNet 我一直以來有個疑惑未明 就是selu是make Feed-forward great again 但是如果加在convolution layer也有self normalize的效果嗎? 以這篇post的作者使用DCGAN的經驗來看 https://ajolicoeur.wordpress.com/cats/ “All my initial attempts at generating cats in 128 x 128 with DCGAN failed. However, simply by replacing the batch normalizations and ReLUs with SELUs, I was able to get slow (6+ hours) but steady convergence with the same learning rates as before. SELUs are self-normalizing and thus remove the need for batch normalization.” 看似是selu也能用在convolution layer且self normalize 不知道數學上也能支持這件事嗎? selu paper裡的數學推導應該是在Feed-forward的前提? -- ※ 發信站: 批踢踢實業坊(ptt.cc), 來自: 140.112.25.100 ※ 文章網址: https://www.ptt.cc/bbs/DataScience/M.1531115157.A.8F5.html ※ 編輯: PyTorch (140.112.25.100), 07/09/2018 13:48:18
文章代碼(AID): #1RGlQLZr (DataScience)
文章代碼(AID): #1RGlQLZr (DataScience)