[情報] DataScience的相關資源置底
有鑑於置底空間有限,版主整理這篇文給大家參考索引
歡迎大家發文或者推文提供資源和建議,版主不定期會更新這個列表
# ---------------------------------------------------
#版友情報文
作者:MLLAB [情報] ML resources #1Qcx4QMU (DataScience)
作者:aa155495 [情報] Mobile Deep learning Resource #1QcywWWC (DataScience)
作者:ruthertw [轉錄] 史上最完整機器學習自學攻略... #5QWh1oWM (DataScience)
作者:aaaba [轉錄] 基於TensorFlow的機器學習... #5Qb-tSvF (DataScience)
作者:RumiManiac [問題] 機器學習在動漫的應用 #1Q-Gv3pO (DataScience)
因為目前文章很少,直接將部份內容列出
# MLLAB [情報] ML resources #1Qcx4QMU (DataScience)
台大資工林軒田老師
機器學習基石
機器學習技法
https://www.csie.ntu.edu.tw/~htlin/mooc/
台大電機李宏毅老師
machine learning (ML)
machine learning and having it deep and structured (MLDS)
http://speech.ee.ntu.edu.tw/~tlkagk/courses.html
台大電機李宏毅老師&台大資工陳縕儂老師
applied deep learning x machine learning and having it deep and structured
(AD
LxMLDS)
https://www.csie.ntu.edu.tw/~yvchen/f106-adl/syllabus.html
Stanford Andrew Ng
Machine Learning
https://www.coursera.org/learn/machine-learning
University of Oxford
Machine Learning
https://www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/
David Silver
RL course
https://goo.gl/BGwF63
Stanford
CNN for Visual Recognition
http://cs231n.stanford.edu/syllabus.html
Ian Goodfellow
Deep Learning
https://www.youtube.com/playlist?list=PLkISDyMVw2Htm42P0eTVEKyz7scxZ4V-O
UIUC Dan Roth
Machine Learning
https://goo.gl/124noX
交大應數李育杰老師
Machine Learning
http://ocw.nctu.edu.tw/course_detail-v.php?bgid=1&gid=1&nid=563
資料科學相關的課程
從 微積分、線代、機率、統計 到 機器學習 都有
https://goo.gl/mKlq8r
CS相關課程
https://www.ptt.cc/bbs/studyabroad/M.1511862466.A.D02.html
AI、ML相關conference的deadline
https://aideadlin.es/
Paper
https://openreview.net/
https://arxiv.org/list/stat.ML/recent
https://www.aaai.org/Library/conferences-library.php
CV相關paper:http://openaccess.thecvf.com/menu.py
GAN相關的paper:https://deephunt.in/the-gan-zoo-79597dc8c347
tensorflow相關資源
tutorials
https://www.tensorflow.org/tutorials/
code範例
https://github.com/aymericdamien/TensorFlow-Examples
論壇
https://www.reddit.com/r/MachineLearning/
# aa155495 [情報] Mobile Deep learning Resource #1QcywWWC (DataScience)
Survey paper
A Survey of Model Compression and Acceleration for Deep Neural Networks
[arXiv '17]
https://arxiv.org/abs/1710.09282
--------------------------------------------------------
輕量化 Model
1. MobilenetV2: Inverted Residuals and Linear Bottlenecks: Mobile Networks
for
Classification, Detection and Segmentation [arXiv '18, Google]
https://arxiv.org/pdf/1801.04381.pdf
2. NasNet: Learning Transferable Architectures for Scalable Image Recognition
[arXiv '17, Google]
註:Google AutoML 的論文
https://arxiv.org/pdf/1707.07012.pdf
3. DeepRebirth: Accelerating Deep Neural Network Execution on Mobile Devices
[AAAI'18, Samsung]
https://arxiv.org/abs/1708.04728
4. ShuffleNet: An Extremely Efficient Convolutional Neural Network for Mobile
Devices [arXiv '17, Megvii]
https://arxiv.org/abs/1707.01083
5. MobileNets: Efficient Convolutional Neural Networks for Mobile Vision
Applications [arXiv '17, Google]
https://arxiv.org/abs/1704.04861
6. CondenseNet: An Efficient DenseNet using Learned Group Convolutions [arXiv
'17]
https://arxiv.org/abs/1711.09224
------------------------------------------------------------
System
1. DeepMon: Mobile GPU-based Deep Learning Framework for Continuous Vision
Applications [MobiSys '17]
https://www.sigmobile.org/mobisys/2017/accepted.php
2. DeepEye: Resource Efficient Local Execution of Multiple Deep Vision Models
using Wearable Commodity Hardware [MobiSys '17]
http://fahim-kawsar.net/papers/Mathur.MobiSys2017-Camera.pdf
3. MobiRNN: Efficient Recurrent Neural Network Execution on Mobile GPU [EMDL
'17]
https://arxiv.org/abs/1706.00878
4. DeepSense: A GPU-based deep convolutional neural network framework on
commodity mobile devices [WearSys '16]
http://ink.library.smu.edu.sg/cgi/viewcontent.cgi?article=4278&context=sis_res
earch
5. DeepX: A Software Accelerator for Low-Power Deep Learning Inference on
Mobile
Devices [IPSN '16]
http://niclane.org/pubs/deepx_ipsn.pdf
6. EIE: Efficient Inference Engine on Compressed Deep Neural Network [ISCA
'16]
https://arxiv.org/abs/1602.01528
7. MCDNN: An Approximation-Based Execution Framework for Deep Stream
Processin
g Under Resource Constraints [MobiSys '16]
http://haneul.github.io/papers/mcdnn.pdf
8. DXTK: Enabling Resource-efficient Deep Learning on Mobile and Embedded
Devices with the DeepX Toolkit [MobiCASE '16]
9. Sparsification and Separation of Deep Learning Layers for Constrained
Resource Inference on Wearables [SenSys ’16]
10. An Early Resource Characterization of Deep Learning on Wearables,
Smartpho
nes
and Internet-of-Things Devices [IoT-App ’15]
11. CNNdroid: GPU-Accelerated Execution of Trained Deep Convolutional Neural
Networks on Android [MM '16]
12. fpgaConvNet: A Toolflow for Mapping Diverse Convolutional Neural Networks
on
Embedded FPGAs [NIPS '17]
--------------------------------------------------------------
Quantization (Model compression)
1. The ZipML Framework for Training Models with End-to-End Low Precision: The
Cans, the Cannots, and a Little Bit of Deep Learning [ICML'17]
2. Compressing Deep Convolutional Networks using Vector Quantization
[arXiv'14]
3. Quantized Convolutional Neural Networks for Mobile Devices [CVPR '16]
4. Fixed-Point Performance Analysis of Recurrent Neural Networks [ICASSP'16]
5. Quantized Neural Networks: Training Neural Networks with Low Precision
Weig
hts and Activations [arXiv'16]
6. Loss-aware Binarization of Deep Networks [ICLR'17]
7. Towards the Limit of Network Quantization [ICLR'17]
8. Deep Learning with Low Precision by Half-wave Gaussian Quantization
[CVPR'17]
9. ShiftCNN: Generalized Low-Precision Architecture for Inference of
Convoluti
onal Neural Networks [arXiv'17]
10. Training and Inference with Integers in Deep Neural Networks [ICLR'18]
------------------------------------------------------------
Pruning (Model Compression)
1. Learning both Weights and Connections for Efficient Neural Networks
[NIPS'15]
2. Pruning Filters for Efficient ConvNets [ICLR'17]
3. Pruning Convolutional Neural Networks for Resource Efficient Inference
[ICL
R'17]
4. Soft Weight-Sharing for Neural Network Compression [ICLR'17]
5. Deep Compression: Compressing Deep Neural Networks with Pruning, Trained
Qu
antization and Huffman Coding [ICLR'16]
6. Dynamic Network Surgery for Efficient DNNs [NIPS'16]
7. Designing Energy-Efficient Convolutional Neural Networks using
Energy-Aware
Pruning [CVPR'17]
8. ThiNet: A Filter Level Pruning Method for Deep Neural Network Compression
[
ICCV'17]
9. To prune, or not to prune: exploring the efficacy of pruning for model comp
ression [ICLR'18]
---------------------------------------------------------------
Approximation
1. Efficient and Accurate Approximations of Nonlinear Convolutional Networks
[
CVPR'15]
2. Accelerating Very Deep Convolutional Networks for Classification and
Detect
ion (Extended version of above one)
3. Convolutional neural networks with low-rank regularization [arXiv'15]
4. Exploiting Linear Structure Within Convolutional Networks for Efficient
Eva
luation [NIPS'14]
5. Compression of Deep Convolutional Neural Networks for Fast and Low Power
Mo
bile Applications [ICLR'16]
6. High performance ultra-low-precision convolutions on mobile devices
[NIPS'17]
其他版友推薦
1.Udacity 的免費DL課程 由google的科學家親自講課
https://www.udacity.com/course/deep-learning--ud730
(全英授課 對計畫留學的版友 應該是不錯的資源)
2. 史丹佛大學的 機器學習:自然語言處理的應用
https://www.youtube.com/watch?v=OQQ-W_63UgQ&list=PL3FW7Lu3i5Jsnh1rnUwq_TcylNr7
EkRe6
3.Deep Learning(英文書)
http://www.deeplearningbook.org/
4.兩門修完林軒田老師的課後可以進修的課
https://goo.gl/HV39mG
https://goo.gl/JK2esy
--
* * * *
最美麗的詩歌是最絕望的詩歌 * *
* * * * *
* * * 有些不朽篇章是純粹的眼淚
* * * * * *
* * * * * Alfred de Musset
--
※ 發信站: 批踢踢實業坊(ptt.cc), 來自: 1.163.147.50
※ 文章網址: https://www.ptt.cc/bbs/DataScience/M.1521727605.A.4DF.html
推
03/23 16:12,
6年前
, 1F
03/23 16:12, 1F
推
03/23 20:37,
6年前
, 2F
03/23 20:37, 2F
林軒田老師的教材好像頗受好評
推
03/24 01:14,
6年前
, 3F
03/24 01:14, 3F
推
03/24 01:40,
6年前
, 4F
03/24 01:40, 4F
推
03/24 14:37,
6年前
, 5F
03/24 14:37, 5F
推
03/26 20:42,
6年前
, 6F
03/26 20:42, 6F
推
04/11 14:08,
6年前
, 7F
04/11 14:08, 7F
已更新
推
04/17 15:10,
6年前
, 8F
04/17 15:10, 8F
→
04/17 15:11,
6年前
, 9F
04/17 15:11, 9F
→
04/17 15:13,
6年前
, 10F
04/17 15:13, 10F
→
04/17 15:14,
6年前
, 11F
04/17 15:14, 11F
已更新
推
04/18 14:42,
6年前
, 12F
04/18 14:42, 12F
推
04/21 14:16,
6年前
, 13F
04/21 14:16, 13F
※ 編輯: st1009 (1.163.154.100), 05/16/2018 23:26:55
推
05/22 18:02, , 14F
05/22 18:02, 14F
推
05/03 03:43,
2年前
, 15F
05/03 03:43, 15F
→
05/03 03:43,
2年前
, 16F
05/03 03:43, 16F
→
05/03 03:43,
2年前
, 17F
05/03 03:43, 17F
→
05/03 03:43,
2年前
, 18F
05/03 03:43, 18F
→
05/03 03:43,
2年前
, 19F
05/03 03:43, 19F
→
05/03 03:43,
2年前
, 20F
05/03 03:43, 20F
DataScience 近期熱門文章
PTT數位生活區 即時熱門文章