This is to record my path heading to ML
To Do
-
[] RNN and LSTM
-
[] Introduce LSTM by Li Hongyi here
-
[] Residual Network
-
Linear regression VS Logistic Regress Activation Layer在Keras中的表示
-
[] Logistic Regression VS no-hidden NN without/with activation
-
[] L1 regularization vs L2 Regularization
-
[] Keras中的linear actication与None activation是否一致,如果一致,为啥不同线性层的叠加效果会好很多,多个线性层的叠加不应该等效成一个线性层呢
2.linear algebra online course https://www.bilibili.com/video/av15463995/?p=1
Kaggle https://www.kaggle.com/
MIT https://www.bilibili.com/video/av15463995
李宏毅 https://www.bilibili.com/video/av22727915?from=search&seid=8427486326617068898
- Linear Regression vs Logistical Regression
- Linear Regression是真的回归问题,输出y是一个连续值
- 而Logistical Regression其实是一个分类问题,参考Andrew关于logistical regression的介绍,而之所以名字中带有regression是由于历史原因,所以逻辑回归虽然名字中有回归二字,但其实质是分类问题