We use the data from sklearn library(need to download face datasets separately), and the IDE is sublime text3. Most of the code comes from the book: https://www.goodreads.com/book/show/32439431-introduction-to-machine-learning-with-python?from_search=true
Please read this first: https://charleshsliao.wordpress.com/2017/04/14/identify-arguments-of-h2o-deep-learning-model-with-tuned-auto-encoder-in-r-with-mnist/ Following the auto encoder results of arguments in last article and a sample FNN model at the end of that article, we can build a full FNN model for MNIST.
Several regularization methods are helpful to reduce overfitting of nn model. 1. L1 penalty is also known as the Least Absolute Shrinkage and Selection Operator (lasso). The penalty term uses the sum of the absolute weights, so the degree of penalty is no smaller or larger for small or large weights People are more familiar… Continue reading Regularization in Neural Network, with MNIST and Deepnet of R
Googled MLP and so many "My Little Ponies" results popped out. LOL. 🙂 Generally speaking, a deep learning model means a neural network model with with more than just one hidden layer. Whether a deep learning model would be successful depends largely on the parameters tuned. Multi-layer Perceptron or MLP provided by R package "RNNS"… Continue reading Tune Multi-layer Perceptron (MLP) in R with MNIST
It is generally acknowledged that SVM algorithm is relatively slow to train, even with tuning parameters such as cost and kernel. The general way to boost the speed is to apply packages of "parallel" "do parallel" "doSNOW" and for each function. Data and background: Data and background: https://charleshsliao.wordpress.com/2017/02/24/svm-tuning-based-on-mnist/ It is not ensured that we can increase… Continue reading Quick Example of Parallel Computation in R for SVM/Random Forest, with MNIST and Credit Data
This is quite like the article using C5.0 to conduct classification: https://charleshsliao.wordpress.com/2017/03/04/a-quick-classification-example-with-c5-0-in-r/ We tried to use more mature and powerful algorithms with cross validation and parameters tuning. 1. At first we preprocess the data. 2. We can start with the basic logistic regression model. The ROC chart below shows the Average Under Curve value as a metric… Continue reading Credit Analysis with ROC evaluation in Neural Network and Random Forest