Though it is more convenient to conduct TensorFlow framework in python, we also talked about how to apply Tensorflow in R here:https://charleshsliao.wordpress.com/tag/tensorflow/ We will talk about how to apply Recurrent neural network in TensorFlow on both of python and R. RNN might not be the best algorithm to deal with MNIST but this can be… Continue reading RNN in TensorFlow in Python&R, with MNIST

# Tag: R

## Recommenders in R, Comparing Multiple Algorithms

We know several essential recommenders' methods. If we want to recommend ourselves a book, we can do it 1. Based on our own exp 2. Based on our friends friends exp 3. Based on the catalog of the library 4. Based on the search engine's result We already talked a little about the first method… Continue reading Recommenders in R, Comparing Multiple Algorithms

## Train Deep Learning Model with R Studio in AWS EC2

AWS provides us with approachable GPU based cloud computing capability with minimal cost. We will talk about the steps to take advantage of AWS EC2 to build GPU computing for our model training in R. 1. Register AWS account... 2. Find EC2 service 3. Click "launch instance" and go for this one labeled Free tier… Continue reading Train Deep Learning Model with R Studio in AWS EC2

## CNN/DNN of KeRas in R, Backend Tensorflow, for MNIST

Keras is a library of tensorflow, and they are both developed under python. We can approach to both of the libraries in R after we install the according packages. Of course, we need to install tensorflow and keras at first with terminal (I am using a MAC), and they can function best with python 2.7.… Continue reading CNN/DNN of KeRas in R, Backend Tensorflow, for MNIST

## A H2O FNN Model for MNIST

Please read this first: https://charleshsliao.wordpress.com/2017/04/14/identify-arguments-of-h2o-deep-learning-model-with-tuned-auto-encoder-in-r-with-mnist/ Following the auto encoder results of arguments in last article and a sample FNN model at the end of that article, we can build a full FNN model for MNIST.

## Auto encoder with R, MNIST in Deep Learning

Auto-encoders are trained to reproduce or predict the inputs--the hidden layers and neurons are not maps between an input and some other outcome, but are self (auto)-encoding. We can use auto encoders to conduct dimensions reduction, lift overfitting and so on. We will talk about it in the next article. h2o package of R provides… Continue reading Auto encoder with R, MNIST in Deep Learning

## Regularization in Neural Network, with MNIST and Deepnet of R

Several regularization methods are helpful to reduce overfitting of nn model. 1. L1 penalty is also known as the Least Absolute Shrinkage and Selection Operator (lasso). The penalty term uses the sum of the absolute weights, so the degree of penalty is no smaller or larger for small or large weights People are more familiar… Continue reading Regularization in Neural Network, with MNIST and Deepnet of R

## Deep Neural Networking for MNIST Hand Writing Digits Recognition, to Minimize Feature of Metrics

Deep Learning is a powerful tool to solve photo recognition problem. Background: https://charleshsliao.wordpress.com/2017/02/24/svm-tuning-based-on-mnist/ Denoted the percentage number for the training data as P1 Denoted error rate as P2 The final FOM = P1/2 + P2 With these parameters above we have a FOM of 9.8733%

## Bank Loan Estimation with SVM and Logistic Regression

Use the bank marketing dataset from UCI Machine Learning Repository (https://archive.ics.uci.edu/ml/datasets/Bank+Marketing). There are no the only best C or Gamma value for SVM since the data and the problem we try to solve are different. From the observation above, a higher gamma value would result a slightly better accuracy. But the cost would not… Continue reading Bank Loan Estimation with SVM and Logistic Regression

## Quick Example of Parallel Computation in R for SVM/Random Forest, with MNIST and Credit Data

It is generally acknowledged that SVM algorithm is relatively slow to train, even with tuning parameters such as cost and kernel. The general way to boost the speed is to apply packages of "parallel" "do parallel" "doSNOW" and for each function. Data and background: Data and background: https://charleshsliao.wordpress.com/2017/02/24/svm-tuning-based-on-mnist/ It is not ensured that we can increase… Continue reading Quick Example of Parallel Computation in R for SVM/Random Forest, with MNIST and Credit Data