Multiple approaches to build models of machine learning in Python are possible, and the article would serve as a simply summary of the essential steps to conduct machine learning from data loading to final visualization. You can find the data here: http://archive.ics.uci.edu/ml/datasets/Ionosphere More details can be found in Robert Layton's book here: https://www.goodreads.com/book/show/26019855-learning-data-mining-with-python?from_search=true

# Category: Tune

## Pipeline Steps in Python

We use the data from sklearn library(need to download face datasets separately), and the IDE is sublime text3. Most of the code comes from the book: https://www.goodreads.com/book/show/32439431-introduction-to-machine-learning-with-python?from_search=true

## ROC and Confusion Matrix for Classifier in Python

We use the data from sklearn library(need to download face datasets separately), and the IDE is sublime text3. Most of the code comes from the book: https://www.goodreads.com/book/show/32439431-introduction-to-machine-learning-with-python?from_search=true

## Quick Cross Validation and Grid Search of Parameters in Python

Cross Validation is a way to lift overfitting during training model, and we also applied Grid Search method in both python and R: https://charleshsliao.wordpress.com/2017/05/20/logistic-regression-in-python-to-tune-parameter-c/ https://charleshsliao.wordpress.com/2017/04/24/cnndnn-of-keras-in-r-backend-tensorflow-for-mnist/ We will focus on how to use both the methods to identify the best parameters, model and score without overfitting. We use the data from sklearn library, and the IDE… Continue reading Quick Cross Validation and Grid Search of Parameters in Python

## Features Selection in Python

We talked about features selection based on Lasso(https://charleshsliao.wordpress.com/2017/04/11/regularization-in-neural-network-with-mnist-and-deepnet-of-r/), and autoencoder. More features will make the model more complex. it can be a good idea to reduce the number of features to only the most useful ones, and discard the rest. There are three basic strategies: Univariate statistics, model-based selection and iterative selection. We use the… Continue reading Features Selection in Python

## Multi Layer Perceptrons in Python

You can see more about MLP in R here: https://charleshsliao.wordpress.com/2017/04/10/tune-multi-layer-perceptron-mlp-in-r-with-mnist/ Generally speaking, a deep learning model means a neural network model with more than just one hidden layer. Whether a deep learning model would be successful depends largely on the parameters tuned. We use the data from sklearn library, and the IDE is sublime text3.… Continue reading Multi Layer Perceptrons in Python

## Basic SVM in Python

In Python we can build SVM model for classification with sklearn library. We can use basic linearsvc or svc with more parameters to tune. We use the data from sklearn library, and the IDE is sublime text3. Most of the code comes from the book: https://www.goodreads.com/book/show/32439431-introduction-to-machine-learning-with-python?from_search=true

## Logistic Regression in Python to Tune Parameter C

The trade-off parameter of logistic regression that determines the strength of the regularization is called C, and higher values of C correspond to less regularization (where we can specify the regularization function).C is actually the Inverse of regularization strength(lambda) We use the data from sklearn library, and the IDE is sublime text3. Most of the… Continue reading Logistic Regression in Python to Tune Parameter C

## CNN/DNN of KeRas in R, Backend Tensorflow, for MNIST

Keras is a library of tensorflow, and they are both developed under python. We can approach to both of the libraries in R after we install the according packages. Of course, we need to install tensorflow and keras at first with terminal (I am using a MAC), and they can function best with python 2.7.… Continue reading CNN/DNN of KeRas in R, Backend Tensorflow, for MNIST

## A H2O FNN Model for MNIST

Please read this first: https://charleshsliao.wordpress.com/2017/04/14/identify-arguments-of-h2o-deep-learning-model-with-tuned-auto-encoder-in-r-with-mnist/ Following the auto encoder results of arguments in last article and a sample FNN model at the end of that article, we can build a full FNN model for MNIST.