Features Selection in Python

We talked about features selection based on Lasso(https://charleshsliao.wordpress.com/2017/04/11/regularization-in-neural-network-with-mnist-and-deepnet-of-r/), and autoencoder. More features will make the model more complex. it can be a good idea to reduce the number of features to only the most useful ones, and discard the rest. There are three basic strategies: Univariate statistics, model-based selection and iterative selection. We use the… Continue reading Features Selection in Python

Advertisements

Auto Encoder to Detect Anomalous Cases in Smartphone Actimetry Data

We use a deep auto-encoder model to analyze actimetry data from smartphones. You can find the data here:  http://archive.ics.uci.edu/ml/datasets/Human+Activity+Recognition+Using+Smartphones. Why should we do this? An auto encoder can be useful for excluding unknown or unusual activities, rather than incorrectly classifying them, by examining whether any of the activities tend to have more or less anomalous values. We… Continue reading Auto Encoder to Detect Anomalous Cases in Smartphone Actimetry Data

Auto encoder with R, MNIST in Deep Learning

Auto-encoders are trained to reproduce or predict the inputs--the hidden layers and neurons are not maps between an input and some other outcome, but are self (auto)-encoding. We can use auto encoders to conduct dimensions reduction, lift overfitting and so on. We will talk about it in the next article. h2o package of R provides… Continue reading Auto encoder with R, MNIST in Deep Learning

K-means, Hierarchical, and Feature Selection Methods

We will use the well-known iris data set to make some quick clustering. (site : http://archive.ics.uci.edu/ml/datasets/Iris) (data : http://archive.ics.uci.edu/ml/machine-learning-databases/iris/iris.data ) (description : http://archive.ics.uci.edu/ml/machine-learning- databases/iris/iris.names). Hierarchical Clustering with basic functions and adjust distance methods accordingly, since K-means function would not allow user to adjust distance attributes: Now comes the interesting part. We know that only some… Continue reading K-means, Hierarchical, and Feature Selection Methods