*Neural Networks and Deep Learning*: Single-hidden-layer neural network are implemented in packagennet (shipped with base R). PackageRSNNS offers an interface to the Stuttgart Neural Network Simulator (SNNS). An interface to the FCNN library allows user-extensible artificial neural networks in packageFCNN4R .rnn implements recurrent neural networks. Packages implementing deep learning flavours of neural networks includedarch (restricted Boltzmann machine, deep belief network),deepnet (feed-forward neural network, restricted Boltzmann machine, deep belief network, stacked autoencoders),RcppDL (denoising autoencoder, stacked denoising autoencoder, restricted Boltzmann machine, deep belief network) andh2o (feed-forward neural network, deep autoencoders). An interface to tensorflow is available intensorflow .*Recursive Partitioning*: Tree-structured models for regression, classification and survival analysis, following the ideas in the CART book, are implemented inrpart (shipped with base R) andtree . Packagerpart is recommended for computing CART-like trees. A rich toolbox of partitioning algorithms is available in Weka, packageRWeka provides an interface to this implementation, including the J4.8-variant of C4.5 and M5. TheCubist package fits rule-based models (similar to trees) with linear regression models in the terminal leaves, instance-based corrections and boosting. TheC50 package can fit C5.0 classification trees, rule-based models, and boosted versions of these.

Two recursive partitioning algorithms with unbiased variable selection and statistical stopping criterion are implemented in packageparty . Function`ctree()`

is based on non-parametric conditional inference procedures for testing independence between response and each input variable whereas`mob()`

can be used to partition parametric models. Extensible tools for visualizing binary trees and node distributions of the response are available in packageparty as well.

Tree-structured varying coefficient models are implemented in packagevcrpart .

For problems with binary input variables the packageLogicReg implements logic regression. Graphical tools for the visualization of trees are available in packagemaptree .

Trees for modelling longitudinal data by means of random effects is offered by packageREEMtree . Partitioning of mixture models is performed byRPMM .

Computational infrastructure for representing trees and unified methods for prediction and visualization is implemented inpartykit . This infrastructure is used by packageevtree to implement evolutionary learning of globally optimal trees. Survival trees are available in various package,LTRCtrees allows for left-truncation and interval-censoring in addition to right-censoring.*Random Forests*: The reference implementation of the random forest algorithm for regression and classification is available in packagerandomForest . Packageipred has bagging for regression, classification and survival analysis as well as bundling, a combination of multiple models via ensemble learning. In addition, a random forest variant for response variables measured at arbitrary scales based on conditional inference trees is implemented in packageparty .randomForestSRC implements a unified treatment of Breiman's random forests for survival, regression and classification problems. Quantile regression forestsquantregForest allow to regress quantiles of a numeric response on exploratory variables via a random forest approach. For binary data,LogicForest is a forest of logic regression trees (packageLogicReg . ThevarSelRF andBoruta packages focus on variable selection by means for random forest algorithms. In addition, packagesranger andRborist offer R interfaces to fast C++ implementations of random forests. Reinforcement Learning Trees, featuring splits in variables which will be important down the tree, are implemented in packageRLT .wsrf implements an alternative variable weighting method for variable subspace selection in place of the traditional random variable sampling.*Regularized and Shrinkage Methods*: Regression models with some constraint on the parameter estimates can be fitted with thelasso2 andlars packages. Lasso with simultaneous updates for groups of parameters (groupwise lasso) is available in packagegrplasso ; thegrpreg package implements a number of other group penalization models, such as group MCP and group SCAD. The L1 regularization path for generalized linear models and Cox models can be obtained from functions available in packageglmpath , the entire lasso or elastic-net regularization path (also inelasticnet ) for linear regression, logistic and multinomial regression models can be obtained from packageglmnet . Thepenalized package provides an alternative implementation of lasso (L1) and ridge (L2) penalized regression models (both GLM and Cox models). Packagebiglasso fits Gaussian and logistic linear models under L1 penalty when the data can't be stored in RAM. PackageRXshrink can be used to identify and display TRACEs for a specified shrinkage path and to determine the appropriate extent of shrinkage. Semiparametric additive hazards models under lasso penalties are offered by packageahaz . A generalisation of the Lasso shrinkage technique for linear regression is called relaxed lasso and is available in packagerelaxo . Fisher's LDA projection with an optional LASSO penalty to produce sparse solutions is implemented in packagepenalizedLDA . The shrunken centroids classifier and utilities for gene expression analyses are implemented in packagepamr . An implementation of multivariate adaptive regression splines is available in packageearth . Variable selection through clone selection in SVMs in penalized models (SCAD or L1 penalties) is implemented in packagepenalizedSVM . Various forms of penalized discriminant analysis are implemented in packageshda ,rda , andsda . PackageLiblineaR offers an interface to the LIBLINEAR library. Thencvreg package fits linear and logistic regression models under the the SCAD and MCP regression penalties using a coordinate descent algorithm. High-throughput ridge regression (i.e., penalization with many predictor variables) and heteroskedastic effects models are the focus of thebigRR package. An implementation of bundle methods for regularized risk minimization is available form packagebmrm . The Lasso under non-Gaussian and heteroscedastic errors is estimated byhdm , inference on low-dimensional components of Lasso regression and of estimated treatment effects in a high-dimensional setting are also contained. PackageSIS implements sure independence screening in generalised linear and Cox models.*Boosting and Gradient Descent*: Various forms of gradient boosting are implemented in packagegbm (tree-based functional gradient descent boosting). Packagexgboost implements tree-based boosting using efficient trees as base learners for several and also user-defined objective functions. The Hinge-loss is optimized by the boosting implementation in packagebst . PackageGAMBoost can be used to fit generalized additive models by a boosting algorithm. An extensible boosting framework for generalized linear, additive and nonparametric models is available in packagemboost . Likelihood-based boosting for Cox models is implemented inCoxBoost and for mixed models inGMMBoost . GAMLSS models can be fitted using boosting bygamboostLSS . An implementation of various learning algorithms based on Gradient Descent for dealing with regression tasks is available in packagegradDescent .*Support Vector Machines and Kernel Methods*: The function`svm()`

frome1071 offers an interface to the LIBSVM library and packagekernlab implements a flexible framework for kernel learning (including SVMs, RVMs and other kernel learning algorithms). An interface to the SVMlight implementation (only for one-against-all classification) is provided in packageklaR . The relevant dimension in kernel feature spaces can be estimated usingrdetools which also offers procedures for model selection and prediction. Packagegmum.r offers an R interface to LIBSVM and SVMLight.*Bayesian Methods*: Bayesian Additive Regression Trees (BART), where the final model is defined in terms of the sum over many weak learners (not unlike ensemble methods), are implemented in packageBayesTree . Bayesian nonstationary, semiparametric nonlinear regression and design by treed Gaussian processes including Bayesian CART and treed linear models are made available by packagetgp .MXM implements variable selection based on Bayesian networks.*Optimization using Genetic Algorithms*: Packagesrgp andrgenoud offer optimization routines based on genetic algorithms. The packageRmalschains implements memetic algorithms with local search chains, which are a special type of evolutionary algorithms, combining a steady state genetic algorithm with local search for real-valued parameter optimization.*Association Rules*: Packagearules provides both data structures for efficient handling of sparse binary data as well as interfaces to implementations of Apriori and Eclat for mining frequent itemsets, maximal frequent itemsets, closed frequent itemsets and association rules. Packageopusminer provides an interface to the OPUS Miner algorithm (implemented in C++) for finding the key associations in transaction data efficiently, in the form of self-sufficient itemsets, using either leverage or lift.*Fuzzy Rule-based Systems*: Packagefrbs implements a host of standard methods for learning fuzzy rule-based systems from data for regression and classification. PackageRoughSets provides comprehensive implementations of the rough set theory (RST) and the fuzzy rough set theory (FRST) in a single package.*Model selection and validation*: Packagee1071 has function`tune()`

for hyper parameter tuning and function`errorest()`

(ipred ) can be used for error rate estimation. The cost parameter C for support vector machines can be chosen utilizing the functionality of packagesvmpath . Functions for ROC analysis and other visualisation techniques for comparing candidate classifiers are available from packageROCR . Packageshdi andstabs implement stability selection for a range of models,hdi also offers other inference procedures in high-dimensional models.*Other procedures*: Evidential classifiers quantify the uncertainty about the class of a test pattern using a Dempster-Shafer mass function in packageevclass . TheOneR (One Rule) package offers a classification algorithm with enhancements for sophisticated handling of missing values and numeric data together with extensive diagnostic functions.spa combines feature-based and graph-based data for prediction of some response.*Meta packages*: Packagecaret provides miscellaneous functions for building predictive models, including parameter tuning and variable importance measures. The package can be used with various parallel implementations (e.g. MPI, NWS etc). In a similar spirit, packagemlr offers a high-level interface to various statistical and machine learning packages. PackageSuperLearner implements a similar toolbox. Theh2o package implements a general purpose machine learning platform that has scalable implementations of many popular algorithms such as random forest, GBM, GLM (with elastic net regularization), and deep learning (feedforward multilayer networks), among others.*Elements of Statistical Learning*: Data sets, functions and examples from the book The Elements of Statistical Learning: Data Mining, Inference, and Prediction by Trevor Hastie, Robert Tibshirani and Jerome Friedman have been packaged and are available asElemStatLearn .*GUI*rattle is a graphical user interface for data mining in R.*Visualisation (initially contributed by Brandon Greenwell)*The`stats::termplot()`

function package can be used to plot the terms in a model whose predict method supports`type="terms"`

. Theeffects package provides graphical and tabular effect displays for models with a linear predictor (e.g., linear and generalized linear models). Friedmanâ€™s partial dependence plots (PDPs), that are low dimensional graphical renderings of the prediction function, are implemented in a few packages.gbm ,randomForest andrandomForestSRC provide their own functions for displaying PDPs, but are limited to the models fit with those packages (the function`partialPlot`

fromrandomForest is more limited since it only allows for one predictor at a time). Packagespdp ,plotmo , andICEbox are more general and allow for the creation of PDPs for a wide variety of machine learning models (e.g., random forests, support vector machines, etc.); bothpdp andplotmo support multivariate displays (plotmo is limited to two predictors whilepdp uses trellis graphics to display PDPs involving three predictors). By default,plotmo fixes the background variables at their medians (or first level for factors) which is faster than constructing PDPs but incorporates less information.ICEbox focuses on constructing individual conditional expectation (ICE) curves, a refinement over Friedman's PDPs. ICE curves, as well as centered ICE curves can also be constructed with the`partial()`

function from thepdp package.ggRandomForests provides ggplot2-based tools for the graphical exploration of random forest models (e.g., variable importance plots and PDPs) from therandomForest andrandomForestSRC packages.