An implementation of extensions to Freund and Schapire’s AdaBoost algorithm and Friedman’s gradient boosting machine. Includes regression methods for least squares, absolute loss, t-distribution loss, quantile regression, logistic, multinomial logistic, Poisson, Cox proportional hazards partial likelihood, AdaBoost exponential loss, Huberized hinge loss, and Learning to Rank measures (LambdaMart).

Home http://code.google.com/p/gradientboostedmodels/
Versions 2.1.1, 2.1.3
License GPL (>= 2)
Recipe https://github.com/bioconda/bioconda-recipes/tree/master/recipes/r-gbm


With an activated Bioconda channel (see 2. Set up channels), install with:

conda install r-gbm

and update with:

conda update r-gbm


A Docker container is available at https://quay.io/repository/biocontainers/r-gbm.