XGBoost is an optimized distributed gradient boosting library
designed to be highly efficient, flexible and portable. It implements
machine learning algorithms under the Gradient Boosting framework.
XGBoost provides a parallel tree boosting (also known as GBDT, GBM)
that solve many data science problems in a fast and accurate way.
The same code runs on major distributed environment (Hadoop, SGE,
MPI) and can solve problems beyond billions of examples.
By default the package is build without parellization, if you want
to enable OpenMP set the environment variable OMP=yes.
For enabling MPI set the environment variable MPI=yes, this requires
a MPI implementation, use mpich (openmpi isn't working).
To enable GPU support set CUDA=yes, this requires the CUDA toolkit.
To install the Python module set PYTHON=yes (note: python3 only).
If you want to run the tests set the environment variable TESTS=yes,
this requires gtest.
Maintained by: William PC
Keywords: distributed gradient boosting,machine learning,gradient boosting
ChangeLog: xgboost
Homepage:
https://xgboost.ai
Download SlackBuild:
xgboost.tar.gz
xgboost.tar.gz.asc (FAQ)
(the SlackBuild does not include the source)
Individual Files: |
README |
slack-desc |
xgboost.SlackBuild |
xgboost.info |
© 2006-2023 SlackBuilds.org Project. All rights reserved.
Slackware® is a registered trademark of
Patrick Volkerding
Linux® is a registered trademark of
Linus Torvalds