Skip to content

In this example I show how Extreme-Gradient-Boosting can be used in a Survival Analysis database. At the end the importance of the variables is shown.

Notifications You must be signed in to change notification settings

DiegoVallarino/Extreme-Gradient-Boosting

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Extreme-Gradient-Boosting

XGBoost Extreme Gradient Boosting is a supervised predictive algorithm that uses the boosting principle. The idea behind boosting is to generate multiple "weak" prediction models sequentially, and each of these takes the results of the previous model, to generate a "stronger" model, with better predictive power and greater stability in its results. To get a stronger model, an optimization algorithm is used, in this case Gradient Descent. During training, the parameters of each weak model are fit iteratively trying to find the minimum of an objective function, which can be the proportion of error in the classification, the area under the curve (AUC), the root mean square error ( RMSE) or some other.

About

In this example I show how Extreme-Gradient-Boosting can be used in a Survival Analysis database. At the end the importance of the variables is shown.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages