• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

HSE University and Sber Researchers Increase Speed of Gradient Boosting Algorithm

Now machine learning will work ten times faster

ISTOCK

A group of researchers from the HSE Faculty of Computer Science and the Sber AI Lab has increased the speed of gradient boosting, one of the most efficient machine learning algorithms. The proposed approach will make it possible to solve classification and regression problems faster. The results of the work were presented at the NeurIPS conference.

Most tasks in the field of data analysis come down to forecasting based on available data. This can be a classification task (determining whether an object belongs to a certain class) or a regression task (predicting a numeric value). In practice, the number of classes or dimensionality of regression can be very large.

In such situations, researchers apply gradient boosting, an advanced machine learning algorithm that solves classification and regression tasks. It gives a prediction model in the form of an ensemble of weak learners. Several weak learners eventually result in a single efficient one.

The operation of the gradient boosting algorithm is similar to golf: in order to put the ball in the hole, the golfer hits the ball with a club, each time based on the previous stroke. Before a new stroke, the golfer looks at the distance between the ball and the hole and tries to shorten it. Boosting is built in much the same way: each new model seeks to reduce the error of an already built ensemble of models..

Leonid Iosipoi
Co-author of the report, expert at the Continuing Education Centre at the HSE Faculty of Computer Science

The problem with gradient boosting is that in classification with a very large number of classes, it may take almost an infinitely long time to train the model. When solving the classification task, the algorithm determines the probability of each object belonging to each possible class. Thus, the more classes the objects are divided into, the more results the algorithm produces. As a result, the computational complexity of the algorithm increases.

Our researchers have developed a unique framework that allows you to expanding the applicability of gradient boosting. The new algorithm is able to show better results in a number of tasks where previously only neural network approaches were used. The proposed approach is based on data compression before the most time-consuming stage: the search for the optimal tree structure. This solution will open up new opportunities for the study of machine learning models in machine learning in order to improve AI technologies using artificial intelligence.

Gleb Gusev
Managing director, Sber AI Lab

IQ

July 12, 2023