Catalyst: Combining Co-training and Active Learning for Lifelong Classification

Catalyst: Combining Co-training and Active Learning for Lifelong Classification


Maxim Ryndin, Denis Turdakov, Sergey Kuznetsov


Modern supervised algorithms assume that the dataset used for training has the same distributions as the data to be processed. However, the real data is permanently changing. This leads to the gradual degradation of supervised machine learning algorithms in production systems and increases the cost of the maintaining.

To solve this problem, we are focusing on domain adaptation of machine learning algorithms in lifelong manner. We assume that real unlabelled data come in continuously. For this setting we propose a method for detecting changes in data distributions, as well as updating supervised algorithms.

The idea behind the method is to process a portion of the data and create a new labelled dataset for training a supervised model.  The trained model becomes a part of the ensemble used for selecting a strategy to deal with new examples: assign the label automatically using co-training or manually with the aid of active learning. This method is independent of the specific architecture of the model and could be used with any modern supervised algorithms, including artificial neural networks.

Our research also confirms two findings. First, adding small portion of data with reliable labels to a self-labelled dataset improves model's performance, even if this amount is small to build a model from scratch. It is also shown that accumulating domain knowledge by continuously adding new trained models to ensemble used for labelling, reduces the amount of labelled data required while maintaining the high performance of the adapted model.

Full text of the paper in pdf


co-training, active learning, lifelong learning, feature drift, concept drift


2020 Ivannikov Ispras Open Conference (ISPRAS)

Research Group

Information Systems

All publications during 2020 All publications