Abstract

The importance of classification in machine learning is increasingly acknowledged in contemporary research and applications, such as disease detection, user analysis, etc. However, the efficacy of traditional classification algorithms is frequently affected by challenges such as class imbalance and the processing of large-scale dynamic data. To address these issues, inspired by the cost-sensitive learning strategy (Pinball loss) and instance-level loss function (Focal loss), this study constructs instance-level cost-sensitive loss functions by expanding the sparse and robust Hinge and Ramp loss. The new loss functions can better discern the difference between classes and samples. By integrating with the online SVM classification algorithm and using online gradient descent (OGD) algorithm, it can effectively deal with classification problems on class imbalanced data streams. Numerical experiments on UCI benchmark datasets validate their effectiveness