Top-k Supervise Feature Selection via ADMM for Integer Programming
conference contribution
posted on 2024-11-03, 14:47authored byMingyu Fan, Xiaojun ChangXiaojun Chang, Xiaoqin Zhang, Di Wang, Liang Du
Recently, structured sparsity-inducing based feature selection has become a hot topic in machine learning and pattern recognition. Most of the sparsity-inducing feature selection methods are designed to rank all features by certain criterion and then select the k top-ranked features, where k is an integer. However, the k top features are usually not the top k features and therefore maybe a suboptimal result. In this paper, we propose a novel supervised feature selection method to directly identify the top k features. The new method is formulated as a classic regularized least squares regression model with two groups of variables. The problem with respect to one group of the variables turn out to be a 0-1 integer programming, which had been considered very hard to solve. To address this, we utilize an efficient optimization method to solve the integer programming, which first replaces the discrete 0-1 constraints with two continuous constraints and then utilizes the alternating direction method of multipliers to optimize the equivalent problem. The obtained result is the top subset with k features under the proposed criterion rather than the subset of k top features. Experiments have been conducted on benchmark data sets to show the effectiveness of proposed method.
History
Start page
1646
End page
1653
Total pages
8
Outlet
Proceedings of the 26th International Joint Conference on Artificial Intelligence (IJCAI 2017)
Name of conference
IJCAI 2017
Publisher
International Joint Conferences on Artifical Intelligence