Abstract Broad Learning System (BLS) is an emerging network paradigm that has received considerable attention in the regression and classification fields. However, there are two deficiencies which seriously hinder its deployment in real applications. The first one is the internal correlations among samples are not fully considered in the modeling process. Second, the strict binary label matrix utilized in BLS provides little freedom for classification. In this paper, to address the above issues, we propose to impose group-sparsity constraints on the class-specific transformed features and label error terms, respectively. The effect is not only the more appropriate margins between data can be preserved, but also the learnt label space can be flexible for recognition. As a result, the obtained projection matrix can show more vital discriminative ability. Further, we employ the alternating direction method of multipliers to solve the resulting optimization problem. Extensive experiments and analysis on diverse benchmark databases are carried out to confirm our proposed model’s superiority in comparison with other competing classification methods.