Support vector machines (SVM) have drawn wide attention for the last two decades due to its extensive applications, so a vast body of work has developed optimization algorithms to solve SVM with various soft-margin losses. To distinguish all, in this paper, we aim at solving an ideal soft-margin loss SVM: $L_{0/1}$ L 0 / 1 soft-margin loss SVM (dubbed as $L_{0/1}$ L 0 / 1 -SVM). Many of the existing (non)convex soft-margin losses can be viewed as one of the surrogates of the $L_{0/1}$ L 0 / 1 soft-margin loss. Despite its discrete nature, we manage to establish the optimality theory for the $L_{0/1}$ L 0 / 1 -SVM including the existence of the optimal solutions, the relationship between them and P-stationary points. These not only enable us to deliver a rigorous definition of $L_{0/1}$ L 0 / 1 support vectors but also allow us to define a working set. Integrating such a working set, a fast alternating direction method of multipliers is then proposed with its limit point being a locally optimal solution to the $L_{0/1}$ L 0 / 1 -SVM. Finally, numerical experiments demonstrate that our proposed method outperforms some leading classification solvers from SVM communities, in terms of faster computational speed and a fewer number of support vectors. The bigger the data size is, the more evident its advantage appears. [ABSTRACT FROM AUTHOR]