We present a follow-up method based on supervised machine learning (ML) to improve the performance in the search of gravitational wave (GW) burts from core-collapse supernovae (CCSNe) using the coherent WaveBurst (cWB) pipeline. The ML model discriminates noise from signal events using as features a set of reconstruction parameters provided by cWB. Detected noise events are discarded yielding to a reduction of the false alarm rate (FAR) and of the false alarm probability (FAP) thus enhancing of the statistical significance. We tested the proposed method using strain data from the first half of the third observing run of advanced LIGO, and CCSNe GW signals extracted from 3D simulations. The ML model is learned using a dataset of noise and signal events, and then it is used to identify and discard noise events in cWB analyses. Noise and signal reduction levels were examined in single detector networks (L1 and H1) and two detector network (L1H1). The FAR was reduced by a factor of $\sim10$ to $\sim100$, there was an enhancement in the statistical significance of $\sim1$ to $\sim2\sigma$, while there was no impact in detection efficiencies.