We revisit and adapt the extended sequential quadratic method (ESQM) in [3] for solving a class of difference-of-convex optimization problems whose constraints are defined as the intersection of level sets of Lipschitz differentiable functions and a simple compact convex set. Particularly, for this class of problems, we develop a variant of ESQM, called ESQM with extrapolation (ESQM$_e$), which incorporates Nesterov's extrapolation techniques for empirical acceleration. Under standard constraint qualifications, we show that the sequence generated by ESQM$_e$ clusters at a critical point if the extrapolation parameters are uniformly bounded above by a certain threshold. Convergence of the whole sequence and the convergence rate are established by assuming Kurdyka-Lojasiewicz (KL) property of a suitable potential function and imposing additional differentiability assumptions on the objective and constraint functions. In addition, when the objective and constraint functions are all convex, we show that linear convergence can be established if a certain exact penalty function is known to be a KL function with exponent $\frac12$; we also discuss how the KL exponent of such an exact penalty function can be deduced from that of the original extended objective (i.e., sum of the objective and the indicator function of the constraint set). Finally, we perform numerical experiments to demonstrate the empirical acceleration of ESQM$_{e}$ over a basic version of ESQM, and illustrate its effectiveness by comparing with the natural competing algorithm SCP$_{ls}$ from [35].