In this paper, we study a regression model with a break in trend regressor, in which the model errors are assumed to be mildly integrated. To be precise, we suppose the model errors are generated by an AR(1) process with the autoregressive coefficient ρT=1+c/kT, where T is the sample size, c is a negative constant, and {kT,T∈N} is a sequence of positive constants diverging to infinity such that kT=o(T). We estimate the break date/break fraction and other parameters in the model using the least squares method. The asymptotic properties, including the consistency, rates of convergence as well as the limiting distributions, of the estimates are examined. The results derived in this paper bridge the findings in Perron and Zhu (Journal of Econometrics 129:65–119, 2005) who estimated the break date/break fraction in trend regressor under I(0) and I(1) model errors. We also show that the phase transition for the estimation error of the least squares estimate of the break date occurs when kT has the same order of magnitude as T1/2. Monte Carlo simulations and an empirical study are given to illustrate the finite-sample performance of estimates.