With modern Electric Connected Vehicles (ECVs) becoming more intelligent and entertaining, the Multi-access Edge Computing (MEC) servers deployed near the Road Side Units (RSUs) have been expected to not only improve the computing performance, but also alleviate the ECVs' battery burden. However, the uneven spatial and temporal distribution of vehicle arrivals cause that the MEC servers in busy areas are overloaded, which results in degraded computation performance and limited energy alleviation for ECVs. In this paper, we consider the task distribution among nearby MEC servers and propose a novel partial offloading strategy where multiple blocks of the tasks are cooperatively processed by MEC servers in the vehicles' moving directions. After formulating the mathematical model to jointly optimize the computing latency and energy consumption of ECVs, a Deep Reinforcement Learning (DRL) based task partial offloading strategy is proposed and a Deep Q-Network (DQN) is adopted to make the offloading decision. Simulation results illustrate the significant alleviation of energy consumption and improvement of computation performance compared with conventional methods.