Laser-plasma acceleration of electrons is characterized by ultra-high gradients, showing promise for reducing the cost and size of next-generation electron linacs. GeV electron bunches have been obtained in a 3 cm plasma channel, leading to near-term goals of demonstrating 10 and even 100 GeV beams. Simulations have played a key role in supporting these efforts, but more than a million processor hours are required to accurately simulate even 1 GeV of acceleration. Simulation run-time must be reduced by many orders of magnitude, so that present experiments can be accurately analyzed and future experiments can be designed in advance. Time-explicit particle-in-cell (PIC) and fluid simulations provide the most complete description of the laser-plasma interaction and electron acceleration, but the enormous ratio of interaction time to laser oscillation period makes this approach unacceptably slow. Conducting such simulations in an optimally chosen Lorentz frame can potentially reduce this ratio by orders of magnitude, depending on a variety of physical parameters, without making any of the approximations required by ponderomotive guiding center or quasi-static algorithms. We present results from a Phase I SBIR project.