The mass loss rates of planets undergoing core-powered escape are usually modeled using an isothermal Parker-type wind at the equilibrium temperature, $T_\mathrm{eq}$. However, the upper atmospheres of sub-Neptunes may not be isothermal if there are significant differences between the opacity to incident visible and outgoing infrared radiation. We model bolometrically-driven escape using aiolos, a hydrodynamic radiative-transfer code that incorporates double-gray opacities, to investigate the process's dependence on the visible-to-infrared opacity ratio, $\gamma$. For a value of $\gamma \approx 1$, we find that the resulting mass loss rates are well-approximated by a Parker-type wind with an isothermal temperature $T = T_\mathrm{eq}/2^{1/4}$. However, we show that over a range of physically plausible values of $\gamma$, the mass loss rates can vary by orders of magnitude, ranging from $10^{-5} \times$ the isothermal rate for low $\gamma$ to $10^5 \times$ the isothermal rate for high $\gamma$. The differences in mass loss rates are largest for small planet radii, while for large planet radii, mass loss rates become nearly independent of $\gamma$ and approach the isothermal approximation. We incorporate these opacity-dependent mass loss rates into a self-consistent planetary mass and energy evolution model and show that lower/higher $\gamma$ values lead to more/less hydrogen being retained after core-powered mass loss. In some cases, the choice of opacities determines whether or not a planet can retain a significant primordial hydrogen atmosphere. The dependence of escape rate on the opacity ratio may allow atmospheric escape observations to directly constrain a planet's opacities and therefore its atmospheric composition.
Comment: 24 pages, 10 figures. Submitted to ApJ