The hardware-level security protection method represented by SGX is the main trend of security guarantee for data processing of cloud tenants. However, different protection strategies have a great impact on the application performance of the tenants. How to make effective balance between the protection cost and the security enhancement has become the key problem in the decision of the data processing hardware security protection. Aiming at this problem, we will study an effective method to balance SGX application performance overhead and security enhancements, which is based on multi-objective optimization. Because there are many factors that cause performance losses in SGX applications, the performance of CPU intensive programs and concurrent programs is difficult to accurately estimate. We analyze the invisible relationship between performance factors and performance losses with deep learning theory, then we build a high precision SGX application performance loss estimation model which can adapt to different application scenarios.