The complexity and computational resource demands of large-scale virtual world simulations are increasing with advancements in CPU and GPU computing power, availability of broadband internet connectivity, and reduction in hardware costs. From these areas of growth, cloud-based services are gaining in popularity for hosting simulations due to their cost-effectiveness, ease in resource management and scale, and data center availability in virtually every region in the world. The next phase of cloud-based simulation hosting will address the fallacies of over/under-utilization of compute resources by incorporating big data analytics and processing, software containerization, automated allocation schemes, and load-balancers. To meet these demands, this study performs the preliminary work of profiling virtual-world simulation loads and then determine which form of analysis (linear or non-linear methods) produces the lowest error prediction of resource needs. As a result of this study, future automated load-balancers will be able to estimate upcoming simulation demands and modify current resource allocations to better match actual needs.