Many critical science, societal, and engineering fields contain large-scale multiobjective optimization problems (LSMOPs), comprised of many decision variables. However, as the number of decision variables increases, optimization algorithms face exponentially large search spaces, thereby exhibiting a degraded performance. Nonetheless, LSMOPs whose optimal solutions correspond to sparse variable vectors can be solved more efficiently by evolutionary multiobjective optimization (EMO) algorithms. Despite the great recent strides in developing generic EMO algorithms for sparse LSMOPs, there is still room for improvement. Specifically, algorithms still struggle to find convergent and diverse Pareto fronts in an acceptable amount of time when solving sparse LSMOPs with thousands of decision variables. To better solve sparse LSMOPs, we propose a novel set of evolutionary operators to adapt small-scale EMO algorithms for sparse LSMOPs. These simple, novel, and effective operators include varied striped sparse population sampling (VSSPS), sparse simulated binary crossover (S-SBX), and sparse polynomial mutation (S-PM). These operators, combined with nondominated sorting genetic algorithm II (NSGA-II), make the proposed S-NSGA-II algorithm. S-NSGA-II runs near-universally faster than existing methods for problems containing up to 6400 decision variables, while performing as well as or better than contemporary sparse LSMOP algorithms with respect to hypervolume, especially, with problems with larger than 5000 decision variables.