Training artificial neural networks (ANNs) is a complex task of great importance in problems of supervised learning. Evolutionary algorithms (EAs) are widely used as global searching techniques for optimization in scientific and engineering problems, and these approaches have been introduced to ANNs to perform various tasks, such as connection weight training and architecture design. Recently, a novel optimization algorithm called Group Search Optimizer (GSO) was introduced, which is inspired by animal searching behaviour and group living theory. In this paper, we present two new hybrid GSO approaches, one based on opposite populations and the other based on opposite populations and a modified Differential Evolution (DE) strategy. We also applied the Weight Decay (WD) heuristic to enhance the generalization power of networks. Experimental results show that the proposed GSO approaches are able to achieve better generalization performance than Levenberg-Marquardt (LM), Opposite Differential Evolution (ODE) and traditional GSO in real benchmark datasets.