While artificial machine learning systems achieve superhuman performance in tasks such as language processing and image/video recognition, they do so using extremely large datasets and consume huge amounts of power. On the other hand, the brain remains superior in several cognitively challenging tasks while operating with the energy of a small lightbulb. To explore how biology achieves such high efficiency, we built a biologically constrained spiking neural network model and assessed its learning capacity on various discrimination tasks. We found that a form of structural plasticity, namely the ability of synapses to form and eliminate continuously, results in higher performance accuracy and faster learning across all scenarios tested. These improvements are most significant under more difficult learning conditions, such as when the number of trainable parameters is halved or when the task difficulty is increased. The study highlights the important role of structural plasticity in optimizing learning in biological circuits and opens new avenues for exploring the applicability of such biological plasticity rules in machine learning applications.