This paper presents a novel technique developed for linearizing the energy spectra of radiation detectors in commercial radioisotope identification devices. Based on few spectrum measurements with standard radio-nuclide sources, this method allows generation of individual nonlinear calibration functions at minimum expense in the routine instrument setup. Instead of fitting peak positions, the measured raw data are compared with simulated spectrum templates, and local gain factors providing the best correspondence are taken as reference points for the calibration function. This approach avoids the problem of fitting multiple peaks with intensity ratios influenced by absorbing layers and assures an accuracy of 1% in the energy range of 30 keV to 3 MeV. [ABSTRACT FROM AUTHOR]