A new method is proposed for the empirical characterization of the nonlinear thermal resistance in GaN HEMTs. Low-Frequency dispersion due to self-heating is used as the sensing parameter of the channel temperature changes deriving from dissipated power variations. Since GaN HEMTs are also affected by LF dispersive charge-trapping phenomena, the thermal resistance description is embedded into a full electrothermal model, in order to extract the parameters by best-fitting the measured data. The method involves only multi-bias small-signal S-parameter and de I/V measurements at different base plate temperatures. It is non-invasive, does neither require special-purpose device geometries/structures nor measurements in the conduction region of the gate junction. Preliminary validation of the method is based on comparisons with measured electrothermal data, with data from the literature on similar devices and on results provided by a large-signal RF device model embedding the thermal resistance description.