Connecting Si-IGBTs and SiC-MOSFETs in parallel, thus forming a hybrid switch, offers additional degrees of freedom to optimize the efficiency of bridge-type power converters and inverters. For improved switching performance closer to that of the superior MOSFET, typically a delay is introduced between the gate signals of the IGBT and the MOSFET. While previous publications have shown that there are optimal delay values, up to now, no comprehensive evaluation exists studying the influence of current, temperature, and the diode selection on the switching behavior and the delay optimum. This paper comprehensively analyzes these factors, deducts adequate design recommendations, and shows that through a proper diode and delay selection, the switching losses can be reduced more than 46% and up to 70% compared to an IGBT-only design.