Time-based ADC is an essential block in designing software radio receivers because it exhibits higher speed and lower power compared to the conventional ADC, especially, at scaled CMOS technologies. In time-based ADCs, the input voltage is first converted to a pulse delay time by using a Voltage-to-Time Converter (VTC) circuit, and then the pulse delay time is converted to a digital word by using a Time-to-Digital Converter (TDC) circuit. In this paper, an analytical model for the timing jitter and skew due to noise and process variations, respectively, is proposed for the VTC circuit. The derived model is verified and compared to Monte Carlo simulations and Eldo transient noise simulations by using industrial 65-nm CMOS technology. This paper shows how the timing jitter/skew can be reduced by using circuit design knobs such as the supply voltage and the load capacitance.