Hi,

I am currently designing a two-stage open loop comparator (differential amp + current sink inverter output stage) using Cadence.

In order to ensure a "safe" and error-free operation of the comparator, it is said that the inputs of the device should lie within the defined input-common mode range (ICMR). To my understanding this means: If both inputs are within the ICMR, then all transistors should operate in saturation mode. Saturation mode, NOT just turned-on (i.e. saturation OR linear)?! Is that true? Correct me if I am wrong.

Maybe somebody could give a definition of the ICMR with respect to my specific application. i.e. Why does it matter?

Moreover, I encountered an issue when shifting one of the inputs to the lower ICMR-range and kept the other exactly in the center of the range. In the DC-simulation, one of my input transistors switched into linear operation mode. Is this behaviour normal or did I possibly calculate something wrongly?

Your help and hints are well appreciated :)

P.S. How can you "simulate" the ICMR in Cadence and verify you calculation results?

More Frederik Lesniewski's questions See All
Similar questions and discussions