Hi
I am designing a TIA in cadence with an inverter stage amplifier. There are some fundamental question boggling me about the bias of MOSFETs in the amplifier. I figured out from simulations that the threshold voltage of the mosfets is around 0.25V
I have two interpretations of the TIA circuit shown in the figure attached.
If a photodiode is modeled as a DC current source and its capacitance in parallel, and is connected to the opamp whose input is the gate of the mosfets, how does that current translate into a voltage providing the bias for MOSFETs in the amplifier? This is shown as case I in figure
An alternative scenario is having a resistor that converts the photocurrent into voltage and this voltage is driving the MOSFET stages in the amplifier (Case II). The issue with this type of simulation is that the resistance needs to be large enough to have gate voltage above the threshold vth 0.25 V which entails large resistor (kilo-ohms) that would limit the bandwidth/ make the PD circuit as the dominant pole contributor. Also the significance of a TIA circuit is reduced if a series resistor is used; since the resistance converters I to V and not the TIA.
So ultimately I am not able to decide whether the resistor has to be included or not and how can small signals current as low as 50 micro-amperes will be able to produce an adequate voltage for biasing the MOSFETs in the amplifier.