Hi all, can anyone please advise on how to best estimate the power consumption of a photodetector? Particularly interested in a reliable method for photoconductors and phototransistors. Thanks!
You can find a circuit example here: https://eepower.com/resistor-guide/resistor-types/photo-resistor/#
This circuit will only consume microwatts of power or less if the photo resistor remains dark. Once you shine light on the detector, the circuit will sink a few milliamps. If you multiply this with the supply voltage of, say, 5 volts then you end up with a few ten milliwatts of power consumption. Note, however, that most of this power is required to light the LED in that circuit. If you are worried about power consumption a photo diode circuit may be a better choice.
Thank you for your great input on experimental methods for determining the power consumption. However, I was looking for a method to theoretically estimate the power consumption based on photodetector figures of merit, do you have any experience with such estimations?
The figure of merit for a photodiode is simply the quantum efficiency, that is, every absorbed photon yields exactly one electron. If you have a PD with 80% QE and you are shining, for example, P=1 mW of light at 620 nm (2 eV) onto this PD, you generate a current of QE times P divided by photon energy. The photon energy at 620 nm is 2x1.6x10^-19 J, so you generate a photocurrent of 2.5x10^15 electrons per second. This is 400 µA.
But this is not what I would refer to as power consumption. It is the opposite, you are using the PD as a photo cell and generate electrical power from light.
If you want to use a photoresistor instead, I have to disappoint you that this neither generates nor sinks any power all by itself. Also, phototransistors are more complicated to model and can be replaced by a combination of a photo diode and a transistor.
Thank you again Dr. Steinmeyer for providing an example for the power generation of a photodiode. I was actually more interested in photodetectors that require a bias or gate voltage, and how to best model the power consumption with changing QE/responsivity over the NIR-MWIR spectrum, say in an environment where you rely on having sufficient battery capacity to run the detector. In this case, could the power consumption simply be expressed as responsivity x input power x bias voltage?
The above consideration also holds when you reverse bias the PD. Just take a battery and connect it to the PD via a resistor. Take a 9V battery and a 1k resistor then the 400 µA current would cause a voltage drop of 0.4 V, and the power consumption would be 160 µW. Suitable photo diodes exist for wavelengths up to about 2.5 µm (strained-layer InGaAs).
One cannot make such general statements for a photoconductor without more technical detail on the detector. You need a data sheet that gives you the relationship between resistance and intensity on the detector. Then the practical use of these devices typically requires some means of amplification, i.e., a transistor or opamp, and this will draw much more power than the current through the photoconductor.
Similarly, a phototransistor is essentially just a combination of a photodiode and a transistor. So you would need to know the characteristics of this transistor to make any estimations on the power consumption and you have to decide for some schematics for the usage of that device.