how power budget of optical communication system is correlated by bit rate. is there any relation for this? how power budget is releted for 10 Gbps and 2.5 Gbps?
As the bit rate increases for the same system limited bandwidth(transmitter, fiber, receiver), the ISI increases and so the BER . The sensitivity will decreases with data rate increasing. So,the transmitted optical power have to be increased to achieve the required BER.
can you plz tell me the relation how it is related ?? suppose I required -10dbm transmitted power in case of 2.5 Gbps then how much power required for 10 Gbps for same performance if all factors are same.
As pointed out by Mohamed, the key-parameter is the detector sensitivity Ps, which is the lowest required power value to achieve a given BER at a given bitrate B. The relationship between Ps, BER and B is rather easy to derive in the case of shot noise and assuming a Poisson process for the detection. We finaly have TEB = exp(-Ps/hfD), where h and f respectively denote the Planck constant and the frequency (c/lambda, if you prefer).
Actually, there are no direct relation between power budget and bit rate. A lot of additional parameters should be taken into account.
Link distance limited by two factors: power budget and dispersion. Both these parameters depends on fiber type, transmitter, receiver and coding.
Dispersion in OC-192 (10 Gbps) system, has higher influence on BER compared to OC-48 (2.5 Gbps), because of wider bandwidth.
The simple approximation of this relation can be presented as following:
Link distance = Dispersion(MHz/km) / Bandwidth (MHz), for given BER=const.
Ex.
Dispersion = 10-11 s/km = 105 MHz/km
For OC-48: Link distance = 105MHz/km / 2.5*103MHz = 40 km
For OC-192: Link distance = 105MHz/km / 10*103MHz = 10 km
After you found this limits, you should check your power budget. If you need longer links, you can choose the better fiber, with lower dispersion or lower bit rate. Playing with transmittion power and receiver sensitivity will not give you desired output. The power budget also depends on signal attenuation in optical fiber.
Ex.
Link distance = Tx_powerdBm - Rx_sensitivitydBm] / Fiber attenuationdB/km =
According to this two examples, link power budget is high enough for OC-192 system, because it allows to create 10 km link easily. However, for OC-48, where limit is 40 km by dispersion, it is not enough.
This is the simple fundamentals of optical link designing. Usually, there are much more parameters, which should be taken into account. Note, that optical fibers designed according to proper relation between all paramaters. Better fibers has lower attenuation, as well as lower dispersion, and another factors.
Therefore, bit rate vs link distance is the main compromise for optical networks designing.
Moreover, if you design long link with EDFA amplifiers, the ASE noise should be taken into account too.
I attached two files, which may be useful for this topic. You can find a lot of additional information about this problem. Good luck!
There is a simple (but often inaccurate) answer to your question.
If you are shot noise or quantum limited, then receiver sensitivity is inversely proportional to bit rate. If you quadruple the bit rate from 2.5 to 10 Gbit/s, you will need 4 times higher power at the receiver.
Unfortunately we are rarely shot noise limited. It is possible to approach the quantum limit with coherent detection, but this is not used in typical 2.5 or 10 Gbps direct modulation systems. More often, performance will be limited by electrical noise in the detector, or by spontaneous emission noise from optical amplifiers.
The simple linear relation between receiver power and bit rate can still hold approximately when certain kinds of electrical and device noise dominate, for example with many APD receivers.
For PIN receivers the relationship breaks down at higher data rates, and sensitivity decreases more rapidly with bit rate than predicted. A more linear relation is likely when an optical pre-amplifier is used.
Taras is correct to point out the importance of dispersion in limiting reach at higher data rates. However the dispersion and attenuation values in his examples are not representative of fibres I am familiar with.
The linear dispersion relationship he gives is similar to that describing modal dispersion in multimode fibre. However OC192 is a SONET format, and such systems typically operate over single mode fibre in which modal dispersion is absent, and chromatic dispersion dominates. Polarisation mode dispersion can be a problem in older fibre, but is less of a concern in more recent systems.
Chromatic dispersion limited reach decreases quadratically rather than inversely with bandwidth, and depends strongly on fibre type and operating wavelength. In standard single mode fibre such as Corning SMF-28, dispersion falls to zero near 1300 nm, rising to 16 ps/nm/km around 1550 nm. For 10 Gbit/s waveforms, dispersion up to at least 1000 ps/nm is often possible (for example 60 km of SMF-28 at 1550 nm). This is for a transmitter using an external modulator.
Many 2.5 Gbps transmitters use direct laser modulation, which introduces additional spectral broadening due to laser chirp. The dispersion penalty will be greater than predicted from the modulation rate alone.
Multimode fibres support much shorter distances. OM1 and OM2 fibres have minimum modal bandwidths of 200 to 500 MHz.km, rising to 4700 MHz.km for OM4 "laser optimised" fibre. Even with a carefully controlled VCSEL launch, 10 Gbps propagation is limited to 400 m.
My answer is very simplified, for better understanding. I just showed how dispersion influence on the link, depends on the bit rate, and also why there are no direct relation between power budget and bit rate. All parameters should be taken into account together depends on specific network scenario. Even if dispersion falls to zero, there are a lot of non-linear phenomenas, which are affected by transmission power and bit rate. And also single carrier or WDM is also matters in terms of power budget.
Amit, if you describe better which system you have deal with, we will be able to explain, which factors are important in your case. There a lot of possible scenarios, which use specific equipment (e.g optical transport networks like SONET, GMPLS, or OBS has different equipment and thus designing issues, comparing to GEPON access networks). And optical transmission system always designed according to given circumstances, because of limited choise of fibers, lasers, detectors, switches, amplifiers, etc.
The power budget is determined by a few parameters.
#1. TX optical power. If we like to keep the same power budget, we need to use higher TX optical power because #2 & #3 below get worse at higher bit rate.
#2. Power penalty from transmission. We need to allocate more optical power budget for impairments such as chromatic dispersion, PMD, etc at higher bit rate.
#3. RX sensitivity. This goes worse at a higher bit rate. In a very simplistic view we need roughly the same number of photons per pulse to maintain the same signal to noise ratio. At a higher bit rate, the pulse duration is shorter. In order to maintain the same # of photons per pulse, the optical power (which is energy (= # of photonics) in the pulse divided by pulse duration) must be increased.
Power budget = #1 - #2 - #3
In a back-to-back case (#2 = 0) with the same TX power (#1 constant) and in shot noise limited RX sensitivity, the power budget decrease 3dB for every doubling of bit rate. This is a little too simplistic but still a good rule of thumbs to start with.
If the link is not limited by dispersion, when you go of 2.5 Gb/s to 10 Gb/s (that is a weak assumption), then the analysis is based in attenuation limit, and receiver noise is very important. Noise power is proportional to receiver bandwidth. Like Toyana and Robinson comments, you need augment of 6dB the optical power in the transmitter. If you have the control of the injection current of the laser driver, good.