Hello,
I want to find out Throughput of an Optical Interconnection Network by with the help of a simulator like Booksim2 / Sniper.
By definition, throughput is no. of bits ( or bytes) per second that can flow through a link.
In case of an optical fiber, if the modulation rate is 10 GHz and it is carrying 64 wavelengths, the total throughput is 640 bits/second.
In many Optical Interconnect papers, I am seeing losses are considered like bending loss,insertion loss etc. How losses can impact throughput of an optical fiber ? What is the mathematics behind that ? If somebody can suggest any material about this, that will be great help.