I have designed a serial data link using mixed signal CMOS technology (90nm) on Cadence. I am wondering if there is a way to simulate the BER using Cadence. I did it on Matlab before but it was a lot of work.
The serial link is composed of a transmitter a receiver and channel. In order to test such system by determining the bit error rate as a function of signal to noise ratio. You must be able to vary the transmit power , build a channel with the simplest one is additive white Gaussian noise. So you have to add a noise signal source to the transmitted signal out of the transmitter. The you receive the transmitted signal plus noise and recover the transmitted signal from the received signal. According to the signal to noise ratio you will find bits in error.
This bits in error can be counted by digital compactor at the output of the receiver.
However you do not need to re evaluate the bit error rate of the system so long as you already determined such performance curve.
Especially when the sample resolution is same. Both implementations perform the same mathematical and logical processing!!!!
Based on the signal to noise ratio from Shannon law(C=Blog(1+S/N), you will find bits in error.However,it now depends on different schemes you want to use or yo have used such as QPSK, PSK, QAM,16... and so on.
This bits in error can be counted and enumerated by digital compactor at the output of the receiver.
Hence, you do not need to re-evaluate the bit error rate of the system so long as you have already determined such performance curve.