Dear all,

In wideband digital phase array radars, how the true-time delays are realized? If the true-time delays are realized by the digital signal processing, for example, the signal is s(n) = [1 2 3 4], and the sample interval is 80 ps (the corresponding sampling rate is 12.5 Gbps ), is the minimum time-delay we can realize for s(n) is 80 ps? And the delayed signal is s1(n) = [4 1 2 3]? Does it mean that the time-delay precision is related with the sampling rate? And are there any other parameters related with the time-delay precision? Thanks very much for answering this question.

More Xuedi Xiao's questions See All
Similar questions and discussions