When I expand the phase mismatch (for three signals interacting nonlinearly in optical fiber delta-beta with dispersion slope (dB = B(f1) + B(f2) - B(f3) - B(f1+f2-f3) , where B (Propagation Constant) will expanded using Taylor series around f0. The result will depend on f0, which seem unlogical because the phase mismatch will change according to the point of your expansion!

Did anybody face the same problem before?

When I plot the FWM mixing with f0 and without f0, I find a big difference.

More Abdallah A. I. Ali's questions See All
Similar questions and discussions