I am implementing a microstrip band-pass filter with high-low impedance sections (stepped impedance). The impedance is varying between 12ohm and 182 ohm, Zmin, Zmax respectively. For lowest impedance section 12ohm and other Z values (typically below 100ohm), when i simulate these individual sections of strip in CST, the simulated line impedances are almost same as required. For highest impedance 182-ohm transmission line required minimum strip width is 0.100 mm. I used the Lincalc from ADS to extract the corresponding substrate h (for 182ohm, 0.100micron w) which turn out to be 2.4mm at 11.75GHz (center frequency), 9.15 epsilon. When I simulate this w,h (0.100mm,2.4mm) in CST to verify the 182ohm, the simulated line impedance give a value of around 90ohm. It happens to all Z values higher than 100ohm. The problem in S-param are the return loss, the required RL is 20dB but the simulated one is around 12dB. I want to know if high impedances (Z>100ohm which are not achieving in simulated case) is contributing in worse RL or there are other factors. If this is about high impedance section, then how i can implement it to achieve the desired 182 ohm.

More Jamil Ahmad's questions See All
Similar questions and discussions