I am trying to simulate LNOI waveguide ( lithium niobate waveguide on SiO2). During simulation I found that as I increase the core size (0.5um to 0.9um), Neff increases. In commercial available LNOI, LN thickness vary from 500nm to 900nm. Now anyone would choose core size 0.9um because its giving higher Neff, then what is the point of having LNOIs with LN thickness vary from 500nm to 900nm?
An screenshot of simulation(capture) is attached for reference.
Thanks for reading. Please share your thoughts.