I'm required to make a jig that measures the divergence angle of a collimated beam. The setup is straightforward as shown below.
The single-mode fiber optic cable, collimator & NIR camera are the only components that cannot be changed (https://www.edmundoptics.com/p/nir-ccd-usb-2-camera/29505/)
Using the camera's SDK, i'm using my python script to capture the beam profile's image and apply scipy curve_fit function to calculate the beam divergence
Currently, I get the desired beam divergence angle only if spot image (grainy pattern) is taken ~40mm away from the plano-convex len's focal point OR if image is at focal point but severely saturated (flat-top profile)
My main problem/concern is:
I keep getting a grainy beam profile and I don't think it's reliable to check for beam divergence. I've searched online but I never come across this profile before. At first, we thought the profile is due to the uneven layer of phosphor coating that's on the CCD array. However, Edmund Optics said that "The phosphor coatings are usually very even but the emissivity of each phosphor atom would not be same hence is it common to get a grainy image."
Not taking the image at the focal point is technically wrong isnt it? Since it's going against the formula Θ1=y2/f (using small angle approx. & Optical Invariant y2Θ2=y1Θ1)
Thus, what should I do to get a uniform Gaussian Profile?