I am working on continuous-time nonlinear dynamical systems which require a lot numerical computations. Previously, I used 'for' loop for solving the system for a range of parameter values to investigate the bifurcation and regions of stability. Then I used vectorization, which made my codes faster. I have a NVIDIA GTX 1060 6 GB card installed on my pc. I don't know if GPU programming will help me getting results faster or not. So, I want suggestions from you if I should start learning GPU programming or not and if the answer is yes where should I start.

Thank you.

Similar questions and discussions