Automatic Differentiation (AD) is a method for calculating derivatives exactly and automatically. Calculus-level languages [e.g. Slang (NASA’s Apollo…), PROSE (Time Sharing Mainframes), & FortranCalculus (PC users)] use AD to calculate the Jacobian and/or Hessian matrices on the fly to the accuracy of each computer your running on ... for more, visit https://en.wikipedia.org/wiki/Automatic_differentiation
Given the Jacobian or Hessian matrix, numerical methods were developed and saved in each compiler’s library as a Solver. So a user may have a dozen solvers to choose from. Pick one … add its name in a Find statement … compile & execute your problem … change solver’s name in Find statement and rerun it. Compare your results after each run and after a few tries you should be sold on one solver over the others for your problem. Do this for each new problem, as problems change, so may the best solver.
Modeling, Simulation, & Optimization problem-solving is simplified by AD in a Calculus-level language/compiler. For more, visit http://fortrancalculus.info/apps/fc-compiler.html ... Solves Algebraic Equations through Ordinary Differential Equations. Equations may be Implicit, Non-Linear, Boundary Value Problems, etc. … most continuous models can be solved with the use of AD.