From the collected data from the research, the data was plotted on the graph to get the curve on the graph. So how do we find the equation of curve from the graph (curve on the graph is similar to the graph of the quadratic function)?
since you speak about ''collected data'' i assume these are measurements in some sense, hence containing some kind of noise. If you are willing to assume the ''standard assumption'' Gaussian noise, expectation value 0 and of constant variance,
then least squares is the mathematical right way , but practitioners also are willing to
use this anyway, simply because it is so simple. nevertheless: without some well based reasoning about the physical/chemical/what so ever background finding the
''true equation'' is an illusion. You may well find some form that fits your data in the order of its noise, and then can be happy. There is software which has a collection of
models and automatically takes the one which is most plausible in this sense, but this means nothing w.r.t the ''truth'' of this model. Overfitting is a danger here. In the simplest case well, take a parabola model and fit it with the software you have at hand,
there is lot of least squares software available in the public domain.
Whether a parabola is o.k. could be seen for example from the second divided
since you speak about ''collected data'' i assume these are measurements in some sense, hence containing some kind of noise. If you are willing to assume the ''standard assumption'' Gaussian noise, expectation value 0 and of constant variance,
then least squares is the mathematical right way , but practitioners also are willing to
use this anyway, simply because it is so simple. nevertheless: without some well based reasoning about the physical/chemical/what so ever background finding the
''true equation'' is an illusion. You may well find some form that fits your data in the order of its noise, and then can be happy. There is software which has a collection of
models and automatically takes the one which is most plausible in this sense, but this means nothing w.r.t the ''truth'' of this model. Overfitting is a danger here. In the simplest case well, take a parabola model and fit it with the software you have at hand,
there is lot of least squares software available in the public domain.
Whether a parabola is o.k. could be seen for example from the second divided
Yes, Peter is right - if you have really data from an experiment, my approach will not help, because of the danger of "unphysical" fitting - i.e. fitting without any physical meaning (my method applies only for pure mathematical data which is without any connection to a real experiment).
You can have a curve going through the experimental points. Generally, it is helpless if there is a lot of points. In this case, you can try to guess a "physical law" underlying your results and do polynomial regression to find the polynomial of a given degree fitting the most accurately to your data.
Mostly we do least square curve fitting. As suggested above by Gruenwald , assume your equation is f(x) = ax² + bx + c and find values of a, b, and c by forming normal equations and solve them.
First, clearly specify your (in-)dependent variables. Then, propose a mathematical relationship that governs these variables. Usually there are family of curves or well known functions that should be easily matched to certain patterns in data. This should give you an idea of the order of the curve, and or any other constants to add/remove to your initially specified model. Once, you have some candidate models, then fit them to the measured data set and estimate the parameters of interest. The resultant model fit to data should tell you the appropriateness of the model. Here you can use statistics e.g. R-square, Mean Square Error (MSE), etc ... to judge the model goodness-of-fit to the measured data.
1. the governing equation is already known and all we want is to determine the numerical values of its parameters, or
2. we want to guess "the curve" best approximating our data, possibly without any relation to the under-laying mechanism generating them. This is, among others, the case of instrument calibration.
In the first case preparing the graph is not necessary at all, we need numerical data and the proper fitting routine. Of course, it is nice to see the final outcome, i.e. original data and the fitted curve as a graph.
In the second case it may be useful to visualize original data in few arrangements, say using various scales on "x" or "y" axis (or both), say linear, logarithmic, x 1/x, x x - const, and so on. Seeing a straight line may be a good hint what class of functions would be appropriate to describe (model) the true data: be it power function (including hyperbolic case), exponential, logarithmic, or still another. One might also consider modeling "calibration" data by using rational approximation, perhaps (a ratio of two polynomials). Splines are not preferable in this case, as we need as small set of parameters as possible.
Landauer may be correct. There are more graphs under one equation also. Please refer Anscombe's regression data sets, in which the data sets has the same mean and variance for x, the same mean and variance for y, the same fitted regression line, and the same residual sum of squares. Anscombe created these data sets to emphasize the need for the statistician to look carefully at data.
It is recommended to use fractal math to develop the formula.
Fractal mathematics analyzed by Mr. Benoit Mandelbrot . The equation will take the form of Z = Z^2 + C, where C is a constant, and Z is an iterative value. Ref: #1) http://www.wakingtimes.com/2015/06/16/using-fractal-math-to-make-sense-of-the-roughness-of-nature/. #2) http://www.fractalcurves.com/
Curve tracing includes methods which can be used to generate a rough idea of the overall shape of a plane curve due to its equation without calculating a large number of points required for an accurate plot.