This equation is a typical optimization problem that many algorithms can be used like dnn,logistic regression. I have seen a more complicated problem that using Tensorflow to solve some wave propagation problem, the methodology and results are given in TF document examples. Myself tried using convNET to solve linear scalar wave equation, the method is the same.
Check out Section " 2. Linear Least Squares " (about learning based on LMS optimization criterion) from:
[Refer to incremental/recursive methods in contrast to batch ones] Stulp et al., " Many regression algorithms, one unified model - A review ", 2015 - https://hal.archives-ouvertes.fr/hal-01162281v2/document
About learning based on TLS optimisation criterion as an alternative, follow:
Von Huffel et al., "Algebraic Relationships Between Classical Regression and Total Least-Squares Estimation", 1987 - https://core.ac.uk/download/pdf/82422441.pdf
Markovsky et al., " Overview of total least-squares methods ", 2007 - http://people.duke.edu/~hpgavin/SystemID/References/Markovsky+VanHuffel-SP-2007.pdf
M. Pesta, " Total Least Squares Approach in Regression Methods ", 2008 - https://www.mff.cuni.cz/veda/konference/wds/proc/pdf08/WDS08_115_m4_Pesta.pdf
I have looked at recommended papers. Just to be more specific - I am looking for reference in which quadratic equation has been solved using machine learning,
let say using Tensorflow and not at generic papers.
Some investigations later, it is finally difficult to find some practical references in which quadratic equation has been solved using machine learning. A very short introduction using TensorFlow is available as an exercise from http://www.sfs.uni-tuebingen.de/~ddekok/dl4nlp/tensorflow-in-class.
In fact, most authors implement their own learning algorithms written in the programming language of their choice. Some concrete examples:
Mishra et al., " An Energy Function Approach for finding Roots of Characteristic Equation ", 2011 - http://ictactjournals.in/paper/IJSC_Vol2_Iss1_237_243.pdf
Mourrain et al., " Determining the Number of Real Roots of Polynomials through Neural Networks ", 2006 - http://www.math.upatras.gr/~dtas/papers/MourrainPTV2005.pdf
Huang et al., " A Neural Root Finder of Polynomials Based on Root Moments ", 2006 - http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.623.9541&rep=rep1&type=pdf
Guo et al., " The Neural-Network Approaches to Solve Nonlinear Equation ", 2010 - http://www.jcomputers.us/vol5/jcp0503-11.pdf
Finally, one practical answer to the question "which machine learning used to solve simple quadratic equation with one variable" is multivariate regression-based learning for eigenvalue problem solving.
This suggestion relies on the following interconnections:
1) Linking root-finding to eigen-solving. Follow:
Pan et al., " Root-finding with Eigen-solving ", 2007 - https://pdfs.semanticscholar.org/14e9/3345e4980a9c4e4034eb5e8e8c86d594b653.pdf
Dreesen et al., " Back to the Roots: Polynomial System Solving, Linear Algebra, Systems Theory ", 2012 -
Conference Paper Back to the Roots: Polynomial System Solving, Linear Algebra...
Dreesen et al., " Back to the Roots – From Polynomial System Solving to Linear Algebra ", 2011 - ftp://ftp.esat.kuleuven.ac.be/stadius/pdreesen/pdreesenifac11.pdf
Emiris et al., " Algebraic Algorithms ", 2012 - https://hal.inria.fr/hal-00776270/document
2) Linking eigenvalue problem in machine learning to sparse least-squares formulation:
Sun et al., " A Least Squares Formulation for a Class of Generalized Eigenvalue Problems in Machine Learning ", 2009 - http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.150.6010&rep=rep1&type=pdf
3) Linking sparse Least-squares formulation to LSQR (similar in style to CG-Conjugate Gradients):
Page et al., "LSQR: An Algorithm for Sparse Linear Equations and Sparse Least Squares", 1982 - https://web.stanford.edu/class/cme324/paige-saunders2.pdf
Due to the fact that such a solution relies on linear algebra, numerous related tools and libraries may be used for implementing numerically.
Let me even more specific - what I need to do to solve quadratic equation using Tensorflow - how many layers, what structure of network, what functions to use?
The answer is not unique. What is really a network? In linear algebra, it may be a simple matrix (a 2D lattice)...
The structure of the network depends closely on the method adopted (conditioned by your requirements) to solve the problem. Such a structure may be one of those (refer to Fig. 1 & 2) suggested by Huang et al., "Neural Networks with Problem Decomposition for Finding Real Roots of Polynomials", 2001 - http://booksc.org/book/30840919/2d1a03
Additional reference by the same author:
Huang et al., " A Neural Root Finder of Polynomials Based on Root Moments ", 2006 - http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.623.9541&rep=rep1&type=pdf
Other architecture suggested by Goulianas et al.:
Goulianas et al., " Solving Polynomial Systems Using a Fast Adaptive Back Propagation-type Neural Network Algorithm ", 2017 - https://aetos.it.teithe.gr/~gouliana/en/2017-polynomials.pdf
The previous publications cited above use alternative decomposition structures in layers (QR decomposition, SVD decomposition, LSQR, etc.).
A new suggestion based on an alternative point of view. It consists in starting from the equivalence between regression trees and multilayered neural networks equipped with particular connection weights as shown by Biau et al.:
Biau et al., " Neural Random Forests ", 2010 - https://hal.archives-ouvertes.fr/hal-01306340/document
This opens new perspectives towards subdvision trees search in conformal spaces. Follow:
[Tree-search] Garcia-Zapata et al., " An Adaptive Subdivision Method for Root Finding of Univariate Polynomials ", 2018 - https://booksc.org/book/74283668/8c7223
[Tree-search] Kobel et al., " Computing Real Roots of Real Polynomials ... and Now For Real! ", 2016 - https://arxiv.org/pdf/1605.00410.pdf
[Conformal Mapping] P. Henrici, " Applied and Computational Complex Analysis: Power Series, Integration, Conformal Mapping, Location of Zeros ", 1974 - https://b-ok.cc/book/536061/a0193d
L. Penaranda, " Non-linear Computational Geometry for Planar Algebraic Curves ", PhD Dissertation, 2010 - https://tel.archives-ouvertes.fr/tel-00547829/document
Moreover, recurrent neural network architectures might be helpful as reported by Z. Zhao, " Machine Learning and Real Roots of Polynomials ", 2019 - https://www.math.ucdavis.edu/files/1415/5249/2664/thesis-ZekaiZhao-Final.pdf
Thanks Mohammad Amin Motamedi. In addition, my last post is to be linked with its related one: https://www.researchgate.net/post/Did_somebody_used_machine_learning_to_model_simple_operations_like_summation_division_multiplication_square_root