You will get an answer with a small sample size, but the Goodness of Fit measure, (usually R2) can be unreliable. In such cases, you may prefer to use Adjusted R2, which adjusts for the smaller ratio of Degrees of Freedom available for error. (If Adjusted R2 is very small, or even negative, then you haven't got a good fit.
It's worth just looking at your results: create a Scatter-Plot of Actual Y values against Predicted Y values. If it doesn't look okay, then it's not okay.
Also, you can check the predictive validity of your model using "leave-one-out cross-validation". With this method, you leave out one of your observations, so you only have 9, instead of your original 10. Run Regression and use the resulting model to forecast the value of the observation that was left out. Put that one observation back in and then pull out another single observation. Repeat for all observations, so you end up with 10 predicted values. Record your predicted values and correlate these with the actual values. Most statistical packages can run this kind of cross-validation automatically as an option.
(Traditionally, leave-one-out is a random selection of single values but, with a small sample, you systematically leave out each observation one at a time. This is often called a Jack-Knife because you're taking off the top observation, using it, and then folding it back into the bottom of the data list.)
Instead of Ordinary Least Squares, you can consider Partial Least Squares. PLS makes some different assumptions about the nature of the data, and can be helpful with small sample sizes. And it is very robust.
Finally, you can often get smaller standard errors of your parameter estimates by using Bayesian Regression.
2. Atwar Rahman & Dhritikesh Chakrabarty (2015) : “Elimination of Parameters and Principle of Least Squares: Fitting of Linear Curve to Average Minimum Temperature Data in the Context of Assam ”, International Journal of Engineering Sciences & Research Technology, 4(2), (ISSN : 2277 - 9655), 255 – 259.
3. Atwar Rahman & Dhritikesh Chakrabarty (2015) : “ Basian-Markovian Principle in Fitting of Linear Curve ”, The International Journal Of Engineering And Science, {ISSN (e): 2319 – 1813 ISSN (p): 2319 – 1805} www.theijes.com), 4(6), 31 – 43.
4. Atwar Rahman & Dhritikesh Chakrabarty (2015) : “Basian-Markovian Principle in Fitting of Quadratic Curve ”, International Research Journal of Natural and Applied Sciences (ISSN: 2349 - 4077), www.aarf.asia , 2(6), 186 – 210.
5. Atwar Rahman & Dhritikesh Chakrabarty (2015) : “Method of Least Squares in Reverse Order: Fitting of Linear Curve to Average Maximum Temperature Data at Guwahati and Tezpur”, International Journal in Physical & Applied Sciences (ISSN: 2394 - 5710), www.ijmr.net.in, 2(9), 24 – 38.
6. Atwar Rahman & Dhritikesh Chakrabarty (2015) : “ Method of Least Squares in Reverse Order: Fitting of Linear Curve to Average Minimum Temperature Data at Guwahati and Tezpur”, AryaBhatta J. Math. & Info. {ISSN (Print): 0975 – 7139, ISSN (Online): 2394 – 9309}, 7(2), 305 – 312, Also available in www.abjni.com .
7. Dhritikesh Chakrabarty (2016) : “Elimination-Minimization Principle: Fitting of Polynomial Curve to Numerical Data”, International Journal of Advanced Research in Science, Engineering and Technology, (ISSN : 2350 – 0328), 3(5), 2067 – 2078, Also available in www.ijarset.com.
8. Dhritikesh Chakrabarty (2016) : “Elimination-Minimization Principle: Fitting of Exponential Curve to Numerical Data”, International Journal of Advanced Research in Science, Engineering and Technology, (ISSN : 2350 – 0328), 3(6), 2256 – 2264, Also available in www.ijarset.com.