If you have a calibration curve. You can use LINSET function to determine the uncertainty on slope and the intercept in Excel. Take a look about this video: https://www.youtube.com/watch?v=Y-rfyeutais.
LOD = (Intercept + 3*SD of the intercept)/slope
LOQ = (Intercept + 3*SD of the intercept)/slope
To have errors on LOD and LOQ, you have to repeat your calibration curve 3 times.
One of the keys that Dr. Bou-Maroun touches on is that you MUST have replicates at or near the LOD and LOQ in order to calculate the variance. While you can estimate the LOD (and by extension the LOQ) from a calibration curve I have always found it preferable to run through a calculation similar to that found in the United States Environmental Protection Agency's 40 CFR 136 Appendix B, which is a statistically-based approach to calculating the LOD. If applied correctly this allows you to at least have a defensible calculation of the error at or near the calculated LOD. LOQ, of course, is defined by the LOD and/or defined as your lowest calibration point, so the error about that point is relatively easy to determine (multiple runs of your low standard). If you only have a single analysis at each point on a single calibration curve it becomes much more difficult to accurately determine the error at that point.