It is not really clear what you are asking. When drawing error-bars in Excel, this program provides some options like "standard error", "percentage" and "standard deviation". NONE of these options is sensible for data analysis (*). Instead you should calculate some measure of precision or uncertainty or error that you want to show. Then always select the last option ("more options...." or something) and specify where Excel can find the legth of the error bars.
(*) Actually, Excel calculates the standard deviation and standard error as a single value form the given Y-values, so it assumes that the Y-values are individual values from one single sample. Showing the standard deviations or errors at all these individual values is not sensible. Typically, the values shown are averages, and the error bars should indicate how much individual values (or estimates) scatter around the average. This can only be achieved in Excel by calculating these dispersion parameters and tell Excel to use these results as lengths of the error bars.
First one - STDEV indicate standard deviation - a deviation within observations of a particular group from the mean value, while
Second one - STDEV/SQRT (number of iteration) - indicates SEM - Standard Error of Mean - It is the standard deviation of the sample-mean's estimate of a population mean.
First, I believe it is important that you decide what type of interval you are interested in. Do you want a confidence interval for the mean; do you want a tolerance interval; or do you want a prediction interval? ... The book "Statistical Intervals" by Hahn & Meeker should be helpful.