If you have only two variables in your regression equation (i.e., a single independent variable), then the standardized regression coefficient will be equal to the correlation by definition.
I might suggest that you plot your data to see what is going on. This looks like there is no error in you measurements so what are you plotting? Best, David Booth@.
Question: Regarding the statement, "the value of beta is equal to r with standard error of 0" -- where are you getting the assertion that this standard error is 0? I highly doubt that this is the case. Under most circumstances we'd look to the standard error of the metric regression coefficient.
The value of the standardized coefficient is exactly equal to the correlation between X and Y, so there is no standard error. Converting the metric regression to the standardized one is just a matter of algebra, so that does not make a difference.
However, there is a non-zero standard error for the relationship of X and Y, as there is for all statistics. If we focus on the simple Pearson r then SE = sqrt(1-r^2)/sqrt(n-2).
Hence my question -- where did he get the estimated standard error of 0. If he got a standard error of 0 it implies that the correlation was perfect, but then beta would be 1.0 as opposed to 10%. Something does not compute.