Clearly I'm doing something wrong, and I'm sure it's a really silly mistake, but I can't figure out what it is. I'm using the easiest equation I could find which is from Wood et al 1977:
[unknown] = (slope unk/slope std) * [std]
(where unk=unknown, std= standard, and [ ] indicate concentration)
Before running my actual samples I wanted to test it using standard solutions. So I made a serial dilution of the standard (let's say 10-fold starting at 10,000units/mL), measured the diameter, squared it, and plotted it against the concentration or dilution factor.
I then picked a random dilution (let's say it happened to contain 100units/mL), pretended I didn't know it's concentration (so I'll call it the "unknown") and repeated the same 10-fold dilution as above. Plotting it on that same graph gives a parallel line with a lower x-intercept and lower y-intercept.
HOWEVER...when I do the equation, the slope ratio becomes 1 (since the slopes are identical), and when I multiply it by the known standard concentration (10,000units/mL), I calculate the highest concentration of my "unknown" as 10,000...not the 100units/mL that it actually is.
What am I doing wrong?
To me it seems like if the y-intercept was the same (or at least more similar), then it should work, but how do I set the y-intercepts equal to each other? How do I shift the "unknown" curve to the right so the y-int can be the same? Am I just horribly misunderstanding something?
Please help! I've been stumped for over a week now!