Consider the following cases, where i have put my understanding
Notation - 0+ = tending to zero
{b} = singleton set
Now the real question is this from probability theory or set theory.
I found this description of singleton as
{b} = infinite intersection over 'n' of (b-1/n , b+1/n]
but according to my understanding(as above), it still should represent a range of real number and not single point. For that intersection to result in a point, 1/n should be exact zero.
These two descriptions, one from probability theory and other from calculus doesn't seem to agree to each other according to my understanding.
Can you please tell where i am doing wrong ?
I might have used some terminologies carelessly, but i hope you got the point what i am trying to ask.
Thanks