Isn't that three?
1. infinity
2. and
3. beyond
![]()
You are right -- there's an infinite number of degrees of infinity, starting with aleph zero and aleph one.
Yet, according to the IEEE standards,
0 11111111 00000000000000000000000 = Infinity
1 11111111 00000000000000000000000 = -Infinity
So infinity on a digital computer is represented by a finite number of binary digits. How can that be?;-)
And then if you take the LARGEST integer that can be implemented (usually 2 to the 31st minus 1) and then you add 1, you get the SMALLEST integer (negative 2 to the 31st) due to 32-bit two's complement representation of integers. How can that be?
Then there's the ubiquitous "Not a Number":
0 11111111 00000100000000000000000 = NaN
1 11111111 00100010001001010101010 = NaN
;-);-)


love you!!