All numbers in JavaScript, whether integers or decimals, are of type Number. Within the program, the Number type is essentially a 64-bit floating point number, which is consistent with the double type floating point number in Java; therefore, all numbers in JavaScript are floating point numbers. Following the IEEE 754 standard (floating point arithmetic standard), the numerical range that JavaScript can represent is plus or minus 1.7976931348623157 times 10 to the power of 308, and the smallest decimal that can be represented is plus or minus 5 times 10 to the power of negative 324. These two boundary values can be obtained by accessing the MAX_VALUE attribute and MIN_VALUE attribute of the Number object respectively.
For integers, according to the requirements of the ECMAScript standard (http://ecma262-5.com/ELS5_HTML.htm#Section_8.5), the range of integers that JavaScript can represent and perform precise arithmetic operations is: Plus or minus 2 raised to the 53rd power, that is, the range from the minimum value -9007199254740992 to the maximum value 9007199254740992; for integers exceeding this range, JavaScript can still perform operations, but it does not guarantee the accuracy of the operation results. It is worth noting that for integer bit operations (such as shifts and other operations), JavaScript only supports 32-bit integers, that is, integers from -2147483648 to 2147483647.
Experiment
Display the absolute value of the largest number and the absolute value of the smallest decimal in JavaScript:
For integers outside the range of plus or minus 2 raised to the 53rd power, JavaScript cannot give accurate calculation results:
The correct calculation result should be 9007199254740995, but the calculation result given by JavaScript is 9007199254740996. After trying to change the calculation formula, you can find that as long as the integer is greater than 9007199254740992, errors in this calculation result will occur frequently. If the deviation in calculation accuracy is acceptable, then the consequences of the following example are even more serious:
Due to calculation accuracy issues, the above for statement will fall into an infinite loop.
For bitwise operations, JavaScript only supports 32-bit integers:
It can be seen that for integers within 32 bits (256), JavaScript can perform correct bit operations, and the result is consistent with the result of the division operation (128). For integers other than 32 bits, JavaScript can perform correct division operations (1100000000), but the result obtained after performing bit operations is far from the correct result (-1047483648).