Formatting to two decimal places in JavaScript

There are two issues with formatting numbers to two places in JavaScript. The first problem is not really a problem with JavaScript, but rather a problem with representing floating point numbers in binary.

In decimal, 1/3 has a non terminating and repeating decimal place of a '3'. In binary, numbers like this exist also.

Take the floating point binary representation of .1. The binary expansion of this is .000110011001100... The '0011' repeats and represents the significand in the floating point representation.

The issue is that there are only so many bits to represent this non-terminating '0011'. In JavaScript there are 53 bits available to represent the significand and after 53 bits everything else is chopped off. That is why when we do .07 - .06 without rounding we get .010000000000000009, instead of the expected result of .01. Thus rounding is necessary.

The other issue is that JavaScript does not output extraneous zeros. So even if you round doing something like this: Math.round(100 * (a-b))/100, results with one decimal place will be displayed with one decimal place. For example, '5.10' would be outputted as '5.1'..

Therefore, after rounding it is still necessary to use the toFixed() method of Number.

var myNum = Number(5.1);
alert(myNum.toFixed(2));

At one time there were bugs in the JScript version of toFixed(), but surely by now that is ancient history.

You can play around with some of these formatting issues if you like, using the inputs below. Type in the two numbers and choose the operation from the select drop down and click the button. And a gotcha might be if you format to two decimal places for something like .07 * .01, the result being 0.00.



Scripts