Infinite wisdom of intuent idiots

Ideally, things in software development should be intuitive. As the saying goes, if you need to explain what something does, you’ve already screwed up. However, while hieroglyphs were perfectly intuitive for Ancient Egyptians, it took researchers a lot of time to understand their meaning.

Sometimes relying on intuition is the way to go. Sometimes, it doesn’t work. Photo: Needpix

Let’s start with a simple question: what’s in binary? We all know how this works; it’s . How about ? Yes, it’s .

Okay, we know binary, but what’s the point?

Once we start dealing with decimal numbers across different numeral systems, we run into one nonintuitive caveat: in most programming languages, doesn’t equal .

Your intuition tells you it should, but it somehow doesn’t. Well, not somehow, this behaviour comes from a simple mathematical principle, combined with limitations of computer hardware. Converting from base- to base- numeral system doesn’t necessarily retain a fixed representation of decimal numbers. It’s only guaranteed to retain it as long as:

  • is a multiple of , e.g. when you convert from binary to decimal, or
  • is exponentiation of , e.g. from hexadecimal to binary.

While your intuition tells you all three decimal numbers have a finite representation with no rounding errors, it assumes base-10 representation. Without an amount of experience working under varying numeral systems, it’s hard to account for the fact that neither , , nor can be represented with a finite number of decimal digits under base-2 numeral system.

Since computers operate with binary values and there’s a limit how many digits can be stored per number, all three numbers are rounded to some binary approximation, and as it turns out resulting approximations for and don’t add up to the approximation of . In base-10, the same rounding error occurs with numbers such as , where doesn’t equal .

But it works in…

It’s hard to accept your intuition is wrong; therefore the first response is usually blaming programming languages for such “nonintuitive bullshit”. Then comes the denial, where “this programming language doesn’t have that problem,” so it’s clearly just a “lazy implementation” and not a hardware or performance limitation.

Come on. Do you really think your intuition is better than the knowledge of groups of people developing a programming language? Do you think your use case is more important than all other priorities a language has?

The aftermath

Software development isn’t about intuition; it’s about developing optimal solutions within given constraints. Programming languages are made to balance such constraints and make a tool which is both efficient for the given task, and for developers to use. In the example above, one could simply increase the number of available digits to achieve better precision, and ultimately would equal .

But is that efficient for general use? Not really. Standard double values offer precision down to of the most significant digit - they can store with precision , and with precision without significant performance impact.

In the end, if something doesn’t fit you, use a different tool, or make your own - if nothing else, you’ll learn to better understand why you can’t “have it your way”.