Infinite wisdom of intuent idiots
Ideally, things in software development should be intuitive. As the saying goes, if you need to explain what something does, you’ve already screwed up. However, while hieroglyphs were perfectly intuitive for Ancient Egyptians, it took researchers a lot of time to understand their meaning.
Let’s start with a simple question: what’s
Okay, we know binary, but what’s the point?
Once we start dealing with decimal numbers across different numeral systems, we run into one nonintuitive caveat: in most programming languages,
Your intuition tells you it should, but it somehow doesn’t. Well, not somehow, this behaviour comes from a simple mathematical principle, combined with limitations of computer hardware. Converting from base-
is a multiple of , e.g. when you convert from binary to decimal, or is exponentiation of , e.g. from hexadecimal to binary.
While your intuition tells you all three decimal numbers have a finite representation with no rounding errors, it assumes base-10 representation. Without an amount of experience working under varying numeral systems, it’s hard to account for the fact that neither
Since computers operate with binary values and there’s a limit how many digits can be stored per number, all three numbers are rounded to some binary approximation, and as it turns out resulting approximations for
But it works in…
It’s hard to accept your intuition is wrong; therefore the first response is usually blaming programming languages for such “nonintuitive bullshit”. Then comes the denial, where “this programming language doesn’t have that problem,” so it’s clearly just a “lazy implementation” and not a hardware or performance limitation.
Come on. Do you really think your intuition is better than the knowledge of groups of people developing a programming language? Do you think your use case is more important than all other priorities a language has?
The aftermath
Software development isn’t about intuition; it’s about developing optimal solutions within given constraints. Programming languages are made to balance such constraints and make a tool which is both efficient for the given task, and for developers to use. In the example above, one could simply increase the number of available digits to achieve better precision, and ultimately
But is that efficient for general use? Not really. Standard double values offer precision down to
In the end, if something doesn’t fit you, use a different tool, or make your own - if nothing else, you’ll learn to better understand why you can’t “have it your way”.