this post was submitted on 15 Sep 2024
23 points (79.5% liked)
Programming
17318 readers
55 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities [email protected]
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Why restrict to 54-bit signed integers? Is there some common language I'm not thinking of that has this as its limit?
Edit: Found it myself, it's the range where you can store an integer in a double precision float without error. I suppose that makes sense for maximum compatibility, but feels gross if we're already identifying value types. I don't come from a web-dev/js background, though, so maybe it makes more sense there.
I didn't think you realize just how much code is written in JavaScript these days.
Because
number
is a double, and IEEE754 specifies the mantissa of double-precision numbers as 53bits+sign.Meaning, it's the highest integer precision that a double-precision object can express.
It's not about compatibility. It's because JSON only has a
number
type which covers both floating point and integers, andnumber
is implemented as a double-precision value. If you have to express integers with a double-precision type, when you go beyond 53bits you will start to experience loss of precision, which goes completely against the notion of an integer.