Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why wouldn't JS Decimal work for you? #52

Open
littledan opened this issue Feb 18, 2020 · 5 comments
Open

Why wouldn't JS Decimal work for you? #52

littledan opened this issue Feb 18, 2020 · 5 comments

Comments

@littledan
Copy link
Member

@erights commented that, if there's a built-in decimal type in JavaScript, Agoric wouldn't use it. I'd like to learn more about why decimal wouldn't work for your application, even if it seems like it would from the outside.

@erights
Copy link

erights commented Feb 19, 2020

We already have BigInt. It has a semantics that is simple, familiar, and precisely understood by everyone --- the mathematical integer. For money, denominated in the smallest tradable unit, bigint does everything we want except for conversion back and forth from its printed form. The complexity of those external conversions is unfortunate, but it is quarantined. It does not make the internal computation less clear. We need to think clearly about what we're doing when we handle other people's money!

The only familiar mathematical concept that decimal resembles is real numbers. And it resembles it badly. My experience trying to think precisely about IEEE binary floats is that it is tremendously hazard prone, and it is hard to write tests that test all the edge cases. A case in point: Caja's initial attempt at SAFE_INTEGER used 2**53 rather than 2**53 - 1. This resulted in an exploitable security hole. Arguably decimal will be less hazard prone. But not less enough.

Aside from the external printed form, why would we consider switching from bigint to decimal? What's the benefit?

@erights
Copy link

erights commented Feb 19, 2020

Please note that my argument above has nothing to do with whether decimal is a better form of floating point than binary floating point, for some uses. Probably so. I just don't see the point of using anything but integers for money specifically, aside from the external form.

@erights
Copy link

erights commented Feb 19, 2020

Caja's initial attempt at SAFE_INTEGER used 253 rather than 253 - 1. This resulted in an exploitable security hole. Arguably decimal will be less hazard prone. But not less enough.

Blame where due: I made that mistake.

@littledan
Copy link
Member Author

The only familiar mathematical concept that decimal resembles is real numbers. And it resembles it badly.

This seems to be a misunderstanding of the feature. Decimal doesn't represent real numbers much more than Number does, even if we choose BigDecimal. The README tries to make it clear what Decimal is intended to represent: primarily, human-readable decimal quantities like money.

Aside from the external printed form, why would we consider switching from bigint to decimal? What's the benefit?

You mentioned being less hazard-prone. I've heard about several kinds of hazard cases, which I hope decimal can help avoid (this list is not exhaustive):

  • People using Number of dollars rather than number of cents, supposedly just for a little bit around presentation, for example, but resulting in occasional errors. (This would imply that not everyone would adopt BigInt even once it's well-supported, since Number already handled most of these cases well. And this misuse of Number seems to be much more common than I would've hoped.)
  • The number of cents being poorly specified and not quite usable that way, e.g.,
    • not accounting for variability in decimal places among currencies (various databases for this actually differ, and then developers like to build in their own incorrect expectations)
    • fractions of cents that are used in some intermediate calculations
    • due resolve these issues, the exponent sometimes has to be carried around through computations (to do it properly, but repetitively) or isn't (causing bugs)

@munrocket
Copy link

It should fit for any finance applications also with this proposal JS can be a language for scientific calculations in future. If all browsers agree to implement arbitrary precision numbers it will be cool.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants