Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Value out of bounds for Numeric types #474

Open
ghost opened this issue Oct 31, 2016 · 5 comments
Open

Value out of bounds for Numeric types #474

ghost opened this issue Oct 31, 2016 · 5 comments
Labels
known issue for any issue that has been reported or there's a PR with the fix

Comments

@ghost
Copy link

ghost commented Oct 31, 2016

I'm running into an issue where a valid numerical value is blocking my insert statement with tedious. A manual SQL insert into the database with the same numerical value works.

Manual SQL Insert:

INSERT INTO foo (id, foo_id, timestamp, value, value2)  
VALUES ('ff3bd5e2-dbd8-424b-bb4f-455df1fe9f9c', 'b3a0a2c6-3b58-4659-8b4f-9466cede43be',  
        '2016-11-01T00:00:00+00:00', null, 66666.55);

Here's my code:

const bluebird = require('bluebird');
const sql = require('mssql');

const db = require('./database.js');

sql.Promise = bluebird;

const COLUMNS = {
  id: 'id',
  foo_id: 'foo_id',
  timestamp: 'timestamp',
  value: 'value',
  value2: 'value2'
};

exports.create = data => {
  return db.getConnection()
    .then(conn => {
      const ps = new sql.PreparedStatement(conn);
      ps.input(COLUMNS.id, sql.UniqueIdentifier);
      ps.input(COLUMNS.foo_id, sql.UniqueIdentifier);
      ps.input(COLUMNS.timestamp, sql.DateTimeOffset(0));
      ps.input(COLUMNS.value, sql.Numeric(30, 15));
      ps.input(COLUMNS.value2, sql.Numeric(30, 15));

      // Have to convert string to JavaScript date for insert to work
      data.timestamp = new Date(data.timestamp);
      const header = Object.keys(COLUMNS).map(key => COLUMNS[key]).join();

      const query = `INSERT INTO foo (${header}) VALUES (@${COLUMNS.id},` +
        ` @${COLUMNS.foo_id}, @${COLUMNS.timestamp}, @${COLUMNS.value},` +
        ` @${COLUMNS.value2})`;
      return ps.prepare(query)
        .then(() => ps.execute(data)
          // Releases the connection back to the pool
          .then(() => ps.unprepare()))
        .then(() => bluebird.resolve());
    });
};

The tedious code seems to convert 66666.55 to an integer with a value of 66666550000000000000. It then tries to take the remainder and quotient of that value and write it to a buffer using writeUInt32LE.

Why is the tedious code converting floating point numbers to integers and writing them to buffers instead of keeping the floating point number and using the built-in function writeDoubleLE for buffers?

This issue is blocking me from moving forward.

@pekim
Copy link
Collaborator

pekim commented Oct 31, 2016

Why is the tedious code converting floating point numbers to integers...

Because that's what the TDS protocol mandates for Numeric values.
2.2.5.5.1.6 Decimal/Numeric

For sql.Numeric(30, 15), p is 30, and s is 15. So the integer will be 66666550000000000000 (66666.55 * 10^15), and should be represented in 16 bytes.

} else {

  ...
  buffer.writeUInt64LE(value);        // 8 bytes
  buffer.writeUInt32LE(0x00000000);   // 4 bytes
  buffer.writeUInt32LE(0x00000000);   // 4 bytes

The first line is the one with the problem.

https://github.com/tediousjs/tedious/blob/master/src/tracking-buffer/writable-tracking-buffer.js#L125
There is no Buffer.writeInt64LE, so it has to be done in two 32 bit chunks.

 writeUInt64LE(value) {
    this.writeInt32LE(value & -1);
    return this.writeUInt32LE(Math.floor(value * SHIFT_RIGHT_32));
  }

The 32 bit right shift is where the problem seems to manifest itself.
Math.floor(66666550000000000000 * SHIFT_RIGHT_32) yields 15522015746 which is 0x39D2F2A02. That requires 34 bits.

Now I see that the problem is that 66666550000000000000 is 0x39d2f2a02aed16000, which requires more than 64 bits. But this raises the issue that not all integers with more than 53 bits can be correctly represented in JavaScripts' Number type. To handle this correctly may require some head scratching, and some new unit tests.

@pekim
Copy link
Collaborator

pekim commented Oct 31, 2016

As a workaround, using a smaller scale would probably work.

@ghost
Copy link
Author

ghost commented Nov 1, 2016

@pekim Thanks for answering my question and providing the information! I decided to write it in Python instead of Node.js to move forward. I'll still be using this library for other work, but it won't involve numeric types. Hopefully a solution can be found so no one else runs into this issue.

@jeteon
Copy link

jeteon commented Oct 30, 2019

In my case, I had to use a different workaround since the Numeric was being set up by a library on top of Tedious (Sequelize). In essence, I format the parameter such that the decimal conversion happens inside the database rather than in code. For all numbers, I have to use this:

const value_stored = value.toFixed(desired_precision);

This is so that you don't lose precision but also don't choke up anything in between my code and the DB that makes assumptions about scale and magnitude that I'm not aware of or control. Of course, this is less bandwidth efficient than using the correct type but in my case <30 byte per record penalty is insignificant.

@IanChokS IanChokS added the known issue for any issue that has been reported or there's a PR with the fix label Dec 17, 2019
@vitorbertolucci
Copy link

In my case, I had to use a different workaround since the Numeric was being set up by a library on top of Tedious (Sequelize). In essence, I format the parameter such that the decimal conversion happens inside the database rather than in code. For all numbers, I have to use this:

const value_stored = value.toFixed(desired_precision);

This is so that you don't lose precision but also don't choke up anything in between my code and the DB that makes assumptions about scale and magnitude that I'm not aware of or control. Of course, this is less bandwidth efficient than using the correct type but in my case <30 byte per record penalty is insignificant.

That solved the issue for me as well.

Is there a way to apply this globally so everytime sequelize tries to insert/update a decimal value it calls .toFixed(2)?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
known issue for any issue that has been reported or there's a PR with the fix
Projects
None yet
Development

No branches or pull requests

4 participants