Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
600 views
in Technique[技术] by (71.8m points)

cryptography - ECDSA signatures between Node.js and WebCrypto appear to be incompatible?

I'm using the following example for signing + verifying in Node.js: https://github.com/nodejs/node-v0.x-archive/issues/6904. The verification succeeds in Node.js but fails in WebCrypto. Similarly, a message signed using WebCrypto fails to verify in Node.js.

Here's the code I used to verify a signature produced from the Node.js script using WebCrypto - https://jsfiddle.net/aj49e8sj/. Tested in both Chrome 54.0.2840.27 and Firefox 48.0.2

// From https://github.com/nodejs/node-v0.x-archive/issues/6904
var keys = {
  priv: '-----BEGIN EC PRIVATE KEY-----
' +
        'MHcCAQEEIF+jnWY1D5kbVYDNvxxo/Y+ku2uJPDwS0r/VuPZQrjjVoAoGCCqGSM49
' +
        'AwEHoUQDQgAEurOxfSxmqIRYzJVagdZfMMSjRNNhB8i3mXyIMq704m2m52FdfKZ2
' +
        'pQhByd5eyj3lgZ7m7jbchtdgyOF8Io/1ng==
' +
        '-----END EC PRIVATE KEY-----
',
  pub: '-----BEGIN PUBLIC KEY-----
' +
       'MFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEurOxfSxmqIRYzJVagdZfMMSjRNNh
' +
       'B8i3mXyIMq704m2m52FdfKZ2pQhByd5eyj3lgZ7m7jbchtdgyOF8Io/1ng==
' +
       '-----END PUBLIC KEY-----
'
};
var message = (new TextEncoder('UTF-8')).encode('hello');

// Algorithm used in Node.js script is ecdsa-with-SHA1, key generated with prime256v1
var algorithm = {
    name: 'ECDSA',
    namedCurve: 'P-256',
    hash: {
        name: 'SHA-1'
    }
};

// Signature from obtained via above Node.js script
var sig64 = 'MEUCIQDkAtiomagyHFi7dNfxMrzx/U0Gk/ZhmwCqaL3TimvlswIgPgeDqgZNqfR5/FZZASYsczUAhGSXjuycLhWnvk20qKc=';

// Decode base64 string into ArrayBuffer
var b64Decode = (str) => Uint8Array.from(atob(str), x => x.charCodeAt(0));

// Get base64 string from public key
const key64 = keys.pub.split('
')
    .filter(x => x.length > 0 && !x.startsWith('-----'))
    .join('');

// Convert to buffers
var sig = b64Decode(sig64);
var keySpki = b64Decode(key64);

// Import and verify
// Want 'Verification result: true' but will get 'false'
var importKey = crypto.subtle.importKey('spki', keySpki, algorithm, true, ['verify'])
    .then(key => crypto.subtle.verify(algorithm, key, sig, message))
    .then(result => console.log('Verification result: ' + result));

Related question with a similar issue using SHA-256 instead of SHA-1: Generating ECDSA signature with Node.js/crypto

Things I've checked:

  • I decoded the Node.js keys and verified they have the same OID as keys generated via WebCrypto. This tells me I'm using the correct curves.
  • SHA-1 is explicitly identified as the hash to use in both locations.
  • ECDSA is explicitly identified in both Node.js and WebCrypto.

How can I successfully verify the signature received from Node.js and vice versa - verify a signature in Node.js produced from WebCrypto? Or are the implementations of the standard subtly different in such a way that makes them incompatible?

Edit:

  • WebCrypto signature (64 bytes): uTaUWTfF+AjN3aPj0b5Z2d1HybUEpV/phv/P9RtfKaGXtcYnbgfO43IRg46rznG3/WnWwJ2sV6mPOEnEPR0vWw==
  • Node.js signature (71 bytes): MEUCIQDkAtiomagyHFi7dNfxMrzx/U0Gk/ZhmwCqaL3TimvlswIgPgeDqgZNqfR5/FZZASYsczUAhGSXjuycLhWnvk20qKc=

Verified Node.js signature is DER encoded and WebCrypto signature is not.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

Having not used either of these libraries I can't say for certain, but one possibility is that they don't use the same encoding type for the signature. For DSA/ECDSA there are two main formats, IEEE P1363 (used by Windows) and DER (used by OpenSSL).

The "Windows" format is to have a preset size (determined by Q for DSA and P for ECDSA (Windows doesn't support Char-2, but if it did it'd probably be M for Char-2 ECDSA)). Then both r and s are left-padded with 0 until they meet that length.

In the too small to be legal example of r = 0x305 and s = 0x810522 with sizeof(Q) being 3 bytes:

// r
000305
// s
810522

For the "OpenSSL" format it is encoded under the rules of DER as SEQUENCE(INTEGER(r), INTEGER(s)), which looks like

// SEQUENCE
30
  // (length of payload)
  0A
  // INTEGER(r)
  02
    // (length of payload)
    02
    // note the leading 0x00 is omitted
    0305
  // INTEGER(s)
  02
    // (length of payload)
    04
    // Since INTEGER is a signed type, but this represented a positive number,
    // a 0x00 has to be inserted to keep the sign bit clear.
    00810522

or, compactly:

  • Windows: 000305810522
  • OpenSSL: 300A02020305020400810522

The "Windows" format is always even, always the same length. The "OpenSSL" format is usually about 6 bytes bigger, but can gain or lose a byte in the middle; so it's sometimes even, sometimes odd.

Base64-decoding your sig64 value shows that it is using the DER encoding. Generate a couple signatures with WebCrypto; if any don't start with 0x30 then you have the IEEE/DER problem.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

2.1m questions

2.1m answers

60 comments

56.9k users

...