Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
942 views
in Technique[技术] by (71.8m points)

databricks - Spark SQL - Convert SHA1 to BIGINT

I've been tasked with moving a lot of T-SQL into Spark (Databricks). The procedures I'm converting are creating surrogate keys in a somewhat typical manner for BI. What I'm trying to figure out the Spark equivalent to the following T-SQL:

select convert(bigint,hashbytes('sha1', N'570'))
-- Returns: -1488953326447475322

In Spark using SQL I can get the same hashbytes by doing:

select sha1(encode('570', 'UTF-16LE'))
--2c14de511f01a8abec0a4f15eb562cd6a1f64586 in Spark
--0x2C14DE511F01A8ABEC0A4F15EB562CD6A1F64586 in T-SQL

What I'm struggling to figure out is how to convert the returned hash into a bigint. I know that SHA1 is a 16 byte result and bigint is only 8 bytes, so there is truncation going on, but when trying to force this truncation using CONV like I've seen suggested I don't get close to the results I'm after.

select conv(substring(sha1(encode('570', 'UTF-16LE')), 0, 16), 16, 10)
--Returns: 3176408077196961963

Has anyone accomplished this?


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)
等待大神答复

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...