Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
576 views
in Technique[技术] by (71.8m points)

Spark Scala: DateDiff of two columns by hour or minute

I have two timestamp columns in a dataframe that I'd like to get the minute difference of, or alternatively, the hour difference of. Currently I'm able to get the day difference, with rounding, by doing

val df2 = df1.withColumn("time", datediff(df1("ts1"), df1("ts2")))

However, when i looked at the doc page https://issues.apache.org/jira/browse/SPARK-8185 I didn't see any extra parameters to change the unit. Is their a different function I should be using for this?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

You can get the difference in seconds by

import org.apache.spark.sql.functions._
val diff_secs_col = col("ts1").cast("long") - col("ts2").cast("long")

Then you can do some math to get the unit you want. For example:

val df2 = df1
  .withColumn( "diff_secs", diff_secs_col )
  .withColumn( "diff_mins", diff_secs_col / 60D )
  .withColumn( "diff_hrs",  diff_secs_col / 3600D )
  .withColumn( "diff_days", diff_secs_col / (24D * 3600D) )

Or, in pyspark:

from pyspark.sql.functions import *
diff_secs_col = col("ts1").cast("long") - col("ts2").cast("long")

df2 = df1 
  .withColumn( "diff_secs", diff_secs_col ) 
  .withColumn( "diff_mins", diff_secs_col / 60D ) 
  .withColumn( "diff_hrs",  diff_secs_col / 3600D ) 
  .withColumn( "diff_days", diff_secs_col / (24D * 3600D) )

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...