Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
1.0k views
in Technique[技术] by (71.8m points)

apache spark - How to convert column of arrays of strings to strings?

I have a column, which is of type array < string > in spark tables. I am using SQL to query these spark tables. I wanted to convert the array < string > into string.

When used the below syntax:

select cast(rate_plan_code  as string) as new_rate_plan  from
customer_activity_searches group by rate_plan_code

rate_plan_code column has following values:

["AAA","RACK","SMOBIX","SMOBPX"] 
["LPCT","RACK"]
["LFTIN","RACK","SMOBIX","SMOBPX"]
["LTGD","RACK"] 
["RACK","LEARLI","NHDP","LADV","LADV2"]

following are populated in the new_rate_plan column:

org.apache.spark.sql.catalyst.expressions.UnsafeArrayData@e4273d9f
org.apache.spark.sql.catalyst.expressions.UnsafeArrayData@c1ade2ff
org.apache.spark.sql.catalyst.expressions.UnsafeArrayData@4f378397
org.apache.spark.sql.catalyst.expressions.UnsafeArrayData@d1c81377
org.apache.spark.sql.catalyst.expressions.UnsafeArrayData@552f3317

Cast seem to work when I am converting decimal to int or int to double, but not in this case. Curious why the cast is not not working here. Greatly appreciate your help.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

In Spark 2.1+ to do the concatenation of the values in a single Array column you can use the following:

  1. concat_ws standard function
  2. map operator
  3. a user-defined function (UDF)

concat_ws Standard Function

Use concat_ws function.

concat_ws(sep: String, exprs: Column*): Column Concatenates multiple input string columns together into a single string column, using the given separator.

val solution = words.withColumn("codes", concat_ws(" ", $"rate_plan_code"))
scala> solution.show
+--------------+-----------+
|         words|      codes|
+--------------+-----------+
|[hello, world]|hello world|
+--------------+-----------+

map Operator

Use map operator to have full control of what and how should be transformed.

map[U](func: (T) ? U): Dataset[U] Returns a new Dataset that contains the result of applying func to each element.

scala> codes.show(false)
+---+---------------------------+
|id |rate_plan_code             |
+---+---------------------------+
|0  |[AAA, RACK, SMOBIX, SMOBPX]|
+---+---------------------------+

val codesAsSingleString = codes.as[(Long, Array[String])]
  .map { case (id, codes) => (id, codes.mkString(", ")) }
  .toDF("id", "codes")

scala> codesAsSingleString.show(false)
+---+-------------------------+
|id |codes                    |
+---+-------------------------+
|0  |AAA, RACK, SMOBIX, SMOBPX|
+---+-------------------------+

scala> codesAsSingleString.printSchema
root
 |-- id: long (nullable = false)
 |-- codes: string (nullable = true)

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...