I have a dataframe df
+-----+--------+----------+-----+
|count|currency| date|value|
+-----+--------+----------+-----+
| 3| GBP|2021-01-14| 4|
| 102| USD|2021-01-14| 3|
| 234| EUR|2021-01-14| 5|
| 28| GBP|2021-01-16| 5|
| 48| USD|2021-01-16| 7|
| 68| EUR|2021-01-15| 6|
| 20| GBP|2021-01-15| 1|
| 33| EUR|2021-01-17| 2|
| 106| GBP|2021-01-17| 10|
+-----+--------+----------+-----+
I have a separate dataframe for USD exchange_rate
val exchange_rate = spark.read.format("csv").load("/Users/khan/data/exchange_rate.csv")
exchange_rate.show()
INSERTTIME EXCAHNGERATE CURRENCY
2021-01-14 0.731422 GBP
2021-01-14 0.784125 EUR
2021-01-15 0.701922 GBP
2021-01-15 0.731422 EUR
2021-01-16 0.851422 GBP
2021-01-16 0.721128 EUR
2021-01-17 0.771621 GBP
2021-01-17 0.751426 EUR
I want to convert the GBP and EUR currency to USD in df
by looking in the exchange_rate dataframe corresponding to the date
My Idea
import com.currency_converter.CurrencyConverter from http://xavierguihot.com/currency_converter/#com.currency_converter.CurrencyConverter$
is there a simpler way to do it?
question from:
https://stackoverflow.com/questions/65901406/spark-shell-scala-to-map-the-exchange-rate-value-in-dataframe 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…