Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
621 views
in Technique[技术] by (71.8m points)

Spark JDBC Write to Teradata: multiple spark tasks failing with Transaction ABORTed due to deadlock error resulting in Stage failure

I am using spark JDBC write to load data from hive to teradata view. I am using 200 vcores and partitioned the data into 10000 partitions.

Spark tasks are failing with the below error resulting in stage failure. Sometimes the application finishes successfully but with some duplicate records

caused by: java.sql.SQLException: [Teradata Database] [TeraJDBC 16.20.00.10] [Error 2631] [SQLState 40001] Transaction ABORTed due to deadlock.

Below is the code I have used:

val df = spark.sql("select * from hive table").distinct.repartition(10000).write.mode(overwrite) .option("truncate", Truncate).jdbc(url,dbTable, dproperties)

Teradata view is created with "AS LOCKING ROW FOR ACCESS". The table also has a unique PI.

I am unable to figure out why some spark tasks are failing with dead lock error and is there a way I can stop my entire spark application from failing because of the task failures.

question from:https://stackoverflow.com/questions/65944454/spark-jdbc-write-to-teradata-multiple-spark-tasks-failing-with-transaction-abor

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

Dozens of sessions trying to insert into the same table will likely cause a deadlock. Even though the view is defined with an access lock, a write lock must be obtained in order to insert rows into the backing table.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...