Manual submitted step jobs failing

0

We have a airflow setup runs the EMR jobs daily basis. I noticed an odd behavior that when I resubmit job for calculating the adhoc reports, spark application failed with below error, arguments seems to be leading some issue at AWS end as the same job worked before.

" No main class set in JAR; Please specify one with --class "

Vaas
已提問 6 個月前檢視次數 197 次
1 個回答
3
已接受的答案

Hello,

Sorry for the inconvenience caused. This is an ongoing issue in EMR Step console for spark-application type as the order of the arguments changed when submitting. Internal team is being worked on fixing this issue and you will be able to submit the jobs as it was before once fixed. Meanwhile, to unblock from this situation, you can do workaround with below,

  1. Submit the job through command-runner and pass the spark-submit as argument,

    a. Choose "Customized Jar" in Add step

    b. Add "command-runner.jar" in the Jar-Location

    c. In the Argument field, place the entire spark-submit command (example - spark-submit --deploy-mode cluster --class org.apache.spark.examples.SparkPi s3:///samplejob/spark-examples.jar 10)

  2. Use CLI method,

aws emr add-steps --cluster-id <clusterid> --steps Type=Spark,Name="SparkPi",Args=[--deploy-mode,cluster,--class,org.apache.spark.examples.SparkPi,s3://<s3bucket>/samplejob/spark-examples.jar,10],ActionOnFailure=CONTINUE

AWS
支援工程師
已回答 6 個月前
profile picture
專家
已審閱 6 個月前
profile pictureAWS
專家
已審閱 6 個月前
  • Thank you very much for the prompt response. I am able to submit the job with suggested alternatives. Hope this issue will be fixed asap.

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南