Manual submitted step jobs failing

0

We have a airflow setup runs the EMR jobs daily basis. I noticed an odd behavior that when I resubmit job for calculating the adhoc reports, spark application failed with below error, arguments seems to be leading some issue at AWS end as the same job worked before.

" No main class set in JAR; Please specify one with --class "

Vaas
已提问 6 个月前197 查看次数
1 回答
3
已接受的回答

Hello,

Sorry for the inconvenience caused. This is an ongoing issue in EMR Step console for spark-application type as the order of the arguments changed when submitting. Internal team is being worked on fixing this issue and you will be able to submit the jobs as it was before once fixed. Meanwhile, to unblock from this situation, you can do workaround with below,

  1. Submit the job through command-runner and pass the spark-submit as argument,

    a. Choose "Customized Jar" in Add step

    b. Add "command-runner.jar" in the Jar-Location

    c. In the Argument field, place the entire spark-submit command (example - spark-submit --deploy-mode cluster --class org.apache.spark.examples.SparkPi s3:///samplejob/spark-examples.jar 10)

  2. Use CLI method,

aws emr add-steps --cluster-id <clusterid> --steps Type=Spark,Name="SparkPi",Args=[--deploy-mode,cluster,--class,org.apache.spark.examples.SparkPi,s3://<s3bucket>/samplejob/spark-examples.jar,10],ActionOnFailure=CONTINUE

AWS
支持工程师
已回答 6 个月前
profile picture
专家
已审核 6 个月前
profile pictureAWS
专家
已审核 6 个月前
  • Thank you very much for the prompt response. I am able to submit the job with suggested alternatives. Hope this issue will be fixed asap.

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则