Manual submitted step jobs failing

0

We have a airflow setup runs the EMR jobs daily basis. I noticed an odd behavior that when I resubmit job for calculating the adhoc reports, spark application failed with below error, arguments seems to be leading some issue at AWS end as the same job worked before.

" No main class set in JAR; Please specify one with --class "

Vaas
demandé il y a 6 mois197 vues
1 réponse
3
Réponse acceptée

Hello,

Sorry for the inconvenience caused. This is an ongoing issue in EMR Step console for spark-application type as the order of the arguments changed when submitting. Internal team is being worked on fixing this issue and you will be able to submit the jobs as it was before once fixed. Meanwhile, to unblock from this situation, you can do workaround with below,

  1. Submit the job through command-runner and pass the spark-submit as argument,

    a. Choose "Customized Jar" in Add step

    b. Add "command-runner.jar" in the Jar-Location

    c. In the Argument field, place the entire spark-submit command (example - spark-submit --deploy-mode cluster --class org.apache.spark.examples.SparkPi s3:///samplejob/spark-examples.jar 10)

  2. Use CLI method,

aws emr add-steps --cluster-id <clusterid> --steps Type=Spark,Name="SparkPi",Args=[--deploy-mode,cluster,--class,org.apache.spark.examples.SparkPi,s3://<s3bucket>/samplejob/spark-examples.jar,10],ActionOnFailure=CONTINUE

AWS
INGÉNIEUR EN ASSISTANCE TECHNIQUE
répondu il y a 6 mois
profile picture
EXPERT
vérifié il y a 6 mois
profile pictureAWS
EXPERT
vérifié il y a 6 mois
  • Thank you very much for the prompt response. I am able to submit the job with suggested alternatives. Hope this issue will be fixed asap.

Vous n'êtes pas connecté. Se connecter pour publier une réponse.

Une bonne réponse répond clairement à la question, contient des commentaires constructifs et encourage le développement professionnel de la personne qui pose la question.

Instructions pour répondre aux questions