1 個回答
- 最新
- 最多得票
- 最多評論
3
Hello,
Sorry for the inconvenience caused. This is an ongoing issue in EMR Step console for spark-application type as the order of the arguments changed when submitting. Internal team is being worked on fixing this issue and you will be able to submit the jobs as it was before once fixed. Meanwhile, to unblock from this situation, you can do workaround with below,
-
Submit the job through command-runner and pass the spark-submit as argument,
a. Choose "Customized Jar" in Add step
b. Add "command-runner.jar" in the Jar-Location
c. In the Argument field, place the entire spark-submit command (example - spark-submit --deploy-mode cluster --class org.apache.spark.examples.SparkPi s3:///samplejob/spark-examples.jar 10)
-
Use CLI method,
aws emr add-steps --cluster-id <clusterid> --steps Type=Spark,Name="SparkPi",Args=[--deploy-mode,cluster,--class,org.apache.spark.examples.SparkPi,s3://<s3bucket>/samplejob/spark-examples.jar,10],ActionOnFailure=CONTINUE
相關內容
- 已提問 7 個月前
- AWS 官方已更新 1 年前
- AWS 官方已更新 3 年前
- AWS 官方已更新 2 年前
- AWS 官方已更新 2 年前
Thank you very much for the prompt response. I am able to submit the job with suggested alternatives. Hope this issue will be fixed asap.