- Newest
- Most votes
- Most comments
Hello,
Can you please confirm, if this (my.jar --cfg /local/path/config.cfg
) is a spark config? If yes, then you can pass this spark config through spark-submit
.
- The spark-submit script in Spark’s bin directory is used to launch applications on a cluster. [1]
Also, please feel free to check out our AWS Integration and Automation Github page on EMR on EKS
here -
https://github.com/aws-ia/terraform-aws-eks-blueprints/tree/main/examples/analytics/emr-on-eks
Furthermore, information regarding sparkSubmit
and example in terms of AWS CLI command for EMR Serverless has been mentioned on our AWS docs here -
https://docs.aws.amazon.com/emr/latest/EMR-Serverless-UserGuide/jobs-spark.html
conf = An arbitrary Spark configuration property.
Configuration that you provide as part of sparkSubmitParameters using --conf.
Additionally, in regards to containers, you can pass config files to a container using configmaps as a ConfigMap
is an API object used to store non-confidential data in key-value pairs. Pods can consume ConfigMaps as environment variables, command-line arguments, or as configuration files in a volume. [2]
-
A ConfigMap allows you to decouple environment-specific configuration from your container images, so that your applications are easily portable.
-
Caution: ConfigMap does not provide secrecy or encryption.
References:
[1] https://spark.apache.org/docs/latest/submitting-applications.html
[2] https://kubernetes.io/docs/concepts/configuration/configmap/
Relevant content
- asked 2 years ago
- asked 2 years ago
- AWS OFFICIALUpdated 3 years ago
- AWS OFFICIALUpdated 3 months ago
- AWS OFFICIALUpdated a month ago
- AWS OFFICIALUpdated 21 days ago