EMR notebook Spark not able to load imported External Jar

0

I am trying to use EMR notebook in combined with a custom jar stored in S3.

I can easily include the jar in spark-shell by running:

spark-shell --jars <S3 jar path>

I want to achieve the same thing in EMR Spark Notebook. I have tried below solution:

%%configure -f
{ "conf":{
          "spark.jars": "<S3 jar path>"
         },
    "jars": ["S3 jar path"] 
}

I can see that the jar is included in spark context, but when I then try to import class from the jar, I get the below error:

<console>:23: error: object ... is not a member of package com

Anyone has any idea about how to use custom jar in EMR Spark notebook?

cxcxcx
質問済み 1年前464ビュー
1回答
0

Configuration looks correct to me. I would suggest to look into your import and see there is any issue with importing specific package or may need to add it to classpath before import. I could find similar issue reported here.

HTH

AWS
Kashif
回答済み 1年前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ