EMR notebook Spark not able to load imported External Jar

0

I am trying to use EMR notebook in combined with a custom jar stored in S3.

I can easily include the jar in spark-shell by running:

spark-shell --jars <S3 jar path>

I want to achieve the same thing in EMR Spark Notebook. I have tried below solution:

%%configure -f
{ "conf":{
          "spark.jars": "<S3 jar path>"
         },
    "jars": ["S3 jar path"] 
}

I can see that the jar is included in spark context, but when I then try to import class from the jar, I get the below error:

<console>:23: error: object ... is not a member of package com

Anyone has any idea about how to use custom jar in EMR Spark notebook?

cxcxcx
已提问 1 年前464 查看次数
1 回答
0

Configuration looks correct to me. I would suggest to look into your import and see there is any issue with importing specific package or may need to add it to classpath before import. I could find similar issue reported here.

HTH

AWS
Kashif
已回答 1 年前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则