EMR notebook Spark not able to load imported External Jar

0

I am trying to use EMR notebook in combined with a custom jar stored in S3.

I can easily include the jar in spark-shell by running:

spark-shell --jars <S3 jar path>

I want to achieve the same thing in EMR Spark Notebook. I have tried below solution:

%%configure -f
{ "conf":{
          "spark.jars": "<S3 jar path>"
         },
    "jars": ["S3 jar path"] 
}

I can see that the jar is included in spark context, but when I then try to import class from the jar, I get the below error:

<console>:23: error: object ... is not a member of package com

Anyone has any idea about how to use custom jar in EMR Spark notebook?

cxcxcx
已提問 1 年前檢視次數 464 次
1 個回答
0

Configuration looks correct to me. I would suggest to look into your import and see there is any issue with importing specific package or may need to add it to classpath before import. I could find similar issue reported here.

HTH

AWS
Kashif
已回答 1 年前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南