EMR notebook Spark not able to load imported External Jar

0

I am trying to use EMR notebook in combined with a custom jar stored in S3.

I can easily include the jar in spark-shell by running:

spark-shell --jars <S3 jar path>

I want to achieve the same thing in EMR Spark Notebook. I have tried below solution:

%%configure -f
{ "conf":{
          "spark.jars": "<S3 jar path>"
         },
    "jars": ["S3 jar path"] 
}

I can see that the jar is included in spark context, but when I then try to import class from the jar, I get the below error:

<console>:23: error: object ... is not a member of package com

Anyone has any idea about how to use custom jar in EMR Spark notebook?

cxcxcx
posta un anno fa464 visualizzazioni
1 Risposta
0

Configuration looks correct to me. I would suggest to look into your import and see there is any issue with importing specific package or may need to add it to classpath before import. I could find similar issue reported here.

HTH

AWS
Kashif
con risposta un anno fa

Accesso non effettuato. Accedi per postare una risposta.

Una buona risposta soddisfa chiaramente la domanda, fornisce un feedback costruttivo e incoraggia la crescita professionale del richiedente.

Linee guida per rispondere alle domande