Unable to create spark context in jupyterhub

0

When creating the SC, it keeps failing with fatal error though restarted the kernel multiple times. As I am data analyst, not sure what else to check in terms of resources

Sri
feita há 8 meses214 visualizações
1 Resposta
4
Resposta aceita

Hello, As you are getting fatal error when creating the SparkContext, then the issue might be in acquiring resources like yarn memory or CPU. Please check the memory and cpu availability of the current status in the cloudwatch and yarn resource manager UI/logs. If there are already applications running which took over the cluster resources, then you might not be able to create new application in JNB. Also, check the HDFS utilization, MR healthy node status in the Cloudwatch metrics. If the nodes are not available to take the containers, then new application might fail to initiate. Based on the resource checks stated above, please add more nodes or use auto scaling or add additional EBS volumes if you find that the resource constraint in the EMR nodes. Here is the EMR best practice for reference.

AWS
ENGENHEIRO DE SUPORTE
respondido há 8 meses
  • Thanks a ton!!! this is helpful

Você não está conectado. Fazer login para postar uma resposta.

Uma boa resposta responde claramente à pergunta, dá feedback construtivo e incentiva o crescimento profissional de quem perguntou.

Diretrizes para responder a perguntas