Unable to create spark context in jupyterhub

0

When creating the SC, it keeps failing with fatal error though restarted the kernel multiple times. As I am data analyst, not sure what else to check in terms of resources

Sri
preguntada hace 8 meses214 visualizaciones
1 Respuesta
4
Respuesta aceptada

Hello, As you are getting fatal error when creating the SparkContext, then the issue might be in acquiring resources like yarn memory or CPU. Please check the memory and cpu availability of the current status in the cloudwatch and yarn resource manager UI/logs. If there are already applications running which took over the cluster resources, then you might not be able to create new application in JNB. Also, check the HDFS utilization, MR healthy node status in the Cloudwatch metrics. If the nodes are not available to take the containers, then new application might fail to initiate. Based on the resource checks stated above, please add more nodes or use auto scaling or add additional EBS volumes if you find that the resource constraint in the EMR nodes. Here is the EMR best practice for reference.

AWS
INGENIERO DE SOPORTE
respondido hace 8 meses
  • Thanks a ton!!! this is helpful

No has iniciado sesión. Iniciar sesión para publicar una respuesta.

Una buena respuesta responde claramente a la pregunta, proporciona comentarios constructivos y fomenta el crecimiento profesional en la persona que hace la pregunta.

Pautas para responder preguntas