Unable to create spark context in jupyterhub

0

When creating the SC, it keeps failing with fatal error though restarted the kernel multiple times. As I am data analyst, not sure what else to check in terms of resources

Sri
質問済み 8ヶ月前214ビュー
1回答
4
承認された回答

Hello, As you are getting fatal error when creating the SparkContext, then the issue might be in acquiring resources like yarn memory or CPU. Please check the memory and cpu availability of the current status in the cloudwatch and yarn resource manager UI/logs. If there are already applications running which took over the cluster resources, then you might not be able to create new application in JNB. Also, check the HDFS utilization, MR healthy node status in the Cloudwatch metrics. If the nodes are not available to take the containers, then new application might fail to initiate. Based on the resource checks stated above, please add more nodes or use auto scaling or add additional EBS volumes if you find that the resource constraint in the EMR nodes. Here is the EMR best practice for reference.

AWS
サポートエンジニア
回答済み 8ヶ月前
  • Thanks a ton!!! this is helpful

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ