Unable to Launch the Spark history server and viewing the Spark UI using Docker

0

I am following what is mentioned in to launch spark history server locally and run spark ui but getting error on starting container

https://docs.aws.amazon.com/glue/latest/dg/monitor-spark-ui-history.html

Did anyone face same issue ? Please help

2023-02-22 17:54:07 Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
2023-02-22 17:54:07 23/02/22 22:54:07 INFO HistoryServer: Started daemon with process name: 1@514d84090bb7
2023-02-22 17:54:07 23/02/22 22:54:07 INFO SignalUtils: Registering signal handler for TERM
2023-02-22 17:54:07 23/02/22 22:54:07 INFO SignalUtils: Registering signal handler for HUP
2023-02-22 17:54:07 23/02/22 22:54:07 INFO SignalUtils: Registering signal handler for INT
2023-02-22 17:54:07 23/02/22 22:54:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2023-02-22 17:54:08 23/02/22 22:54:08 INFO SecurityManager: Changing view acls to: root
2023-02-22 17:54:08 23/02/22 22:54:08 INFO SecurityManager: Changing modify acls to: root
2023-02-22 17:54:08 23/02/22 22:54:08 INFO SecurityManager: Changing view acls groups to: 
2023-02-22 17:54:08 23/02/22 22:54:08 INFO SecurityManager: Changing modify acls groups to: 
2023-02-22 17:54:08 23/02/22 22:54:08 INFO SecurityManager:** SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
2023-02-22 17:54:08 23/02/22 22:54:08 INFO FsHistoryProvider: History server ui acls disabled; users with admin permissions: ; groups with admin **permissions: 
2023-02-22 17:54:08 Exception in thread "main" java.lang.reflect.InvocationTargetException
2023-02-22 17:54:08 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2023-02-22 17:54:08 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2023-02-22 17:54:08 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2023-02-22 17:54:08 at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2023-02-22 17:54:08 at org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:300)
2023-02-22 17:54:08 at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
2023-02-22 17:54:08 Caused by: org.apache.hadoop.fs.UnsupportedFileSystemException: No FileSystem for scheme "s3"
2023-02-22 17:54:08 at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3281)
2023-02-22 17:54:08 at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3301)
2023-02-22 17:54:08 at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124)
2023-02-22 17:54:08 at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3352)
2023-02-22 17:54:08 at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3320)
2023-02-22 17:54:08 at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:479)
2023-02-22 17:54:08 at org.apache.hadoop.fs.Path.getFileSystem(Path.java:365)
2023-02-22 17:54:08 at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:116)
2023-02-22 17:54:08 at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:88)
2023-02-22 17:54:08 ... 6 more
asked a year ago705 views
1 Answer
0
Accepted Answer

Hello All I got solution for this.

I was getting issue because I was passing

-Dspark.history.fs.logDirectory=s3:// name instead of s3a://

answered a year ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions