While running a hive query using "insert overwrite select," we encountered an error related to the sanitization of an XML document destined for the ListObjectsV2Handle class . Here is the snippet of the error code
at org.apache.hadoop.fs.s3a.S3ObjectsListingV2.listNextBatchOfObjects(S3ObjectsListing.java:98)
at org.apache.hadoop.fs.s3a.S3AFileSystem.listPrefixV2(S3AFileSystem.java:1848)
at org.apache.hadoop.fs.s3a.S3AFileSystem.listPrefix(S3AFileSystem.java:1679)
at org.apache.hadoop.fs.s3a.S3AFileSystem.listStatus(S3AFileSystem.java:1912)
at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1531)
at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1571)
Caused by: java.lang.OutOfMemoryError: Java heap space
We already tried increase memory ,and number of nodes to increase heap size but it doesn't work.
Please suggest a solution to fix this.
Thank you for the swift response. As mentioned in my comment, we have already attempted to increase the heap space, as recommended in the response. Unfortunately, this adjustment did not resolve the issue. I would appreciate your attention to the specific error message: "Failed to sanitize XML document destined for handler class com.amazonaws.services.s3.model.transform.XmlResponsesSaxParser$ListObjectsV2Handler." The error occurs when the method ListObjectsV2Handler is invoked, indicating a potential issue with the ListObjectsV2 method. Could you offer any insights or suggestions in this context?