Redshift varchar(max) not enough to store json data type column from Postgres

0

Redshift varchar(max) not enough to store json data type column from Postgres. Which data type should I use to store the column, I am using glue to perform the ETL and storing to Redshift, Postgres has Json data type but in Redshift the limit is exceeding even by varchar(max). Getting below error in STL_LOAD_ERRORS table-redshift ->String length exceeds DDL length
Is there any workaround or direct solution, as super data type is not supported in Glue, so can't map that to the column.

2개 답변
0

Hi,

In Redshift for storing JSON data we have introduced the SUPER data type.

However, the SUPER data type only supports up to 1MB of data for an individual SUPER field or object. For more information, see Ingesting and querying semistructured data in Amazon Redshift.

hope this helps,

AWS
전문가
답변함 2년 전
0

We are encountering a similar issue where we're utilizing the "super" datatype, which has a maximum length of 65K. However, the column in the Parquet file we receive has a maximum length of 192K. How should we handle this data? Are there alternative datatypes we can use to accommodate such large data sizes?

msve
답변함 한 달 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠