Questions tagged with Amazon Redshift

Content language: English

Sort by most recent

Browse through the questions and answers listed below or filter and sort to narrow down your results.

import tables with large data columns from redshift federated schema into local tables

Hi, We have Aurora Postgres databases federated in Redshift and require copying tables into Redshift local schemata. Our Aurora databases have some tables with large data columns, in the form of integer arrays among others. When we try to import those tables as with this query: ``` CREATE TABLE local_schema.accounts AS ( SELECT tasks FROM federated_schema.accounts ) ``` We get this error: ``` [XX000] ERROR: Value of VARCHAR type is too long. Detail: ----------------------------------------------- error: Value of VARCHAR type is too long. code: 25101 context: Received VARCHAR len=128599 max=65535. This could be due to multibyte characters exceeding the column size or the value in remote database is larger than the max allowed length (65535) of varchar column in Redshift. query: 865164 location: federation_fetchers.hpp:146 process: query12_495_865164 [pid=12824] ----------------------------------------------- ``` The error does not change, even if we try to cast the column as in `CAST(tasks AS SUPER)` Based on this https://docs.amazonaws.cn/en_us/redshift/latest/dg/federated-data-types.html such columns are converted into VARCHAR(64K) How can we handle columns with data larger than 64K of characters? This post suggest creating a view on postgres, but we do not have access to the postgres db. https://repost.aws/questions/QUM_FcRt0eR3uqhTJ6ZDDUJQ/redshift-federated-query-rds-postgres-long-varchar-value-error-code-25101 Is there a way to handle this issue from the Redshift side? Much appreciate the help!
1
answers
0
votes
21
views
vida
asked 2 months ago