Running SQL query from AWS Glue to BigQuery

0

I am trying to write data from S3 to BigQuery using Glue, this is working perfectly fine -** have used the Glue BQ connector from Marketplace**. I am able to write the data to the table at BQ but when trying to use prequery/postquery to run a delete SQL query on BQ table, it does not run. How can I perform the deletion that I need to run on the table at BQ, is there any way for this, or am I doing any mistake while adding the prequery? Below is the code for writing the dynamic_frame to BQ where I am adding the prequery as option:

prequery = f"""DELETE FROM fynd-1088.fynd_warehouse_pre.rds_subscriber_config_dms WHERE id in ({list_id})""" preact = False if deletedf_count == 0 else True

writedf = ( glueContext.write_dynamic_frame.from_options( frame=target_df, connection_type="marketplace.spark", connection_options={ "parentProject": "parentProject", "table": "parentProject.dataset_name.table_name", "temporaryGcsBucket": "bucket_name", "connectionName": "new_bq", "prequery": prequery, "addPreAction": preact, }, transformation_ctx="writedf", ) )

1 Antwort
0

Hi,

to my knowledge the bigQuery Connector does not support the prequery option, as stated in the connector documentation you can see other available options here: https://github.com/GoogleCloudDataproc/spark-bigquery-connector/tree/0.22.0 .

The only connector that supports pre and post query is the Redshift connector as you mentioned in your comment.

hope this helps,

AWS
EXPERTE
beantwortet vor 2 Jahren

Du bist nicht angemeldet. Anmelden um eine Antwort zu veröffentlichen.

Eine gute Antwort beantwortet die Frage klar, gibt konstruktives Feedback und fördert die berufliche Weiterentwicklung des Fragenstellers.

Richtlinien für die Beantwortung von Fragen