Running SQL query from AWS Glue to BigQuery

0

I am trying to write data from S3 to BigQuery using Glue, this is working perfectly fine -** have used the Glue BQ connector from Marketplace**. I am able to write the data to the table at BQ but when trying to use prequery/postquery to run a delete SQL query on BQ table, it does not run. How can I perform the deletion that I need to run on the table at BQ, is there any way for this, or am I doing any mistake while adding the prequery? Below is the code for writing the dynamic_frame to BQ where I am adding the prequery as option:

prequery = f"""DELETE FROM fynd-1088.fynd_warehouse_pre.rds_subscriber_config_dms WHERE id in ({list_id})""" preact = False if deletedf_count == 0 else True

writedf = ( glueContext.write_dynamic_frame.from_options( frame=target_df, connection_type="marketplace.spark", connection_options={ "parentProject": "parentProject", "table": "parentProject.dataset_name.table_name", "temporaryGcsBucket": "bucket_name", "connectionName": "new_bq", "prequery": prequery, "addPreAction": preact, }, transformation_ctx="writedf", ) )

已提問 2 年前檢視次數 826 次
1 個回答
0

Hi,

to my knowledge the bigQuery Connector does not support the prequery option, as stated in the connector documentation you can see other available options here: https://github.com/GoogleCloudDataproc/spark-bigquery-connector/tree/0.22.0 .

The only connector that supports pre and post query is the Redshift connector as you mentioned in your comment.

hope this helps,

AWS
專家
已回答 2 年前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南