Running SQL query from AWS Glue to BigQuery

0

I am trying to write data from S3 to BigQuery using Glue, this is working perfectly fine -** have used the Glue BQ connector from Marketplace**. I am able to write the data to the table at BQ but when trying to use prequery/postquery to run a delete SQL query on BQ table, it does not run. How can I perform the deletion that I need to run on the table at BQ, is there any way for this, or am I doing any mistake while adding the prequery? Below is the code for writing the dynamic_frame to BQ where I am adding the prequery as option:

prequery = f"""DELETE FROM fynd-1088.fynd_warehouse_pre.rds_subscriber_config_dms WHERE id in ({list_id})""" preact = False if deletedf_count == 0 else True

writedf = ( glueContext.write_dynamic_frame.from_options( frame=target_df, connection_type="marketplace.spark", connection_options={ "parentProject": "parentProject", "table": "parentProject.dataset_name.table_name", "temporaryGcsBucket": "bucket_name", "connectionName": "new_bq", "prequery": prequery, "addPreAction": preact, }, transformation_ctx="writedf", ) )

1개 답변
0

Hi,

to my knowledge the bigQuery Connector does not support the prequery option, as stated in the connector documentation you can see other available options here: https://github.com/GoogleCloudDataproc/spark-bigquery-connector/tree/0.22.0 .

The only connector that supports pre and post query is the Redshift connector as you mentioned in your comment.

hope this helps,

AWS
전문가
답변함 2년 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠