Using Pandas in Glue ETL Job ( How to convert Dynamic DataFrame or PySpark Dataframe to Pandas Dataframe)

0

I am wanting to use Pandas in a Glue ETL job. I am reading from S3 and writing to Data Catalog. I am trying to find a basic example where I can read in from S3 , either into or converting to a Pandas DF, and then do my manipulations and then write out to Data Catalog. It looks like I may need to write to a Dynamic DataFrame before sending to data catalog. Any examples? I am doing my ETL today using PySpark but would like to do most of my transformations in Pandas.

bfeeny
已提问 2 年前9838 查看次数
1 回答
0
已接受的回答

Would say convert Dynamic frame to Spark data frame using .ToDF() method and from spark dataframe to pandas dataframe using link https://sparkbyexamples.com/pyspark/convert-pyspark-dataframe-to-pandas/#:~:text=Convert%20PySpark%20Dataframe%20to%20Pandas%20DataFrame,small%20subset%20of%20the%20data.

AWS
NishAWS
已回答 2 年前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则