1 Answer
- Newest
- Most votes
- Most comments
2
DynamicFrame doesn't use the catalog, it will infer the schema from the actual data files.
DataFrame does and since you are converting to it, you can just do:
usage_df = spark.table("source_db", "source_tbl")
Relevant content
- asked 2 years ago
- asked 9 months ago
- Accepted Answerasked 2 years ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated a year ago
- AWS OFFICIALUpdated a year ago
Thanks a lot. It worked.
usage_df = spark.table("source_db.source_tbl")
@Gonzalo Herreros can you share details around fixing it and then going back to using create_dynamic_frame.from_catalog or create_data_frame.from_catalog. Or is the expectation to use only spark.table once we have updated the schema?