1 Answer
- Newest
- Most votes
- Most comments
1
Hi,
If you do not want to use S3 in the RDS export/import process, then you could use EFS integration with RDS to get your files created directly on EFS rather than the RDS server. See: https://aws.amazon.com/blogs/database/integrate-amazon-rds-for-oracle-with-amazon-efs/ - Hope this helps.
If you already have the dump and log files in RDS, on the other hand, and if they are what you want to access, then you got to use S3 as the in-between staging area.
Regards, Govardhanan.
Relevant content
- asked 2 years ago
- asked 5 years ago
- asked 2 months ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated 9 months ago
- AWS OFFICIALUpdated a year ago
Thanks! Excellent Link about the EFS integration option where we dont even need to copy files to RDS server as in S3 integration.
I though there is an API or SQL command that I can run in RDS to read the log file contents in data pump directory.
To download the Log file from RDS Server to S3 bucket, do not you have to setup a role/permissions between the RDS and S3 before you are able to do that?
I guess you are talking about rdsadmin.rds_file_util.read_text_file procedure. See https://repost.aws/questions/QUowXaNnrrREascAz9C6fKwQ/can-i-read-log-file-in-rds#ANWB5p-W-rTMmT-b05zxT9vQ for description. As for setting up role/permission between RDS and S3, yes, you need that. See: The section "Configuring IAM permissions for RDS for Oracle integration with Amazon S3" under https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/oracle-s3-integration.html#oracle-s3-integration.using. Hope this helps.
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Appendix.Oracle.CommonDBATasks.Misc.html
SELECT * FROM TABLE (rdsadmin.rds_file_util.read_text_file( p_directory => 'PRODUCT_DESCRIPTIONS', p_filename => 'rice.txt'));