How use DBMS_DATAPUMP to import oracle schemas

0

Hi

I have one RDS oracle 19c database in a cloud account that I need to move to another new RDS oracle instance in a second cloud account.

According to link below, It seems that DBMS_DATAPUMP allows export/import at schema level or database level. The source database has 30-40 application schemas that I need to move but I have to exclude the internal oracle SYS or other schemas so I assume I should not do a database level import unless there is a way to exclude those.

https://oracle-base.com/articles/misc/data-pump-api

Is there a way to do import for multiple schemas at once using the database level export dump or not?

Thanks,

2 Answers
0
Accepted Answer

First, you will have to export the schemas from the source database. The DBMS_DATAPUMP package has an open procedure that you can use to start an export job. You can specify the SCHEMA as the job_mode in this procedure. You will also have to add files for the dump file and the log file using the add_file procedure. Finally, you will use the metadata_filter procedure to specify the schema to be exported. This process should be repeated for each schema you want to export

declare
  l_dp_handle       number;
begin
  -- Open a schema export job.
  l_dp_handle := dbms_datapump.open(
    operation   => 'EXPORT',
    job_mode    => 'SCHEMA',
    remote_link => NULL,
    job_name    => 'TESTUSER1_EXPORT',
    version     => 'LATEST');

  -- Specify the dump file name and directory object name.
  dbms_datapump.add_file(
    handle    => l_dp_handle,
    filename  => 'TESTUSER1.dmp',
    directory => 'TEST_DIR');

  -- Specify the log file name and directory object name.
  dbms_datapump.add_file(
    handle    => l_dp_handle,
    filename  => 'expdpTESTUSER1.log',
    directory => 'TEST_DIR',
    filetype  => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);

  -- Specify the schema to be exported.
  dbms_datapump.metadata_filter(
    handle => l_dp_handle,
    name   => 'SCHEMA_EXPR',
    value  => '= ''TESTUSER1''');

  dbms_datapump.start_job(l_dp_handle);

  dbms_datapump.detach(l_dp_handle);
end;

You would need to run similar code for each schema you want to export.

Next, you can import the schemas into the destination database. The import process is similar to the export process. You will use the open procedure to start an import job, and the add_file procedure to specify the dump file and the log file. You can also use the metadata_remap procedure if you want to remap the schema name

declare
  l_dp_handle       number;
begin
  -- Open a schema import job.
  l_dp_handle := dbms_datapump.open(
    operation   => 'IMPORT',
    job_mode    => 'SCHEMA',
    remote_link => NULL,
    job_name    => 'TESTUSER1_IMPORT',
    version     => 'LATEST');

  -- Specify the dump file name and directory object name.
  dbms_datapump.add_file(
    handle    => l_dp_handle,
    filename  => 'TESTUSER1.dmp',
    directory => 'TEST_DIR');

  -- Specify the log file name and directory object name.
  dbms_datapump.add_file(
    handle    => l_dp_handle,
    filename  => 'impdpTESTUSER1_IMPORT.log',
    directory => 'TEST_DIR',
    filetype  => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);

  -- Specify the remap schema name.
  dbms_datapump.metadata_remap(
    handle     => l_dp_handle,
    name       => 'REMAP_SCHEMA',
    old_value  => 'TESTUSER1',
    value      => 'TESTUSER2');

  dbms_datapump.start_job(l_dp_handle);

  dbms_datapump.detach(l_dp_handle);
end;

You can exclude certain schemas (like the SYS and SYSTEM schemas) during the export process by using the exclude parameter.

expdp username/password directory=datapump dumpfile=x.dmp logfile=y.log full=y exclude=schema:"in ('SYS', 'SYSTEM')"

When importing, you can remap multiple schemas at once by using the remap_schema option in the impdp command

impdp system/**** directory=DATA_PUMP_DIR dumpfile=export_something.dmp logfile=imp.log remap_schema=name:newname,name2:newname2,name3:banana
profile picture
EXPERT
answered 10 months ago
  • So if i have 30 schemas in source RDS, I need to run the export 30 times in SOURCE RDS to create 30 dump files and then move files to destination RDS using S3 integration and then run the import 30 times on RDS destination?

    If I am doing the export at schema level why do I need to exclude SYS and SYSTEM schemas since the import is only for one application schema I select? Did you mean using EXCLUSION when doing export at DATAVASE level.

  • Not necessarily, you could do a single export/import that specifies 30 schemas - the DBMS_DATAPUMP metadata filter and the expdp "SCHEMAS" argument can take multiple values. Or you could do 30 exports/imports, or some combination of the two.

  • Hi

    Do you have an example or link that shows how to export/import multiple schemas using DBMS_DATAPUMP? Let us say you want to export/import 3 schemas "testuser1", "testuser2", testuser3".

    Thanks,

-1

Please note that in RDS oracle we dont have access to the Unix Operating System so we cannot run "expdp" and "impdp" as shown in above answer for exclusions.

We are limited to running DBMS_DATAPUMP pl/sql package. It seems the export does support adding a list of schemas according to this answer. I am not sure if the import can support all the schemas at same time though?

https://forums.oracle.com/ords/apexds/post/dbms-datapump-exclude-6730

sam15
answered 10 months ago
  • "expdp" and "imdp" are client utilities. The actual work of processing dump files happens on the server, but there has never been a requirement that the clients be run on the database server itself (the older versions "exp" and "imp" did all of the actual work of exporting and importing data so it was best for performance reasons to run them on the server to reduce round trips; the newer Data Pump clients are essentially just wrappers for PL/SQL that runs on the database server). You can just install these wherever your other Oracle client utilities are installed (e.g. SQLPlus, SQLLoader). Alternatively, you can use the PL/SQL APIs directly.

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions