Thursday, December 7, 2017

AWS: Oracle DB Import

We recently moved to AWS as our Oracle db provider and hopefully this guide helps all AWS newbies..

Few things to keep in mind
  • You do not have physical access to Oracle db server.
  • You do not have SYS or SYSTEM privileges to Oracle db, instead you get SA_MASTER account.
  • SA_MASTER is a cooked up account with certain privileges. 
  • You need to call predefined functions to do SYS level tasks such as killing session, flush pool etc. Refer here for the functions/procedures
Oracle DB Dump import.

This is tricky as you don't have access to OS server running Oracle. So you cannot physically move the dump file to Oracle directory (DATA_PUMP_DIR)

By default AWS Admin will provide you a "Jump box" and SA_MASTER credentials for the target AWS DB. Typically you will have admin access to this Jump box and you manage all target AWS Oracle OS  related activities through this Jump Box. You need to install an Oracle db on this Jump box, and since you are the admin, you will have SYS/SYSTEM privileges on this Jump box db.

Lets call our Jump Box Oracle db as "Local" db from now on.

Let's say I want to create a schema called RMANNI on AWS DB

Connect to AWS DB as SA_Master
  1. Create data and index tablespaces. By default they will be created as "BigFiles", so you don't have to specify data files or location. Once you create the user, you can grant whatever privileges you want. In my example, I am granting bare minimum
  • Create Tablespace RMANNI_DATA;
  • Create Tablespace RMANNI_INDX;
  • Create User RMANNI Identified by RMANNI;
  • Alter User RMANNI Default Tablespace RMANNI_DATA Temporary Tablespace TEMP Quota Unlimited on RMANNI_DATA Quota Unlimited on RMANNI_INDX;
  • Grant Connect, Create Session,Alter Session to RMANNI;
Connect to "Local" (Jump box db) as SYS/SYSTEM

  1. Use default DATA_PUMP_DIR or map any directory on jump box. In my case i mapped "D:\datadump" folder to DATA_ PUMP_LOCAL
  2. Copy your data dump to above folder on jump box.
  3. Here is the fun part. From local db, create a database link pointing to target AWS db. Replace HOST,PORTand SID according to your setup
             Create Database Link TO_AWS_PROD  
             Connect To RMANNI
             Identified By RMANNI
             Using '(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)              (HOST=your_host_address*rds.amazonaws.com)(PORT=1521))(CONNECT_DATA=      (SID=ORCLDB)))';
  4. Verify if the database link works
    Select * from dual@TO_AWS_PROD
  5. Transfer the dump. Command below will transfer the file you placed in your local data pump folder to AWS Server data pump folder using the Database link you created in previous step. Depending upon the size of your dump, this step may take a while to complete.
    BEGIN
              DBMS_FILE_TRANSFER.PUT_FILE(
                source_directory_object       => 'DATA_PUMP_LOCAL',
                source_file_name              => 'MY_SOURCE.DMP',
                destination_directory_object  => 'DATA_PUMP_DIR',
                destination_file_name         => 'MY_SOURCE.DMP',
                destination_database          => 'TO_AWS_PROD'
              );
            END;
Once the dump is transferred, you import the dump as usual. You can either import as RMANNI or SA_MASTER. If you import as RMANNI, make sure to grant read,write access to the data pump folder on AWS db.
        DIRECTORY=DATA_PUMP_DIR 
        DUMPFILE=MY_SOURCE.DMP 
        LOGFILE=MY_SOURCE.log 
        REMAP_SCHEMA=<>:RMANNI 
        REMAP_TABLESPACE=<>_DATA:RMANNI_DATA,<>_INDX:RMANNI_INDX                    SCHEMAS=<source>

After the import, review log file. Again, since you don't have access to db server, you retrieve log file using Select statement and copy/paste output to file or spool the output from SQLPlus.

select * from table(RDSADMIN.RDS_FILE_UTIL.READ_TEXT_FILE('DATA_PUMP_DIR','MY_SOURCE.log'));
or

SET HEADING OFF
 spool D:\RMANNI\RMANNI.LOG
select * from table(RDSADMIN.RDS_FILE_UTIL.READ_TEXT_FILE('DATA_PUMP_DIR','MY_SOURCE.log'));
spool off

Hope this helps.