Disclaimer

Monday, 31 March 2025

How to import dump from RDS S3 bucket to AWS Oracle RDS instance


Below are the step-by-step details on how to import a dump from an RDS S3 bucket to an AWS Oracle RDS instance.


Step 1: Gather the DB Connection details from AWS Secrets Manager
Step 2: How to establish the AWS Session Manager for Port forwarding (Oracle RDS machine connectivity purpose)
Step 3: Connect to Oracle RDS database using SQL Developer tool and invoke worksheet (SQL Query execution)
Step 4: Creation of empty schema structure along with necessary permissions, tablespaces, etc
Step 5: Import your dump file into your target DB instance using DBMS_DATAPUMP package
Step 6: Once the Import job completed read the logfile to verify for any errors
Step 7: Verification of imported DB Objects & its Storage size information
Step 8: Clean-up of files
Step 9: Import "SAMIK2" in orcl database
Step 10: Import "SAMIK3" in orcl database
Step 11: Remove the source database dump files from DATA_PUMP_DIR which are copied from S3 Bucket



Step 1: Gather the DB Connection details from AWS Secrets Manager
===================================================
E.x.

DB_User_Account: admin
DB_User_Account_Password: Welcome123 (This password might change due to rotation. 
So, please check secrets manager for updated password.)
DB_Name: ORCL
Engine: Oracle
Port: 1521
Host: orcl.eu-central-1.rds.amazonaws.com
End_Point: orcl.eu-central-1.rds.amazonaws.aws.glpoly.net




Step 2: How to establish the AWS Session Manager for Port forwarding (Oracle RDS machine connectivity purpose)
============================================================================================


Open the Power Shell command prompt in your local machine and execute below commands. 
Basically, you need to collect environment variables from AWS Access Key webpage under PowerShell tab.

DevOps-abcdefgh
982534357203 | aws-root-17c0820db8@oracle.com

AWSAdministratorAccess|Access keys 

E.x.

Windows PowerShell
Copyright (C) Microsoft Corporation. All rights reserved.

Install the latest PowerShell for new features and improvements! https://aka.ms/PSWindows

PS C:\Users\sam>
PS C:\Users\sam> $Env:AWS_ACCESS_KEY_ID="ASIA6JQ44QDJ2RE65ITL"
PS C:\Users\sam> $Env:AWS_SECRET_ACCESS_KEY="3jk8TKMPz7f3"
PS C:\Users\sam> $Env:sHpSvWCCw1++evj4qK1dzwbzzziFAvjsCs5ncWx+VY/xqA0IHrYR+DuT2fNISVm6yiuNBzgG6P


After environment variables set, please use below given AWS SSM command to start the session.

PS C:\Users\sam> aws ssm start-session  --region eu-central-1 --target i-8e0ae0085 --document-name AWS-StartPortForwardingSessionToRemoteHost --parameters host="orcl-oraclerds.982534357203.aws.glpoly.net",portNumber="1521",localPortNumber="1521"

Starting session with SessionId: scott@oracle.com-frs7fgk7yfe
Port 1521 opened for sessionId scott@oracle.com-frs7fgk7yfe
Waiting for connections...

Connection accepted for session [scott@oracle.com-frs7fgk7yfe]




Step 3: Connect to Oracle RDS database using SQL Developer tool and invoke worksheet (SQL Query execution)
==============================================================================================
Now, you can run the SQL based commands here and perform required DBA actions.


Q) How to download export dump from Bucket to RDS DATA_PUMP_DIR location..????

First, we need to download the Oracle DB dump files in RDS machine from S3 bucket. To do that, please execute below commands,
-----------------------------------------------------------------------------------------------------------------------------
Note: Below "rdsadmin.rdsadmin_s3_tasks.download_from_s3" package will download the dumpfiles and keep in DATA_PUMP_DIR location on RDS instance.


SELECT rdsadmin.rdsadmin_s3_tasks.download_from_s3(
           p_bucket_name    =>  'orcl-oracle-backup-bucket', 
           p_s3_prefix      =>  '20250305_ORCLV52/', 
           p_directory_name =>  'DATA_PUMP_DIR') 
           AS TASK_ID FROM DUAL;

O/p will be like -> 1742397812960-2284 -> Task_ID

Here, you need to replace the bucket name and in prefix option you have to specify if there any sub-directories.

Our S3 bucket structure is like below,

Amazon S3 -> Buckets -> "orcl-oracle-backup-bucket/20250305_ORCLV52/"

Once the above SELECT statement execution completed, please verify the logfile for successful dump download from S3 bucket.

Syntax:-
SELECT text from table(rdsadmin.rds_file_util.read_text_file('BDUMP','dbtask-<TASK_ID>.log'));

Actual Query:-
SELECT text from table(rdsadmin.rds_file_util.read_text_file('BDUMP','dbtask-1742397812960-2284.log'));

O/p will be like below,

2025-03-19 15:25:43.937 UTC [INFO ] This task is about to list the Amazon S3 objects for AWS Region eu-central-1, bucket name orcl-oracle-backup-bucket, and prefix 20250305_ORCLV52/.
2025-03-19 15:25:43.976 UTC [INFO ] The task successfully listed the Amazon S3 objects for AWS Region eu-central-1, bucket name orcl-oracle-backup-bucket, and prefix 20250305_ORCLV52/.
2025-03-19 15:25:43.992 UTC [INFO ] This task is about to download the Amazon S3 object or objects in /rdsdbdata/datapump from bucket name orcl-oracle-backup-bucket and key 20250305_ORCLV52/EMP2.DMP.
2025-03-19 15:25:44.582 UTC [INFO ] The task successfully downloaded the Amazon S3 object or objects from bucket name orcl-oracle-backup-bucket with key 20250305_ORCLV52/EMP2.DMP to the location /rdsdbdata/datapump.
2025-03-19 15:25:44.582 UTC [INFO ] This task is about to download the Amazon S3 object or objects in /rdsdbdata/datapump from bucket name orcl-oracle-backup-bucket and key 20250305_ORCLV52/EMP2.log.
2025-03-19 15:25:44.706 UTC [INFO ] The task successfully downloaded the Amazon S3 object or objects from bucket name orcl-oracle-backup-bucket with key 20250305_ORCLV52/EMP2.log to the location /rdsdbdata/datapump.
2025-03-19 15:25:44.706 UTC [INFO ] This task is about to download the Amazon S3 object or objects in /rdsdbdata/datapump from bucket name orcl-oracle-backup-bucket and key 20250305_ORCLV52/EMP3.DMP.
2025-03-19 15:25:44.880 UTC [INFO ] The task successfully downloaded the Amazon S3 object or objects from bucket name orcl-oracle-backup-bucket with key 20250305_ORCLV52/EMP3.DMP to the location /rdsdbdata/datapump.
2025-03-19 15:25:44.880 UTC [INFO ] The task skipped downloading already existing key orcl-oracle-backup-bucket to the location 20250305_ORCLV52/EMP3.log.
2025-03-19 15:25:44.880 UTC [INFO ] This task is about to download the Amazon S3 object or objects in /rdsdbdata/datapump from bucket name orcl-oracle-backup-bucket and key 20250305_ORCLV52/EMP1.DMP.
2025-03-19 15:28:13.968 UTC [INFO ] The task successfully downloaded the Amazon S3 object or objects from bucket name orcl-oracle-backup-bucket with key 20250305_ORCLV52/EMP1.DMP to the location /rdsdbdata/datapump.
2025-03-19 15:28:13.999 UTC [INFO ] This task is about to download the Amazon S3 object or objects in /rdsdbdata/datapump from bucket name orcl-oracle-backup-bucket and key 20250305_ORCLV52/EMP1.log.
2025-03-19 15:28:14.216 UTC [INFO ] The task successfully downloaded the Amazon S3 object or objects from bucket name orcl-oracle-backup-bucket with key 20250305_ORCLV52/EMP1.log to the location /rdsdbdata/datapump.
2025-03-19 15:28:14.217 UTC [INFO ] The task finished successfully.


Now, please verify the downloaded files available in DATA_PUMP_DIR directory on RDS server,

SQL> select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')) order by filename, mtime;

O/p will be like below,

File_Name Type FileSize MTIME

EMP2.DMP file 28131328 19-03-25
EMP2.log file 1982 19-03-25
EMP3.DMP file 479232 19-03-25
EMP3.log file 3036 19-03-25
EMP1.DMP file 15426408448 19-03-25
EMP1.log file 195909 19-03-25
datapump/ directory 4096 19-03-25



Step 4: Creation of empty schema structure along with necessary permissions, tablespaces, etc.,
===============================================================================================
SQL> select * from dba_users where oracle_maintained='N' and username in ('SAMIK1','SAMIK2','SAMIK3') order by created;

no rows selected.

Now, start creating the all three empty schemas. To do that, please execute below SQL commands in SQL Developer worksheet to create the respective tablespaces.

E.x.

SQL> select tablespace_name from dba_tablespaces;

SYSTEM
SYSAUX
UNDO_T1
TEMP
USERS
RDSADMIN

SQL> select tablespace_name, file_name from dba_data_files order by tablespace_name;

RDSADMIN /rdsdbdata/db/ORCL_A/datafile/o1_mf_rdsadmin_mw07j5xz_.dbf
SYSAUX /rdsdbdata/db/ORCL_A/datafile/o1_mf_sysaux_mw0648yh_.dbf
SYSTEM /rdsdbdata/db/ORCL_A/datafile/o1_mf_system_mw06475h_.dbf
UNDO_T1 /rdsdbdata/db/ORCL_A/datafile/o1_mf_undo_t1_mw0649fs_.dbf
USERS /rdsdbdata/db/ORCL_A/datafile/o1_mf_users_mw0649pl_.dbf


Now, verify the above output to identify the naming convention and existing tablespaces details.

In AWS RDS, you don't have to specify the datafile location explicitly. 
Hence, use below SQL commands to create the required three tablespaces and verify once created.

SQL> CREATE BIGFILE TABLESPACE SAMIK1_DATA DATAFILE SIZE 1G AUTOEXTEND ON MAXSIZE 33554431M;

TABLESPACE SAMIK1_DATA created.

SQL> CREATE BIGFILE TABLESPACE SAMIK2_DATA DATAFILE SIZE 1G AUTOEXTEND ON MAXSIZE 33554431M;

TABLESPACE SAMIK2_DATA created.

SQL> CREATE BIGFILE TABLESPACE SAMIK3_DATA DATAFILE SIZE 1G AUTOEXTEND ON MAXSIZE 33554431M;

TABLESPACE SAMIK3_DATA created.


Verification:
------------

SQL> select tablespace_name from dba_tablespaces;

SYSTEM
SYSAUX
UNDO_T1
TEMP
USERS
RDSADMIN
SAMIK3_DATA
SAMIK1_DATA
SAMIK2_DATA

SQL> select tablespace_name, file_name from dba_data_files order by tablespace_name;

RDSADMIN /rdsdbdata/db/ORCL_A/datafile/o1_mf_rdsadmin_mw07j5xz_.dbf
SYSAUX /rdsdbdata/db/ORCL_A/datafile/o1_mf_sysaux_mw0648yh_.dbf
SYSTEM /rdsdbdata/db/ORCL_A/datafile/o1_mf_system_mw06475h_.dbf
UNDO_T1 /rdsdbdata/db/ORCL_A/datafile/o1_mf_undo_t1_mw0649fs_.dbf
USERS /rdsdbdata/db/ORCL_A/datafile/o1_mf_users_mw0649pl_.dbf
SAMIK1_DATA /rdsdbdata/db/ORCL_A/datafile/o1_mf_apm_dev__mxp3q4x5_.dbf
SAMIK2_DATA /rdsdbdata/db/ORCL_A/datafile/o1_mf_apm_dev__mxp3r06r_.dbf
SAMIK3_DATA /rdsdbdata/db/ORCL_A/datafile/o1_mf_apm_dev__mxp0s461_.dbf


After these all three tablespaces created, start creating the empty user accounts and provide the necessary grants (Source DB -> Target DB) like below,

1.) EMP1 -> SAMIK1
2.) EMP2 -> SAMIK2
3.) EMP3 -> SAMIK3


First create the custom role -> "MI_CONNECT_ROLE" exported in source database server & assign necessary privilege.
------------------------------------------------------------------------------------------------------------------

SQL> CREATE ROLE MI_CONNECT_ROLE;

Role MI_CONNECT_ROLE created.

SQL> grant CREATE PROCEDURE TO MI_CONNECT_ROLE;

Grant succeeded.

SQL> grant CREATE SEQUENCE TO MI_CONNECT_ROLE;

Grant succeeded.

SQL> grant CREATE SESSION TO MI_CONNECT_ROLE;

Grant succeeded.

SQL> grant CREATE TABLE TO MI_CONNECT_ROLE;

Grant succeeded.

SQL> grant CREATE TRIGGER TO MI_CONNECT_ROLE;

Grant succeeded.

SQL> grant CREATE VIEW TO MI_CONNECT_ROLE;

Grant succeeded.

SQL> grant QUERY REWRITE TO MI_CONNECT_ROLE;

Grant succeeded.

SQL> grant CREATE TYPE TO MI_CONNECT_ROLE;

Grant succeeded.

SQL> grant EXECUTE ON SYS.DBMS_CRYPTO TO MI_CONNECT_ROLE;

Grant succeeded.



Now, start creating the empty user accounts.
--------------------------------------------

User Account: SAMIK1
------------------------

SQL> CREATE USER SAMIK1 IDENTIFIED BY SAMIK1#032025 default tablespace SAMIK1_DATA;

User SAMIK1 created.

SQL> alter user SAMIK1 quota UNLIMITED on SAMIK1_DATA;

User SAMIK1 altered.

SQL> grant DBA to SAMIK1 WITH ADMIN OPTION;

Grant succeeded.

SQL> grant MI_CONNECT_ROLE to SAMIK1;

Grant succeeded.

SQL> grant CREATE VIEW to SAMIK1;

Grant succeeded.

SQL> grant CREATE TABLE to SAMIK1;

Grant succeeded.

SQL> grant CREATE DATABASE LINK to SAMIK1;

Grant succeeded.

SQL> grant EXECUTE on CTXSYS.CTX_DDL to SAMIK1;

Grant succeeded.

SQL> grant CREATE TYPE to SAMIK1;

Grant succeeded.

SQL> grant CREATE MATERIALIZED VIEW to SAMIK1;

Grant succeeded.

SQL> grant CREATE SEQUENCE to SAMIK1;

Grant succeeded.

SQL> grant CREATE TRIGGER to SAMIK1;

Grant succeeded.

SQL> grant EXECUTE on SYS.DBMS_CRYPTO to SAMIK1;

Grant succeeded.



User Account: SAMIK2
-------------------------

SQL> CREATE USER SAMIK2 IDENTIFIED BY SAMIK2#032025 default tablespace SAMIK2_DATA;

User SAMIK2 created.

SQL> alter user SAMIK2 quota UNLIMITED on SAMIK2_DATA;

User SAMIK1 altered.

SQL> grant DBA to SAMIK2 WITH ADMIN OPTION;

Grant succeeded.

SQL> grant MI_CONNECT_ROLE to SAMIK2;

Grant succeeded.

SQL> grant CREATE VIEW to SAMIK2;

Grant succeeded.

SQL> grant CREATE TABLE to SAMIK2;

Grant succeeded.

SQL> grant CREATE DATABASE LINK to SAMIK2;

Grant succeeded.

SQL> grant EXECUTE on CTXSYS.CTX_DDL to SAMIK2;

Grant succeeded.

SQL> grant CREATE TYPE to SAMIK2;

Grant succeeded.

SQL> grant CREATE MATERIALIZED VIEW to SAMIK2;

Grant succeeded.

SQL> grant CREATE SEQUENCE to SAMIK2;

Grant succeeded.

SQL> grant CREATE TRIGGER to SAMIK2;

Grant succeeded.

SQL> grant EXECUTE on SYS.DBMS_CRYPTO to SAMIK2;

Grant succeeded.


User Account: SAMIK3
-------------------------

SQL> CREATE USER SAMIK3 IDENTIFIED BY SAMIK3#032025 default tablespace SAMIK3_DATA;

User SAMIK3 created.

SQL> alter user SAMIK3 quota UNLIMITED on SAMIK3_DATA;

User SAMIK3 altered.

SQL> grant DBA to SAMIK3 WITH ADMIN OPTION;

Grant succeeded.

SQL> grant CONNECT to SAMIK3;

Grant succeeded.

SQL> grant RESOURCE to SAMIK3;

Grant succeeded.

SQL> grant CREATE VIEW to SAMIK3;

Grant succeeded.

SQL> grant CREATE TABLE to SAMIK3;

Grant succeeded.

SQL> grant CREATE DATABASE LINK to SAMIK3;

Grant succeeded.

SQL> grant EXECUTE on CTXSYS.CTX_DDL to SAMIK3;

Grant succeeded.

SQL> grant CREATE TYPE to SAMIK3;

Grant succeeded.

SQL> grant CREATE MATERIALIZED VIEW to SAMIK3;

Grant succeeded.

SQL> grant CREATE SEQUENCE to SAMIK3;

Grant succeeded.

SQL> grant CREATE TRIGGER to SAMIK3;

Grant succeeded.

SQL> grant EXECUTE on SYS.DBMS_CRYPTO to SAMIK3;

Grant succeeded.



Step 5: Import your dump file into your target DB instance using DBMS_DATAPUMP package
======================================================================================

For schema -> "SAMIK1" use below PL/SQL block and run it on SQL Developer (or) SQL*Plus as master user (admin) on RDS Oracle DB instance.


DECLARE
  v_hdnl NUMBER;
BEGIN
  v_hdnl := DBMS_DATAPUMP.OPEN( 
    operation => 'IMPORT', 
    job_mode  => 'SCHEMA', 
    job_name  => null);
  DBMS_DATAPUMP.ADD_FILE( 
    handle    => v_hdnl, 
    filename  => 'EMP1.DMP', 
    directory => 'DATA_PUMP_DIR', 
    filetype  => dbms_datapump.ku$_file_type_dump_file);
  DBMS_DATAPUMP.ADD_FILE( 
    handle    => v_hdnl, 
    filename  => 'EMP1_to_SAMIK1_Import_24032025.log', 
    directory => 'DATA_PUMP_DIR', 
    filetype  => dbms_datapump.ku$_file_type_log_file);
  DBMS_DATAPUMP.METADATA_REMAP(v_hdnl, 'REMAP_SCHEMA', 'EMP1', 'SAMIK1');
  DBMS_DATAPUMP.METADATA_REMAP(v_hdnl, 'REMAP_TABLESPACE', 'MERMS', 'SAMIK1_DATA');
  DBMS_DATAPUMP.SET_PARALLEL(v_hdnl, degree => 2);
  DBMS_DATAPUMP.START_JOB(v_hdnl);
END;
/


O/p:-
-----

PL/SQL procedure successfully completed.


Step 6: Once the Import job completed read the logfile to verify for any errors
===============================================================================




SELECT * FROM TABLE 
    (rdsadmin.rds_file_util.read_text_file(
        p_directory => 'DATA_PUMP_DIR',
        p_filename  => 'EMP1_to_SAMIK1_Import_24032025.log'));


O/p will be like below,

Master table "ADMIN"."SYS_IMPORT_SCHEMA_01" successfully loaded/unloaded
Starting "ADMIN"."SYS_IMPORT_SCHEMA_01":  
Processing object type SCHEMA_EXPORT/USER
ORA-31684: Object type USER:"SAMIK1" already exists
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TYPE/TYPE_SPEC
Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. . imported "SAMIK1"."MI_INDX_COLUMNS"                  0 KB       0 rows
. . imported "SAMIK1"."MI_LUBRICANT"                     0 KB       0 rows
. . imported "SAMIK1"."MI_STG_QUERIES_BKUP_4030007"  16.89 KB     154 rows
. . imported "SAMIK1"."MI_ROLE_RES"                  55.06 KB    2190 rows
. . imported "SAMIK1"."MI_TMCAWS00"                      0 KB       0 rows
. . 
. . 
. . 
. . 
. . 
Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
Processing object type SCHEMA_EXPORT/VIEW/TRIGGER
Processing object type SCHEMA_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX
ORA-39082: Object type PROCEDURE:"SAMIK1"."TEMP_SEG_USAGE_INSERT" created with compilation warnings
ORA-39082: Object type VIEW:"SAMIK1"."BYV_TECH_DATA" created with compilation warnings
Job "ADMIN"."SYS_IMPORT_SCHEMA_01" completed with 5 error(s) at Mon Mar 24 07:56:10 2025 elapsed 0 01:48:51


Now analyze the errors (Post import steps):-
==========================================

ORA-31684: Object type USER:"SAMIK1" already exists ----> Please ignore this error since we created this schema manually in target database.

ORA-39083: Object type INDEX failed to create with error:

Here this index creation step failed because of schema "DBOP" unavailability in target database. So, re-create them manually using "SAMIK1" schema like below,


SQL> CREATE INDEX "SAMIK1"."N_MI_EVWKHIST_OID_RID_SS" ON "SAMIK1"."MI_EVWKHIST" ("MI_EVWKHIST_ORDR_ID_C", "MI_EVWKHIST_RQST_ID_C", "MI_EVWKHIST_SAP_SYSTEM_C") PCTFREE 10 INITRANS 2 MAXTRANS 255  STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT) TABLESPACE "SAMIK1_DATA" PARALLEL 1
/

Index "SAMIK1"."N_MI_EVWKHIST_OID_RID_SS" created.


SQL> CREATE INDEX "SAMIK1"."N_MI_EVWKHIST_RID_OID_SS" ON "SAMIK1"."MI_EVWKHIST" ("MI_EVWKHIST_RQST_ID_C", "MI_EVWKHIST_ORDR_ID_C", "MI_EVWKHIST_SAP_SYSTEM_C") PCTFREE 10 INITRANS 2 MAXTRANS 255  STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT) TABLESPACE "SAMIK1_DATA" PARALLEL 1 
/

Index "SAMIK1"."N_MI_EVWKHIST_RID_OID_SS" created.


ORA-39082: Object type PROCEDURE:"SAMIK1"."TEMP_SEG_USAGE_INSERT" created with compilation warnings
ORA-39082: Object type VIEW:"SAMIK1"."BYV_TECH_DATA" created with compilation warnings

The above two errors, please ignore since these objects are already in INVALID state in source database.


Step 7: Verification of imported DB Objects & its Storage size information
==========================================================================


SQL> select distinct owner, segment_type, sum(bytes)/1024/1024/1024 from dba_segments where owner='SAMIK1' group by owner, segment_type;

Owner Segment_Type Size (GB)
---------- ------------ -----------------
SAMIK1 LOBINDEX 0.03961181640625
SAMIK1 TABLE 46.92852783203125
SAMIK1 LOBSEGMENT 28.79962158203125
SAMIK1 INDEX 79.62249755859375

SQL> select distinct owner, object_type, status, count(*) from dba_objects where owner='SAMIK1' group by owner, object_type, status order by status, object_type;

Owner Object_Type Status Count(*)
---------- ----------- ------- --------
SAMIK1 PROCEDURE INVALID 1
SAMIK1 VIEW INVALID 1
SAMIK1 FUNCTION VALID 15
SAMIK1 INDEX VALID 6951
SAMIK1 LOB VALID 1099
SAMIK1 PACKAGE VALID 21
SAMIK1 PACKAGE BODY VALID 19
SAMIK1 PROCEDURE VALID 67
SAMIK1 SEQUENCE VALID 57
SAMIK1 TABLE VALID 2421
SAMIK1 TRIGGER VALID 33
SAMIK1 TYPE VALID 3
SAMIK1 TYPE BODY VALID 2
SAMIK1 VIEW VALID 1688




Step 8: Clean-up of files
==========================



SQL> select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')) order by filename, mtime;

Filename Type Size Mtime
--------------- ---- ---------- --------
EMP2.DMP file 28131328 19-03-25
EMP2.log file 1982 19-03-25
EMP3.DMP file 479232 19-03-25
EMP3.log file 3036 19-03-25
EMP3_to_SAMIK3_Import_19032025.log file 5788 19-03-25
EMP1.DMP file 15426408448 19-03-25
EMP1.log file 195909 19-03-25
EMP1_to_SAMIK1_Import_24032025.log file 194293 24-03-25
datapump/ directory 4096 24-03-25


Now delete the unwanted files from the DATA_PUMP_DIR using below command,

EXEC UTL_FILE.FREMOVE('DATA_PUMP_DIR','filename');
EXEC UTL_FILE.FREMOVE('DATA_PUMP_DIR','EMP3_to_SAMIK3_Import_19032025.log'); 






Step 9: Import "SAMIK2" in orcl database
================================================
Please create below sequences and empty tablespace before executing the data pump process.


SQL> CREATE SEQUENCE  "SAMIK2"."CUSTOM_ID_SEQ"  MINVALUE 1 MAXVALUE 9223372036854775807 INCREMENT BY 1 START WITH 1 NOCACHE  NOORDER  NOCYCLE  NOKEEP  NOSCALE  GLOBAL
/

Sequence "SAMIK2"."CUSTOM_ID_SEQ" created.


SQL> CREATE SEQUENCE  "SAMIK2"."BASELINE_ID_SEQ"  MINVALUE 1 MAXVALUE 9223372036854775807 INCREMENT BY 1 START WITH 725903 NOCACHE  NOORDER  NOCYCLE  NOKEEP  NOSCALE  GLOBAL
/

Sequence "SAMIK2"."BASELINE_ID_SEQ" created.


SQL> CREATE SEQUENCE  "SAMIK2"."TEMPORARY_ID_SEQ"  MINVALUE 1 MAXVALUE 9223372036854775807 INCREMENT BY 1 START WITH 1 NOCACHE  NOORDER  NOCYCLE  NOKEEP  NOSCALE  GLOBAL
/

Sequence "SAMIK2"."TEMPORARY_ID_SEQ" created.


SQL> CREATE TABLE "SAMIK2"."CUSTOM" ("ID" NUMBER(19,0) DEFAULT "SAMIK2"."CUSTOM_ID_SEQ"."NEXTVAL" NOT NULL ENABLE, "LANGUAGEID" VARCHAR2(10 BYTE) NOT NULL ENABLE, "NAME" VARCHAR2(250 BYTE) NOT NULL ENABLE, "TEXT" VARCHAR2(1000 BYTE) NOT NULL ENABLE, "TRANSLATION" VARCHAR2(1500 BYTE) NOT NULL ENABLE, "PATH" VARCHAR2(350 BYTE), "TYPE" VARCHAR2(1 BYTE) DEFAULT '0', "SUBTYPE" VARCHAR2(1 BYTE) DEFAULT '0', "SUBMODULE" VARCHAR2(50 BYTE), "MODULEID" VARCHAR2(40 BYTE) NOT NULL ENABLE, "TENANTID" VARCHAR2(50 BYTE) NOT NULL ENABLE, "NEEDTRANSLATION" VARCHAR2(2 BYTE) NOT NULL ENABLE, "LASTUPDATEDATE" TIMESTAMP (2) DEFAULT LOCALTIMESTAMP(2)) SEGMENT CREATION IMMEDIATE PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255  NOCOMPRESS LOGGING STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT) TABLESPACE "SAMIK2_DATA"
/

Table "SAMIK2"."CUSTOM" created.


SQL> CREATE TABLE "SAMIK2"."BASELINE" ("ID" NUMBER(19,0) DEFAULT "SAMIK2"."BASELINE_ID_SEQ"."NEXTVAL" NOT NULL ENABLE, "LANGUAGEID" VARCHAR2(10 BYTE) NOT NULL ENABLE, "NAME" VARCHAR2(250 BYTE) NOT NULL ENABLE, "TEXT" VARCHAR2(1000 BYTE) NOT NULL ENABLE, "TRANSLATION" VARCHAR2(1500 BYTE) NOT NULL ENABLE, "PATH" VARCHAR2(350 BYTE), "TYPE" VARCHAR2(1 BYTE) DEFAULT '0', "SUBTYPE" VARCHAR2(1 BYTE) DEFAULT '0', "SUBMODULE" VARCHAR2(50 BYTE), "MODULEID" VARCHAR2(40 BYTE) NOT NULL ENABLE, "NEEDTRANSLATION" VARCHAR2(2 BYTE) NOT NULL ENABLE, "LASTUPDATEDATE" TIMESTAMP (2) DEFAULT LOCALTIMESTAMP(2)) SEGMENT CREATION IMMEDIATE PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255  NOCOMPRESS LOGGING STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT) TABLESPACE "SAMIK2_DATA" 
/

Table "SAMIK2"."BASELINE" created.


SQL> CREATE TABLE "SAMIK2"."TEMPORARY" ("ID" NUMBER(19,0) DEFAULT "SAMIK2"."TEMPORARY_ID_SEQ"."NEXTVAL" NOT NULL ENABLE, "LANGUAGEID" VARCHAR2(10 BYTE) NOT NULL ENABLE, "NAME" VARCHAR2(250 BYTE) NOT NULL ENABLE, "TEXT" VARCHAR2(1000 BYTE) NOT NULL ENABLE, "TRANSLATION" VARCHAR2(1500 BYTE) NOT NULL ENABLE, "PATH" VARCHAR2(350 BYTE), "TYPE" VARCHAR2(1 BYTE) DEFAULT '0', "SUBTYPE" VARCHAR2(1 BYTE) DEFAULT '0', "SUBMODULE" VARCHAR2(50 BYTE), "MODULEID" VARCHAR2(40 BYTE) NOT NULL ENABLE, "TENANTID" VARCHAR2(50 BYTE) NOT NULL ENABLE, "NEEDTRANSLATION" VARCHAR2(2 BYTE) NOT NULL ENABLE, "LASTUPDATEDATE" TIMESTAMP (2) DEFAULT LOCALTIMESTAMP(2)) SEGMENT CREATION IMMEDIATE PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255  NOCOMPRESS LOGGING STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT) TABLESPACE "SAMIK2_DATA" 
/

Table "SAMIK2"."TEMPORARY" created.


Now, Please import the schema "SAMIK2" use the below PL/SQL block and run it on SQL Developer (or) SQL*Plus as master user (admin) on RDS Oracle DB instance.


DECLARE
  v_hdnl NUMBER;
BEGIN
  v_hdnl := DBMS_DATAPUMP.OPEN( 
    operation => 'IMPORT', 
    job_mode  => 'SCHEMA', 
    job_name  => null);
  DBMS_DATAPUMP.ADD_FILE( 
    handle    => v_hdnl, 
    filename  => 'EMP2.DMP', 
    directory => 'DATA_PUMP_DIR', 
    filetype  => dbms_datapump.ku$_file_type_dump_file);
  DBMS_DATAPUMP.ADD_FILE( 
    handle    => v_hdnl, 
    filename  => 'EMP2_to_SAMIK2_Import_24032025.log', 
    directory => 'DATA_PUMP_DIR', 
    filetype  => dbms_datapump.ku$_file_type_log_file);
  DBMS_DATAPUMP.SET_PARAMETER (v_hdnl, 'TABLE_EXISTS_ACTION','TRUNCATE');
  DBMS_DATAPUMP.METADATA_REMAP(v_hdnl, 'REMAP_SCHEMA', 'EMP2', 'SAMIK2');
  DBMS_DATAPUMP.METADATA_REMAP(v_hdnl, 'REMAP_TABLESPACE', 'MERMS', 'SAMIK2_DATA');
  DBMS_DATAPUMP.SET_PARALLEL(v_hdnl, degree => 2);
  DBMS_DATAPUMP.START_JOB(v_hdnl);
END;
/


O/p:-
-----

PL/SQL procedure successfully completed.


Now, verify the import logfile for any errors and take necessary actions.


SELECT * FROM TABLE 
    (rdsadmin.rds_file_util.read_text_file(
        p_directory => 'DATA_PUMP_DIR',
        p_filename  => 'EMP2_to_SAMIK2_Import_24032025.log'));


O/p:-
-----

Master table "ADMIN"."SYS_IMPORT_SCHEMA_03" successfully loaded/unloaded
Starting "ADMIN"."SYS_IMPORT_SCHEMA_03":  
Processing object type SCHEMA_EXPORT/USER
ORA-31684: Object type USER:"SAMIK2" already exists
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
ORA-31684: Object type SEQUENCE:"SAMIK2"."BASELINE_ID_SEQ" already exists
ORA-31684: Object type SEQUENCE:"SAMIK2"."CUSTOM_ID_SEQ" already exists
ORA-31684: Object type SEQUENCE:"SAMIK2"."TEMPORARY_ID_SEQ" already exists
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Table "SAMIK2"."CUSTOM" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
Table "SAMIK2"."BASELINE" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
Table "SAMIK2"."TEMPORARY" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. . imported "SAMIK2"."UPGRADE_HISTORY"             5.468 KB      13 rows
. . imported "SAMIK2"."LANGUAGES"                   5.289 KB      14 rows
. . imported "SAMIK2"."MODULES"                     4.937 KB       1 rows
. . imported "SAMIK2"."CUSTOM"                          0 KB       0 rows
. . imported "SAMIK2"."TEMPORARY"                       0 KB       0 rows
. . imported "SAMIK2"."BASELINE"                    26.66 MB  725912 rows
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
Job "ADMIN"."SYS_IMPORT_SCHEMA_03" completed with 4 error(s) at Mon Mar 24 11:58:47 2025 elapsed 0 00:00:19





Step 10: Import "SAMIK3" in orcl database
================================================

Please create below sequences and empty tablespace before executing the data pump process.


SQL> CREATE SEQUENCE "SAMIK3"."JOB_EXECUTION_ID_SEQ" MINVALUE 1 MAXVALUE 9223372036854775807 INCREMENT BY 1 START WITH 1 NOCACHE  NOORDER  NOCYCLE  NOKEEP  NOSCALE  GLOBAL
/

Sequence "SAMIK3"."JOB_EXECUTION_ID_SEQ" created.


SQL> CREATE SEQUENCE "SAMIK3"."UPGRADE_HISTORY_SEQUENCE" MINVALUE 1 MAXVALUE 9223372036854775807 INCREMENT BY 1 START WITH 1 NOCACHE  NOORDER  NOCYCLE  NOKEEP  NOSCALE  GLOBAL
/

Sequence "SAMIK3"."UPGRADE_HISTORY_SEQUENCE" created.


SQL> CREATE TABLE "SAMIK3"."JOB_EXECUTION" ("JOB_EXECUTION_ID" NUMBER(19,0) DEFAULT "SAMIK3"."JOB_EXECUTION_ID_SEQ"."NEXTVAL" NOT NULL ENABLE, "TENANT_ID" VARCHAR2(120 BYTE) NOT NULL ENABLE, "SCHEDULER_NAME" VARCHAR2(120 BYTE) NOT NULL ENABLE, "SCHEDULER_INSTANCE_ID" VARCHAR2(255 BYTE) NOT NULL ENABLE, "TRIGGER_GROUP" VARCHAR2(150 BYTE) NOT NULL ENABLE, "TRIGGER_NAME" VARCHAR2(150 BYTE) NOT NULL ENABLE, "TRIGGER_DESCRIPTION" VARCHAR2(250 BYTE), "JOB_GROUP" VARCHAR2(200 BYTE) NOT NULL ENABLE, "JOB_NAME" VARCHAR2(200 BYTE) NOT NULL ENABLE, "JOB_DESCRIPTION" VARCHAR2(4000 BYTE), "JOB_TYPE" VARCHAR2(250 BYTE), "JOB_CLASS_NAME" VARCHAR2(250 BYTE), "JOB_PARAMETER" VARCHAR2(4000 BYTE), "EXECUTION_STATUS" NUMBER(*,0), "STATUS_DESCRIPTION" VARCHAR2(4000 BYTE), "IS_CANCEL_REQUESTED" NUMBER(3,0) DEFAULT 0, "SCHEDULED_TIME" TIMESTAMP (6), "SCHEDULED_TIMEZONE" VARCHAR2(250 BYTE), "FIRE_INSTANCE_ID" VARCHAR2(255 BYTE), "FIRED_TIME" TIMESTAMP (6) NOT NULL ENABLE, "NEXT_FIRE_TIME" TIMESTAMP (6), "PREVIOUS_FIRE_TIME" TIMESTAMP (6), "PROGRESS" FLOAT(49), "COMPLETED_TIME" TIMESTAMP (6), "RUN_TIME" NUMBER(19,0), "RECOVERY_TRIGGER_GROUP" VARCHAR2(150 BYTE), "RECOVERY_TRIGGER_NAME" VARCHAR2(150 BYTE), "RECOVERY_TRIGGER_FIRE_TIME" TIMESTAMP (6), "MODIFIED_DT" TIMESTAMP (6)) SEGMENT CREATION IMMEDIATE PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255  NOCOMPRESS LOGGING STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT) TABLESPACE "SAMIK3_DATA" 
/

Table "SAMIK3"."JOB_EXECUTION" created.


SQL> CREATE TABLE "SAMIK3"."UPGRADE_HISTORY" ("SCHEMAVERSIONID" NUMBER(10,0) DEFAULT "SAMIK3"."UPGRADE_HISTORY_SEQUENCE"."NEXTVAL" NOT NULL ENABLE, "SCRIPTNAME" VARCHAR2(255 BYTE), "APPLIED" TIMESTAMP (6)) SEGMENT CREATION IMMEDIATE PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255  NOCOMPRESS LOGGING STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT) TABLESPACE "SAMIK3_DATA"
/

Table "SAMIK3"."UPGRADE_HISTORY" created.


Now, Please import the schema "SAMIK3" use the below PL/SQL block and run it on SQL Developer (or) SQL*Plus as master user (admin) on RDS Oracle DB instance.


DECLARE
  v_hdnl NUMBER;
BEGIN
  v_hdnl := DBMS_DATAPUMP.OPEN( 
    operation => 'IMPORT', 
    job_mode  => 'SCHEMA', 
    job_name  => null);
  DBMS_DATAPUMP.ADD_FILE( 
    handle    => v_hdnl, 
    filename  => 'EMP3.DMP', 
    directory => 'DATA_PUMP_DIR', 
    filetype  => dbms_datapump.ku$_file_type_dump_file);
  DBMS_DATAPUMP.ADD_FILE( 
    handle    => v_hdnl, 
    filename  => 'EMP3_to_SAMIK3_Import_24032025.log', 
    directory => 'DATA_PUMP_DIR', 
    filetype  => dbms_datapump.ku$_file_type_log_file);
  DBMS_DATAPUMP.SET_PARAMETER (v_hdnl, 'TABLE_EXISTS_ACTION','TRUNCATE');
  DBMS_DATAPUMP.METADATA_REMAP(v_hdnl, 'REMAP_SCHEMA', 'EMP3', 'SAMIK3');
  DBMS_DATAPUMP.METADATA_REMAP(v_hdnl, 'REMAP_TABLESPACE', 'MERMS', 'SAMIK3_DATA');
  DBMS_DATAPUMP.SET_PARALLEL(v_hdnl, degree => 2);
  DBMS_DATAPUMP.START_JOB(v_hdnl);
END;
/


O/p:-
-----

PL/SQL procedure successfully completed.


Now, verify the import logfile for any errors and take necessary actions.


SELECT * FROM TABLE 
    (rdsadmin.rds_file_util.read_text_file(
        p_directory => 'DATA_PUMP_DIR',
        p_filename  => 'EMP3_to_SAMIK3_Import_24032025.log'));


O/p:-
-----

Master table "ADMIN"."SYS_IMPORT_SCHEMA_04" successfully loaded/unloaded
Starting "ADMIN"."SYS_IMPORT_SCHEMA_04":  
Processing object type SCHEMA_EXPORT/USER
ORA-31684: Object type USER:"SAMIK3" already exists
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
ORA-31684: Object type SEQUENCE:"SAMIK3"."UPGRADE_HISTORY_SEQUENCE" already exists
ORA-31684: Object type SEQUENCE:"SAMIK3"."JOB_EXECUTION_ID_SEQ" already exists
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Table "SAMIK3"."JOB_EXECUTION" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
Table "SAMIK3"."UPGRADE_HISTORY" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. . imported "SAMIK3"."JOB_EXECUTION"               145.2 KB    2415 rows
. . imported "SAMIK3"."QRTZ_TRIGGERS"               6.187 KB       4 rows
. . imported "SAMIK3"."JOB_EXECUTION_MSG"           6.085 KB      38 rows
. . imported "SAMIK3"."QRTZ_JOB_DETAILS"            6.039 KB       4 rows
. . imported "SAMIK3"."QRTZ_BLOB_TRIGGERS"          5.992 KB       2 rows
. . imported "SAMIK3"."SCHEDULES"                   5.656 KB       2 rows
. . imported "SAMIK3"."QRTZ_CRON_TRIGGERS"          5.257 KB       2 rows
. . imported "SAMIK3"."QRTZ_SCHEDULER_STATE"        5.203 KB       2 rows
. . imported "SAMIK3"."UPGRADE_HISTORY"             5.164 KB       3 rows
. . imported "SAMIK3"."DBUP_MIGRATION_HISTORY"      5.125 KB       2 rows
. . imported "SAMIK3"."QRTZ_LOCKS"                  4.953 KB       4 rows
. . imported "SAMIK3"."QRTZ_CALENDARS"                  0 KB       0 rows
. . imported "SAMIK3"."QRTZ_FIRED_TRIGGERS"             0 KB       0 rows
. . imported "SAMIK3"."QRTZ_PAUSED_TRIGGER_GRPS"        0 KB       0 rows
. . imported "SAMIK3"."QRTZ_SIMPLE_TRIGGERS"            0 KB       0 rows
. . imported "SAMIK3"."QRTZ_SIMPROP_TRIGGERS"           0 KB       0 rows
. . imported "SAMIK3"."SCHEDULER_SETTINGS"              0 KB       0 rows
Processing object type SCHEMA_EXPORT/FUNCTION/FUNCTION
Processing object type SCHEMA_EXPORT/FUNCTION/ALTER_FUNCTION
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
ORA-39083: Object type REF_CONSTRAINT:"SAMIK3"."JOB_EXECUTION_MSG_JOB_EXECUTION" failed to create with error:
ORA-02270: no matching unique or primary key for this column-list

Failing sql is:
ALTER TABLE "SAMIK3"."JOB_EXECUTION_MSG" ADD CONSTRAINT "JOB_EXECUTION_MSG_JOB_EXECUTION" FOREIGN KEY ("JOB_EXECUTION_ID") REFERENCES "SAMIK3"."JOB_EXECUTION" ("JOB_EXECUTION_ID") ON DELETE CASCADE ENABLE
Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
Job "ADMIN"."SYS_IMPORT_SCHEMA_04" completed with 4 error(s) at Mon Mar 24 12:16:33 2025 elapsed 0 00:00:20





Step 11: Remove the source database dump files from DATA_PUMP_DIR which are copied from S3 Bucket
=================================================================================================


SQL> select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')) order by filename, mtime;

Filename Type Size Mtime
--------------- ---- ---------- --------
EMP2.DMP file 28131328 19-03-25
EMP2.log file 1982 19-03-25
EMP2_to_SAMIK2_Import_24032025.log file 2132 24-03-25
EMP3.DMP file 479232 19-03-25
EMP3.log file 3036 19-03-25
EMP3_to_SAMIK3_Import_24032025.log file 3370 24-03-25
EMP1.DMP file 15426408448 19-03-25
EMP1.log file 195909 19-03-25
EMP1_to_SAMIK1_Import_24032025.log file 194293 24-03-25
datapump/ directory 4096 24-03-25


Now, delete the unwanted files from the DATA_PUMP_DIR using below command,

E.x.

EXEC UTL_FILE.FREMOVE('DATA_PUMP_DIR','filename');

SQL> EXEC UTL_FILE.FREMOVE('DATA_PUMP_DIR','EMP2.DMP'); 

PL/SQL procedure successfully completed.

SQL> EXEC UTL_FILE.FREMOVE('DATA_PUMP_DIR','EMP3.DMP'); 

PL/SQL procedure successfully completed.

SQL> EXEC UTL_FILE.FREMOVE('DATA_PUMP_DIR','EMP1.DMP'); 

PL/SQL procedure successfully completed.


Now, check the current files available in DATA_PUMP_DIR directory using below command,


SQL> select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')) order by filename, mtime;

Filename Type Size Mtime
--------------- ---- ---------- --------
EMP2.log file 1982 19-03-25
EMP3.log file 3036 19-03-25
EMP1.log file 195909 19-03-25
EMP2_to_SAMIK2_Import_24032025.log file 2132 24-03-25
EMP3_to_SAMIK3_Import_24032025.log file 3370 24-03-25
EMP1_to_SAMIK1_Import_24032025.log file 194293 24-03-25
datapump/ directory 4096 28-03-25
















No comments:

Post a Comment

Understanding CSSD Heartbeat Mechanisms in Oracle RAC

  Understanding CSSD Heartbeat Mechanisms in Oracle RAC The Cluster Services Synchronization Daemon (CSSD) is a critical process in Oracle...