Disclaimer

Monday, 31 March 2025

aws import in Oracle

 

1) 
SELECT rdsadmin.rdsadmin_s3_tasks.download_from_s3(
           p_bucket_name    =>  'orcl-oracle-backup-bucket', 
           p_s3_prefix      =>  '20250305_ORCLV52/', 
           p_directory_name =>  'DATA_PUMP_DIR') 
           AS TASK_ID FROM DUAL;


Note:-:
This command is used to download files from an Amazon S3 bucket into an Oracle RDS instance using the rdsadmin.rdsadmin_s3_tasks.download_from_s3 procedure.


Explanation of Parameters:
p_bucket_name → 'orcl-oracle-backup-bucket'

The name of the S3 bucket where the files are stored.

p_s3_prefix → '20250305_ORCLV52/'

The prefix (folder path) inside the S3 bucket containing the files to download.

p_directory_name → 'DATA_PUMP_DIR'

The Oracle directory where the files will be placed after downloading.

AS TASK_ID FROM DUAL;

Returns a TASK_ID, which can be used to monitor the progress of the task.

Checking Task Status

After executing this command, you can check the status of the task using:


SELECT * FROM rdsadmin.rdsadmin_s3_tasks WHERE task_id = '<TASK_ID>';
Common Use Case
This is typically used to restore Oracle database backups (such as Data Pump export .dmp files) that were previously uploaded to S3.

Let me know if you need further clarification! 🚀

----------------------------------

SELECT text from table(rdsadmin.rds_file_util.read_text_file('BDUMP','dbtask-<TASK_ID>.log'));


Note:-:
This SQL command reads the log file for the S3 download task from the BDUMP directory in an Oracle RDS instance.

Breakdown of the Query:
rdsadmin.rds_file_util.read_text_file

This function is used to read the contents of a file stored in an Oracle directory.

'BDUMP'

The Oracle directory where the log file is stored.

BDUMP (Background Dump Destination) is used for logging diagnostic messages.

'dbtask-<TASK_ID>.log'

The specific log file associated with the S3 download task.

<TASK_ID> should be replaced with the actual TASK_ID returned from the download_from_s3 command.

How to Use This?
First, execute:


SELECT rdsadmin.rdsadmin_s3_tasks.download_from_s3(
    p_bucket_name    => 'orcl-oracle-backup-bucket', 
    p_s3_prefix      => '20250305_ORCLV52/', 
    p_directory_name => 'DATA_PUMP_DIR') AS TASK_ID 
FROM DUAL;

This returns a TASK_ID (e.g., 12345).

Then, check the status:


SELECT * FROM rdsadmin.rdsadmin_s3_tasks WHERE task_id = '12345';

Finally, read the log file:


SELECT text FROM table(rdsadmin.rds_file_util.read_text_file('BDUMP','dbtask-12345.log'));
This shows the detailed logs of the task, including errors, progress, and completion status.

Why Use This?
Helps in troubleshooting if the S3 download fails.



------------------------------------------

DECLARE
  v_hdnl NUMBER;
BEGIN
  v_hdnl := DBMS_DATAPUMP.OPEN( 
    operation => 'IMPORT', 
    job_mode  => 'SCHEMA', 
    job_name  => null);
  DBMS_DATAPUMP.ADD_FILE( 
    handle    => v_hdnl, 
    filename  => 'EMP1.DMP', 
    directory => 'DATA_PUMP_DIR', 
    filetype  => dbms_datapump.ku$_file_type_dump_file);
  DBMS_DATAPUMP.ADD_FILE( 
    handle    => v_hdnl, 
    filename  => 'EMP1_to_SAMIK1_Import_24032025.log', 
    directory => 'DATA_PUMP_DIR', 
    filetype  => dbms_datapump.ku$_file_type_log_file);
  DBMS_DATAPUMP.METADATA_REMAP(v_hdnl, 'REMAP_SCHEMA', 'EMP1', 'SAMIK1');
  DBMS_DATAPUMP.METADATA_REMAP(v_hdnl, 'REMAP_TABLESPACE', 'MERMS', 'SAMIK1_DATA');
  DBMS_DATAPUMP.SET_PARALLEL(v_hdnl, degree => 2);
  DBMS_DATAPUMP.START_JOB(v_hdnl);
END;
/


Note:-:
This PL/SQL block performs an Oracle Data Pump Import (IMPDP) operation, importing schema-level objects from the EMP1.DMP dump file while remapping the schema and tablespace.

Step-by-Step Breakdown:
DBMS_DATAPUMP.OPEN

Opens a Data Pump Import job (IMPORT mode) at the SCHEMA level.

job_name => null → Oracle assigns a system-generated job name.

DBMS_DATAPUMP.ADD_FILE

Adds the dump file EMP1.DMP stored in DATA_PUMP_DIR.

Adds the log file EMP1_to_SAMIK1_Import_24032025.log for tracking progress.

DBMS_DATAPUMP.METADATA_REMAP

Schema Remap: Changes all objects from schema EMP1 to SAMIK1.

Tablespace Remap: Moves objects from MERMS tablespace to SAMIK1_DATA.

DBMS_DATAPUMP.SET_PARALLEL

Sets parallel execution to 2 for improved performance.

DBMS_DATAPUMP.START_JOB

Starts the import job.

Use Case
This script is useful when you need to: ✅ Restore a schema backup from an exported dump file.
✅ Move schema objects from one user (EMP1) to another (SAMIK1).
✅ Change the storage location (tablespace) of imported objects.


Monitoring the Import Job
After execution, you can monitor the job using:

SELECT * FROM DBA_DATAPUMP_JOBS WHERE job_name LIKE 'SYS_IMPORT%';
To check real-time logs:


SELECT text FROM TABLE(RDSADMIN.RDS_FILE_UTIL.READ_TEXT_FILE('DATA_PUMP_DIR', 'EMP1_to_SAMIK1_Import_24032025.log'));


Possible Enhancements
🔹 Specify job_name to manage the job explicitly:


v_hdnl := DBMS_DATAPUMP.OPEN('IMPORT', 'SCHEMA', 'EMP1_TO_SAMIK1_JOB');

🔹 Add filters to import specific objects:

DBMS_DATAPUMP.DATA_FILTER(v_hdnl, 'INCLUDE_TABLE', 'EMPLOYEES');



-------------------------------------------------------------------------------------------------------------

SELECT * FROM TABLE 
    (rdsadmin.rds_file_util.read_text_file(
        p_directory => 'DATA_PUMP_DIR',
        p_filename  => 'EMP1_to_SAMIK1_Import_24032025.log'));


Note:-:
This SQL command reads the log file generated by the Data Pump Import job to monitor its progress and identify any errors or warnings.

🔍 Breakdown of the Query:
rdsadmin.rds_file_util.read_text_file

This function reads the contents of a text file stored in an Oracle directory.

Parameters:

p_directory => 'DATA_PUMP_DIR'

Specifies the Oracle directory object where the log file is located.

p_filename => 'EMP1_to_SAMIK1_Import_24032025.log'

The name of the import log file generated by DBMS_DATAPUMP.

TABLE() Function

Converts the function output into a readable table format.

📌 When to Use This?
✅ To monitor an ongoing or completed Data Pump Import job.
✅ To debug errors such as missing objects, permissions, or remapping issues.
✅ To verify successful completion of the schema import (EMP1 → SAMIK1).


🚀 Next Steps:
If the log file shows errors like ORA-xxxx, you can:


🔹 Check Data Pump job status:

SELECT * FROM DBA_DATAPUMP_JOBS WHERE job_name LIKE 'SYS_IMPORT%';

🔹 Restart failed jobs:

DBMS_DATAPUMP.START_JOB(v_hdnl, skip_current = 1);

🔹 Investigate log errors and re-run the import with necessary fixes.



-----------------------------------------------------------------------------------------------------------

select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')) order by filename, mtime;


Note:-:
This SQL command lists all files present in the DATA_PUMP_DIR directory within an Oracle RDS instance.

📌 Breakdown of the Query:
RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')

Retrieves file details (filename, size, and modification time) from the DATA_PUMP_DIR directory.

TABLE() Function

Converts the function's output into a readable table format.

ORDER BY filename, mtime

Sorts files alphabetically by name and chronologically by modification time (mtime).

📄 Example Output:
FILENAME FILESIZE MTIME
EMP1.DMP 2GB 2025-03-24 10:15:30
EMP1_to_SAMIK1_Import_24032025.log 15KB 2025-03-24 10:16:10
EXPDP_METADATA.LOG 8KB 2025-03-24 10:18:45


📌 When to Use This?
✅ To verify if a dump file (.DMP) or log file (.LOG) exists before performing import/export.
✅ To monitor file sizes and timestamps in Oracle RDS.
✅ To troubleshoot missing or outdated files.

🚀 Next Steps:
🔹 Read a specific log file:

SELECT * FROM TABLE(RDSADMIN.RDS_FILE_UTIL.READ_TEXT_FILE('DATA_PUMP_DIR', 'EMP1_to_SAMIK1_Import_24032025.log'));


🔹 Check active Data Pump jobs:

SELECT * FROM DBA_DATAPUMP_JOBS WHERE job_name LIKE 'SYS_IMPORT%';

🔹 Delete an unwanted file (if required):

EXEC RDSADMIN.RDS_FILE_UTIL.DELETE_FILE('DATA_PUMP_DIR', 'OLD_BACKUP.DMP');

--------------


EXEC UTL_FILE.FREMOVE('DATA_PUMP_DIR','filename');
EXEC UTL_FILE.FREMOVE('DATA_PUMP_DIR','EMP3_to_SAMIK3_Import_19032025.log'); 


Note:-:
🗑️ Deleting a File in Oracle RDS Using UTL_FILE.FREMOVE
Your command is using UTL_FILE.FREMOVE to delete a file from the DATA_PUMP_DIR directory in Oracle RDS.

📌 Explanation
UTL_FILE.FREMOVE('DATA_PUMP_DIR', 'filename')

Deletes the specified file from the DATA_PUMP_DIR directory.

You need to replace 'filename' with the actual file you want to remove.

UTL_FILE.FREMOVE('DATA_PUMP_DIR', 'EMP3_to_SAMIK3_Import_19032025.log')

Deletes the import log file 'EMP3_to_SAMIK3_Import_19032025.log'.

⚠️ Important Considerations
✅ Ensure the file exists before attempting to delete it. Use:


SELECT * FROM TABLE(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')) 
WHERE filename = 'EMP3_to_SAMIK3_Import_19032025.log';
✅ You must have proper privileges to delete files.
✅ This operation is irreversible, so be careful before running it.

🔍 Alternative in AWS RDS
If UTL_FILE.FREMOVE is restricted in Oracle RDS, use:

EXEC RDSADMIN.RDS_FILE_UTIL.DELETE_FILE('DATA_PUMP_DIR', 'EMP3_to_SAMIK3_Import_19032025.log');


This is the recommended approach in AWS RDS.

🚀 Next Steps
🔹 Verify file deletion:

SELECT * FROM TABLE(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')) 
ORDER BY filename;

🔹 Check Data Pump log files before deletion:

SELECT * FROM TABLE(RDSADMIN.RDS_FILE_UTIL.READ_TEXT_FILE('DATA_PUMP_DIR', 'EMP3_to_SAMIK3_Import_19032025.log'));


















No comments:

Post a Comment

Understanding CSSD Heartbeat Mechanisms in Oracle RAC

  Understanding CSSD Heartbeat Mechanisms in Oracle RAC The Cluster Services Synchronization Daemon (CSSD) is a critical process in Oracle...