Select Page

Migrating Oracle On-Premises to Amazon RDS for Oracle

Author: Ranjit Nagi | 7 min read | May 14, 2025

Migrating your Oracle database to AWS Relational Database Service (RDS) can be daunting, but with the right planning and execution, it can significantly improve performance, scalability, and cost-efficiency.

This comprehensive guide outlines a step-by-step approach to ensure a smooth and successful migration process. From evaluating your current environment to handling post-migration validation, we’ve got you covered.

Whether you’re a seasoned database administrator or a project manager overseeing the transition, this guide will provide valuable insights and practical tips to make your Oracle migration journey seamless.

Pre-Migration Planning

Evaluate Current Environment

  • Assess the existing on-premises Oracle version, storage, and performance needs.
  • Review top queries, alert.log, and the AWR/sp report.
  • Identify key dependencies such as custom applications or integrations.
  • Ensure Oracle license compatibility with AWS RDS.
  • Choose the appropriate RDS instance type (e.g., db.t3.xlarge).

Preparing for Migration

Backup Strategy

  • Take full backups of your Oracle database before migration.
  • Define the backup strategy (full, incremental, etc).

Schema Review

  • Analyze existing schemas and database configurations.
  • Identify unsupported Oracle features in RDS.

Choosing the Oracle Cloud Migration Strategy

When choosing a migration strategy, consider data volume, schema complexity, downtime tolerance, and available resources. Options include full migration for simplicity, or phased migration (schema-by-schema) for better control, flexibility, and troubleshooting during the move to Oracle RDS.

Options for migration:

  1. AWS Database Migration Service (DMS) for continuous replication.
  2. Oracle Data Pump (expdp/impdp) for export/import.
  3. Backup/Restore via Amazon S3 or RDS snapshots.

Chosen Strategy: expdp/impdp

Migration Process

  • Set up the Amazon RDS for Oracle instance.
  • Create the necessary tablespace, role, and profile matching source database.
  • Export the on-premises database, we took expdp full=y.
  • Create an s3 bucket in AWS and provide the necessary IAM privileges.
  • Set up AWScli on the on-premises server.
  • Copy the dumpfile to s3.
  • Import all the required schema in RDS.

Choosing the RDS Instance Type

  • Source DB: 16GB memory, 2 CPUs
  • Chosen RDS instance: db.t3.xlarge 4 vCPUs, 16 GB of memory

License Configuration

  • Option: Bring Your Own License (BYOL) or License Included
  • Customer opted for License Included

Handling RDS Bugs (Database Name Issue)

Oracle RDS is automatically created with the default database name “orcl” and changing DB_NAME is not supported. Workarounds include creating a snapshot of the ORCL database and restoring the snapshot with the desired database name.

Solution: Restore an RDS snapshot to a new RDS instance.

Steps:

  1. Take a Snapshot:
    • Go to RDS → Databases → Select DB → Actions → Take Snapshot.
    • Enter snapshot name → Take Snapshot.
  2. Restore the Snapshot:
    • Navigate to RDS → Snapshots → Search for snapshot.
    • Select snapshot → Actions → Restore Snapshot.
    • In “Additional Configuration” → Set “Initial Database Name” → Restore.

AWS CLI Configuration

Configure AWS CLI on the primary server by installing the CLI package, running aws configure, and verifying with aws s3 ls.

Steps to Configure AWS CLI on the Source Database

[oracle@pridb-dev aws]$ ./aws configure
AWS Access Key ID [****************SBW5]: ####################
AWS Secret Access Key [****************Fr1w]: ####################
Default region name [US East]: us-east-1
Default output format [json]:

Export Dump File & Upload to S3

A full database export using Oracle Data Pump (expdp):

expdp directory=expdp dumpfile=export_test.dmp logfile=export_test.log full=y

Export took around 2 hours to complete, size around 250g.

Copy the export dump to S3 bucket:

./aws s3 cp /US2/export/export_dev1.dmp s3://Bucketname/ --profile=profilename

After copying the dumpfile, download the dumpfile to data_pump_dir for import:

SQL> SELECT rdsadmin.rdsadmin_s3_tasks.download_from_s3(
  p_bucket_name    =>  'bucketeandev',
  p_directory_name =>  'DATA_PUMP_DIR')
AS TASK_ID FROM DUAL;

To check the progress of the download job:

SELECT text FROM table(rdsadmin.rds_file_util.read_text_file('BDUMP','dbtask-1735051030083-1282.log'));

Create Tablespace, Profile, and Role in Target Database

Use the below script to get the DDL:

SELECT DBMS_METADATA.get_ddl ('TABLESPACE', tablespace_name)
FROM dba_tablespaces
WHERE tablespace_name = UPPER('DATA');

Import Schemas into RDS

Used a schema-by-schema import for better control and error handling:

DECLARE
  v_hdnl NUMBER;
BEGIN
  v_hdnl := DBMS_DATAPUMP.OPEN(
    operation => 'IMPORT',
    job_mode  => 'SCHEMA',
    job_name  => null,
    version   => '19.0.0');

  DBMS_DATAPUMP.ADD_FILE(
    handle    => v_hdnl,
    filename  => 'export_dev.dmp',
    directory => 'DATA_PUMP_DIR',
    filetype  => dbms_datapump.ku$_file_type_dump_file);

  DBMS_DATAPUMP.ADD_FILE(
    handle    => v_hdnl,
    filename  => 'BO_EAN_PRI_imp.log',
    directory => 'DATA_PUMP_DIR',
    filetype  => dbms_datapump.ku$_file_type_log_file);

  DBMS_DATAPUMP.METADATA_FILTER(v_hdnl,'SCHEMA_EXPR','IN (''PRI_RESTORE'', ''BO_USER'', ''EAN_USER'')');
  DBMS_DATAPUMP.START_JOB(v_hdnl);
END;
/

Compiling and Validating Objects

Compile Schema

Instead of utlrp.sql, use this for Oracle RDS:

BEGIN
  DBMS_UTILITY.compile_schema(schema => 'RMS_USER2');
END;
/

Compare Objects

SELECT status, owner, COUNT(*) 
FROM dba_objects 
WHERE username='RMS_USER2';

Conclusion

Migrating Oracle to RDS requires careful planning, from handling RDS-specific limitations to schema compilation and validation. By following a structured approach—configuring AWS CLI, exporting/importing data, compiling schemas, and verifying objects—we ensure a smooth transition. Continuous monitoring and troubleshooting help maintain database integrity and performance post-migration.

Get more data engineering resources at Datavail.com.

Subscribe to Our Blog

Never miss a post! Stay up to date with the latest database, application and analytics tips and news. Delivered in a handy bi-weekly update straight to your inbox. You can unsubscribe at any time.