Real Application Testing

By | In Blog, Oracle | April 06th, 2011

As promised for our upcoming presentation at this year’s Collaborate 2011 Conference, we have our Real Application Testing scripts. At the conference, Datavail and WTS will show how we used Oracle’s Real Application Testing to tune a production workload from a JD Edwards EnterpriseOne database running on Oracle.

After capturing a production workload, we created a clone of the production database to a test system. We were then able to iteratively test this workload using Database Replay, analyze performance data from the tests and make tuning changes. We compared our performance data between different test runs to see how well we tuned the database against a real-life production workload. The results were pretty impressive. The two main scripts we created to do this were called test.sql and tune.sql. The contents of those scripts comprise the rest of this blog post.

Database Replay Script

The core of our Database Replay logic is in a script we called test.sql. We kicked this off and let the replay run for a couple hours before coming back and issuing our tuning commands.

test.sql

save_sql_profiles.sql (called from test.sql)

The output from the SQL Tuning Advisor will include both index changes and new SQL Profiles for changes to query plans. When we flashback the database, we will lose the SQL Profiles stored in the database, so we use Data Pump commands to export the SQL Profiles, so we can load them back in later.

load_sql_profiles.sql (called from test.sql)

schema_changes.sql (called from test.sql)

The output of the SQL Tuning Advisor will also contain changes to indexes. We put those changes in a script called schema_changes.sql

delete_old_replay_info.sql (called from test.sql)

Before remapping our connections we needed to purge any old replay data that was loaded up when we initialized our replay.

remap_connections.sql (called from test.sql)

We ran our Workload Replay Clients (wrc) on another host other than the test database host, so we needed to remap connections. We replaced the hostnames and IP addresses of the production system with the test system. The DBA_WORKLOAD_CONNECTION_MAP has all the mappings.

run_wrc.bash (called from test.sql)

While our production database and cloned database were on Windows Server 2003, our replay clients were run from a Linux host. Here’s the shell script we used to kick off the Workload Replay Clients (wrc) during the Database Replay test.

replay_minder.proc (called from test.sql)

We wanted to make sure that the AWR Snapshots that we took were consistent between Database Replay runs. To do that, we created a procedure to take the snapshots at particular SCNs during the replay. This script also terminates the replay after the final AWR Snapshot is taken.

SQL Tuning Advisor Script

Our tuning script creates a SQL Set from the two AWR Snapshots taken during the Database Replay. It orders the SQL statements by “elapsed_time” and pulls out the top-N statements (we tuned the top-100 SQL statements in our tests). The SQL Set is the input for the SQL Tuning Advisor. We found that Oracle will spend a few minutes on tuning any given SQL statement, and will take 1-3 hours to tune the top-100 SQL statements.

tune.sql

We then issued the following commands to pull out the report and script generated by the SQL Tuning Advisor:

We looked through the tuning report and made judgements about the individual recommendations from the SQL Tuning Advisor. Do we want to add this particular SQL Profile? Do we want to add that new index? For a given SQL statement, if we wanted to accept a SQL Profile, we would run the dbms_sqltune.accept_sql_profile command from the report or script. Likewise, if we want to accept a new index recommendation, we would take their CREATE INDEX command and place that in our schema_changes.sql file. After reviewing each SQL statement recommendation we were then ready start another round of testing with the new tuning changes. We judged our tuning efforts by looking at the reduction in “DB Time” from the AWR reports compared across different test runs. Dallas Willett Blue Gecko – Remote DBA Services

Contact Us

Leave a Reply

Your email address will not be published.
Required fields are marked (*).