Background:
We have a database environment where views are calling views which are calling views... the logic has become complex and now changes to underlying views can have significant impact on the top view being called by the client application.
Now while we are documenting all the logic and figuring out how to unwind everything development continues on and performance continues to degrade.
Currently I would manually run an explain plan on a client query and dig into tuning it. This is a slow and tedious process and changes may not be examined for ages.
Problem:
I want to generate a report that lists SQL ID and lists changes in actual time/discrepancy between estimated rows and actual rows/changes in buffers/changes in reads in comparison to the average computed over the last month.
I would generally run the following script manually and examine it based just on that day's response.
ALTER SESSION SET statistics_level=all;
set linesize 256;
set pagesize 0;
set serveroutput off;
-- QUERY
SELECT
*
FROM
table (DBMS_XPLAN.display_cursor (NULL, NULL, 'ALLSTATS LAST'));
What I am trying to do is see about automating the explain plan query and inserting the statistics into a table. From there I can run a regression report to detect changes in the performance which can then alert the developers.
I was thinking something like this would be common enough without having to resorting to the OEM. I can't find anything so I wonder if there is a more common approach to this?