Tim Hall inspired me with his #ThanksOTN (or “OTN Appreciation Day“) campaign to continue my mini-series on leveraging SQL Developer Reports for DBA tasks. Today with visualizing historical System Statistics from Statspack for performance analyses.
#ThanksOTN I have three favourite Oracle features to rave about today:
- Analytic SQL
- SQL Developer (particularly the reports)!
Up to Oracle 11.2 it was possible to display archived SQL plans from Statspack using DBMS_XPLAN. I make use of this in some of my scripts and SQL Developer Reports since I first saw this in Christian Antognini’s Book “Troubleshooting Oracle Performance“.
But in 12c (here: 18.104.22.168 on Linux), there’s a piece missing now:
select * from table(dbms_xplan.display(
table_name => 'perfstat.stats$sql_plan',
statement_id => null,
format => 'ALL -predicate -note',
filter_preds => 'plan_hash_value = '|| &&phv
ERROR: an uncaught error in function display has happened; please contact Oracle support
Please provide also a DMP file of the used plan table perfstat.stats$sql_plan
ORA-00904: "TIMESTAMP": invalid identifier
So it looks like STATS$SQL_PLAN wasn’t synchronized to the changes in 12c’s PLAN_TABLE. Maybe because the timestamp wouldn’t make much sense there, anyway, maybe simply because Oracle forgot.
==> Quick and most certainly unsupported workaround:
ALTER TABLE perfstat.stats$sql_plan ADD timestamp INVISIBLE AS (cast(NULL AS DATE));
Another workaround could be to create a separate view with an additional dummy timestamp column an reference the view. I chose to stick with the invisible column solution so I won’t have to create new objects in the DB and change scripts to use these objects.
Hopefully, this will be solved in 12.2 – at least, I had filed an SR / Enhancement Request with Oracle Support.
Today here’s a shorter post about my experiments with Oracle SQL Developer’s user-defined reports: A report on all long running operations (“LongOps”) with details on session wait events, explain plans and live SQL monitoring.
“Wait a minute”, you might say, “there’s already the session report in SQL Developer’s standard reports that shows Session_LongOps”! – and you’re right, it’s been integrated in SQLDev for quite some time.
BUT: I wanted to view the LongOps from a different perspective: To find out what’s taking so long in the database generally, it would be better in my humble opinion to start with ALL LongOps and drill down from there.
Here’s how it looks on a live system:
This is the second post in my mini-series on leveraging SQL Developer Reports for DBA tasks, today with visualizing Average Active Sessions (AAS). In this article I’ll cover
- What AAS is and how to interpret it
- How to build a basic line graph in SQL Developer
- How to extend the graph with detailed child reports (Time Model Statistics and Top Events)
If you run Oracle Standard Edition or haven’t licenced Diagnostics Pack for Enterprise Edition, then you don’t have AWR and ASH Data available. This is when Statspack, the predecessor of AWR, comes in handy to keep a history of database performance metrics.
But although Oracle still deliver Statspack with their recent DB releases (yes, even in 12c it’s not dead!), there are few tools that support it. But wait – Oracle SQL Developer has a nice reporting feature built in, so why not build custom statspack reports for this great free tool?
This motivated me to start a series on leveraging SQL Developer Reports for DBA tasks, starting with visualizing logical I/O history.
First things first:
Thou shalt not explicitly set AQ_TM_PROCESSES=0 !
Unless, of course, you want to disable the Queue Manager Process (QMNC in Oracle 11.x). Which you may want to do during database upgrades to prevent Streams or Advanced Queueing from interfering with the upgrade process.
However, if you don’t reset this parameter afterwards, you might run into the following scenario the next time you do a Data Pump export or import:
One of my customers uses AppDynamics as their tool of choice for Application Performance Management of a multi-tier, B2B/B2C application environment with several Oracle Databases at the back end. This tool gives a graphic overview of the application infrastructure, discovers (JDBC or .net) database queries automatically and provides metrics on the response time for those database calls as well as metrics from the servers where the application components run on.
To my surprise, I found out that there are so-called “Machine Agents” that can be run on the servers to collect performance and other metrics, but there wasn’t any component for collecting metrics from the Oracle RDBMS itself. Thus, born was the idea to create such a component (“Monitor” in the AppDynamics terminology)!