Search

Top 60 Oracle Blogs

Recent comments

subquery factoring

Report Generators And Query Transformations

Usually the Cost-Based Optimizer arrives at a reasonable execution plan if it gets the estimates regarding cardinality and data scattering / clustering right (if you want to learn more about that why not watch my Webinar available at "AllThingsOracle.com"?).

Here is an example I've recently come across where this wasn't case - the optimizer obviously preferred plans with a significantly higher cost.

The setup to reproduce the issue is simple:

Subquery Factoring

I have a small collection of postings where I’ve described anomalies or limitations in subquery factoring (the “with subquery”, or Common Table Expression (CTE) to give it the official ANSI name). Here’s another example of Oracle’s code not behaving consistently. You may recognise the basic query from yesterday’s example of logical tuning – so I won’t reprint the code to generate the data sets. This examples in this note were created on 11.2.0.2 – we start with a simple query and its execution plan:

Subquery Factoring

Here’s an interesting little conundrum about subquery factoring that hasn’t changed in the recent (11.2.0.3) patch for subquery factoring. It came to me from Jared Still (a fellow member of Oak Table Network) shortly after I’d made some comments about the patch. It’s an example based on the scott/tiger schema – which I’ve extracted from the script $ORACLE_HOME/rdbms/admin/utlsampl.sql (though the relevant scripts may be demobld.sql or scott.sql, depending on version).

As usual I’ve used an 8KB block size, LMT with uniform 1MB extents, and no ASSM to hold the data. I won’t reproduce the code to generate the schema, just the versions of the query with, then without, subquery factoring:

Subquery Factoring

It’s always worth browsing through the list of Oracle’s bug fixes each time a new release or patch comes out because it can give you clues about where to look for problems in your production release – and how to anticipate problems on the upgrade. This article is an example of a fix that I found while while looking at the note for 11.2.0.3 quite recently.

Test Data

The UKOUG conference is over for another year – but it has left me with plenty to do and lots of things to investigate. Here’s just one little point that I picked up during one of the 10 minute “Oak Talks” that members of the Oak Table Network were doing in the lunch breaks.

There is a fairly well-known strategy for generating a list of numbers by using a “select from dual … connect by …” query, but I hadn’t realised that there were two ways of using it. The code I’ve usually used is this:

select
        rownum id 
from
        dual 
connect by 
        rownum <= 4000
;

But it looks as if most people use it like this:

select
        rownum id 
from
        dual 
connect by 
        level <= 4000
;

CONNECT BY oddity

This week I’ve seen an issue with a CONNECT BY query: for some reason Oracle 10.2.0.4 decided to build a weird plan (the query is weird too, but that’s not my point here ). An explanation of why that happened looks interesting, so here it is. Set up: drop table t2 cascade constraints purge; drop [...]