While totally agreeing with Quassnoi's suggestion that external tables do not appear to be the proper solution here, as well as DCookie's analogy that you're being bound and tossed overboard and asked to swim, there may at least be a way to structure your program so that the external table is only read once. My belief from your description is that all 10 cursors are reading from the external table, meaning that you are forcing Oracle to scan the external table 10 times.
Assuming this inference is correct, the simplest answer is likely to make the external table the driving cursor, similar to what IronGoofy suggested. Depending on what some_query
in the code snippet below is doing,
for each register in some_query
and assuming that the fact that the query returns the same number of rows that are in the external table is not a coincidence, the simplest option would be to do something like
FOR register in (select * from ext_temp)
LOOP
-- Figure out if the row should have been part of cursor 1
IF( <<set of conditions>> )
THEN
<<do something>>
-- Figure out if the row should have been part of cursor 2
ELSIF( ... )
...
END LOOP;
or
FOR register in (select *
from ext_temp a,
(<<some query>>) b
where a.column_name = b.column_name )
LOOP
-- Figure out if the row should have been part of cursor 1
IF( <<set of conditions>> )
THEN
<<do something>>
-- Figure out if the row should have been part of cursor 2
ELSIF( ... )
...
END LOOP;
It should be more efficient to take things a step further and move logic out of the cursors (and IF statements) and into the driving cursor. Using the simpler of the code snippets above (you could, of course, join some_query
to these examples
FOR register in (select a.*,
NVL(sum( (case when condition1 and condition2
then table_4.f
else 0
end) ),
0) f_cursor_sum
from ext_temp table_4)
LOOP
<<do something>>
END LOOP;
If, even after doing this, you still find that you are doing some row-by-row processing, you could even go one more step forward and do a BULK COLLECT from the driving cursor into a locally declared collection and operate on that collection. You almost certainly don't want to fetch 3 GB of data into a local collection (though crushing the PGA might lead the DBA to conclude that temporary tables aren't such a bad thing, it's not something I would advise), fetching a few hundred rows at a time using the LIMIT clause should make things a bit more efficient.