Using an Analytical function – NTILE, I was able to
divide big volume of data into multiple batches.
After dividing data into multiple batches, it’s easy to
implement multi-threading, without missing any record.
Following query divides Items(for the Org: 489 ) data in to
5 equal batches, and returns MIN and MAX Item ids from each batch.
select min(inventory_item_id)
min_item_id , max(inventory_item_id) max_item_id,
count(*) batch_count, batch batch_num
from (
select inventory_item_id, NTILE(5)
over (order by inventory_item_id) batch
where
organization_id = 679 -- 048
) group by batch
Query Output:
MIN_ITEM_ID
|
MAX_ITEM_ID
|
BATCH_COUNT
|
BATCH_NUM
|
158281
|
299596
|
7092
|
1
|
299598
|
400344
|
7092
|
2
|
509671
|
589229
|
7091
|
4
|
589230
|
961093
|
7091
|
5
|
400345
|
509659
|
7092
|
3
|
I used this batching concept to process my data into
multi-threading. Sample code below.
Sample Code:
---
for rec in (select min(inventory_item_id) min_item_id ,
max(inventory_item_id) max_item_id,
count(*) batch_count, batch batch_num
from ( select inventory_item_id, ntile(5) over (order by
inventory_item_id) batch
from mtl_system_items_b
where organization_id =
<489>) group by batch 489>
) loop
print.log_message ('batch_num:'||rec.batch_num
||':min_item_id:' ||rec.min_item_id
||':max_item_id:'||rec.max_item_id
||':batch_count:'||rec.batch_count);
ln_request_id := 0;
begin
-- submit programs in batchhes..
ln_request_id :=
fnd_request.submit_request (
application => 'XXCUST',
program =>
'concp_short_name',
argument1 => xv_item_code,
argument2 => rec.min_item_id,
argument3 =>
rec.max_item_id,
argument4 =>
xn_org_id
);
commit;
end loop ;
--