Notifications
Clear all

[Solved] How can I split a queue?

1 Posts
1 Users
1 Reactions
857 Views
0
Topic starter

I have a queue with a big backlog of pending Messages.  While I want the queue to normally operate as strict single-threaded FIFO, in order to "catch up", I want to temporarily do some parallel processing.  I am OK with the parallel processing, i.e. I'm not worried about message ordering.  Is there a quick way to achieve this?

1 Answer
1
Topic starter

You can do some parallel processing on a single-threaded queue by severing the "TASK_DEPEND" of one of the grid tasks.

Within a queue, you have a series of grid_tasks, where the task_depend of each grid_tasks entry points to the next task.  So it's one big long chain, and no grid_tasks entry will run until its task_depend becomes null.  Normally that happens when the task upon which it depends completes.  However, you can artificially break that link.

 

For example, to split a queue into 2 parallel parts, you can find the "middle" grid task for the messages in the queue.  (The precise where clause is up to you)

 

select median(m.sys_message_id)
from message m
join message_queue mq on mq.sys_message_queue_id = m.sys_owning_queue_id and mq.classification = 'Inbox' and mq.name = 'myqueuename'
where m.creation_date > somedate
and m.processing_state is null;

 

You can then find the grid task for that message by querying tag_1 with the result from the previous query:

select * from grid_tasks where tag1 = '14012717';

 

And finally, you can nullify the task_depend on that grid task.

update grid_tasks set task_depend = null where sys_task_id = 32871104;

 

This basically allows the two halves of the queue to be processed in parallel.  (New messages always go into the *second* half.)

 

If you want more than 2 parts, you can sever the connections in multiple places.

 

None of this changes the permanent behavior of the queue - all new messages end up in the last chunk of the queue and are processed sequentially in that chunk.  So once the earlier chunks are done, you are left with your original serial queue.

 

 

If you want to *permanently* enable parallel processing, that is different - in this case you need to mark the queue as EXCLUSIVE_CONSUMER=0 *and* you need to setup dedicated task performer thread pools via dvce-app-config.xml (normally populated via node.properties).