Different queues indeed seems to be the easiest way… see attached qrscu.py (backwards compatible) with added functionality to send jobs to a background queue (parameter “-b”).
Additional requirement: add “celery_script” to the celery start command (or /etc/default/celeryd) as queue to be consumed:
celery multi start default -A openremproject -c 4 -Q default,celery_script --pidfile=/path/to/media/celery/%N.pid --logfile=/path/to/media/celery/%N.log
This way, the website will use the “default” queue and qr’s triggered on the command line will use “default” or “celery_script”, depending on the optional “-b” parameter. Seems to work on my end so far
@tcdewit can you remind me how the two queues work - I experimented with many queues a long time ago, then reverted to one and I can't remember the details!
Each queue has their own workers, so a background task and a foreground task run along at the same time?
Tim de Wit
When using "-Q default,celery_script" at the celery command line, the same set of workers will retrieve their jobs from both queues (round robbin style). When the celery_script queue is used for retrieving huge amounts of data (e.g. all dose info from PACS from last year), the default queue can still be used hourly/daily in cron to keep up with the recent studies (otherwise they will be added to the bottom of the queue, causing large delays).