0

Tasker类在实例化时设置初始作业。基本上我想要的是在'main_queue'中放置一个作业,决定作业是否正在运行或者是否已经有相同的作业在'process_queue'中排队,从当前的'main_queue'作业返回。否则在“process_queue”中排队作业。当该进程队列完成时,将作业放入“main_queue”。

然而,“process_queue”在这段时间内具有与 id 相同的工作,尽管它应该已经完成​​了对输出的查看。所以一个新的工作永远不会被投入处理。是否发生了我看不到的僵局?

main_queue 工作者

$ rq worker main_queue --with-scheduler
22:44:19 Worker rq:worker:7fe23a24ae404135a10e301f7509eb7e: started, version 1.9.0
22:44:19 Subscribing to channel rq:pubsub:7fe23a24ae404135a10e301f7509eb7e
22:44:19 *** Listening on main_queue...
22:44:19 Trying to acquire locks for main_queue
22:44:19 Scheduler for main_queue started with PID 3747
22:44:19 Cleaning registries for queue: main_queue
22:44:33 main_queue: tasks.redis_test_job() (e90e0dff-bbcc-48ab-afed-6d1ba8b020a8)
None
Job is enqueued to process_queue!
22:44:33 main_queue: Job OK (e90e0dff-bbcc-48ab-afed-6d1ba8b020a8)
22:44:33 Result is kept for 500 seconds
22:44:47 main_queue: tasks.redis_test_job() (1a7f91d0-73f4-466e-92f4-9f918a9dd1e9)
<Job test_job: tasks.print_job()>
!!Scheduler added job to main but same job is already queued in process_queue!!
22:44:47 main_queue: Job OK (1a7f91d0-73f4-466e-92f4-9f918a9dd1e9)
22:44:47 Result is kept for 500 seconds

进程队列工作者

$ rq worker process_queue
22:44:24 Worker rq:worker:d70daf20ff324c18bc17f0ea9576df52: started, version 1.9.0
22:44:24 Subscribing to channel rq:pubsub:d70daf20ff324c18bc17f0ea9576df52
22:44:24 *** Listening on process_queue...
22:44:24 Cleaning registries for queue: process_queue
22:44:33 process_queue: tasks.print_job() (test_job)
The process job executed.
22:44:42 process_queue: Job OK (test_job)
22:44:42 Result is kept for 500 seconds

任务者.py

class Tasker():
    def __init__(self):
        self.tasker_conn = RedisClient().conn
        self.process_queue = Queue(name='process_queue', connection=Redis(), 
                                   default_timeout=-1)
        self.main_queue = Queue(name='main_queue', connection=Redis(),
                                    default_timeout=-1)
        self.__setup_tasks()
    def __setup_tasks(self):
        self.main_queue.enqueue_in(timedelta(seconds=3), tasks.redis_test_job)

任务.py

import tasks
def redis_test_job():
    q = Queue('process_queue', connection=Redis(), default_timeout=-1)
    queued = q.fetch_job('test_job')
    print(queued)
    if queued:
        print("!!Scheduler added job to main but same job is already queued in process_queue!!")
        return False
    else:
        q.enqueue(tasks.print_job, job_id='test_job')
        print("Job is enqueued to process_queue!")
    return True

def print_job():
    sleep(8)
    print("The process job executed.")
    q = Queue('main_queue', connection=Redis(), default_timeout=-1)
    q.enqueue_in(timedelta(seconds=5), tasks.redis_test_job)
4

1 回答 1

1

docs中,如果您未定义排队作业result_ttl,则默认为500 秒。

如果您想将其更改为例如使作业和结果仅存活 1 秒,请像这样将您的作业排入队列:

 q.enqueue(tasks.print_job, job_id='test_job', result_ttl=1)
于 2021-07-05T20:08:19.673 回答