A django extension to run huey with multiple queues. Multiple queues allow tasks to not block each other and to scale independently. Only the redis storage is supported.
- If you use
huey 1.x
then installhueyx 0.1.2
. Checkout the git tag huey1.x. - If you use
huey 2.x
then installhueyx >= 1.0
.
Install it with
pip install hueyx
Add hueyx in your installed apps.
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'hueyx',
]
Compared to djhuey, hueyx allows several queues to be defined in the settings.py.
HUEYX = {
'queue_name1': {
'connection': {
'host': 'localhost',
'port': 6379,
'db': 0,
},
'consumer': {
'workers': 1,
'worker_type': 'process',
}
},
'queue_name2': {
'connection': {
'connection_pool': ConnectionPool(host='localhost', port=6379, db=1)
},
'consumer': {
'multiple_scheduler_locking': True,
'prometheus_metrics': True,
'workers': 2,
'worker_type': 'thread',
}
},
'priority_queue_name3': {
'huey_class': 'huey.PriorityRedisHuey',
'connection': {
'connection_pool': ConnectionPool(host='localhost', port=6379, db=1)
},
'consumer': {
'multiple_scheduler_locking': True,
'prometheus_metrics': True,
'workers': 2,
'worker_type': 'thread',
}
},
}
The settings are almost the same as in djhuey. Have a look at the huey documentation to see the exact parameter usage.
Exceptions:
- You can only configure redis as storage engine by configure
huey_class
tohuey.RedisHuey
,huey.PriorityRedisHuey
,huey.RedisExpireHuey
orhuey.PriorityRedisExpireHuey
. - The
name
andbackend_class
parameters are not supported. - The options
multiple_scheduler_locking
andprometheus_metrics_enabled
have been added. See below. - The parameters
heartbeat_timeout
fordb_task
has been added. See below.
from hueyx.queues import hueyx
"""
Define which queue you want to use.
They are predefined in settings.py.
"""
HUEY_Q1 = hueyx('queue_name1')
HUEY_Q2 = hueyx('queue_name2')
@HUEY_Q1.task()
def my_task1():
print('my_task1 called')
@HUEY_Q1.db_task()
def my_db_task1():
print('my_db_task1 called')
@HUEY_Q2.task()
def my_task2():
print('my_task2 called')
@HUEY_Q2.periodic_task(crontab(minute='0', hour='3'))
def my_periodic_task2():
print('my_periodic_task2 called')
return 1
@HUEY_Q2.db_task(heartbeat_timeout=120)
def my_heartbeat_task(heartbeat: Heartbeat):
with heartbeat.long_running_operation():
print('This operation can take a while -> don\'t check for heartbeats')
print('Now we check for heartbeats -> call heartbeat() periodically')
heartbeat()
from example.tasks import my_task1, my_db_task1, my_task2
my_task1() # Task for queue_name1
my_db_task1() # Task for queue_name1
my_task2() # Task for queue_name2
Consumers are started with the queue_name.
./manage.py run_hueyx queue_name1
Heartbeat tasks are tasks with the parameter heartbeat_timeout
. It defines the timeout in seconds.
They get a Heartbeat object which needs to be called in order to send a heartbeat to redis.
If no heartbeat occurs in set timeout the task is presumed to be dead and will automatically get restarted.
heartbeat_timeout
needs to be at least 120 seconds. It does not work together with the parameter include_task
.
multiple_scheduler_locking
has been added to support multiple huey schedulers.
If you run huey in a cloud environment, you will end up running multiple huey instances which each will
schedule the periodic task.
multiple_scheduler_locking
prevents periodic tasks to be scheduled multiple times. It is false by default.
Optionally hueyx pushes all huey signals to the redis pubsub hueyx.huey2.signaling
if enabled.
HUEYX_SIGNALS = {
'enabled': True,
'environment': 'your environment'
}
The format of the message is
{
'environment': settings.HUEYX_SIGNALS['environment'],
'queue': queue,
'pid': pid,
'signal': signal_name,
'task': task_name
}
The environment parameter is a optional variable.
The huey-exporter project takes the signals und reports it to prometheus.