python - Share a common utility function between Celery tasks -


i' ve got bunch of tasks in celery connected using canvas chain.

@shared_task(bind=true) def my_task_a(self):     try:         logger.debug('running task a')             except exception:         run common cleanup function  @shared_task(bind=true) def my_task_b(self):     try:         logger.debug('running task b')         else     except exception:         run common cleanup function  ... 

so far good. problem i'm looking best practice when comes using common utility function this:

def cleanup_and_notify_user(task_data):     logger.debug('task failed')     send email     delete folders     ... 

what't best way without tasks blocking? example can replace run common cleanup function call cleanup_and_notify_user(task_data)? , happen if multiple tasks multiple workers attempt call function @ same time?

does each worker own copy? apparently bit confused on few of concepts here. appreciated.

thank in advance.

from inside celery task coding python therefore task has own process , function instantiated each task in basic oop logic. off course if cleanup function attempt delete shared resources system folders or database rows end in concurrent access problem of external resources need solve in other way, example in case of fylesystem create sandbox each task. hope helps


Comments

Popular posts from this blog

asp.net mvc - SSO between MVCForum and Umbraco7 -

Python Tkinter keyboard using bind -

ubuntu - Selenium Node Not Connecting to Hub, Not Opening Port -