python - Starting celery worker from multiprocessing -


i'm new celery. of examples i've seen start celery worker command line. e.g:

$ celery -a proj worker -l info 

i'm starting project on elastic beanstalk , thought nice have worker subprocess of web app. tried using multiprocessing , seems work. i'm wondering if idea, or if there might disadvantages.

import celery import multiprocessing   class workerprocess(multiprocessing.process):     def __init__(self):         super().__init__(name='celery_worker_process')      def run(self):         argv = [             'worker',             '--loglevel=warning',             '--hostname=local',         ]         app.worker_main(argv)   def start_celery():     global worker_process     worker_process = workerprocess()     worker_process.start()   def stop_celery():     global worker_process     if worker_process:         worker_process.terminate()         worker_process = none   worker_name = 'celery@local' worker_process = none  app = celery.celery() app.config_from_object('celery_app.celeryconfig') 

seems option, not option 1 :)

one thing might want (you might doing this), linking autoscaling size of celery queue. scale when queue growing.

effectively celery similar internally of course, there's not lot of difference. snag can think of handling of external resources (database connections example), might problem dependent on doing celery.


Comments

Popular posts from this blog

asp.net mvc - SSO between MVCForum and Umbraco7 -

Python Tkinter keyboard using bind -

ubuntu - Selenium Node Not Connecting to Hub, Not Opening Port -