We have a crawler written in Python.
We use Redis as a broker and Celery to multithread the crawlers.
Everything has been working fine - we have made no changes ti the latest working version. Now Celery is stopping for no reason without error. We occasionally get the error;
WorkerLostError('Worker exited prematurely: signal 15 (SIGTERM).',)
if we run the worker locally.
Please get in touch if you have experience with Celery & Redis and have an idea on how to fix this problem. Thanks, Dave,
9 freelancers are bidding on average $163 for this job
there must be somethign wrong if Celery stopped working and u wont getting any error too, i need to look into the issue, i worked on Celery a lot, so i can definately fix the issue
it seems to be a debugging job, so proposing my hourly rate for that kind of activities: $25/hr, please contact me if you want to manage this issue as a hourly consultancy, maybe it can be fixed in few hours