Airflow 2.0 is queuing but not launching tasks in my dev environment.
DAG and Pool settings are valid, but all tasks in each dag are queued when I trigger them, and are never running.
When typing airflow celery worker, I've got the following error:
Traceback (most recent call last):
File "/usr/local/bin/airflow", line 8, in <module>
sys.exit(main())
File "/usr/local/lib/python3.8/site-packages/airflow/__main__.py", line 40, in main
args.func(args)
File "/usr/local/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 48, in command
return func(*args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/airflow/utils/cli.py", line 92, in wrapper
return f(*args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/airflow/cli/commands/celery_command.py", line 188, in worker
_run_worker(options=options, skip_serve_logs=skip_serve_logs)
File "/usr/local/lib/python3.8/site-packages/airflow/cli/commands/celery_command.py", line 94, in _run_worker
celery_app.worker_main(options)
File "/usr/local/lib/python3.8/site-packages/celery/app/base.py", line 365, in worker_main
return instantiate(
File "/usr/local/lib/python3.8/site-packages/celery/bin/base.py", line 283, in execute_from_commandline
self.maybe_patch_concurrency(argv)
File "/usr/local/lib/python3.8/site-packages/celery/bin/base.py", line 315, in maybe_patch_concurrency
maybe_patch_concurrency(argv, *pool_option)
File "/usr/local/lib/python3.8/site-packages/celery/__init__.py", line 143, in maybe_patch_concurrency
pool = _find_option_with_arg(argv, short_opts, long_opts)
File "/usr/local/lib/python3.8/site-packages/celery/__init__.py", line 95, in _find_option_with_arg
if arg.startswith('-'):
AttributeError: 'int' object has no attribute 'startswith'
In my staging, prod environment there are no issues with running tasks as well as if I check airflow celery worker. It will start or warn me that it is already running (as expected).
There are no difference between envs, but I guess that the problem occurred after most-recent deploy on server.
As I can see, celery received the wrong argument:
AttributeError: 'int' object has no attribute 'startswith'
But how to trace which params Airflow trying to pass to celery? I have no Idea how to debug this.
Solved by upgrading celery to It's latest version from 4.4.2 to 5.1.2.
Seems like version 4.4.2 (which is a one of airflow deps) had a bug with arguments.
If there are any other suggestions how to solve this issue, feel free to mention them here.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With