##database intialization in apache airflow. I have installed apache airflow in linux with python 3.8 version. while running airflow commands it showing version name. error occured while initializing database in airflow. explain the error below and solution, thanks
/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/relationships.py:3463 SAWarning: relationship 'DagRun.serialized_dag' will copy column serialized_dag.dag_id to column dag_run.dag_id, which conflicts with relationship(s): 'TaskInstance.dag_run' (copies task_instance.dag_id to dag_run.dag_id), 'DagRun.task_instances' (copies task_instance.dag_id to dag_run.dag_id). If this is not the intention, consider if these relationships should be linked with back_populates, or if viewonly=True should be applied to one or more if they are read-only. For the less common case that foreign key constraints are partially overlapping, the orm.foreign() annotation can be used to isolate the columns that should be written towards. The 'overlaps' parameter may be used to remove this warning. (Background on this error at: http://sqlalche.me/e/14/qzyx)
/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/sqlalchemy/orm/relationships.py:3463 SAWarning: relationship 'SerializedDagModel.dag_runs' will copy column serialized_dag.dag_id to column dag_run.dag_id, which conflicts with relationship(s): 'TaskInstance.dag_run' (copies task_instance.dag_id to dag_run.dag_id), 'DagRun.task_instances' (copies task_instance.dag_id to dag_run.dag_id). If this is not the intention, consider if these relationships should be linked with back_populates, or if viewonly=True should be applied to one or more if they are read-only. For the less common case that foreign key constraints are partially overlapping, the orm.foreign() annotation can be used to isolate the columns that should be written towards. The 'overlaps' parameter may be used to remove this warning. (Background on this error at: http://sqlalche.me/e/14/qzyx)
____________ _____________
____ |__( )_________ __/__ /________ __
____ /| |_ /__ ___/_ /_ __ /_ __ \_ | /| / /
___ ___ | / _ / _ __/ _ / / /_/ /_ |/ |/ /
_/_/ |_/_/ /_/ /_/ /_/ \____/____/|__/
[2021-04-29 11:50:40,298] {dagbag.py:448} INFO - Filling up the DagBag from /dev/null
[2021-04-29 11:50:40,330] {manager.py:727} WARNING - No user yet created, use flask fab command to do it.
[2021-04-29 11:50:41,340] {abstract.py:229} ERROR - Failed to add operation for GET /api/v1/connections
Traceback (most recent call last):
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/connexion/apis/abstract.py", line 209, in add_paths
self.add_operation(path, method)
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/connexion/apis/abstract.py", line 173, in add_operation
pass_context_arg_name=self.pass_context_arg_name
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/connexion/operations/__init__.py", line 8, in make_operation
return spec.operation_cls.from_spec(spec, *args, **kwargs)
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/connexion/operations/openapi.py", line 138, in from_spec
**kwargs
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/connexion/operations/openapi.py", line 89, in __init__
pass_context_arg_name=pass_context_arg_name
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/connexion/operations/abstract.py", line 96, in __init__
self._resolution = resolver.resolve(self)
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/connexion/resolver.py", line 40, in resolve
return Resolution(self.resolve_function_from_operation_id(operation_id), operation_id)
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/connexion/resolver.py", line 66, in resolve_function_from_operation_id
raise ResolverError(str(e), sys.exc_info())
connexion.exceptions.ResolverError: <ResolverError: columns>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/hduser/apache_airflow/venv/bin/airflow", line 8, in <module>
sys.exit(main())
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/airflow/__main__.py", line 40, in main
args.func(args)
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/airflow/cli/cli_parser.py", line 48, in command
return func(*args, **kwargs)
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/airflow/utils/cli.py", line 89, in wrapper
return f(*args, **kwargs)
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/airflow/cli/commands/webserver_command.py", line 360, in webserver
app = cached_app(None)
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/airflow/www/app.py", line 135, in cached_app
app = create_app(config=config, testing=testing)
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/airflow/www/app.py", line 120, in create_app
init_api_connexion(flask_app)
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/airflow/www/extensions/init_views.py", line 172, in init_api_connexion
specification='v1.yaml', base_path=base_path, validate_responses=True, strict_validation=True
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/connexion/apps/flask_app.py", line 57, in add_api
api = super(FlaskApp, self).add_api(specification, **kwargs)
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/connexion/apps/abstract.py", line 156, in add_api
options=api_options.as_dict())
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/connexion/apis/abstract.py", line 111, in __init__
self.add_paths()
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/connexion/apis/abstract.py", line 216, in add_paths
self._handle_add_operation_error(path, method, err.exc_info)
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/connexion/apis/abstract.py", line 231, in _handle_add_operation_error
raise value.with_traceback(traceback)
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/connexion/resolver.py", line 61, in resolve_function_from_operation_id
return self.function_resolver(operation_id)
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/connexion/utils.py", line 111, in get_function_from_name
module = importlib.import_module(module_name)
File "/home/hduser/apache_airflow/venv/lib64/python3.6/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 994, in _gcd_import
File "<frozen importlib._bootstrap>", line 971, in _find_and_load
File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/airflow/api_connexion/endpoints/connection_endpoint.py", line 26, in <module>
from airflow.api_connexion.schemas.connection_schema import (
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/airflow/api_connexion/schemas/connection_schema.py", line 42, in <module>
class ConnectionSchema(ConnectionCollectionItemSchema): # pylint: disable=too-many-ancestors
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/marshmallow/schema.py", line 125, in __new__
dict_cls=dict_cls,
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/marshmallow_sqlalchemy/schema/sqlalchemy_schema.py", line 92, in get_declared_fields
fields.update(mcs.get_auto_fields(fields, converter, opts, dict_cls))
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/marshmallow_sqlalchemy/schema/sqlalchemy_schema.py", line 106, in get_auto_fields
for field_name, field in fields.items()
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/marshmallow_sqlalchemy/schema/sqlalchemy_schema.py", line 108, in <dictcomp>
and field_name not in opts.exclude
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/marshmallow_sqlalchemy/schema/sqlalchemy_schema.py", line 28, in create_field
return converter.field_for(model, column_name, **self.field_kwargs)
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/marshmallow_sqlalchemy/convert.py", line 179, in field_for
converted_prop = self.property2field(prop, **kwargs)
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/marshmallow_sqlalchemy/convert.py", line 146, in property2field
field_class = field_class or self._get_field_class_for_property(prop)
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/marshmallow_sqlalchemy/convert.py", line 223, in _get_field_class_for_property
column = prop.columns[0]
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/sqlalchemy/util/langhelpers.py", line 1220, in __getattr__
return self._fallback_getattr(key)
File "/home/hduser/apache_airflow/venv/lib/python3.6/site-packages/sqlalchemy/util/langhelpers.py", line 1194, in _fallback_getattr
raise AttributeError(key)
AttributeError: columns
the above is the error while running airflow initdb
This can be fixed by rolling back the version of SQLAlchemy. Try uninstalling SQLAlchemy and installing an older version of it. I used SQLAlchemy 1.3.20 and was able to get it working.
I came across this (and many other) problems while trying to implement a dockerized airflow for our data engineering team. I'm not sure what the exact cause was, but somewhere along the line of troubleshooting I think the database got bad data (I had to remove the volume that persisted the psql container's data).
Aside from that, something docker-specific is that some python modules installed during the docker build phase were not accessible due to permission issues (installed modules as one user, ran container as a different user).
Hope this helps. If you find out exactly what the cause of this error is, please let us know!
If you have sqlalchemy > 1.4
, this can be fixed by upgrading marshmallow-sqlalchemy
pip install -U marshmallow-sqlalchemy
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With