Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update conf column in dag_run table type from bytes to JSON #44533

Open
wants to merge 41 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
41 commits
Select commit Hold shift + click to select a range
b91446b
remove pickled data from dag run table
vatsrahul1001 Dec 1, 2024
cbde2ed
Merge branch 'main' of github.com:astronomer/airflow into dag_run_mig…
vatsrahul1001 Dec 3, 2024
e698769
fix downgrade + add news fragement
vatsrahul1001 Dec 3, 2024
f7f9155
Merge branch 'main' into dag_run_migrate_conf_column_as_json
vatsrahul1001 Dec 3, 2024
18c03fb
remove archive table if exits after downgrade
vatsrahul1001 Dec 3, 2024
2b6abf7
Merge branch 'dag_run_migrate_conf_column_as_json' of github.com:astr…
vatsrahul1001 Dec 3, 2024
a54c435
removing archiving data
vatsrahul1001 Dec 4, 2024
6b3789d
implementing review comments
vatsrahul1001 Dec 10, 2024
07258f5
Merge branch 'main' of github.com:astronomer/airflow into dag_run_mig…
vatsrahul1001 Dec 10, 2024
8b88d63
fixing static check
vatsrahul1001 Dec 10, 2024
730470b
Merge branch 'main' into dag_run_migrate_conf_column_as_json
vatsrahul1001 Dec 10, 2024
c2e0d09
fixing static checks
vatsrahul1001 Dec 10, 2024
c2b8dbd
Merge branch 'main' into dag_run_migrate_conf_column_as_json
vatsrahul1001 Dec 12, 2024
c8fa27d
simplying upgrade and downgrade as per review
vatsrahul1001 Dec 13, 2024
d27b7ed
simplying upgrade and downgrade as per review
vatsrahul1001 Dec 13, 2024
603ca17
Merge branch 'dag_run_migrate_conf_column_as_json' of github.com:astr…
vatsrahul1001 Dec 13, 2024
9406702
fixing failures
vatsrahul1001 Dec 13, 2024
01c453c
Merge branch 'main' of github.com:astronomer/airflow into dag_run_mig…
vatsrahul1001 Dec 14, 2024
219b694
Merge branch 'main' of github.com:astronomer/airflow into dag_run_mig…
vatsrahul1001 Dec 14, 2024
22a5dba
removing setting conf to null
vatsrahul1001 Dec 14, 2024
70ce5df
refactor approach to migrate values in conf
vatsrahul1001 Dec 16, 2024
0cade3a
Merge branch 'main' into dag_run_migrate_conf_column_as_json
vatsrahul1001 Dec 17, 2024
cce341a
update offline warning
vatsrahul1001 Dec 17, 2024
c3604d9
Merge branch 'dag_run_migrate_conf_column_as_json' of github.com:astr…
vatsrahul1001 Dec 17, 2024
1eb120a
resolving conflicts
vatsrahul1001 Dec 31, 2024
1bdc5cb
Merge branch 'main' into dag_run_migrate_conf_column_as_json
vatsrahul1001 Dec 31, 2024
8261e05
resolving conflicts
vatsrahul1001 Dec 31, 2024
9101a05
Merge branch 'main' into dag_run_migrate_conf_column_as_json
vatsrahul1001 Dec 31, 2024
a6ebc01
Merge branch 'dag_run_migrate_conf_column_as_json' of github.com:astr…
vatsrahul1001 Dec 31, 2024
7fb6509
resolving conflicts
vatsrahul1001 Dec 31, 2024
33e4c6e
Merge branch 'main' into dag_run_migrate_conf_column_as_json
vatsrahul1001 Dec 31, 2024
6b23d8a
resolving conflicts
vatsrahul1001 Dec 31, 2024
249aa7e
Merge branch 'dag_run_migrate_conf_column_as_json' of github.com:astr…
vatsrahul1001 Dec 31, 2024
fa28acd
Merge branch 'main' into dag_run_migrate_conf_column_as_json
vatsrahul1001 Jan 1, 2025
dcd8534
Merge branch 'main' into dag_run_migrate_conf_column_as_json
vatsrahul1001 Jan 1, 2025
71078d7
Merge branch 'main' into dag_run_migrate_conf_column_as_json
vatsrahul1001 Jan 1, 2025
2ee9e7b
Merge branch 'main' into dag_run_migrate_conf_column_as_json
vatsrahul1001 Jan 2, 2025
d2f2b0f
Merge branch 'main' of github.com:astronomer/airflow into dag_run_mig…
vatsrahul1001 Jan 2, 2025
4845c61
Merge branch 'main' of github.com:astronomer/airflow into dag_run_mig…
vatsrahul1001 Jan 2, 2025
141abfd
Merge branch 'main' into dag_run_migrate_conf_column_as_json
jedcunningham Jan 2, 2025
8488cdf
Merge branch 'main' into dag_run_migrate_conf_column_as_json
vatsrahul1001 Jan 6, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,195 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.

"""
remove pickled data from dagrun table.

Revision ID: e39a26ac59f6
Revises: eed27faa34e3
Create Date: 2024-12-01 08:33:15.425141

"""

from __future__ import annotations

import sqlalchemy as sa
from alembic import op
from sqlalchemy import text
from sqlalchemy.dialects.mysql import LONGBLOB
from sqlalchemy_utils import UUIDType

# revision identifiers, used by Alembic.
revision = "e39a26ac59f6"
down_revision = "eed27faa34e3"
branch_labels = None
depends_on = None
airflow_version = "3.0.0"


def upgrade():
"""Apply remove pickled data from dagrun table."""
# Summary of the change:
# 1. Create an archived table (`_dag_run_archive`) to store the current "pickled" data in the dag_run table
# 2. Extract and archive the pickled data using the condition
# 3. Delete the pickled data from the dag_run table so that we can update the column type
# 4. Update the dag_run.conf column type to JSON from bytea

conn = op.get_bind()
dialect = conn.dialect.name

# Create an archived table to store the current data
# Create the dag_run table
op.create_table(
"_dag_run_archive",
sa.Column("id", sa.Integer(), nullable=False, primary_key=True, autoincrement=True),
sa.Column("dag_id", sa.String(length=250), nullable=False),
sa.Column("queued_at", sa.TIMESTAMP(timezone=True), nullable=True),
sa.Column("logical_date", sa.TIMESTAMP(timezone=True), nullable=False),
sa.Column("start_date", sa.TIMESTAMP(timezone=True), nullable=True),
sa.Column("end_date", sa.TIMESTAMP(timezone=True), nullable=True),
sa.Column("state", sa.String(length=50), nullable=True),
jscheffl marked this conversation as resolved.
Show resolved Hide resolved
sa.Column("run_id", sa.String(length=250), nullable=False),
sa.Column("creating_job_id", sa.Integer(), nullable=True),
sa.Column("external_trigger", sa.Boolean(), nullable=True),
sa.Column("run_type", sa.String(length=50), nullable=False),
sa.Column("triggered_by", sa.String(length=50), nullable=True),
sa.Column("conf", sa.LargeBinary().with_variant(LONGBLOB, "mysql"), nullable=True),
sa.Column("data_interval_start", sa.TIMESTAMP(timezone=True), nullable=True),
sa.Column("data_interval_end", sa.TIMESTAMP(timezone=True), nullable=True),
sa.Column("last_scheduling_decision", sa.TIMESTAMP(timezone=True), nullable=True),
sa.Column("log_template_id", sa.Integer(), nullable=True),
sa.Column("updated_at", sa.TIMESTAMP(timezone=True), nullable=True),
sa.Column("clear_number", sa.Integer(), nullable=False, server_default=sa.text("0")),
sa.Column("backfill_id", sa.Integer(), nullable=True),
sa.Column("dag_version_id", UUIDType(binary=False), nullable=True),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("dag_id", "run_id"),
if_not_exists=True,
)

# Condition to detect pickled data for different databases
condition_templates = {
"postgresql": "get_byte(conf, 0) = 128",
"mysql": "HEX(SUBSTRING(conf, 1, 1)) = '80'",
"sqlite": "substr(conf, 1, 1) = char(128)",
}

condition = condition_templates.get(dialect)
if not condition:
raise RuntimeError(f"Unsupported dialect: {dialect}")

# Archive pickled data using the condition
conn.execute(
text(
f"""
INSERT INTO _dag_run_archive (dag_id, run_id, queued_at, logical_date, start_date, end_date, state, creating_job_id, external_trigger, run_type, triggered_by, conf, data_interval_start, data_interval_end, last_scheduling_decision, log_template_id, updated_at, clear_number, backfill_id, dag_version_id)
SELECT dag_id, run_id, queued_at, logical_date, start_date, end_date, state, creating_job_id, external_trigger, run_type, triggered_by, conf, data_interval_start, data_interval_end, last_scheduling_decision, log_template_id, updated_at, clear_number, backfill_id, dag_version_id
FROM dag_run
WHERE conf IS NOT NULL AND {condition};

"""
)
)

# Delete the pickled data from the dag_run table so that we can update the column type
conn.execute(text(f"DELETE FROM dag_run WHERE conf IS NOT NULL AND {condition}"))
jscheffl marked this conversation as resolved.
Show resolved Hide resolved

# Update the conf column type to JSON
if dialect == "postgresql":
op.execute(
"""
ALTER TABLE dag_run
ALTER COLUMN conf TYPE JSONB
USING CASE
WHEN conf IS NOT NULL THEN CAST(CONVERT_FROM(conf, 'UTF8') AS JSONB)
ELSE NULL
END
"""
)
elif dialect == "mysql":
jscheffl marked this conversation as resolved.
Show resolved Hide resolved
op.add_column("dag_run", sa.Column("conf_json", sa.JSON(), nullable=True))
op.execute("UPDATE dag_run SET conf_json = CAST(conf AS CHAR CHARACTER SET utf8mb4)")
op.drop_column("dag_run", "conf")
op.alter_column("dag_run", "conf_json", existing_type=sa.JSON(), new_column_name="conf")
elif dialect == "sqlite":
# Rename the existing `conf` column to `conf_old`
with op.batch_alter_table("dag_run", schema=None) as batch_op:
batch_op.alter_column("conf", new_column_name="conf_old")

# Add the new `conf` column with JSON type
with op.batch_alter_table("dag_run", schema=None) as batch_op:
batch_op.add_column(sa.Column("conf", sa.JSON(), nullable=True))

# Migrate data from `conf_old` to `conf`
conn.execute(
text(
"""
UPDATE dag_run
SET conf = json(CAST(conf_old AS TEXT))
WHERE conf_old IS NOT NULL
"""
)
)

# Drop the old `conf_old` column
with op.batch_alter_table("dag_run", schema=None) as batch_op:
batch_op.drop_column("conf_old")


def downgrade():
"""Unapply Remove pickled data from dagrun table."""
conn = op.get_bind()
dialect = conn.dialect.name

# Revert the conf column back to LargeBinary
if dialect == "postgresql":
op.execute(
"""
ALTER TABLE dag_run
ALTER COLUMN conf TYPE BYTEA
USING CASE
WHEN conf IS NOT NULL THEN CONVERT_TO(conf::TEXT, 'UTF8')
ELSE NULL
END
"""
)
elif dialect == "mysql":
op.add_column("dag_run", sa.Column("conf_blob", LONGBLOB, nullable=True))
op.execute("UPDATE dag_run SET conf_blob = CAST(conf AS BINARY);")
op.drop_column("dag_run", "conf")
op.alter_column("dag_run", "conf_blob", existing_type=LONGBLOB, new_column_name="conf")

elif dialect == "sqlite":
with op.batch_alter_table("dag_run", schema=None) as batch_op:
batch_op.alter_column("conf", new_column_name="conf_old")

with op.batch_alter_table("dag_run", schema=None) as batch_op:
batch_op.add_column(sa.Column("conf", sa.LargeBinary, nullable=True))

conn.execute(
text(
"""
UPDATE dag_run
SET conf = CAST(conf_old AS BLOB)
WHERE conf IS NOT NULL
"""
)
)

with op.batch_alter_table("dag_run", schema=None) as batch_op:
batch_op.drop_column("conf_old")
5 changes: 3 additions & 2 deletions airflow/models/dagrun.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,14 +25,14 @@

import re2
from sqlalchemy import (
JSON,
Boolean,
Column,
Enum,
ForeignKey,
ForeignKeyConstraint,
Index,
Integer,
PickleType,
PrimaryKeyConstraint,
String,
Text,
Expand All @@ -44,6 +44,7 @@
text,
update,
)
from sqlalchemy.dialects import postgresql
from sqlalchemy.exc import IntegrityError
from sqlalchemy.ext.associationproxy import association_proxy
from sqlalchemy.orm import declared_attr, joinedload, relationship, synonym, validates
Expand Down Expand Up @@ -137,7 +138,7 @@ class DagRun(Base, LoggingMixin):
triggered_by = Column(
Enum(DagRunTriggeredByType, native_enum=False, length=50)
) # Airflow component that triggered the run.
conf = Column(PickleType)
conf = Column(JSON().with_variant(postgresql.JSONB, "postgresql"))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we not need any more handling than these?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Check if the DagRun edit view works in the FAB UI

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you might need to change this line:

if item.conf:
item.conf = json.loads(item.conf)

Copy link
Member

@kaxil kaxil Dec 2, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member

@kaxil kaxil Dec 2, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

# These two must be either both NULL or both datetime.
data_interval_start = Column(UtcDateTime)
data_interval_end = Column(UtcDateTime)
Expand Down
2 changes: 1 addition & 1 deletion airflow/utils/db.py
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ class MappedClassProtocol(Protocol):
"2.9.2": "686269002441",
"2.10.0": "22ed7efa9da2",
"2.10.3": "5f2621c13b39",
"3.0.0": "eed27faa34e3",
"3.0.0": "e39a26ac59f6",
}


Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow/img/airflow_erd.sha256
Original file line number Diff line number Diff line change
@@ -1 +1 @@
aa9e2e5b2a52af1e92bc876727ad5e8958e291315096fc5249a9afa2c21a5d06
bfbabfa8ed9fb1365f99b715e4c1a361e12f71743a31fef4c1deb4d02714e1f6
2 changes: 1 addition & 1 deletion docs/apache-airflow/img/airflow_erd.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 3 additions & 1 deletion docs/apache-airflow/migrations-ref.rst
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,9 @@ Here's the list of all the Database Migrations that are executed via when you ru
+-------------------------+------------------+-------------------+--------------------------------------------------------------+
| Revision ID | Revises ID | Airflow Version | Description |
+=========================+==================+===================+==============================================================+
| ``eed27faa34e3`` (head) | ``9fc3fc5de720`` | ``3.0.0`` | Remove pickled data from xcom table. |
| ``e39a26ac59f6`` (head) | ``eed27faa34e3`` | ``3.0.0`` | remove pickled data from dagrun table. |
+-------------------------+------------------+-------------------+--------------------------------------------------------------+
| ``eed27faa34e3`` | ``9fc3fc5de720`` | ``3.0.0`` | Remove pickled data from xcom table. |
+-------------------------+------------------+-------------------+--------------------------------------------------------------+
| ``9fc3fc5de720`` | ``2b47dc6bc8df`` | ``3.0.0`` | Add references between assets and triggers. |
+-------------------------+------------------+-------------------+--------------------------------------------------------------+
Expand Down