Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using ADF Copy Activity with dynamic schema mapping

I'm trying to drive the columnMapping property from a database configuration table. My first activity in the pipeline pulls in the rows from the config table. My copy activity source is a Json file in Azure blob storage and my sink is an Azure SQL database.

In copy activity I'm setting the mapping using the dynamic content window. The code looks like this:

"translator": {
    "value": "@json(activity('Lookup1').output.value[0].ColumnMapping)",
     "type": "Expression"
 }

My question is, what should the value of activity('Lookup1').output.value[0].ColumnMapping look like?

I've tried several different json formats but the copy activity always seems to ignore it.

For example, I've tried:

{
    "type": "TabularTranslator",
    "columnMappings": {
      "view.url": "url"
    }
}

and:

"columnMappings": {
    "view.url": "url"
}

and:

{
  "view.url": "url"
}

In this example, view.url is the name of the column in the JSON source, and url is the name of the column in my destination table in Azure SQL database.

like image 526
typheon Avatar asked Feb 27 '26 17:02

typheon


2 Answers

The issue is due to the dot (.) sign in your column name.

  1. To use column mapping, you should also specify structure in your source and sink dataset.
  2. For your source dataset, you need specify your format correctly. And since your column name has dot, you need specify the json path as following. enter image description here
  3. You could use ADF UI to setup a copy for a single file first to get the related format, structure and column mapping format. Then change it to lookup.

And as my understanding, your first format should be the right format. If it is already in json format, then you may not need use "json" function in your expression.

like image 115
Fang Liu Avatar answered Mar 02 '26 14:03

Fang Liu


There seems to be a disconnect between the question and the answer, so I'll hopefully provide a more straightforward answer.

When setting this up, you should have a source dataset with dynamic mapping. The sink doesn't require one, as we're going to specify it in the mapping.

Within the copy activity, format the dynamic json like the following:

    {
      "structure": [
        {
          "name": "Address Number"
        },
        {
          "name": "Payment ID"
        },
        {
          "name": "Document Number"
        },
          ...
          ...
      ]
    }

You would then specify your dynamic mapping like this:

    {
      "translator": {
        "type": "TabularTranslator",
        "mappings": [
          {
            "source": {
              "name": "Address Number",
              "type": "Int32"
            },
            "sink": {
              "name": "address_number"
            }
          },
          {
            "source": {
              "name": "Payment ID",
              "type": "Int64"
            },
            "sink": {
              "name": "payment_id"
            }
          },
          {
            "source": {
              "name": "Document Number",
              "type": "Int32"
            },
            "sink": {
              "name": "document_number"
            }
          },
          ...
          ...
        ]
      }
    }

Assuming these were set in separate variables, you would want to send the source as a string, and the mapping as json:

source: @string(json(variables('str_dyn_structure')).structure)

mapping: @json(variables('str_dyn_translator')).translator

like image 32
VladDrak Avatar answered Mar 02 '26 15:03

VladDrak