In Azure Data Factory pipeline, Can I have a copy activity with two SINKs? I have one source and 2 sinks (One Azure Data lake store for downstream processing and the other for archival on Blob Storage).
That’s definitively possible. Just add a second activity in the same pipeline with the same Input dataset but a different output dataset.
The JSON will then look something like this:
{
"$schema": "http://datafactories.schema.management.azure.com/schemas/2015-09-01/Microsoft.DataFactory.Pipeline.json",
"name": "CopyActivity1",
"properties": {
"description": "Copy data from blob to a sql server table",
"activities": [
{
"name": "CopyActivityTemplate",
"type": "Copy",
"inputs": [
{
"name": "AzureBlobLocation1"
}
],
"outputs": [
{
"name": "AzureSqlTableLocation1"
}
],
"typeProperties": {
"source": {
"type": "BlobSource"
},
},
{
"name": "CopyActivityTemplate2",
"type": "Copy",
"inputs": [
{
"name": "AzureBlobLocation1"
}
],
"outputs": [
{
"name": "AzureSqlTableLocation2"
}
],
"typeProperties": {
"source": {
"type": "BlobSource"
},
},
}
],
"start": "2016-12-05T22:00:00Z",
"end": "2016-12-06T01:00:00Z"
}
}
So basically you need to add another activity with same source.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With