So in Azure Data Factory, for a pipeline, I had an HTTP object set up for copying data from an API, it was using a basic Password and Username. Now the API uses a bearer token to authorize calls. I've been able to code up a solution in Python, but I really don't know how to get Azure to handle this authentication process, in the Copy step.
Is there a way to call for the bearer token earlier and then pass it as part of the HTTP link service password?
Python Script:
import http.client
conn = http.client.HTTPSConnection("www.url.com")
headers = {
    'authorization': "Basic [removed]",
    'cache-control': "no-cache",
    }
conn.request("GET", "/v1/oauth2/accesstoken?grant_type=client_credentials", headers=headers)
res = conn.getresponse()
data = res.read()
import json
datajson = json.loads(data.decode("utf-8"))
headers = {
    'authorization': "Bearer " + datajson["access_token"],
    'cache-control': "no-cache",
    }
conn.request("GET", "/data?data-date=2018-12-09", headers=headers)
res = conn.getresponse()
data = res.read()
print(data.decode("utf-8"))
                Unfortunately, according to Copy data from an HTTP endpoint by using Azure Data Factory, the only supported authentication methods are: Anonymous, Basic, Digest, Windows, or ClientCertificate.
But, you might be able to do a workaround by using the additionalHeaders of the Dataset's properties to pass the bearer token to the HTTP endpoint.
To get the token (and even you might be able to get data this way), you could use Web activity in Azure Data Factory to perform the HTTP requests.
Hope it helps!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With