Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Reasons to use Azure Data Lake Analytics vs Traditional ETL approach

I'm considering using Data Lake technologies which I have been studying for the latest weeks, compared with the traditional ETL SSIS scenarios, which I have been working with for so many years.

I think of Data Lake as something very linked to big data, but where is the line between using Data Lake technolgies vs SSIS?

Is there any advantage of using Data Lake technologies with 25MB ~100MB ~ 300MB files? Parallelism? flexibility? Extensible in the future? Is there any performance gain when the files to be loaded are not so big as U-SQL best scenario...

What are your thoughts? Would it be like using a hammer to crack a nut? Please, don't hesitate to ask me any questions to clarify the situation. Thanks in advance!!

21/03 EDIT More clarifications:

  1. has to be on the cloud
  2. the reason I considered about using ADL is because there is no substitution for SSIS in the cloud. There is ADF, but it's not the same, it orchestrates the data, but it's not so flexible as SSIS
  3. I thought I could use U-SQL for some (basic) transformations but I see some problems
    • There are many basic things I cannot do: loops, updates, writing logs in a SQL...
    • The output can only be a U-SQL table or a file. The architecture doesn't look good this way (despite U-SQL is very good with big files, if I need an extra step to export the file to another DB or DWH) - Or maybe this is the way it's done in Big Data Warehouses... I don't know
    • In my tests, It takes 40s for a 1MB file, and 1:15s for a 500MB file. I cannot justify a 40s process for 1MB (plus uploading to the Database/Data Warehouse with ADF)
    • The code looks unorganised for a user, as the scripts with many basic validations will be U-SQL scripts too long.

Don't get me wrong, I really like ADL techonologies, but I think that for now, it's for something very specific and still there is no substitution for SSIS in the cloud. What do you thing? Am I wrong?

like image 909
Carlos Moreno Avatar asked Mar 17 '17 08:03

Carlos Moreno


People also ask

Why should I use Azure Data Lake?

It removes the complexities of ingesting and storing all of your data while making it faster to get up and running with batch, streaming and interactive analytics. Azure Data Lake works with existing IT investments for identity, management and security for simplified data management and governance.

What is the difference between Azure data Factory and data lake?

ADF helps in transforming, scheduling and loading the data as per project requirement. Whereas Azure Data Lake is massively scalable and secure data lake storage for storing optimized workloads. It can store structured, semi structured and unstructured data seamlessly.

Does data lake need ETL?

However, organizations find that simply pouring all of the data into object storage such as Amazon S3 does not mean you have an operational data lake quite yet; to actually put that data to use in analytics or machine learning, developers need to build ETL flows that transform raw data into structured datasets they can ...

Is Azure synapse analytics an ETL tool?

Traditional SMP dedicated SQL pools use an Extract, Transform, and Load (ETL) process for loading data. Synapse SQL, within Azure Synapse Analytics, uses distributed query processing architecture that takes advantage of the scalability and flexibility of compute and storage resources.


3 Answers

For me, if the data is highly structured and relational, the right place for it is a relational database. In Azure you have several choices:

  1. SQL Server on a VM (IaaS) Ordinary SQL Server running on a VM, you have to install, configure and manage it yourself but you get the full flexibility of the product.
  2. Azure SQL Database PaaS database option targetted at smaller volumes but now up to 4TB. All of the features of normal SQL Server with potentially lower TCO and the option to scale up or down using tiers.
  3. Azure SQL Data Warehouse (ADW) MPP product suitable for large warehouses. For me, the entry criteria is warehouses at least 1TB in size, and probably more like 10TB. It's really not worth having a MPP for small volumes.

For all database options you can use clustered columnstore indexes, (the default in ADW), which can give massive compression, between 5x and 10x.

400MB per day for a year totals ~143GB, which honestly is not that much in modern data warehouse terms, which are normally measured in terabytes (TB).

Where Azure Data Lake Analytics (ADLA) comes in, is doing things you cannot do in ordinary SQL, like:

  • combine the power of C# with SQL for powerful queries - example here
  • dealing with unstructured files like images, xml or JSON - example here
  • using RegEx
  • scale out R processing - example here

ADLA also offers federated queries, the ability to "query data where it lives", ie bring together structured data from your database and unstructured data from your lake.

Your decision seems more to do with whether or not you should be using the cloud. If you need the elastic and scalable features of cloud then Azure Data Factory is the tool for moving data from place to place in the cloud.

HTH

like image 137
wBob Avatar answered Nov 01 '22 18:11

wBob


Be careful. This question is likely to get closed for being too broad.

There are many arguments for and against. We can't discuss them all here.

ADL isn't a replacement for SSIS. The consultants answer as always will be.. it depends what your doing/trying to do.

A simplistic answer might be. ADL is unlimited and highly scalable. SSIS is not. But, yes, ADL has a high entry point for small files because of that scalability.

Generally I don't think the two technologies are comparable.

If you want SSIS in Azure. Wait for MS to release it as a PaaS. Or use a virtual machine.

like image 41
Paul Andrew Avatar answered Nov 01 '22 18:11

Paul Andrew


I think for simpler transformations it may be a good solution, however if you have complexities, notifications etc. it may be incompatible. A typical scenario would be something like transforming a JSON document to CSV, then taking the CSV and running that through SSIS for further transforms. There is certainly a future state that will enable U-SQL to to be much more powerful, for now I think there are separate and distinct uses for U-SQL/ADLA/ADLS and SSIS.

like image 32
Carolus Holman Avatar answered Nov 01 '22 18:11

Carolus Holman