Azure Data Architecture Guide – Blog #9: Extract, transform, load (ETL)

This post has been republished via RSS; it originally appeared at: Azure Global articles.

This is our ninth and final blog entry exploring the Azure Data Architecture Guide. The previous entries for this blog series are:

Like all the previous posts in this series, we'll work from a technology implementation seen directly in our customer engagements. The example can help lead you to the ADAG content to make the right technology choices for your business.

  

Extract, transform, load (ETL)

 

In this example, the web application logs and custom telemetry are captured with Application Insights, sent to Azure Storage blobs, and then the ETL pipeline is created, scheduled, and managed using Azure Data Factory. The SSIS packages are deployed to Azure--with the Azure-SSIS integration runtime (IR) in Azure Data Factory--to apply data transformation as a step in the ETL pipeline, before loading the transformed data into Azure SQL Database.

   

ADAG_ETL.png

 

Highlighted services

  

Related ADAG articles

    

Please peruse ADAG to find a clear path for you to architect your data solution on Azure:

     

 

AzureCAT_Icon.jpg

Azure CAT Guidance

"Hands-on solutions, with our heads in the Cloud!"

Leave a Reply

Your email address will not be published. Required fields are marked *

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.