Approach: Hybrid Data Integration with Data Factory - NeosAlpha
Approach: Hybrid Data Integration with Data Factory
Facebook
Twitter
LinkedIn
WhatsApp

Hybrid Integration

Azure Data Factory is a cloud based data integration service built for complex hybrid extract-transform-load (ETL), extract-load-transform (ELT), and data integration projects. It allows creating data driven workflows in the cloud for orchestrating and automating data movement and data transformation.

Azure Data Factory helped our customers to aggregate their multiple sources of data in Azure for advanced analytics and data visualisation. Few of our customers are running their Big Data Analytics on top of the aggregated data using Azure Data Lake Analytics and Azure HDInsight.

Connect and collect

Enterprises have various types of data located in disparate sources such as on-premise, in the cloud, structured, unstructured, and semi-structured, all arriving at different intervals and speeds.

The first step in building an information production system is to connect to all the required sources of data for processing, such as software-as-a-service (SaaS) services, databases, file shares, and FTP web services. The next step is to move the data as needed to a centralised location for subsequent processing.

Data Processing and Movement

The centralised loaded data stored in the cloud is processed or transformed by computer services. Raw data that is refined into a consumable form is loaded into Azure Data Warehouse, Azure SQL Database, On-Premises SQL Server, Azure CosmosDB or whatever analytical engine according to business requirement.

Consumption

Processed data is now ready for use for various business purposes such as for analytics, data visualisation scenarios or whatever according to business requirement.