Azure Data Factory is a fully managed, cloud-based data orchestration service that enables data movement and transformation. You can create, schedule and manage your data transformation and integration at a scale with the help of Azure Data Factory (ADF). Wherever your data lives, either in the cloud or on-premises, Azure Data Factory enable a hybrid data integration service. Azure Portal has integrated Azure Data Factory Authoring Tools to setup your data framework pipeline. You can also set up code repository for Azure Data Factory (ADF) and have an end to end integrated development and release experience. You can also schedule trigger for the pipeline execution. In this post let’s quickly explore some of the useful tips on Azure Data Factory.
8 Azure Data Factory Tips & Tricks that you should know
- Invoke Azure Data Factory Pipeline from another pipeline
- Trigger Azure Data Factory Pipeline when a Blob is created or deleted
- How to schedule trigger for Azure Data Factory (ADF) Pipeline?
- Automatic Azure Data Factory Pipeline Creation using Copy Data Wizard
- Exploring Azure Data Factory Activity Execution Details
- How to Monitor Azure Data Factory Pipeline?
- Select Azure Data Factory in Authoring Tool
- Setting up Code Repository for Azure Data Factory