Skip to content

Latest commit

 

History

History
88 lines (73 loc) · 8.4 KB

azure-data-factory-limits.md

File metadata and controls

88 lines (73 loc) · 8.4 KB
title description services author ms.service ms.topic ms.date ms.author ms.custom
include file
include file
data-factory
chez-charlie
data-factory
include
11/16/2020
chez
include file

Azure Data Factory is a multitenant service that has the following default limits in place to make sure customer subscriptions are protected from each other's workloads. To raise the limits up to the maximum for your subscription, contact support.

Version 2

Resource Default limit Maximum limit
Data factories in an Azure subscription 800 800
Total number of entities, such as pipelines, data sets, triggers, linked services, Private Endpoints, and integration runtimes, within a data factory 5,000 Contact support.
Total CPU cores for Azure-SSIS Integration Runtimes under one subscription 256 Contact support.
Concurrent pipeline runs per data factory that's shared among all pipelines in the factory 10,000 10,000
Concurrent External activity runs per subscription per Azure Integration Runtime region
External activities are managed on integration runtime but execute on linked services, including Databricks, stored procedure, HDInsights, Web, and others. This limit does not apply to Self-hosted IR.
3,000 3,000
Concurrent Pipeline activity runs per subscription per Azure Integration Runtime region
Pipeline activities execute on integration runtime, including Lookup, GetMetadata, and Delete. This limit does not apply to Self-hosted IR.
1,000 1,000
Concurrent authoring operations per subscription per Azure Integration Runtime region
Including test connection, browse folder list and table list, preview data. This limit does not apply to Self-hosted IR.
200 200
Concurrent Data Integration Units1 consumption per subscription per Azure Integration Runtime region Region group 12: 6,000
Region group 22: 3,000
Region group 32: 1,500
Region group 12: 6,000
Region group 22: 3,000
Region group 32: 1,500
Maximum activities per pipeline, which includes inner activities for containers 40 40
Maximum number of linked integration runtimes that can be created against a single self-hosted integration runtime 100 Contact support.
Maximum parameters per pipeline 50 50
ForEach items 100,000 100,000
ForEach parallelism 20 50
Maximum queued runs per pipeline 100 100
Characters per expression 8,192 8,192
Minimum tumbling window trigger interval 15 min 15 min
Maximum timeout for pipeline activity runs 7 days 7 days
Bytes per object for pipeline objects3 200 KB 200 KB
Bytes per object for dataset and linked service objects3 100 KB 2,000 KB
Bytes per payload for each activity run4 896 KB 896 KB
Data Integration Units1 per copy activity run 256 256
Write API calls 1,200/h 1,200/h

This limit is imposed by Azure Resource Manager, not Azure Data Factory.
Read API calls 12,500/h 12,500/h

This limit is imposed by Azure Resource Manager, not Azure Data Factory.
Monitoring queries per minute 1,000 1,000
Maximum time of data flow debug session 8 hrs 8 hrs
Concurrent number of data flows per integration runtime 50 Contact support.
Concurrent number of data flow debug sessions per user per factory 3 3
Data Flow Azure IR TTL limit 4 hrs 4 hrs

1 The data integration unit (DIU) is used in a cloud-to-cloud copy operation, learn more from Data integration units (version 2). For information on billing, see Azure Data Factory pricing.

2 Azure Integration Runtime is globally available to ensure data compliance, efficiency, and reduced network egress costs.

Region group Regions
Region group 1 Central US, East US, East US2, North Europe, West Europe, West US, West US 2
Region group 2 Australia East, Australia Southeast, Brazil South, Central India, Japan East, Northcentral US, Southcentral US, Southeast Asia, West Central US
Region group 3 Canada Central, East Asia, France Central, Korea Central, UK South

3 Pipeline, data set, and linked service objects represent a logical grouping of your workload. Limits for these objects don't relate to the amount of data you can move and process with Azure Data Factory. Data Factory is designed to scale to handle petabytes of data.

4 The payload for each activity run includes the activity configuration, the associated dataset(s) and linked service(s) configurations if any, and a small portion of system properties generated per activity type. Limit for this payload size doesn't relate to the amount of data you can move and process with Azure Data Factory. Learn about the symptoms and recommendation if you hit this limit.

Version 1

Resource Default limit Maximum limit
Pipelines within a data factory 2,500 Contact support.
Data sets within a data factory 5,000 Contact support.
Concurrent slices per data set 10 10
Bytes per object for pipeline objects1 200 KB 200 KB
Bytes per object for data set and linked service objects1 100 KB 2,000 KB
Azure HDInsight on-demand cluster cores within a subscription2 60 Contact support.
Cloud data movement units per copy activity run3 32 32
Retry count for pipeline activity runs 1,000 MaxInt (32 bit)

1 Pipeline, data set, and linked service objects represent a logical grouping of your workload. Limits for these objects don't relate to the amount of data you can move and process with Azure Data Factory. Data Factory is designed to scale to handle petabytes of data.

2 On-demand HDInsight cores are allocated out of the subscription that contains the data factory. As a result, the previous limit is the Data Factory-enforced core limit for on-demand HDInsight cores. It's different from the core limit that's associated with your Azure subscription.

3 The cloud data movement unit (DMU) for version 1 is used in a cloud-to-cloud copy operation, learn more from Cloud data movement units (version 1). For information on billing, see Azure Data Factory pricing.

Resource Default lower limit Minimum limit
Scheduling interval 15 minutes 15 minutes
Interval between retry attempts 1 second 1 second
Retry timeout value 1 second 1 second

Web service call limits

Azure Resource Manager has limits for API calls. You can make API calls at a rate within the Azure Resource Manager API limits.