admin管理员组文章数量:1131184
I'm working in Azure Data Factory (ADF) to extract data from an Oracle database and store it temporarily in Azure Data Lake. I then use a Data Flow to load this data into a table in Dataverse.
However, when dealing with a large volume of data (e.g., 1,000,000 records), only about 100,000 records are successfully loaded before encountering an error due to API call limitations in Dataverse.
What would be the most recommended approach to handle this scenario?
I'm working in Azure Data Factory (ADF) to extract data from an Oracle database and store it temporarily in Azure Data Lake. I then use a Data Flow to load this data into a table in Dataverse.
However, when dealing with a large volume of data (e.g., 1,000,000 records), only about 100,000 records are successfully loaded before encountering an error due to API call limitations in Dataverse.
What would be the most recommended approach to handle this scenario?
Share Improve this question asked Jan 8 at 0:10 tavo92tavo92 92 bronze badges 3- Can you share what approach have you followed so far? – Dileep Raj Narayan Thumula Commented Jan 8 at 2:18
- Sure, I have created a Copy Data activity to extract data from Oracle and create a parquet file in the Data Lake. After this activity, I have set up a Data Flow to perform some transformations, using the Data Lake file as the source and sending the transformed data to a table using the Dataverse connector. – tavo92 Commented Jan 8 at 2:46
- what is the exact error? – JayashankarGS Commented Jan 8 at 10:03
1 Answer
Reset to default 0As you mentioned you are facing issues with performance to move data from ADLS to Dataverse using dataflow and copy activity in Azure Data Factory.
Below are the 2 approaches:
Using Copy Activity:
Select the Dataverse table as the sink dataset, configure the sink settings for Upsert
as write behavior , and specify an alternate key
.
Using Mapping dataflows:
Use a Alter Row transformation and enable Upsert if
To enable the Upsert
operation, it requires an Alter Row transformation.
Next, In the Sink Specify the insert
method for the Dataverse.
Reference: Bulk insert into Dataverse using ADF
本文标签:
版权声明:本文标题:How to Handle API Call Limitations in Dataverse When Inserting Large Datasets via Azure Data Factory? - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1736766861a1951866.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论