admin管理员组

文章数量:1131184

I'm working in Azure Data Factory (ADF) to extract data from an Oracle database and store it temporarily in Azure Data Lake. I then use a Data Flow to load this data into a table in Dataverse.

However, when dealing with a large volume of data (e.g., 1,000,000 records), only about 100,000 records are successfully loaded before encountering an error due to API call limitations in Dataverse.

What would be the most recommended approach to handle this scenario?

I'm working in Azure Data Factory (ADF) to extract data from an Oracle database and store it temporarily in Azure Data Lake. I then use a Data Flow to load this data into a table in Dataverse.

However, when dealing with a large volume of data (e.g., 1,000,000 records), only about 100,000 records are successfully loaded before encountering an error due to API call limitations in Dataverse.

What would be the most recommended approach to handle this scenario?

Share Improve this question asked Jan 8 at 0:10 tavo92tavo92 92 bronze badges 3
  • Can you share what approach have you followed so far? – Dileep Raj Narayan Thumula Commented Jan 8 at 2:18
  • Sure, I have created a Copy Data activity to extract data from Oracle and create a parquet file in the Data Lake. After this activity, I have set up a Data Flow to perform some transformations, using the Data Lake file as the source and sending the transformed data to a table using the Dataverse connector. – tavo92 Commented Jan 8 at 2:46
  • what is the exact error? – JayashankarGS Commented Jan 8 at 10:03
Add a comment  | 

1 Answer 1

Reset to default 0

As you mentioned you are facing issues with performance to move data from ADLS to Dataverse using dataflow and copy activity in Azure Data Factory.

Below are the 2 approaches:

Using Copy Activity: Select the Dataverse table as the sink dataset, configure the sink settings for Upsert as write behavior , and specify an alternate key.

Using Mapping dataflows:

Use a Alter Row transformation and enable Upsert if To enable the Upsert operation, it requires an Alter Row transformation.

Next, In the Sink Specify the insert method for the Dataverse.

Reference: Bulk insert into Dataverse using ADF

本文标签: