DynamicsPlus

Working with large dataset in Power automate

Working with large datasets in Power Automate can be a challenge, but there are several strategies you can use to optimize your workflows and ensure that your data processing tasks are efficient and effective. By default, the pagination option is not enabled in Power Automate, and it will only retrieve a limited number of records from the connector. For example, if you’re using the “List rows” action in the Dataverse connector, it will only retrieve 256 records by default.

How to retrieve more records?

Option 1: Paging option

Follow below steps to add the List rows action to your flow to return up to 5000 Sales order lines from the Sales order lines table in Dataverse.

1. Create new power automate and Select New stepto add an action to your flow.

2. Enter list rowinto the Search connectors and actions search box on the Choose an operation

3. Select Microsoft Dataverse to filter the search results to display only actions and triggers for Microsoft Dataverse.

4. Select List rows.

5. Select the Sales order lines from the table name list.

      

Turn on paging option on List rows

1. In the upper-right corner of the List rows card, select the menu (…).

2. Select settings.

 

3. Move the Pagination slider to the on position if it’s not already turned on.

4. In Threshold, enter the maximum number of rows requested. The maximum configurable threshold is 100,000. Internally, this number is rounded off in increments of the default page size. For example, if that page size is 5,000, and you enter 7,000, the number of rows returned is 10,000.

Option 2: Skip token

In Power Automate, the “skip token” is a token that is returned by certain data connectors to allow you to retrieve additional pages of data when using pagination. When you use the pagination feature to retrieve a large dataset from a connector, the connector may return a “skip token” along with the data for each page.

 

The skip token is essentially a bookmark that indicates the last record that was retrieved on the current page. When you use the skip token in the next iteration of your loop, it tells the connector to skip over the records that were already retrieved in the previous page and return the next set of records in the dataset.

 

Because Power Automate applies content throughput limits and message size limits to ensure general service guarantees, it’s often useful to use pagination to return a smaller number of rows in a batch, rather than the default limits on number of table rows returned.

The default page limit of 5,000 rows applies if you don’t use pagination.

To use it, implement a loop to parse the @odata.nextLink value in the JSON response, extract the skip token, and then send another request until you’ve listed the number of rows that you need.

Option 3: Fetch Xml Query

FetchXML is a query language used in Dynamics 365 and Power Apps to retrieve data from the Dataverse. When using FetchXML queries with the Dataverse connector in Power Automate, you can enable pagination to retrieve large datasets in smaller chunks.

To enable pagination in a FetchXML query in Power Automate, you need to add the “paging-cookie” attribute to the <fetch> element in your FetchXML query. The “paging-cookie” attribute is a token that indicates the last record that was retrieved on the current page and is used to retrieve the next page of data in the dataset.

Tips and Tricks

Here are some tips for working with large datasets in Power Automate:

Use filtering to reduce the size of the dataset: When working with large datasets, it’s often a good idea to filter the data to reduce the amount of information you’re working with. Use the “Filter array” action in Power Automate to apply conditions to your data and remove any unnecessary records.

Use pagination: If you’re working with a very large dataset, you may need to break it up into smaller chunks to process it efficiently. Use the “Top count” and “Skip count” options to paginate your data and work on smaller subsets at a time.

 

Optimize your data connectors: The performance of your data connectors can have a big impact on the speed of your workflows. Make sure you’re using the most efficient data connectors available and consider using premium connectors if necessary.

 

Use parallel processing: If you’re working with a large dataset and need to perform multiple actions on it, consider using parallel processing to run multiple actions simultaneously. This can help speed up your workflows and reduce processing time.

 

Monitor resource consumption: When working with large datasets, it’s important to monitor resource consumption to ensure that your workflows don’t exceed the capacity of your environment. Keep an eye on resource usage and consider optimizing your workflows if you notice any issues.

 

By following these tips, you can optimize your Power Automate workflows and effectively work with large datasets.

I hope you found the article on Working with Large Datasets in Power Automate helpful. Please feel free to share it with your colleagues if you think it could be useful to them as well.

Leave a Reply

Your email address will not be published. Required fields are marked *