web
You’re offline. This is a read only version of the page.
close
Skip to main content
Community site session details

Community site session details

Session Id :
Power Automate - Using Flows
Answered

How to prevent Excel Scripts from timing out (504 BadGateway) in Flow for large CSV files?

(0) ShareShare
ReportReport
Posted on by 206

I’ve been building a Power Automate flow that ingests large CSV files (5,000+ rows) and transforms them into structured Excel tables using Office Scripts. My original script parsed the CSV and wrote rows one-by-one, then applied removeDuplicates, AutoFilter, and table creation directly in the workbook.
This worked for small files but consistently failed for larger datasets with the error:

504 BadGateway
Your Office Script has timed out. Please try again.
The Run script action in Power Automate has a ~120-second synchronous timeout, and my script was hitting that limit due to excessive workbook I/O.
I needed a solution that:
  • Handles large datasets without timing out.
  • Performs deduplication, filtering, and cleaning.
  • Creates a final Excel table for downstream processes.

Has anyone solved this problem or can share best practices for optimizing Office Scripts in Power Automate?

I have the same question (0)
  • Verified answer
    J_Taylor Profile Picture
    206 on at
    How to prevent Excel Scripts from timing out (504 BadGateway) in Flow for large CSV files?
    Yes—this is a common issue caused by the connector’s synchronous timeout and scripts that make too many workbook calls. The key is to minimize workbook I/O and batch operations. Here’s what worked for me:

    Core Principles

    • Parse once → transform in memory → write once (or in batches).
    • Use Range.setValues() for bulk writes instead of per-cell updates.
    • Batch large writes by cells, not rows (e.g., cellsPerBatch = 10,000).
    • Perform deduplication, filtering, and text cleaning in memory before writing.
    • Use absolute references (getWorksheet("Sheet1"))—avoid active-sheet APIs.
    • Refresh tables safely using getTables() and addTable() (no unsupported async calls).

    Final Pipeline (T7)

    • Parse CSV → deduplicate → exclude primary/preparatory → drop blank HOMECLASS → clean text (titles, YEAR normalization, HOMECLASS trimming) → batched write → create table Import.
    • Processes 5,000+ rows in under 10 seconds of script time, well within the connector’s limit.

    Microsoft Learn References

    I’ve documented everything in detail, including test results, best-practice checklists, and full scripts (T4–T7), in these reports:
    • Report 1: Performance testing and batching strategy (S0–S6).
    • Report 2: Full pipeline implementation with deduplication, filtering, cleaning, and table creation.

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Responsible AI policies

As AI tools become more common, we’re introducing a Responsible AI Use…

Tom Macfarlan – Community Spotlight

We are honored to recognize Tom Macfarlan as our Community Spotlight for October…

Leaderboard > Power Automate

#1
Michael E. Gernaey Profile Picture

Michael E. Gernaey 714 Super User 2025 Season 2

#2
Tomac Profile Picture

Tomac 519 Moderator

#3
chiaraalina Profile Picture

chiaraalina 317

Last 30 days Overall leaderboard

Featured topics

Restore a deleted flow