Skip to content
Guides

Run Your Recipe Against 500 Rows, Not One at a Time

JieGou's batch execution engine lets you upload a data table, run any recipe across every row with configurable concurrency, track progress in real time, and export results.

JT
JieGou Team
· · 3 min read

Running a recipe once is useful. Running it against 500 leads, 200 support tickets, or 1,000 product descriptions is where AI automation pays for itself.

JieGou’s batch execution engine takes any recipe and runs it across an entire data table — with progress tracking, error handling, and export built in.

How it works

1. Prepare your data

Upload a CSV or paste data directly into the batch execution interface. Each row becomes one recipe execution. Column headers map to the recipe’s input fields.

If your recipe expects company_name, industry, and website, your CSV needs those three columns. JieGou validates the mapping before execution starts, so you catch schema mismatches upfront rather than 200 rows in.

2. Configure and run

Select the recipe, review the column-to-input mapping, and set your concurrency level. Higher concurrency means faster completion but more parallel LLM calls (subject to your account’s concurrency limit).

Before you hit run, the cost estimator shows the projected token cost based on the recipe’s historical per-run usage multiplied by your row count. No surprises on the bill.

3. Track progress

A real-time progress view shows:

  • How many rows have completed, are running, or are pending
  • Success and error counts
  • Per-row status with expandable output previews

You don’t need to watch the whole run. Navigate away and come back — progress persists.

4. Filter and export

Once the batch completes (or while it’s still running), filter results by status: show only successes, only errors, or everything. This makes it easy to identify and retry failed rows without sifting through hundreds of successful ones.

Export the full results as CSV or JSON. The export includes every input field, every output field, the execution status, and any error messages — ready for import into your next system.

When to use batch execution

Lead enrichment. Upload a list of company names and websites. Run a research recipe that pulls company size, industry, recent news, and competitive positioning. Export the enriched data back to your CRM.

Content generation. Feed in a product catalog with names, descriptions, and target audiences. Run a recipe that generates SEO-optimized product descriptions, social media posts, or email copy for each product.

Document processing. Upload a batch of contract summaries, support tickets, or meeting notes. Run an extraction recipe that pulls key fields — dates, action items, risk flags — into structured data.

Quality assurance. Take a sample of AI-generated outputs from your production workflows and run them through an evaluation recipe that scores quality, checks for hallucinations, or verifies adherence to brand guidelines.

Tips for effective batch execution

Start small. Run 10 rows first. Check the outputs. If the recipe needs prompt adjustments, it’s cheaper to discover that on 10 rows than 500.

Use the right model. Batch runs multiply costs. If a cheaper model (Haiku instead of Opus) produces acceptable output for your use case, the savings are significant at scale. Use a bakeoff to verify before committing.

Handle errors gracefully. Some rows will fail — bad input data, rate limits, edge cases the recipe doesn’t handle. Filter for errors after the run, fix the input data, and re-run just the failed rows.

Batch execution is available on Pro and Enterprise plans. Start your free trial.

batch-execution data-tables bulk-operations productivity
Share this article

Enjoyed this post?

Get workflow tips, product updates, and automation guides in your inbox.

No spam. Unsubscribe anytime.