- Add Activepieces fork with SmoothSchedule custom piece - Create integrations app with Activepieces service layer - Add embed token endpoint for iframe integration - Create Automations page with embedded workflow builder - Add sidebar visibility fix for embed mode - Add list inactive customers endpoint to Public API - Include SmoothSchedule triggers: event created/updated/cancelled - Include SmoothSchedule actions: create/update/cancel events, list resources/services/customers 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
36 lines
1.2 KiB
Plaintext
36 lines
1.2 KiB
Plaintext
---
|
|
title: "Truncated Logs"
|
|
description: "Understanding and resolving truncated flow run logs"
|
|
icon: "file-lines"
|
|
---
|
|
|
|
## Overview
|
|
|
|
If you see `(truncated)` in the flow run logs, it means the logs have exceeded the maximum allowed file size.
|
|
|
|
## How It Works
|
|
|
|
There is a current limitation where the log file of a run cannot grow past a certain size. When this limit is reached, the engine automatically removes the largest keys in the JSON output until it fits within the allowed size.
|
|
|
|
<Note>
|
|
**This does not affect flow execution.** Your flow will continue to run normally even when logs are truncated.
|
|
</Note>
|
|
|
|
## Known Limitation
|
|
|
|
There is one known issue with truncated logs:
|
|
|
|
If you **pause** a flow, then **resume** it, and the resumed step references data from a truncated step, the flow will fail because the referenced data is no longer available in the logs.
|
|
|
|
## Solution
|
|
|
|
You can increase the `AP_MAX_FILE_SIZE_MB` environment variable to a higher value to allow larger log files:
|
|
|
|
```bash
|
|
AP_MAX_FILE_SIZE_MB=50
|
|
```
|
|
|
|
<Info>
|
|
**Future Improvement:** There is a planned enhancement to change this limit from per-log-file to per-step, which will provide more granular control over log sizes. This feature is currently in the planning phase.
|
|
</Info>
|