Skip to main content
Available starting with FlowX.AI 5.6.0Conversational workflows require the Chat component for user interaction.

Overview

Conversational workflows are a specialized workflow type designed for multi-turn chat interactions. Unlike output focused workflows that process structured input/output, conversational workflows manage ongoing dialogue between users and AI agents — handling message exchange, session memory, and response routing. When creating a workflow in the Integration Designer, you choose the workflow type: Chat Driven or Output Focused. This choice is permanent and cannot be changed after creation.
Conversational workflow canvas with Start, Custom Agent, and End Flow nodes

Session memory

Automatically persist and retrieve conversation history across messages within a session

Dedicated Start node

The Start node provides Chat Session ID and User Message fields for receiving chat input

Chat replies

AI agent nodes send responses directly to the Chat component in real time

Intent routing

Classify user messages and route to appropriate workflow branches using the Intent Classification node

Chat driven vs output focused workflows

AspectChat DrivenOutput Focused
PurposeMulti-turn dialogue with usersStructured input/output processing
Start nodeChat Session ID + User Message fieldsStandard Start node (JSON input)
MemoryBuilt-in session memoryNo memory
Response deliveryDirect chat reply from Custom Agent nodesOutput on End node
Data modelInput/Output tabs hiddenFull data model access
IntegrationChat component onlyProcess actions, subworkflows, API
The workflow type cannot be changed after creation. Choose the appropriate type when creating the workflow.

How it works

1

User sends a message

The Chat component sends the user’s message and session ID to the workflow.
2

Memory retrieval

The Start node retrieves session memory — the latest 3 message turns plus a summary of earlier conversation history.
3

AI processing

The workflow processes the message through AI nodes. Nodes with Use Memory enabled receive the conversation history as context for their LLM calls.
4

Response delivery

The Custom Agent node with Send as Chat Reply enabled sends its response directly to the Chat component in Markdown format.
5

Memory update

The system stores the user message and AI response in session memory, updating the conversation summary if needed.

Start node

In Chat Driven workflows, the Start node provides two dedicated input fields instead of the standard JSON editor:
FieldDescriptionNotes
Chat Session IDUnique session identifier. Must be a valid UUID. At runtime, the Chat component generates this automatically. For testing, enter any valid UUID (e.g., 550e8400-e29b-41d4-a716-446655440000).Required. Used for memory retrieval and storage
User MessageThe user’s message textRequired. Referenced in nodes using ${userMessage}
At runtime, both fields are populated automatically by the Chat component. When testing manually via Run Workflow, you enter values directly in these fields on the Start node.
The Chat Session ID must be a valid UUID. Using a plain string (e.g., test-session-1) causes a runtime error: Invalid UUID string.
The Start node in Chat Driven workflows is not a separate node type — it is the same Start node with a different layout tailored for chat input.

Custom Agent node

In Chat Driven workflows, the Custom Agent node has additional configuration options. The full layout from top to bottom:
Conversational workflow with Custom Agent configuration panel

Operation Prompt

The system prompt for the LLM. Use ${userMessage} to reference the user’s message.

Use Memory

When enabled:
  • The node includes the conversation history (retrieved via session ID) in the LLM prompt
  • Memory consists of the latest 3 message turns plus a summary of earlier messages
  • The session ID is sent to the AI platform, which attaches the conversation context to the prompt

Settings

  • MCP Servers — Select MCP tools available to the agent
  • Knowledge Base — Connect a knowledge base for RAG-powered responses

Response

Send as Chat Reply

When enabled:
  • The node’s output is sent directly to the Chat component as a Markdown-formatted response
  • The Response Schema field is hidden (the LLM is instructed to return plain text)
  • A Chat Response tag appears on the node header
  • The response triggers a memory update (stores the user message + AI reply and iterates on the conversation summary)
At least one Custom Agent node in the workflow must have Send as Chat Reply enabled. If no node sends a chat reply, the console log displays an error.

Response Key

Always visible. Defines the key where the node output is stored in the workflow data.

Response Schema

Only visible when Send as Chat Reply is OFF. Defines the expected JSON structure of the LLM response.

Session memory

Chat Driven workflows use built-in session memory stored and managed by FlowX. On each message, the system retrieves the latest 3 user/agent message pairs in full, plus a summary of earlier exchanges. This memory is injected into the LLM’s system prompt for nodes with Use Memory enabled.

Memory Capabilities

Full guide to memory structure, summarization, debugging, and storage
The summarization runs automatically when the conversation exceeds 3 turns. Only user messages and agent responses are included — internal workflow data is not part of the memory.

Console log

The workflow console log includes additional information for Chat Driven workflows:
  • Input tab — Displays User Message and Chat Session ID in JSON format (read-only)
  • Output tab — Displays the chat response as text (not in JSON editor for readability)
  • Memory tab — Shows the conversation history and summary sent to the LLM for that workflow instance

Constraints

Chat Driven workflows cannot be referenced as subworkflows. The subworkflow node filters out Chat Driven workflows from the selection list.
The Start Integration Workflow action in processes filters out Chat Driven workflows. They can only be started through the Chat component.
Running a Chat Driven workflow without a user message triggers an error: “The user message is mandatory in conversational workflows.”
A Chat Driven workflow requires an End Flow node to complete the execution path. The End Flow node is simplified (header only, no body configuration) since responses are sent from Custom Agent nodes. The End Flow node is not auto-created — you must add it manually from the node palette.
Only Chat Driven workflows can be integrated into the Chat component. Output Focused workflows are filtered out from the Chat component workflow selection.

The Navigate in UI Flow node is an action node exclusive to Chat Driven workflows. It navigates the user to a specific screen in a UI Flow, passing dynamic parameters — allowing the AI conversation to open forms, dashboards, or any UI flow destination with contextual data.
Navigate in UI Flow node configuration with UI Flow, Destination, and Parameters
This node only appears in the Actions category when editing a Chat Driven workflow. It is not available in Output Focused workflows.

Configuration

UI Flow
select
required
Select the UI Flow resource to navigate to. Only UI Flows in the current project and its dependencies are available.
Destination
select
required
The specific screen (root view) within the selected UI Flow. The destination list is fetched dynamically and displays the navigation tree of the UI Flow.Changing the destination resets all configured parameters.
Parameters
key-value[]
Dynamic parameters passed to the destination screen as query parameters. The available parameters are defined by the destination’s query parameter configuration.
  • Use ${expression} syntax to map workflow data to parameters
  • Required parameters are marked with an asterisk (*)
  • Default values are shown when configured on the destination
Example: ${customerData.accountId} maps the workflow’s customerData.accountId to the destination’s query parameter.

How it works

When the node executes:
  1. Validates that the workflow is Chat Driven (fails with error otherwise)
  2. Resolves ${expression} placeholders in parameter values using the current workflow context
  3. Sends a navigation command to the Chat component via the events gateway
  4. The Chat component opens the target UI Flow screen with the resolved parameters
If any required parameter resolves to an empty value after placeholder resolution, the node fails with an error. Verify that the referenced workflow data keys contain values at runtime.

Testing conversational workflows

When testing via Run Workflow, the test modal provides two input fields:
  • Chat Session ID — A valid UUID that identifies the test conversation session (e.g., 550e8400-e29b-41d4-a716-446655440000). Required. Used for memory storage and retrieval.
  • User Message — The test message to send (e.g., “What is my account balance?”)
These replace the standard JSON editor used in Output Focused workflows.
When testing, use the same Chat Session ID across multiple Run Workflow executions to test multi-turn memory — the system retrieves previous messages from that session.
Run Workflow test modal with Chat Session ID and User Message fields

Setting up a conversational workflow

1

Create a new workflow

In the Integration Designer, click + to create a new workflow. Enter a name and select Chat Driven as the workflow type.
2

Review the Start node

The Start node is created automatically with Chat Session ID and User Message fields pre-configured.
3

Add AI processing nodes

Add Custom Agent nodes or Intent Classification nodes to process user messages.
4

Enable Chat Reply

On the Custom Agent node that generates the final response, toggle Send as Chat Reply to ON.
5

Enable Memory (optional)

On nodes that need conversation context, toggle Use Memory to ON.
6

Add an End Flow node

Add an End Flow node from the node palette and connect it to the final node in your workflow. The End Flow node has no body configuration in Chat Driven workflows.
7

Integrate with Chat component

In your UI Flow, add a Chat component and select the Chat Driven workflow.

Chat component

Technical reference for runtime behavior, session management, and SDK integration

AI node types

Overview of all available AI workflow node types

Chat interface

Conceptual overview of adding conversational AI to your apps

Knowledge Base

Connect knowledge bases for context-aware AI responses
Last modified on March 25, 2026