- Move workflow generation logic from LLMGenerator to WorkflowGenerator - Extract to api/core/workflow/generator/ with modular architecture - Implement Planner-Builder pattern for better separation of concerns - Add validation engine with rule-based error classification - Add node and edge repair utilities for auto-fixing common issues - Add deterministic Mermaid generator for consistent output - Reorganize configuration and prompts - Move vibe_config/ to generator/config/ - Move vibe_prompts.py to generator/prompts/ (split into multiple files) - Add builder_prompts.py and planner_prompts.py for new architecture - Enhance frontend workflow handling - Use standard node initialization for proper node setup - Improve variable reference replacement with better error handling - Add model fallback logic for better compatibility - Handle end node outputs format (value_selector vs legacy format) - Ensure parameter-extractor nodes have required 'required' field - Add comprehensive test coverage - Unit tests for mermaid generator, node repair, edge repair - Tests for validation engine and rule system - Tests for planner prompts formatting - Frontend tests for variable reference replacement - Add max_fix_iterations parameter for validate-fix loop configuration # Conflicts: # web/app/components/workflow/hooks/use-workflow-vibe.tsx |
||
|---|---|---|
| .. | ||
| entities | ||
| generator | ||
| graph | ||
| graph_engine | ||
| graph_events | ||
| node_events | ||
| nodes | ||
| repositories | ||
| runtime | ||
| utils | ||
| README.md | ||
| __init__.py | ||
| constants.py | ||
| conversation_variable_updater.py | ||
| enums.py | ||
| errors.py | ||
| system_variable.py | ||
| variable_loader.py | ||
| workflow_entry.py | ||
| workflow_type_encoder.py | ||
README.md
Workflow
Project Overview
This is the workflow graph engine module of Dify, implementing a queue-based distributed workflow execution system. The engine handles agentic AI workflows with support for parallel execution, node iteration, conditional logic, and external command control.
Architecture
Core Components
The graph engine follows a layered architecture with strict dependency rules:
-
Graph Engine (
graph_engine/) - Orchestrates workflow execution- Manager - External control interface for stop/pause/resume commands
- Worker - Node execution runtime
- Command Processing - Handles control commands (abort, pause, resume)
- Event Management - Event propagation and layer notifications
- Graph Traversal - Edge processing and skip propagation
- Response Coordinator - Path tracking and session management
- Layers - Pluggable middleware (debug logging, execution limits)
- Command Channels - Communication channels (InMemory, Redis)
-
Graph (
graph/) - Graph structure and runtime state- Graph Template - Workflow definition
- Edge - Node connections with conditions
- Runtime State Protocol - State management interface
-
Nodes (
nodes/) - Node implementations- Base - Abstract node classes and variable parsing
- Specific Nodes - LLM, Agent, Code, HTTP Request, Iteration, Loop, etc.
-
Events (
node_events/) - Event system- Base - Event protocols
- Node Events - Node lifecycle events
-
Entities (
entities/) - Domain models- Variable Pool - Variable storage
- Graph Init Params - Initialization configuration
Key Design Patterns
Command Channel Pattern
External workflow control via Redis or in-memory channels:
# Send stop command to running workflow
channel = RedisChannel(redis_client, f"workflow:{task_id}:commands")
channel.send_command(AbortCommand(reason="User requested"))
Layer System
Extensible middleware for cross-cutting concerns:
engine = GraphEngine(graph)
engine.layer(DebugLoggingLayer(level="INFO"))
engine.layer(ExecutionLimitsLayer(max_nodes=100))
Event-Driven Architecture
All node executions emit events for monitoring and integration:
NodeRunStartedEvent- Node execution beginsNodeRunSucceededEvent- Node completes successfullyNodeRunFailedEvent- Node encounters errorGraphRunStartedEvent/GraphRunCompletedEvent- Workflow lifecycle
Variable Pool
Centralized variable storage with namespace isolation:
# Variables scoped by node_id
pool.add(["node1", "output"], value)
result = pool.get(["node1", "output"])
Import Architecture Rules
The codebase enforces strict layering via import-linter:
-
Workflow Layers (top to bottom):
- graph_engine → graph_events → graph → nodes → node_events → entities
-
Graph Engine Internal Layers:
- orchestration → command_processing → event_management → graph_traversal → domain
-
Domain Isolation:
- Domain models cannot import from infrastructure layers
-
Command Channel Independence:
- InMemory and Redis channels must remain independent
Common Tasks
Adding a New Node Type
- Create node class in
nodes/<node_type>/ - Inherit from
BaseNodeor appropriate base class - Implement
_run()method - Register in
nodes/node_mapping.py - Add tests in
tests/unit_tests/core/workflow/nodes/
Implementing a Custom Layer
- Create class inheriting from
Layerbase - Override lifecycle methods:
on_graph_start(),on_event(),on_graph_end() - Add to engine via
engine.layer()
Debugging Workflow Execution
Enable debug logging layer:
debug_layer = DebugLoggingLayer(
level="DEBUG",
include_inputs=True,
include_outputs=True
)