VirtualWorkflowSynthesizer._build_features() now extracts ALL legacy
app features from AppModelConfig into the synthesized workflow.features:
- opening_statement + suggested_questions
- sensitive_word_avoidance (keywords/API moderation)
- more_like_this
- speech_to_text / text_to_speech
- retriever_resource
Previously workflow.features was hardcoded to "{}", losing all these
features during transparent upgrade. Now AdvancedChatAppRunner's
moderation, opening text, and other feature layers work correctly
for transparently upgraded old apps.
Made-with: Cursor
VirtualWorkflowSynthesizer.ensure_workflow() creates a real draft
workflow on first call for a legacy app, persisting it to the database.
On subsequent calls, returns the existing draft.
This is needed because AdvancedChatAppGenerator's worker thread looks
up workflows from the database by ID. Instead of hacking the generator
to skip DB lookups, we treat this as a lazy one-time upgrade: the old
app gets a real workflow that can also be edited in the workflow editor.
Verified: old chat app created on main branch ("What is 2+2?" -> "Four")
and old agent-chat app ("Say hello" -> "Hello!") both successfully
execute through the Agent V2 engine with AGENT_V2_TRANSPARENT_UPGRADE=true.
Made-with: Cursor
Add two feature-flag-controlled upgrade paths that allow existing apps
and LLM nodes to transparently run through the Agent V2 engine without
any database migration:
1. AGENT_V2_TRANSPARENT_UPGRADE (default: off):
When enabled, old apps (chat/completion/agent-chat) bypass legacy
Easy-UI runners. VirtualWorkflowSynthesizer converts AppModelConfig
to an in-memory Workflow (start -> agent-v2 -> answer) at runtime,
then executes via AdvancedChatAppGenerator. Falls back to legacy
path on any synthesis error.
VirtualWorkflowSynthesizer maps:
- model JSON -> ModelConfig
- pre_prompt/chat_prompt_config -> prompt_template
- agent_mode.tools -> ToolMetadata[]
- agent_mode.strategy -> agent_strategy
- dataset_configs -> context
- file_upload -> vision
2. AGENT_V2_REPLACES_LLM (default: off):
When enabled, DifyNodeFactory.create_node() transparently remaps
nodes with type="llm" to type="agent-v2" before class resolution.
Since AgentV2NodeData is a strict superset of LLMNodeData, the
mapping is lossless. With tools=[], Agent V2 behaves identically
to LLM Node.
Both flags default to False for safety. Turn off = instant rollback.
46 existing tests pass. Flask starts successfully.
Made-with: Cursor