Compare commits

...

731 Commits

Author SHA1 Message Date
Yeuoly 229b0e190f bump version 2025-10-30 23:21:56 +08:00
autofix-ci[bot] 09d412cf2a
[autofix.ci] apply automated fixes 2025-10-30 15:18:20 +00:00
Harry 2842cbf1e1 refactor: update error handling to use BadRequest for plugin invocation errors 2025-10-30 23:16:14 +08:00
Harry e2543bcf30 refactor: remove unused error imports in TriggerManager 2025-10-30 23:05:08 +08:00
Harry 3f75aa6848 refactor: simplify error handling in TriggerManager 2025-10-30 23:04:54 +08:00
Yeuoly 57719f3ce9 fix: docker env 2025-10-30 22:21:14 +08:00
Harry 45677ac57c bump plugin daemon image version to 0.4.0-local 2025-10-30 22:19:24 +08:00
Yeuoly 4eacbf37ff bump docker compose daemon version 2025-10-30 21:22:09 +08:00
Yeuoly ef256ac276 bump version 2025-10-30 21:20:58 +08:00
hjlarry 2733e04039 fix CI 2025-10-30 20:28:47 +08:00
hjlarry e49ec82258 fix CI 2025-10-30 20:25:35 +08:00
hjlarry cf301eb1d9 fix CI 2025-10-30 20:23:22 +08:00
hjlarry 98b9ba2b2e fix CI 2025-10-30 20:22:01 +08:00
hjlarry 2126c64468 fix CI 2025-10-30 20:17:11 +08:00
hjlarry 271a1b4f98 fix CI 2025-10-30 20:10:49 +08:00
hjlarry 9be3c62c04 fix CI 2025-10-30 20:08:16 +08:00
lyzno1 04bfa235a9
fix: test 2025-10-30 19:49:08 +08:00
lyzno1 3b37ae1b4e
fix: dotenv lint 2025-10-30 19:37:45 +08:00
Yeuoly c1cb93cd26 fix 2025-10-30 18:55:08 +08:00
Yeuoly 75fa161c46 apply fix 2025-10-30 18:49:06 +08:00
Yeuoly d6d82cff33 apply linter 2025-10-30 18:48:16 +08:00
lyzno1 5c266fecf9
fix: types 2025-10-30 18:36:08 +08:00
Yeuoly 7244978b24 fix 2025-10-30 18:33:46 +08:00
Yeuoly 623021dcff cleanup 2025-10-30 18:32:05 +08:00
zhsama 5af165fce9 fix: change timestamp type to integer 2025-10-30 18:24:30 +08:00
lyzno1 9503fafc53
fix 2025-10-30 18:15:03 +08:00
lyzno1 99fac21bdb
fix: type 2025-10-30 18:11:52 +08:00
yessenia bc95678c5e fix(trigger): appmode type 2025-10-30 18:03:12 +08:00
lyzno1 3f34f38635
fix: types 2025-10-30 18:02:58 +08:00
lyzno1 30f771369b
fix: types 2025-10-30 18:01:12 +08:00
hjlarry 20bd059a6c fix CI 2025-10-30 17:58:59 +08:00
lyzno1 f5eb406394
fix: types 2025-10-30 17:58:31 +08:00
hjlarry cbebac1d45 fix: webhook container tests 2025-10-30 17:46:59 +08:00
lyzno1 030da43ae3
fix: type 2025-10-30 17:44:43 +08:00
yessenia b7f1394403 fix(trigger): add default icon 2025-10-30 17:42:10 +08:00
lyzno1 ceb6a09387
exclude test file in tsc 2025-10-30 17:39:02 +08:00
Yeuoly 14ad800967 Revert "rm type check"
This reverts commit 34d1f86f76.
2025-10-30 17:34:45 +08:00
lyzno1 34d1f86f76
rm type check 2025-10-30 17:28:38 +08:00
lyzno1 b9b9f8eae3
fix: type 2025-10-30 17:24:36 +08:00
lyzno1 0de8596afe
Merge branch 'feat/trigger' of https://github.com/langgenius/dify into feat/trigger 2025-10-30 17:14:11 +08:00
autofix-ci[bot] 4dbd26ff66
[autofix.ci] apply automated fixes 2025-10-30 09:14:08 +00:00
Yeuoly d018ef9033 apply autofix to autofix CI 2025-10-30 17:11:49 +08:00
lyzno1 979c985804
Merge branch 'feat/trigger' of https://github.com/langgenius/dify into feat/trigger 2025-10-30 17:11:08 +08:00
Yeuoly 291e9a3aee fix: ruff 2025-10-30 17:10:21 +08:00
Yeuoly 5861ca773e fix: mapping to dict 2025-10-30 17:09:40 +08:00
autofix-ci[bot] eb3b5f751a
[autofix.ci] apply automated fixes 2025-10-30 09:08:11 +00:00
lyzno1 9bbfbf1c5f
Merge branch 'feat/trigger' of https://github.com/langgenius/dify into feat/trigger 2025-10-30 17:06:27 +08:00
zhsama 8cbd124b80 delete: remove cron-parser unit tests 2025-10-30 17:05:41 +08:00
lyzno1 d137d0eed0
rm test 2025-10-30 17:05:27 +08:00
zhsama 58c5db3b00 Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-10-30 17:05:14 +08:00
lyzno1 8750796f9f
Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-30 16:59:56 +08:00
lyzno1 7d4bb45f94
fix(workflow): align plugin lock overlay with install availability 2025-10-30 16:55:07 +08:00
zhsama db744444f2 fix(CI): fix CI errors 2025-10-30 16:54:17 +08:00
Yeuoly b25d379ef4 cleanup 2025-10-30 16:33:39 +08:00
zhsama e1e95f7ccd fix(CI): fix CI errors 2025-10-30 16:33:04 +08:00
Yeuoly edd50420ec apply test fix 2025-10-30 16:24:43 +08:00
Yeuoly 9af8fe085b fix: apply docker template 2025-10-30 16:20:26 +08:00
zhsama ed6bb121bb fix: update types 2025-10-30 16:16:55 +08:00
zhsama 4635b99153 Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-10-30 16:16:21 +08:00
zhsama 282fde9a04 feat(next.config): add console log removal configuration for production 2025-10-30 16:16:11 +08:00
Yeuoly e9078eedbd fix: Variable e is not accessed (reportUnusedVariable) 2025-10-30 16:14:13 +08:00
Yeuoly 501698d844
Potential fix for code scanning alert no. 243: Information exposure through an exception
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
2025-10-30 16:11:32 +08:00
Yeuoly dd089b1b21 fix: coding style 2025-10-30 16:10:54 +08:00
Yeuoly 6260a1a28c fix: cycle imports 2025-10-30 16:09:39 +08:00
Yeuoly 8bc5035624
Potential fix for code scanning alert no. 211: Information exposure through an exception
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
2025-10-30 16:00:57 +08:00
Yeuoly 2dbfd9ea5a
Potential fix for code scanning alert no. 241: Use of a broken or weak cryptographic hashing algorithm on sensitive data
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
2025-10-30 16:00:44 +08:00
Yeuoly 08e61d76d6
Potential fix for code scanning alert no. 244: Information exposure through an exception
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
2025-10-30 16:00:24 +08:00
Yeuoly 447127cee4
Update api/controllers/console/app/workflow.py
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-30 15:59:37 +08:00
Yeuoly 49ebbd05b5
Update api/controllers/console/app/workflow.py
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-30 15:58:58 +08:00
Yeuoly defea962f6
Update api/controllers/console/workspace/trigger_providers.py
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-30 15:58:41 +08:00
Yeuoly b3866288e0 fix: docker compose 2025-10-30 15:54:25 +08:00
lyzno1 bed2ce69bb
Merge branch 'feat/trigger' of https://github.com/langgenius/dify into feat/trigger 2025-10-30 15:35:39 +08:00
lyzno1 4d37d61851
feat: dim node 2025-10-30 15:34:50 +08:00
lyzno1 8a48db6d0d
Improve workflow tool install flow 2025-10-30 15:34:49 +08:00
lyzno1 ff0f645e54
Fix plugin install detection for tool nodes 2025-10-30 15:34:49 +08:00
lyzno1 6e0765fbaf
feat: add install check for tools, triggers and datasources 2025-10-30 15:34:49 +08:00
yessenia 1d03e0e9fc fix(trigger): hide input params when no subscription 2025-10-30 15:28:34 +08:00
Yeuoly cac60a25bb cleanup: migrations 2025-10-30 15:27:02 +08:00
Yeuoly 57c65ec625 fix: typing 2025-10-30 14:58:30 +08:00
Yeuoly ffc3c61d00 merge workflow pasuing 2025-10-30 14:54:14 +08:00
Yeuoly aa3b16a136 fix: migrations 2025-10-30 14:45:26 +08:00
Yeuoly 6e0b408dd5 Merge branch 'main' into feat/trigger 2025-10-30 14:43:27 +08:00
lyzno1 be9eeff6c2
Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-30 12:14:47 +08:00
hjlarry ca9d92b1e5 fix(variable): when open history mode not close global var panel 2025-10-30 09:53:00 +08:00
hjlarry 0607db41e5 fix(variable): draft run workflow cause global var panel misalign 2025-10-30 09:02:38 +08:00
hjlarry 48b1829b14 chore: improve toggle env/conversation/global var panel 2025-10-29 22:08:04 +08:00
hjlarry 6767a8f72c chore: i18n for system var 2025-10-29 21:10:26 +08:00
Harry 1e477af05f feat(trigger): add system variables to webhook node outputs
Enhanced the TriggerWebhookNode to include system variables as outputs. This change allows for better accessibility of system variables during node execution, improving the overall functionality of the webhook trigger process. A TODO comment has been added to address future improvements for direct access to system variables.
2025-10-29 18:15:36 +08:00
Harry 9b5e5f0f50 refactor(api): replace dict type hints with Mapping for improved type safety
Updated type hints in several services to use Mapping instead of dict for better compatibility with various dictionary-like objects. Adjusted credential handling to ensure consistent encryption and decryption processes across ToolManager, DatasourceProviderService, ApiToolManageService, BuiltinToolManageService, and MCPToolManageService. This change enhances code clarity and adheres to strong typing practices.
2025-10-29 18:10:38 +08:00
Harry fb12f31df2 feat(trigger): system variables for trigger nodes
Added a timestamp field to the SystemVariable model and updated the WorkflowAppRunner to include the current timestamp during execution. Enhanced node type checks to recognize trigger nodes in various services, ensuring proper handling of system variables and node outputs in TriggerEventNode and TriggerScheduleNode. This improves the overall workflow execution context and maintains consistency across node types.
2025-10-29 18:10:38 +08:00
yessenia db2c6678e4 fix(trigger): show subscription url & add readme in trigger plugin node 2025-10-29 16:16:29 +08:00
zhsama bc3421add8 refactor(variable): update global variable names and types for consistency 2025-10-29 15:53:37 +08:00
zhsama 61d8809a0f fix(workflow): enhance validation before running workflows by integrating warning notifications 2025-10-29 15:53:13 +08:00
lyzno1 d37cc9f9c8
Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-29 15:16:28 +08:00
lyzno1 0db082f6d0
feat(workflow): persist RAG recommendation panel collapse state 2025-10-29 15:10:45 +08:00
lyzno1 c94dc52310
fix: remove duplicate RAG tool heading and fix select callback type 2025-10-29 14:57:38 +08:00
hjlarry bebcbfd80e chore: improve delete app related tables 2025-10-29 14:29:59 +08:00
Harry dfc5e3609d refactor(trigger): streamline OAuth client existence check
Replaced the method for checking the existence of a system OAuth client with a new dedicated method `is_oauth_system_client_exists` in the TriggerProviderService. This improves code clarity and encapsulates the logic for verifying the presence of a system-level OAuth client. Updated the TriggerOAuthClientManageApi to utilize the new method for better readability.
2025-10-29 14:22:56 +08:00
lyzno1 fa5765ae82
Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-29 12:56:31 +08:00
lyzno1 852d851996
fix(workflow): add empty array validation for required checklist fields in trigger plugin
The checkValid function was not properly validating required checklist fields when they had empty array values. This caused required fields to pass validation even when no options were selected.

Added array length check to the constant type validation to ensure required checklist fields must have at least one selected option.
2025-10-29 12:36:43 +08:00
hjlarry 0b599b44b0 chore: when delete app also delete related trigger tables 2025-10-29 12:15:34 +08:00
lyzno1 f06dc3ef90
fix: localize workflow block search filters 2025-10-29 11:55:30 +08:00
lyzno1 f9df61e648
feat: add inline code copy styling for variable inspect webhook url 2025-10-29 10:14:50 +08:00
lyzno1 6e76e02dba
fix: trigger plugin help link 2025-10-29 09:35:45 +08:00
zhsama dc24450e29 feat(workflow): add webhook debug URL display in variable inspection 2025-10-29 04:38:27 +08:00
zhsama 8bcecce627 feat(workflow): add toast notifications for warning nodes during execution 2025-10-29 01:40:27 +08:00
zhsama 66cb963df3 feat(workflow): enhance validation by integrating warning nodes into last run checks. 2025-10-29 01:28:31 +08:00
Harry 5c95c77604 refactor(trigger): streamline workflow argument handling in DraftWorkflowTriggerNodeApi
- Simplified retrieval of workflow arguments by directly accessing event.workflow_args.
- Removed unnecessary conditional checks for user inputs, ensuring cleaner code.
- Enhanced TriggerEventNode to use deepcopy for user inputs to prevent unintended mutations.
2025-10-29 01:04:37 +08:00
zhsama a264a609db feat(workflow): integrate workflow run validation before execution 2025-10-29 00:36:48 +08:00
zhsama 3a876fd437 Merge branch 'main' into feat/trigger 2025-10-29 00:08:50 +08:00
zhsama 13bc68a646 feat(trigger): enhance runScheduleSingleRun to handle API response 2025-10-29 00:07:08 +08:00
Harry b41538d8c7 feat(trigger): reinforcement schedule trigger debugging with cron calculation
- Implemented a caching mechanism for schedule trigger debug events using Redis to optimize performance.
- Added methods to create and manage schedule debug runtime configurations, including cron expression handling.
- Updated the ScheduleTriggerDebugEventPoller to utilize the new caching and event creation logic.
- Removed the deprecated build_schedule_pool_key function from event handling.
2025-10-28 23:34:08 +08:00
zhsama 720480d05e chore(tests): remove deprecated test files for schedule and webhook triggers 2025-10-28 22:42:29 +08:00
yessenia 71b1af69c5 feat(trigger): request condition param 2025-10-28 22:35:03 +08:00
Harry 18fd79fbe6 feat(trigger): add event_name to PluginTriggerMetadata for enhanced trigger handling
- Introduced event_name attribute in PluginTriggerMetadata to improve metadata clarity.
- Updated dispatch_triggered_workflow function to include event_name when dispatching triggered workflows.
2025-10-28 18:32:06 +08:00
Harry c16421df27 refactor: improve trigger metadata handling and streamline workflow service
- Updated ScheduleTriggerDebugEventPoller to include an empty files list in workflow_args.
- Enhanced WorkflowAppService to handle trigger metadata more effectively, including a new method for processing metadata and removing the deprecated _safe_json_loads function.
- Adjusted PluginTriggerMetadata to use icon_filename and icon_dark_filename for better clarity.
- Simplified async workflow task parameters by changing triggered_from to trigger_from for consistency.
2025-10-28 17:50:06 +08:00
Harry 0d686fc6ae refactor: streamline trigger event node metadata handling and update async workflow service for JSON serialization
- Removed unnecessary input data from the TriggerEventNode's metadata.
- Updated AsyncWorkflowService to use model_dump_json() for trigger metadata serialization.
- Added a comment in WorkflowAppService to address the large size of the workflow_app_log table and the use of an additional details field.
2025-10-28 17:50:06 +08:00
yessenia db352c0a18 Merge branch 'main' into feat/trigger 2025-10-28 17:11:15 +08:00
yessenia bf7b18d442 feat(trigger): dynamic options opt 2025-10-28 16:20:01 +08:00
lyzno1 7d56ca5294
fix: add time-picker placement prop 2025-10-28 15:49:10 +08:00
zhsama 0b1015e221 feat(workflow): enhance variable inspector to support schedule trigger events with next execution time display 2025-10-28 14:12:20 +08:00
lyzno1 96f0d648fa
feat: invalidate trigger plugin queries after marketplace installs 2025-10-28 11:31:02 +08:00
lyzno1 c4996f9563
Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-28 11:28:06 +08:00
yessenia 850c5fec32 fix(trigger): invalid subscription 2025-10-27 21:27:08 +08:00
zhsama b1f79c34d1 fix(workflow): add support for schedule triggers in workflow run hook 2025-10-27 20:52:44 +08:00
yessenia 96a461646e fix(trigger): readme portal zindex 2025-10-27 20:05:02 +08:00
zhsama 5df94fd866 fix(workflow): enhance node-run to include schedule triggers 2025-10-27 19:44:26 +08:00
lyzno1 e074ba84d1
fix(workflow): avoid nested buttons in subscription selector to stop hydration warning 2025-10-27 17:23:58 +08:00
lyzno1 1335be8d60
Revert "feat: propagate trigger metadata for plugin icons across UI"
This reverts commit 3bd62f3fdf.
2025-10-27 17:06:40 +08:00
lyzno1 c79d75b32d
Revert "fix: display plugin trigger labels in logs using i18n metadata"
This reverts commit 651cc81cfe.
2025-10-27 17:06:35 +08:00
lyzno1 f18054847e
Revert "fix: workflow_trigger"
This reverts commit cc219cc81c.
2025-10-27 17:06:30 +08:00
lyzno1 b2b81f3822
Revert "fix: trigger by display translations"
This reverts commit 33daedd7aa.
2025-10-27 17:06:25 +08:00
yessenia 90753b2782 fix(trigger): readme style 2025-10-27 16:33:40 +08:00
hjlarry c05fa9963a fix plugin name incorrect encoded 2025-10-27 16:08:47 +08:00
Harry 9de7a7d48f fix(trigger): update outputs in TriggerEventNode to use inputs directly 2025-10-27 15:35:09 +08:00
yessenia 29cddc449f fix(trigger): add clickOutsideNotClose prop 2025-10-27 14:30:08 +08:00
zhsama dfed14ba67 refactor: simplify syncWorkflowDraft parameters by removing payload sanitization 2025-10-27 13:46:27 +08:00
lyzno1 440262a51b
fix: serialize workflow draft sync operations (#27487) 2025-10-27 13:29:40 +08:00
Harry d705fece9d fix(plugin): update trigger field type to allow None and add field validator for parameters in EventEntity 2025-10-27 12:02:22 +08:00
lyzno1 d08cc48368
Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-27 11:37:07 +08:00
lyzno1 b94ad084c3
feat: surface featured trigger recommendations in start tab (#27319) 2025-10-27 11:33:02 +08:00
Harry 9453148233 chore(migrations): remove obsolete migration files for workflow webhook and schedule plan tables 2025-10-27 00:36:27 +08:00
Harry 1857d0e53f chore(migrations): remove obsolete migration files for workflow trigger logs, app triggers, and plugin triggers 2025-10-27 00:28:07 +08:00
Harry ae422c2628 fix(trigger): simplify return logic in TriggerProviderService by removing unnecessary None return 2025-10-26 23:48:50 +08:00
Harry d933116e46 fix(workflow): improve error handling in DraftWorkflowTriggerNodeApi by returning JSON response 2025-10-26 23:48:50 +08:00
lyzno1 5cf4afd7b2
Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-26 11:54:17 +08:00
lyzno1 f7853f3b27
Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-25 14:58:32 +08:00
lyzno1 913d85302c
fix: hide replay button for non app-run workflow logs 2025-10-25 14:42:26 +08:00
lyzno1 33daedd7aa
fix: trigger by display translations 2025-10-25 14:36:29 +08:00
lyzno1 cc219cc81c
fix: workflow_trigger 2025-10-25 14:21:52 +08:00
lyzno1 945295adc3
chore: add some ja-JP translations 2025-10-25 14:00:21 +08:00
lyzno1 651cc81cfe
fix: display plugin trigger labels in logs using i18n metadata 2025-10-25 12:41:37 +08:00
lyzno1 3bd62f3fdf
feat: propagate trigger metadata for plugin icons across UI 2025-10-25 12:15:21 +08:00
lyzno1 e3484c8dc3
fix: ruff format 2025-10-25 12:08:22 +08:00
lyzno1 eecbe533a1
fix: ruff check 2025-10-25 12:07:46 +08:00
Harry 4221e99362 update(docker): add triggered workflow dispatcher and refresh executor to default Celery queues 2025-10-24 21:31:38 +08:00
Stream 5c69521973
feat: align with params 2025-10-24 21:20:12 +08:00
Stream ffcaa67a56
feat: align with path 2025-10-24 20:47:54 +08:00
Stream 64a070f6b0
feat: align with path 2025-10-24 19:56:06 +08:00
Stream c61656c759
fix: request param 2025-10-24 19:40:50 +08:00
Stream 6d34e4e99b
fix: request path 2025-10-24 19:33:19 +08:00
Stream d3a767364b
fix: request path 2025-10-24 19:15:18 +08:00
Stream f3c6d1ca1d
fix: param passing 2025-10-24 18:27:23 +08:00
zhsama 6098dc0242 feat(workflow): enhance listening descriptions for plugin and webhook triggers 2025-10-24 16:18:07 +08:00
zhsama 29ec3c7d5c feat(workflow): update listening descriptions when trigger nodes start test-run 2025-10-24 15:32:23 +08:00
zhsama 4597ab4efb Merge branch 'refs/heads/main' into feat/trigger 2025-10-24 14:44:15 +08:00
lyzno1 1dddcf1194
fix(workflow): resolve occasional issues with syncing historical states or clearing DSL in draft mode (#27391) 2025-10-24 12:33:14 +08:00
Stream c4c38a51d9
fix: merge 2025-10-24 11:58:13 +08:00
yessenia 8a8c0703b1 feat: add datasource node readme 2025-10-24 11:46:58 +08:00
yessenia 1b74869b04 fix: plugin readme params 2025-10-24 10:48:59 +08:00
lyzno1 f065504ed6
fix(app-overview): soften tooltip styling 2025-10-24 10:34:16 +08:00
lyzno1 3f5485605f
chore: update docs link 2025-10-24 10:31:09 +08:00
lyzno1 399bb522e0
chore: add ts-node for test 2025-10-24 10:20:58 +08:00
lyzno1 9fffa9a996
refactor(workflow): clean up entry node status and colocate store types 2025-10-24 10:20:38 +08:00
lyzno1 aee9a8366f
refactor: move marketplace footer outside scroll containers 2025-10-24 10:07:02 +08:00
lyzno1 c3eec7ea8a
Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-24 09:47:28 +08:00
yessenia 4b4ec3438f feat: add plugin readme 2025-10-24 01:18:58 +08:00
zhsama 9aa43c9165 feat(workflow): enhance trigger node handling with event listening and state management 2025-10-23 18:21:22 +08:00
zhsama 4ae23ed0f9 feat(workflow): remove unused trigger status logic and simplify entry node status handling 2025-10-23 18:19:06 +08:00
Yeuoly efe68d5aa6 Merge branch 'main' into feat/trigger 2025-10-23 18:05:59 +08:00
zhsama 1604db02b5 fix(workflow): change description field to be required in TriggerPluginNodePayload 2025-10-23 16:56:00 +08:00
zhsama e7192de9c0 Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-10-23 16:46:35 +08:00
zhsama f822b38a00 feat(workflow): integrate payload sanitization for workflow draft synchronization 2025-10-23 16:45:28 +08:00
yessenia 5a5c7f38d1 fix(plugin): stop loading when uninstall fails 2025-10-23 16:18:52 +08:00
zhsama 42a9a88ae2 refactor(trigger-plugin): enhance variable type resolution and encapsulate output variable logic in a dedicated function 2025-10-23 16:01:18 +08:00
zhsama aea3fc6281 Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-10-23 15:33:31 +08:00
zhsama bcdc11396a refactor(trigger-plugin): streamline variable type resolution and output variable construction 2025-10-23 15:33:19 +08:00
hjlarry ecd1d44d23 chore: update trigger dsl version to 0.5.0 2025-10-23 15:05:39 +08:00
hjlarry 6df786248c fix: draft run webhook node the _raw var not display on the panel 2025-10-23 13:35:45 +08:00
lyzno1 37e75f7791
Ensure workflow tools tab always shows marketplace footer 2025-10-23 13:08:59 +08:00
lyzno1 7ada2385b3
feat: add toggle behavior for featured tools 2025-10-23 12:27:42 +08:00
lyzno1 a77aab96f5
fix: align all workflow trigger docs link 2025-10-23 12:17:27 +08:00
Stream 13af48800b
fix: merge 2025-10-23 12:13:33 +08:00
Stream 6e7fb59638
Merge branch 'feat/plugin-readme' into feat/trigger
# Conflicts:
#	api/controllers/console/workspace/plugin.py
#	api/core/plugin/entities/plugin_daemon.py
2025-10-23 12:10:44 +08:00
lyzno1 863b4f8fe9
Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-23 11:54:35 +08:00
yessenia 949ac9d930 feat(trigger): add formitem desc 2025-10-23 10:57:42 +08:00
yessenia 06d1a2e2fd feat(trigger): remove Redundant comp & triggers api no cache 2025-10-23 10:57:41 +08:00
lyzno1 d478f62b49
Optimize workflow tool sync after plugin install (#27280) 2025-10-23 09:58:54 +08:00
zhsama 128bc2241d feat(checkbox): adjust styles for checkbox component layout 2025-10-22 17:35:04 +08:00
Harry b2730d680c fix(trigger): add missing fields in TriggerEventNode configuration 2025-10-22 16:55:55 +08:00
lyzno1 52b180104a
Limit workflow tool/trigger search to provider and item names 2025-10-22 16:53:23 +08:00
Harry 9cdb62da93 fix(trigger): reset inputs in TriggerEventNode to an empty dictionary 2025-10-22 15:58:43 +08:00
Joel 5af08edfda chore: add missing icon 2025-10-22 15:38:54 +08:00
Harry 24fa5f33d7 fix(trigger): update input handling in TriggerEventNode to correctly retrieve and set outputs 2025-10-22 15:37:00 +08:00
yessenia caf0bf34dd fix(trigger): add event node status & min-height in modal 2025-10-22 15:31:03 +08:00
lyzno1 181a1ae7f3
fix: hover tooltip 2025-10-22 15:09:01 +08:00
lyzno1 e18ecead2c
fix: hide All tools when searching 2025-10-22 15:07:46 +08:00
lyzno1 36a26adab2
fix: install from marketplace 2025-10-22 14:59:49 +08:00
zhsama bc2edf5107 refactor(workflow): replace fetch function with base/post in use-one-step-run and use-workflow-run hooks in polling 2025-10-22 14:55:57 +08:00
lyzno1 50bbac5973
Add double-arrow icon swap on featured tools “Show more” hover 2025-10-22 14:47:19 +08:00
lyzno1 45b221659b
Tighten featured tools header arrow and add “All tools” section divider 2025-10-22 14:41:18 +08:00
lyzno1 16957f14f1
fix: featured icons 2025-10-22 14:33:00 +08:00
lyzno1 0d7dde0639
Align featured tool hover layout and widen action dropdown 2025-10-22 14:19:22 +08:00
zhsama 37805184d9 Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-10-22 12:58:15 +08:00
Yeuoly df9932088f avoid time slice strategy in community edition 2025-10-22 12:50:11 +08:00
zhsama d101a83be8 Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-10-22 12:49:41 +08:00
lyzno1 94ea289c75
fix: suggestions tools list 2025-10-22 12:46:22 +08:00
lyzno1 e2539e91eb
fix: view docs 2025-10-22 12:46:22 +08:00
lyzno1 77e9bae3ff
feat(workflow): polish featured tools recommendations 2025-10-22 12:46:21 +08:00
lyzno1 d99644237b
chore: align help link translations 2025-10-22 12:46:21 +08:00
lyzno1 5cb268e99b
feat: suggestions ui 2025-10-22 12:46:21 +08:00
lyzno1 f179b03d6e
fix: constrain rag pipeline datasource selector width 2025-10-22 12:46:21 +08:00
lyzno1 28fe58f3dd
feat: try to add tools suggestions 2025-10-22 12:46:21 +08:00
Yeuoly 14acd05846 fix 2025-10-22 12:41:19 +08:00
zhsama ccce135bf5 fix(workflow): add setShowVariableInspectPanel for specific block types in useLastRun hook 2025-10-22 12:38:03 +08:00
Yeuoly cb5607fc8c refactor: TimeSliceLayer 2025-10-22 12:13:12 +08:00
Yeuoly 7f70d1de1c ASYNC_WORKFLOW_SCHEDULER_GRANULARITY 2025-10-22 12:10:12 +08:00
Yeuoly c36173f5a9 fix: typing 2025-10-22 11:55:26 +08:00
Yeuoly 7acbe981e2 fix: discorrect elapsed_time 2025-10-22 11:49:02 +08:00
Yeuoly dd6ab7c68c feat: support pausing workflow trigger log 2025-10-22 11:45:16 +08:00
Joel a1ea256e79 fix: global icon in inspect 2025-10-22 11:36:01 +08:00
Joel 14942c9ee9 fix: page crash 2025-10-22 11:28:28 +08:00
Joel b0b316ed48 fix: rag pipline not show sys vars 2025-10-22 11:07:08 +08:00
hjlarry 871cfbd40c fix: CredentialsSchema missing help field display 2025-10-22 09:23:59 +08:00
yessenia 9a3ca0ce3b fix(trigger): check subscription removed 2025-10-21 20:01:16 +08:00
zhsama c90df5c12c refactor(entry-node): remove showIndicator prop and related logic for cleaner component structure 2025-10-21 19:53:29 +08:00
yessenia f4acc78f66 fix(trigger): deal with empty manualPropertiesSchema 2025-10-21 19:49:42 +08:00
Yeuoly 3d5e2c5ca1 feat(trigger): add suspend/timeslice layers and workflow CFS scheduler
- add suspend, timeslice, and trigger post engine layers
- introduce CFS workflow scheduler tasks and supporting entities
- update async workflow, trigger, and webhook services to wire in the new scheduling flow
2025-10-21 19:20:54 +08:00
zhsama 55bf9196dc feat(trigger): add TriggerSchedule to node type checks for workflow execution 2025-10-21 18:57:57 +08:00
yessenia 18a52b4937 fix(trigger): subscription removed in workflow 2025-10-21 18:43:15 +08:00
Joel 439727746c fix: trigger timestamp show place 2025-10-21 18:21:37 +08:00
Joel 04b55177b5 feat: support show global vars 2025-10-21 17:59:37 +08:00
zhsama 2793ede875 feat: update checkbox component in the panel and refactor form types for checkbox and boolean 2025-10-21 17:28:25 +08:00
yessenia dc4801c014 refactor(trigger): refactor app mode type to enum 2025-10-21 16:50:18 +08:00
yessenia d5e2649608 fix(trigger): disable some options when no start node 2025-10-21 15:25:36 +08:00
Joel 4102f0bc9d feat: vars to new place 2025-10-21 15:03:16 +08:00
Joel 25e4203cb1 main 2025-10-21 14:44:24 +08:00
Joel e1a3ead941 main 2025-10-21 14:42:27 +08:00
zhsama 6251090893 Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-10-21 14:39:10 +08:00
zhsama aa5e04b70e Merge branch 'main' into feat/trigger
# Conflicts:
#	web/app/components/workflow/hooks-store/store.ts
#	web/package.json
#	web/pnpm-lock.yaml
2025-10-21 14:36:07 +08:00
Harry 8ac25c29ee feat(trigger): implement subscription refresh logic with enhanced error handling and logging 2025-10-21 14:11:27 +08:00
Harry f4517d667b feat(trigger): enhance trigger provider refresh task with locking mechanism and due filter logic 2025-10-21 14:11:27 +08:00
Yeuoly dc2481c805 feat: docs 2025-10-21 13:56:31 +08:00
Yeuoly 8d7435a51b docs: introduce agent skill for trigger 2025-10-21 12:23:19 +08:00
lyzno1 bb28c718df
fix: correct webhook trigger node id parsing 2025-10-21 11:48:50 +08:00
lyzno1 1b7a5b6209
fix: immer breaking change 2025-10-21 11:42:31 +08:00
lyzno1 448622b4fd
fix: pnpm lock file 2025-10-21 11:39:39 +08:00
crazywoola e9dda03e8d
fix: immer version and ref in code base (#27130) 2025-10-21 11:38:44 +08:00
lyzno1 8d3d177932
fix: pnpm lock file 2025-10-21 11:35:41 +08:00
lyzno1 f0af4d692a
fix: breaking change 2025-10-21 11:32:20 +08:00
lyzno1 075173e67d
fix(workflow): reset onboarding auto-open flag across flows 2025-10-21 11:19:36 +08:00
Yeuoly f02d575379 Merge branch 'main' into feat/trigger 2025-10-21 11:09:26 +08:00
yessenia 735ebf6c59 fix(trigger): oauth client params 2025-10-21 09:27:10 +08:00
Harry 96f0b7abe3 fix(trigger): handle missing 'inputs' key in trigger data retrieval 2025-10-20 21:47:49 +08:00
zhsama eb1686f04b Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-10-20 20:27:02 +08:00
zhsama d4b5d9a02a feat(trigger): add trigger validation logic and utility functions for improved checklist integration 2025-10-20 20:26:40 +08:00
Harry f87f77ce7b feat(trigger): add configuration for trigger provider refresh task 2025-10-20 20:02:12 +08:00
Harry 24619e74f6 fix(trigger): update error handling and credential expiration field 2025-10-20 20:02:12 +08:00
zhsama f5c1646f79 fix(dynamic-options): fix the dynamic options in plugin trigger 2025-10-20 19:34:41 +08:00
zhsama e26d77e78c fix(checklist): enhance type safety by refining BlockEnum usage in checklist components 2025-10-20 19:34:41 +08:00
yessenia 8e1e81732a fix(trigger): formitem boolean layout 2025-10-20 19:27:21 +08:00
yessenia 801f8c1592 fix(trigger): oauth client default values 2025-10-20 18:21:38 +08:00
Harry fd4b234171 feat: improve oauth client info api 2025-10-20 16:50:03 +08:00
Harry dff536ab6d feat(trigger): trigger plugin protocol improvements 2025-10-20 16:50:03 +08:00
hjlarry a152ce45d3 fix: start/stop button on the node control not work 2025-10-20 16:43:16 +08:00
Yeuoly 6a164f8811 refactor: use EnumText 2025-10-20 15:48:11 +08:00
Yeuoly a03ff39f3e chore: add to .env.example 2025-10-20 15:42:29 +08:00
Yeuoly a6373e357a fix: typing 2025-10-20 15:38:54 +08:00
Yeuoly 538b639bef fix: unify trigger url generation 2025-10-20 15:34:51 +08:00
yessenia fe0457b257 fix(trigger): show text 2025-10-20 14:17:52 +08:00
zhsama d5b228f234 fix(end-node): adjust required status and update end node terminology to output in i18n 2025-10-20 14:00:14 +08:00
lyzno1 1c2f95eeb6
fix(migrations): chain messages.app_mode upgrade after plugin trigger 2025-10-20 13:40:37 +08:00
lyzno1 81b3436ec4
fix(trigger): resolve circular import in models 2025-10-20 09:23:11 +08:00
Yeuoly 3e4f2bcf14 optimize: TriggerDispatchResponse 2025-10-18 20:40:59 +08:00
Yeuoly c7696964b9 fix: refine 2025-10-18 20:27:22 +08:00
Yeuoly fb8ecf7b5a refactor: move out enums to specific file 2025-10-18 20:22:21 +08:00
Yeuoly e3c2345b21 fix: typing 2025-10-18 20:17:23 +08:00
Yeuoly bfe0d14409 fix: typing 2025-10-18 20:16:10 +08:00
Yeuoly c7498c3a11 fix: typing 2025-10-18 20:14:00 +08:00
Yeuoly 5fba41688a refactor: cleaning up terrible data 2025-10-18 20:12:20 +08:00
Yeuoly b63b9c32f7 refactor: models 2025-10-18 20:06:46 +08:00
Yeuoly 65c6203ad7 fix: correct building reference 2025-10-18 19:54:06 +08:00
Yeuoly 3a18337129 refactor: confused abstract class 2025-10-18 19:47:23 +08:00
Yeuoly b6b433626e fix: typing 2025-10-18 19:43:00 +08:00
Yeuoly 5d6b9b0cb1 refactor 2025-10-18 19:41:53 +08:00
Yeuoly 6d09330f98 chore: rename PluginTriggerManager to PluginTriggerClient 2025-10-18 19:33:08 +08:00
Yeuoly 5df9afa91a fix: typing 2025-10-18 19:32:08 +08:00
Yeuoly 30a341331f chore: unify request handling 2025-10-18 19:29:00 +08:00
Yeuoly 31cf4b6619 fix: query parameter dose not exist in workflow 2025-10-18 19:19:36 +08:00
Yeuoly dd0da3218c feat: introduce payload field to plugin trigger processing 2025-10-18 19:15:46 +08:00
Yeuoly 11c9219848 chore: better exception handling 2025-10-18 19:15:09 +08:00
Yeuoly b1ffd2ef2b refine: use enum reference to avoid plain text declarations 2025-10-18 19:14:24 +08:00
Yeuoly 86cf7952fb refactor: add typing annotation 2025-10-18 19:13:07 +08:00
Yeuoly d790d2b6bc feat: introduce payload field to TriggerDispatchResponse and a better typing 2025-10-18 19:12:43 +08:00
Yeuoly a711a8e759 refactor: better typing 2025-10-18 19:11:50 +08:00
Yeuoly 8a18b6e13b refactor webhook service enduser operations 2025-10-18 19:11:15 +08:00
Yeuoly 95aeb61d7c fix: missing backwards invocation 2025-10-18 19:10:22 +08:00
Yeuoly e8b0144cf7 refactor: remove common end user operations out of wraps.py and move it into EndUserService 2025-10-18 19:09:55 +08:00
yessenia 2c8c1860ca fix(trigger): show event output 2025-10-18 16:28:26 +08:00
Yeuoly 5edfbd5305 fix: meaningless error messages 2025-10-18 16:27:12 +08:00
lyzno1 4ceae655bd
fix: prevent selecting time text in picker 2025-10-18 15:50:15 +08:00
lyzno1 6ae76d108b
feat: add cursor pointer to macketplace actions 2025-10-17 21:31:40 +08:00
lyzno1 9cc3cfb63e
fix: hide footer from all start block when search not found 2025-10-17 21:28:57 +08:00
lyzno1 58e4c0793a
feat: align tool selector empty state with start blocks 2025-10-17 21:25:28 +08:00
Harry 80f2c1be67 fix(trigger): enhance error handling and refactor end user creation in trigger workflows
- Improved error handling in `TriggerSubscriptionListApi` to return a 404 response for ValueErrors.
- Refactored end user creation logic in `service_api/wraps.py` to use `get_or_create_end_user` for better clarity and consistency.
- Introduced a new method `create_end_user_batch` for batch creation of end users, optimizing database interactions.
- Updated various trigger-related services to utilize the new end user handling, ensuring proper user context during trigger dispatching.
2025-10-17 21:00:57 +08:00
lyzno1 8a5174d078
Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-17 19:21:15 +08:00
zhsama d0f357a690 feat(workflow): enhance listening functionality with multiple trigger node support 2025-10-17 19:09:55 +08:00
zhsama fbe3df5658 fix(plugin-detail-panel): update provider reference to use trigger identity name 2025-10-17 18:23:35 +08:00
yessenia 21e3ef91eb fix(trigger): show event detail 2025-10-17 18:23:04 +08:00
zhsama 3f116dc74b feat(variable-inspect): improve listening description resolution in Listening component 2025-10-17 18:11:26 +08:00
hjlarry 32731c4622 render autoCommonParametersSchema other input type 2025-10-17 18:09:14 +08:00
zhsama 3c1f0e1aec fix(trigger): fix authentication status check 2025-10-17 17:13:07 +08:00
Joel 685e48636d fix: if tag show global vars problem 2025-10-17 16:57:42 +08:00
Joel 7c4edaa636 fix: variableValid in prompt editor 2025-10-17 16:48:27 +08:00
Joel 35867707d0 fix: global var type render in node 2025-10-17 15:24:49 +08:00
zhsama 5b884d750f feat(trigger): add run all triggers test-run and implement TriggerType enum 2025-10-17 14:56:05 +08:00
Harry bc0d5f4e41 fix(trigger): enhance subscription retrieval error handling in TriggerService
- Added exception handling for `get_subscription_by_endpoint` to return a 404 response when the plugin is not found and a 500 response for other errors.
- Improved overall robustness of the subscription retrieval process.
2025-10-17 14:43:43 +08:00
Harry f20452622a fix(trigger): improve event retrieval handling in PluginTriggerProviderController
- Updated the `get_event` method to return `None` instead of raising a ValueError when an event is not found, enhancing error handling.
- Adjusted the `get_event_parameters` method to handle cases where the event may be `None`, returning an empty dictionary instead of causing an error.
- Improved type hinting for better clarity and type safety.
2025-10-17 14:43:43 +08:00
Joel 6ba26cf7b5 fix: global var show in node 2025-10-17 14:39:30 +08:00
Joel 7510e0654b fix: show global vars in picker 2025-10-17 14:24:20 +08:00
Joel 564bb22d8b feat: system var icon 2025-10-17 13:57:26 +08:00
lyzno1 5e2d5f0d83
feat: allow trigger schedule TimePicker to stretch with panel 2025-10-17 13:52:26 +08:00
hjlarry d90ffbcf14 rm unused ensureWebhookRawVariable 2025-10-17 13:49:33 +08:00
hjlarry 771cc72dcf fix auto generate webhook url 2025-10-17 13:41:03 +08:00
-LAN- 04c91111e9
fix(trigger): trigger node is marked as 'branch' type 2025-10-17 13:37:46 +08:00
yessenia 5a13daefdb fix(trigger): close portal after select a subscription 2025-10-17 13:31:00 +08:00
lyzno1 c033c05ec1
fix: resolve trigger plugin icons in workflow checklist 2025-10-17 12:55:41 +08:00
hjlarry 5b2f323a87 improve webhook request headers 2025-10-17 11:27:48 +08:00
Joel b855d95430 feat: can choose global vars 2025-10-17 11:02:27 +08:00
yessenia fe4b63210e fix(trigger): oauth client config 2025-10-17 10:52:42 +08:00
Joel 84c09ec59d chore: user input output vars show 2025-10-17 10:21:11 +08:00
hjlarry 40e17ef801 fix merge main cause current_user not defined 2025-10-17 09:49:09 +08:00
yessenia f1fcb92691 feat(trigger): add category trigger 2025-10-16 18:30:54 +08:00
lyzno1 3865555113
Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-16 18:30:33 +08:00
lyzno1 95e46806a4
fix: marketplace item install hover 2025-10-16 18:01:00 +08:00
lyzno1 c9c3d03878
fix: keep start tab search results restorable 2025-10-16 17:56:32 +08:00
lyzno1 b28ec4be6e
fix: start block ui 2025-10-16 17:48:24 +08:00
lyzno1 29d7023fae
- Update all-tools.tsx so provider search results keep only relevant items: full list retained when the provider matches; otherwise the provider is cloned with just matching tools.
- Mirror the same filtering strategy for Start-tab trigger plugins in trigger-plugin/list.tsx, ensuring only matching events render when searching.
2025-10-16 17:46:44 +08:00
lyzno1 22f6c23780
refactor: remove empty search placeholder from tool selector 2025-10-16 17:39:35 +08:00
hjlarry 548db29a47 add var name check for webhook node 2025-10-16 16:59:46 +08:00
hjlarry 1089c5bf04 add _webhook_raw to downstreamed node 2025-10-16 16:35:05 +08:00
hjlarry 559cf6583f fix add candidate webhook node raise error 2025-10-16 15:33:18 +08:00
yessenia b04f92715c feat(trigger): plugin category type 2025-10-16 15:30:04 +08:00
Harry 671aba6ab7 fix(trigger): handle missing subscription constructor gracefully in PluginTriggerProviderController
- Updated the logic in `PluginTriggerProviderController` to return an empty list instead of raising a ValueError when the subscription constructor is not found, improving error handling and flow.
2025-10-16 15:09:13 +08:00
Harry beaeb30dcc fix(trigger): enhance credential encryption handling in TriggerProviderService
- Introduced conditional initialization of credential_encrypter based on credential_type to prevent errors when unauthorized.
- Updated the encryption logic to handle cases where credential_encrypter may be None, ensuring robustness in credential processing.
2025-10-16 15:07:05 +08:00
hjlarry 56abca1f41 webhook i18n 2025-10-16 14:52:15 +08:00
zhsama 52d5f219e1 fix(workflow): include trigger node type in available blocks check 2025-10-16 14:24:44 +08:00
Harry d4516e942c fix(trigger): improve error handling in DraftWorkflowTriggerNodeApi and update input class naming
- Removed specific exception handling for ValueError and PluginInvokeError in `DraftWorkflowTriggerNodeApi`, allowing a more general exception to be raised.
- Renamed `PluginTriggerInput` to `TriggerEventInput` in `TriggerEventNodeData` for better clarity and consistency.
- Updated validation logic in `TriggerEventInput` to ensure correct type checks for input values.
2025-10-16 14:04:44 +08:00
zhsama 1c17a16830 feat(trigger): format event_parameters and improve 2025-10-16 14:00:21 +08:00
lyzno1 1f6ab13fc5
fix(workflow): auto run single start node without dropdown 2025-10-16 09:37:18 +08:00
lyzno1 7344df87e5
Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-10-15 20:47:20 +08:00
lyzno1 29353bd7c2
Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-15 20:47:02 +08:00
yessenia 7b6f5d6860 fix(trigger): show tool credentials in workflow 2025-10-15 20:42:14 +08:00
lyzno1 2ccb20bf3a
fix(workflow): gate “publish as tool” on published user input node validity 2025-10-15 20:26:12 +08:00
lyzno1 34b7e5cbca
fix: enable scrolling in start selector tab 2025-10-15 19:09:23 +08:00
yessenia a595e2df06 fix(trigger): skip validation when updating properties 2025-10-15 18:44:05 +08:00
zhsama 729e0e9b1e feat(workflow): add disableVariableInsertion prop to form input and trigger components 2025-10-15 18:20:13 +08:00
zhsama c03b790888 feat(trigger): add event_parameters to PluginTriggerNode configuration 2025-10-15 18:14:43 +08:00
zhsama 112b5f63dd feat(workflow): enhance single run handling 2025-10-15 18:14:33 +08:00
Harry 334e5f19bf fix(trigger): handle missing subscription constructor in trigger subscription builder
- Updated the `TriggerSubscriptionBuilderService` to return an empty dictionary when the subscription constructor is not available, improving robustness in subscription handling.
2025-10-15 17:44:51 +08:00
Harry 35bbf67175 refactor(trigger): Rename and replace PluginTriggerNode with TriggerEventNode
- Updated references from `PluginTriggerNode` to `TriggerEventNode` across multiple files to reflect the new naming convention.
- Modified `PluginTriggerNodeData` to `TriggerEventNodeData`, including changes to event parameters for better clarity and consistency in data handling.
- Removed the deprecated `trigger_plugin_node.py` file as part of the refactor.
2025-10-15 17:30:42 +08:00
yessenia 9aec255ee9 feat(trigger): update subscription list after saving draft 2025-10-15 17:22:14 +08:00
Harry b07e80e6ae fix(trigger): update error type for event handling in trigger manager
- Changed the error type check from "TriggerIgnoreEventError" to "EventIgnoreError" in the `TriggerManager` class to improve clarity in error handling during trigger invocations.
2025-10-15 17:14:44 +08:00
Harry ad2b910d73 refactor(trigger): Enhance error handling and parameter resolution in trigger workflows
- Improved error handling in `DraftWorkflowTriggerRunApi`, `DraftWorkflowTriggerNodeApi`, and `DraftWorkflowTriggerRunAllApi` to raise exceptions directly, providing clearer error messages.
- Introduced `get_event_parameters` method in `PluginTriggerProviderController` to retrieve event parameters for triggers.
- Updated `PluginTriggerNodeData` to include a new method for resolving parameters based on event schemas, ensuring better validation and handling of trigger inputs.
- Refactored `TriggerService` to utilize the new parameter resolution method, enhancing the clarity and reliability of trigger invocations.
2025-10-15 17:05:51 +08:00
yessenia f28a7218cd fix(trigger): optimize subscription entry in workflow 2025-10-15 16:13:00 +08:00
lyzno1 4164e1191e
fix: hide checklist navigation for missing nodes 2025-10-15 16:10:34 +08:00
Harry bd31c6f90b refactor(trigger): Reinstate DraftWorkflowTriggerNodeApi with improved structure
- Restored the `DraftWorkflowTriggerNodeApi` class to handle polling for trigger events in draft workflows.
- Enhanced the implementation to utilize `TriggerDebugEvent` and `TriggerDebugEventPoller` for better event management.
- Improved error handling and response structure for node execution, ensuring clarity in API responses.
- Updated API documentation to reflect the restored functionality and parameters.
2025-10-15 14:45:00 +08:00
Harry 8f7bef9509 fix(trigger): Update API routes for draft workflow trigger
- Changed the endpoint for triggering draft workflows from `/trigger/plugin/run` to `/trigger/run` in both backend and frontend to ensure consistency and clarity in the API structure.
- Adjusted the URL construction in the `useWorkflowRun` hook to reflect the updated route.
2025-10-15 14:44:00 +08:00
Harry 06c91fbcbd refactor(trigger): Unify the Trigger Debug interface and event handling and enhance error management
- Updated `DraftWorkflowTriggerNodeApi` to utilize the new `TriggerDebugEvent` and `TriggerDebugEventPoller` for improved event polling.
- Removed deprecated `poll_debug_event` methods from `TriggerService`, `ScheduleService`, and `WebhookService`, consolidating functionality into the new event structure.
- Enhanced error handling in `invoke_trigger_event` to utilize `TriggerPluginInvokeError` for better clarity on invocation issues.
- Updated frontend API routes to reflect changes in trigger event handling, ensuring consistency across the application.
2025-10-15 14:41:53 +08:00
Harry dab4e521af feat(trigger): enhance trigger event handling and introduce new debug event polling
- Refactored the `DraftWorkflowTriggerNodeApi` and related services to utilize the new `TriggerService` for polling debug events, improving modularity and clarity.
- Added `poll_debug_event` methods in `TriggerService`, `ScheduleService`, and `WebhookService` to streamline event handling for different trigger types.
- Introduced `ScheduleDebugEvent` and updated `PluginTriggerDebugEvent` to include a more structured approach for event data.
- Enhanced the `invoke_trigger_event` method to improve error handling and data validation during trigger invocations.
- Updated frontend API calls to align with the new event structure, removing deprecated parameters for cleaner integration.
2025-10-15 11:04:09 +08:00
lyzno1 b20f61356c
Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-15 09:53:03 +08:00
yessenia 4ec23eea00 fix: add i18n key 2025-10-14 21:23:24 +08:00
Yeuoly 270fd9cb07 fix: discorrect entity reference 2025-10-14 20:14:13 +08:00
yessenia c7c5e07d43 fix(trigger): add tooltip when only one creation type 2025-10-14 18:39:22 +08:00
zhsama c1ba83f0d4 feat(trigger): add validation for subscription in PluginTrigger node 2025-10-14 18:13:02 +08:00
zhsama d71200ee32 feat: enhance block selector and change block components with flow type handling 2025-10-14 16:42:21 +08:00
yessenia 16ac05ebd5 feat: support search in checkbox list 2025-10-14 16:24:44 +08:00
zhsama ac77b9b735 Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-10-14 15:28:35 +08:00
zhsama 0fa4b77ff8 feat(style): adjust minimum and maximum width for block-selector and data source components 2025-10-14 15:23:28 +08:00
Harry 6773dda657 feat(trigger): enhance trigger handling with new data validation and logging improvements
- Added validation for `PluginTriggerData` and `ScheduleTriggerData` in the `WorkflowService` to support new trigger types.
- Updated debug event return strings in `PluginTriggerDebugEvent` and `WebhookDebugEvent` for clarity and consistency.
- Enhanced logging in `dispatch_triggered_workflows_async` to include subscription and provider IDs, improving traceability during trigger dispatching.
2025-10-14 14:36:52 +08:00
zhsama bf42386c5b feat(trigger): add PluginTrigger node support and enhance output variable handling 2025-10-14 11:55:12 +08:00
Harry 90fc06a494 refactor(trigger): update TriggerApiEntity description type to TypeWithI18N
- Changed the description field type in `TriggerApiEntity` from `TriggerDescription` to `TypeWithI18N` for improved internationalization support.
- Adjusted the usage of the description field in the `convertToTriggerWithProvider` function to align with the new type definition.
2025-10-13 22:24:12 +08:00
Harry 8dfe693529 refactor(trigger): rename TriggerApiEntity to EventApiEntity and update related references
- Changed `TriggerApiEntity` to `EventApiEntity` in the trigger provider and subscription models to better reflect its purpose.
- Updated the description field type from `EventDescription` to `I18nObject` for improved consistency in event descriptions.
- Adjusted imports and references across multiple files to accommodate the renaming and type changes, ensuring proper functionality in trigger processing.
2025-10-13 21:10:31 +08:00
yessenia d65d27a6bb fix: creating button style 2025-10-13 20:53:06 +08:00
lyzno1 e6a6bde8e2
feat(i18n): add draft reminder to app overview tooltips 2025-10-13 20:18:54 +08:00
lyzno1 c7d0a7be04
feat(trigger): enable triggers by default after workflow publish 2025-10-13 19:59:39 +08:00
Harry e0f1b03cf0 fix(trigger): clear subscription_id in trigger plugin processing
- Updated the `AppDslService` to clear the `subscription_id` when processing nodes of type `TRIGGER_PLUGIN`. This change ensures that sensitive subscription data is not retained unnecessarily, enhancing data security during workflow execution.
2025-10-13 18:42:54 +08:00
Harry 902737b262 feat(trigger): enhance subscription decryption in trigger processing
- Added functionality to decrypt subscription credentials and properties within the `dispatch_triggered_workflows_async` method. This ensures that sensitive data is securely handled before processing, improving the overall security of trigger invocations.
2025-10-13 18:10:53 +08:00
Harry 429cd05a0f fix(trigger): serialize subscription model in trigger invocation
- Updated the `PluginTriggerManager` to serialize the `subscription` parameter using `model_dump()` before passing it during trigger invocation. This change ensures that the subscription data is correctly formatted for processing.
2025-10-13 18:07:51 +08:00
Harry 46e7e99c5a feat(trigger): add subscription parameter to trigger invocation methods
- Enhanced `PluginTriggerManager`, `PluginTriggerProviderController`, and `TriggerManager` to accept a `subscription` parameter in their trigger invocation methods.
- Updated `TriggerService` to pass the subscription entity when invoking trigger events, improving the handling of subscription-related data during trigger execution.
2025-10-13 17:47:40 +08:00
lyzno1 d19ce15f3d
Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-13 17:28:47 +08:00
lyzno1 49af7eb370
fix(trigger-schedule): make timezone field optional to match actual usage 2025-10-13 17:28:40 +08:00
lyzno1 8e235dc92c
feat(workflow): hide timezone in node next execution, keep in panel next 5 executions 2025-10-13 17:28:40 +08:00
lyzno1 3b3963b055
refactor(workflow): remove timezone required validation as it is auto-filled by use-config 2025-10-13 17:28:40 +08:00
lyzno1 378c2afcd3
fix(workflow): remove hardcoded UTC timezone from new schedule node to use user timezone 2025-10-13 17:28:40 +08:00
lyzno1 d709f20e1f
fix(workflow): update community feedback link to plugin request template 2025-10-13 17:28:40 +08:00
lyzno1 99d9657af8
feat(workflow): integrate timezone display into execution time format for better readability 2025-10-13 17:28:40 +08:00
lyzno1 62efdd7f7a
fix(workflow): preserve saved timezone in trigger-schedule to match backend fixed-timezone design 2025-10-13 17:28:39 +08:00
lyzno1 ebcf98c137
revert(workflow): remove timezone label from trigger-schedule node display 2025-10-13 17:28:39 +08:00
lyzno1 7560e2427d
fix(timezone): support half-hour and 45-minute timezone offsets
Critical regression fix for convertTimezoneToOffsetStr:

Issues Fixed:
- Previous regex /^([+-]?\d{1,2}):00/ only matched :00 offsets
- This caused half-hour offsets (e.g., India +05:30) to return UTC+0
- Even if matched, parseInt only parsed hours, losing minute info

Changes:
- Update regex to /^([+-]?\d{1,2}):(\d{2})/ to match all offset formats
- Parse both hours and minutes separately
- Output format: "UTC+5:30" for non-zero minutes, "UTC+8" for whole hours
- Preserve leading zeros in minute part (e.g., "UTC+5:30" not "UTC+5:3")

Test Coverage:
- Added 8 comprehensive tests covering:
  * Default/invalid timezone handling
  * Whole hour offsets (positive/negative)
  * Zero offset (UTC)
  * Half-hour offsets (India +5:30, Australia +9:30)
  * 45-minute offset (Chatham +12:45)
  * Leading zero preservation in minutes

All 14 tests passing. Verified with timezone.json entries at lines 967, 1135, 1251.
2025-10-13 17:28:39 +08:00
lyzno1 920a608e5d
fix(trigger-schedule): prevent timezone label truncation in node
- Change layout to ensure timezone label always visible with shrink-0
- Time text can truncate but timezone label stays intact
- Improves readability in constrained node space
2025-10-13 17:28:39 +08:00
lyzno1 4dfb8b988c
feat(time-picker): add showTimezone prop with comprehensive tests
- Add showTimezone prop to TimePickerProps for optional inline timezone display
- Integrate TimezoneLabel component into TimePicker when showTimezone=true
- Add 6 comprehensive test cases covering all showTimezone scenarios:
  * Default behavior (no timezone label)
  * Explicit disable with showTimezone=false
  * Enable with showTimezone=true
  * Inline prop correctly passed
  * No display when timezone is missing
  * Correct styling classes applied
- Update trigger-schedule panel to use showTimezone prop
- All 15 tests passing with good coverage
2025-10-13 17:28:39 +08:00
lyzno1 af6dae3498
fix(timezone): fix UTC offset display bug and add timezone labels
- Fixed convertTimezoneToOffsetStr() that only extracted first digit
  * UTC-11 was incorrectly displayed as UTC-1, UTC+10 as UTC+0
  * Now correctly extracts full offset using regex and removes leading zeros
- Created reusable TimezoneLabel component with inline mode support
- Added comprehensive unit tests with 100% coverage
- Integrated timezone labels into 3 locations:
  * Panel time picker (next to time input)
  * Node next execution display
  * Panel next 5 executions list
2025-10-13 17:28:39 +08:00
yessenia ee21b4d435 feat: support copy to clipboard in input component 2025-10-13 17:21:26 +08:00
zhsama 654adccfbf fix(trigger): implement plugin single run functionality and update node status handling 2025-10-13 17:02:44 +08:00
Harry b283a2b3d9 feat(trigger): add API endpoint to retrieve trigger plugin icons and enhance workflow response handling
- Introduced `TriggerProviderIconApi` to fetch icons for trigger plugins based on tenant and provider ID.
- Updated `WorkflowResponseConverter` to include trigger plugin icons in the response.
- Implemented `get_trigger_plugin_icon` method in `TriggerManager` for icon retrieval logic.
- Adjusted `Node` class to correctly set provider information for trigger plugins.
- Modified TypeScript types to accommodate new provider ID field in workflow nodes.
2025-10-13 16:50:32 +08:00
lyzno1 cce729916a
fix(trigger-schedule): pass time string directly to TimePicker to avoid double timezone conversion 2025-10-13 16:00:13 +08:00
yessenia 4f8bf97935 fix: creating modal style 2025-10-13 14:54:24 +08:00
zhsama ba88c7b25b fix(workflow): handle plugin run mode correctly by setting status 2025-10-13 14:50:12 +08:00
yessenia 0ec5d53e5b fix(trigger): log style 2025-10-13 14:46:08 +08:00
lyzno1 f3b415c095
Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-13 13:21:51 +08:00
Harry 6fb657a89e refactor(subscription): enhance subscription count handling in selector view
- Introduced a subscriptionCount variable to improve readability and performance when checking the number of subscriptions.
- Updated the rendering logic to use subscriptionCount, ensuring consistent and clear display of subscription information in the component.
2025-10-13 11:22:25 +08:00
Harry 90240cb6db refactor(subscription): optimize subscription count handling in list view
- Replaced direct length checks on subscriptions with a computed subscriptionCount variable for improved readability and performance.
- Updated the CreateSubscriptionButton to conditionally render based on the new subscriptionCount variable, enhancing clarity in the component logic.
- Adjusted className logic for the button to account for multiple supported methods, ensuring better user experience.
2025-10-12 23:56:27 +08:00
Harry cca48f07aa feat(trigger): implement atomic update and verification for subscription builders
- Introduced atomic operations for updating and verifying subscription builders to prevent race conditions.
- Added distributed locking mechanism to ensure data consistency during concurrent updates and builds.
- Refactored existing methods to utilize the new atomic update and verification logic, enhancing the reliability of trigger subscription handling.
2025-10-12 21:27:38 +08:00
Harry beff639c3d fix(trigger): improve trigger subscription query with AppTrigger join
- Updated the trigger subscription query to join with the AppTrigger model, ensuring only enabled app triggers are considered.
- Enhanced the filtering criteria for retrieving subscribers based on the AppTrigger status, improving the accuracy of the trigger subscription handling.
2025-10-12 19:24:54 +08:00
Harry 00359830c2 refactor(trigger): streamline response handling in trigger subscription dispatch
- Removed the redundant response extraction from the dispatch call and directly assigned the response to a variable for clarity.
- Enhanced logging by appending the request log after dispatching, ensuring better traceability of requests and responses in the trigger subscription workflow.
2025-10-11 22:16:18 +08:00
Harry f23e098b9a fix(trigger): handle exceptions in trigger subscription dispatch
- Wrapped the dispatch call in a try-except block to catch exceptions and return a 500 error response if an error occurs.
- Enhanced logging of the request and error response for better traceability in the trigger subscription workflow.
2025-10-11 22:13:36 +08:00
Harry 42f75b6602 feat(trigger): enhance trigger subscription handling with credential support
- Added `credentials` and `credential_type` parameters to various methods in `PluginTriggerManager`, `PluginTriggerProviderController`, and `TriggerManager` to support improved credential management for trigger subscriptions.
- Updated the `Subscription` model to include `parameters` for better subscription data handling.
- Refactored related services to accommodate the new credential handling, ensuring consistency across the trigger workflow.
2025-10-11 21:12:27 +08:00
yessenia 4f65cc312d feat: delete confirm opt 2025-10-11 20:19:27 +08:00
yessenia 854a091f82 feat: add validation status for formitem 2025-10-11 19:50:05 +08:00
zhsama 63dbc7c63d fix(trigger): update provider_id reference to plugin_id in useToolIcon hook 2025-10-11 19:05:57 +08:00
zhsama a4e80640fe chore(trigger): remove debug console logs 2025-10-11 18:54:47 +08:00
zhsama fe0a139c89 fix(trigger): update provider_id references to plugin_id in BasePanel component 2025-10-11 18:52:15 +08:00
zhsama ac2616545b fix(trigger): update provider_id field in TriggerPluginActionItem component 2025-10-11 17:10:29 +08:00
zhsama c9e7922a14 refactor(trigger): update trigger-related types and field names / values 2025-10-11 17:06:43 +08:00
yessenia 12a7402291 fix: create button not working in manual creation mode 2025-10-11 15:36:37 +08:00
yessenia 33d7b48e49 fix: error when fetching info while switching plugins 2025-10-11 15:00:35 +08:00
Harry ee89e9eb2f refactor(trigger): update type parameter naming in PluginTriggerManager
- Changed the parameter name from 'type' to 'type_' in multiple method calls within the PluginTriggerManager class to avoid conflicts with the built-in type function and improve code clarity.
2025-10-11 13:09:25 +08:00
Harry e793f9e871 refactor(trigger): remove unnecessary whitespace in trigger-related files
- Cleaned up the code by removing extraneous whitespace in `trigger.py` and `workflow_plugin_trigger_service.py`, improving readability and maintaining code style consistency.
2025-10-11 12:44:54 +08:00
Harry 18b02370a2 chore(workflows): update deployment configurations
- Modified the build-push workflow to trigger on all branches under "deploy/**" for broader deployment coverage.
- Changed the SSH host secret in the deploy-dev workflow from RAG_SSH_HOST to DEV_SSH_HOST for improved clarity.
- Removed the obsolete deploy-rag-dev workflow to streamline the CI/CD process.
2025-10-11 12:26:31 +08:00
Harry d53399e546 refactor(trigger): rename trigger-related fields and methods for consistency
- Updated the naming convention from 'trigger_name' to 'event_name' across various models and services to align with the new event-driven architecture.
- Refactored methods in PluginTriggerManager and PluginTriggerProviderController to use 'invoke_trigger_event' instead of 'invoke_trigger'.
- Adjusted database migration scripts to reflect changes in the schema, including the addition of 'event_name' and 'subscription_id' fields in the workflow_plugin_triggers table.
- Removed deprecated trigger-related methods in WorkflowPluginTriggerService to streamline the codebase.
2025-10-11 12:26:08 +08:00
yessenia 622d12137a feat: change subscription field in workflow 2025-10-10 20:58:56 +08:00
lyzno1 bae8e44b32
Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-10 19:43:23 +08:00
zhsama 24b5387fd1 fix(workflow): streamline stopping workflow run process 2025-10-10 18:45:43 +08:00
Harry 0c65824cad fix(workflow): update API route for DraftWorkflowTriggerRunApi
- Changed the route from "/apps/<uuid:app_id>/workflows/draft/trigger/run" to "/apps/<uuid:app_id>/workflows/draft/trigger/plugin/run" to reflect the new plugin-based trigger structure.
- Updated corresponding URL in the useWorkflowRun hook to maintain consistency across the application.
2025-10-10 18:13:28 +08:00
Harry 31c9d9da3f fix(workflow): enhance response structure in DraftWorkflowTriggerRunApi
- Added a "retry_in" field to the response when no event is found, improving the API's feedback during workflow execution.
2025-10-10 18:05:50 +08:00
Harry 8f854e6a45 fix(workflow): add root_node_id to DraftWorkflowTriggerRunApi for improved response handling
- Included root_node_id in the API call to enhance the response structure during workflow execution.
2025-10-10 18:05:50 +08:00
yessenia 75b3f5ac5a feat: change subscription field 2025-10-10 17:37:20 +08:00
zhsama 323e183775 refactor(trigger): improve config value formatting in PluginTriggerNode 2025-10-10 17:28:41 +08:00
Harry 380ef52331 refactor(trigger): update API and service to use 'event' terminology
- Renamed 'trigger_name' to 'event_name' in the DraftWorkflowTriggerNodeApi for consistency with the new naming convention.
- Added 'provider_id' to the API request model to enhance functionality.
- Updated the PluginTriggerDebugEvent and TriggerDebugService to reflect changes in naming and improve address formatting.
- Adjusted frontend utility to align with the updated variable names.
2025-10-10 15:48:42 +08:00
lyzno1 b8862293b6
fix: resolve semantic conflict in TimePicker notClearable logic 2025-10-10 15:17:19 +08:00
lyzno1 85f1cf1d90
Merge branch 'main' into feat/trigger 2025-10-10 15:16:00 +08:00
lyzno1 1d4e36d58f
fix: display correct icon for trigger nodes in listening panel 2025-10-10 15:04:58 +08:00
Harry 90ae5e5865 refactor(trigger): enhance update method to use explicit None checks
- Updated the `update` method in `SubscriptionBuilderUpdater` to use 'is not None' checks instead of truthy evaluations for better handling of empty values.
- This change improves clarity and ensures that empty dictionaries or strings are correctly processed during updates.
2025-10-10 14:52:03 +08:00
zhsama 755fb96a33 feat(trigger): add plugin trigger test-run handling to workflow 2025-10-10 10:43:13 +08:00
Harry b8ca480b07 refactor(trigger): update variable names for clarity and consistency
- Renamed variables related to triggers to use 'trigger' terminology consistently across the codebase.
- Adjusted filtering logic in `TriggerPluginList` to reference 'events' instead of 'triggers' for improved clarity.
- Updated the `getTriggerIcon` function to reflect the new naming conventions and ensure proper icon rendering.
2025-10-09 12:23:48 +08:00
lyzno1 8a5fbf183b Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-09 09:07:01 +08:00
Harry 91318d3d04 refactor(trigger): rename trigger references to event for consistency
- Updated variable names and types from 'trigger' to 'event' across multiple files to enhance clarity and maintain consistency in the codebase.
- Adjusted related data structures and API responses to reflect the new naming convention.
- Improved type annotations and error handling in the workflow trigger run API and associated services.
2025-10-09 03:12:35 +08:00
Harry a33d04d1ac refactor(trigger): unify debug event handling and improve polling mechanism
- Introduced a base class for debug events to streamline event handling.
- Refactored `TriggerDebugService` to support multiple event types through a generic dispatch/poll interface.
- Updated webhook and plugin trigger debug services to utilize the new event structure.
- Enhanced the dispatch logic in `dispatch_triggered_workflows_async` to accommodate the new event model.
2025-10-08 17:31:16 +08:00
lyzno1 02222752f0 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-07 18:25:43 +08:00
lyzno1 04d94e3337 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-06 19:12:16 +08:00
hjlarry b98c36db48 fix trigger related api 404 2025-10-06 14:36:07 +08:00
hjlarry d05d11e67f add webhook node draft single run 2025-10-06 14:35:12 +08:00
lyzno1 3370736e09 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-04 11:30:26 +08:00
hjlarry cc5a315039 fix pyright exception 2025-10-02 20:26:29 +08:00
hjlarry 6ea10cdaaf debug webhook don't require publish the app 2025-10-02 20:07:57 +08:00
lyzno1 9643fa1c9a fix: use StopCircle icon in variable inspect listening panel 2025-10-02 10:02:19 +08:00
lyzno1 937a58d0dd Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-02 09:18:21 +08:00
hjlarry d9faa1329a move workflow_plugin_trigger_service to trigger sub dir 2025-10-02 00:31:33 +08:00
hjlarry fec09e7ed3 move trigger_service to trigger sub dir 2025-10-02 00:29:53 +08:00
hjlarry 31b15b492e move trigger_debug_service to trigger sub dir 2025-10-02 00:27:48 +08:00
hjlarry f96bd4eb18 move schedule service to trigger sub dir 2025-10-02 00:24:32 +08:00
hjlarry a4109088c9 move webhook service to trigger sub dir 2025-10-02 00:18:37 +08:00
hjlarry f827e8e1b7 add more code comment 2025-10-02 00:14:35 +08:00
hjlarry 82f2f76dc4 ruff format code 2025-10-01 23:39:46 +08:00
hjlarry e6a44a0860 can debug when disable webhook 2025-10-01 23:39:37 +08:00
hjlarry 604651873e refactor webhook service 2025-10-01 12:46:42 +08:00
lyzno1 9114881623 fix: update frontend trigger field mapping from triggers to events
- Update TriggerProviderApiEntity type to use events field (aligned with backend commit 32f4d1af8)
- Update conversion function in use-triggers.ts to map provider.events to TriggerWithProvider.triggers
- Fix trigger-events-list.tsx to use providerInfo.events (TriggerProviderApiEntity type)
- Fix parameters-form.tsx to use provider.triggers (TriggerWithProvider type)
2025-10-01 09:53:45 +08:00
hjlarry 080cdda4fa query param of webhook backend support 2025-09-30 21:21:39 +08:00
Harry 32f4d1af8b Refactor: Rename triggers to events in trigger-related entities and services
- Updated class and variable names from 'triggers' to 'events' across multiple files to improve clarity and consistency.
- Adjusted related data structures and methods to reflect the new naming convention, including changes in API entities, service methods, and trigger management logic.
- Ensured all references to triggers are replaced with events to align with the updated terminology.
2025-09-30 20:18:33 +08:00
lyzno1 1bfa8e6662 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-09-30 18:56:21 +08:00
lyzno1 7c97ea4a9e fix: correct entry node alignment for wrapper offset
- Add ENTRY_NODE_WRAPPER_OFFSET constant (x: 0, y: 21) for Start/Trigger nodes
- Implement getNodeAlignPosition() to calculate actual inner node positions
- Fix horizontal/vertical helpline rendering to account for wrapper offset
- Fix snap-to-align logic to properly align inner nodes instead of wrapper
- Correct helpline width/height calculation by subtracting offset for entry nodes
- Ensure backward compatibility: only affects Start/Trigger nodes with EntryNodeContainer wrapper

This fix ensures that Start and Trigger nodes (which have an EntryNodeContainer wrapper
with status indicator) align based on their inner node boundaries rather than the wrapper
boundaries, matching the alignment behavior of regular nodes.
2025-09-30 18:36:49 +08:00
lyzno1 bea11b08d7 refactor: hide workflow features button in workflow mode, keep it visible in chatflow mode 2025-09-30 17:51:01 +08:00
lyzno1 8547032a87 Revert "refactor: app publisher"
This reverts commit 8feef2c1a9.
2025-09-30 17:46:27 +08:00
hjlarry 43574c852d add variable type to webhook request parameters panel 2025-09-30 16:31:21 +08:00
hjlarry 5ecc006805 add listening status for variable panel 2025-09-30 15:18:07 +08:00
lyzno1 15413108f0 chore: remove unused empty enums.py file 2025-09-30 13:52:33 +08:00
lyzno1 831c888b84 feat: sort output variables by table display order in webhook trigger 2025-09-30 12:34:09 +08:00
lyzno1 f0ed09a8d4
feat: add output variables display to webhook trigger node (#26478)
Co-authored-by: Claude <noreply@anthropic.com>
2025-09-30 12:26:42 +08:00
hjlarry a80f30f9ef add nginx /triggers endpoint 2025-09-30 11:08:14 +08:00
hjlarry fd2f0df097 useStore to isListening status 2025-09-30 10:48:38 +08:00
lyzno1 d72a3e1879 fix: translations 2025-09-30 10:01:33 +08:00
lyzno1 4a6903fdb4 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-09-30 08:00:16 +08:00
zxhlyh 8106df1d7d fix: types 2025-09-29 20:53:50 +08:00
Harry 5e3e6b0bd8 refactor(api): update subscription handling in trigger provider
- Replaced SubscriptionSchema with SubscriptionConstructor in various parts of the trigger provider implementation to streamline subscription management.
- Enhanced the PluginTriggerProviderController to utilize the new subscription constructor for retrieving default properties and credential schemas.
- Removed the deprecated get_provider_subscription_schema method from TriggerManager.
- Updated TriggerSubscriptionBuilderService to reflect changes in subscription handling, ensuring compatibility with the new structure.

These changes improve the clarity and maintainability of the subscription handling within the trigger provider architecture.
2025-09-29 18:28:10 +08:00
Harry a06d2892f8 fix(plugin): handle optional property in llm_description assignment
- Updated the llm_description assignment in the ToolParameter to safely access the en_US property of paramDescription, ensuring it defaults to an empty string if not present. This change improves the robustness of the parameter handling in the plugin detail panel.
2025-09-29 18:28:10 +08:00
Harry e377e90666 feat(api): add CHECKBOX parameter type to plugin and tool entities
- Introduced CHECKBOX as a new parameter type in CommonParameterType and PluginParameterType.
- Updated as_normal_type and cast_parameter_value functions to handle CHECKBOX type.
- Enhanced ToolParameter class to include CHECKBOX for consistency across parameter types.

These changes expand the parameter capabilities within the API, allowing for more versatile input options.
2025-09-29 18:28:10 +08:00
Harry 19cc67561b refactor(api): improve error handling in trigger providers
- Removed unnecessary ValueError handling in TriggerSubscriptionBuilderCreateApi and TriggerSubscriptionBuilderBuildApi, allowing for more streamlined exception management.
- Updated TriggerSubscriptionBuilderVerifyApi and TriggerSubscriptionBuilderBuildApi to raise ValueError with the original exception context for better debugging.
- Enhanced trigger_endpoint in trigger.py to log errors and return a JSON response for not found endpoints, improving user feedback and error reporting.

These changes enhance the robustness and clarity of error handling across the API.
2025-09-29 18:28:10 +08:00
hjlarry 92f2ca1866 add listening status in the run panel result 2025-09-29 17:55:53 +08:00
hjlarry 1949074e2f add shortcut for open test run panel 2025-09-29 14:39:44 +08:00
hjlarry 1c0068e95b fix can't stop webhook debug 2025-09-29 13:34:05 +08:00
lyzno1 4b43196295 feat: add specialized trigger icons to workflow logs
- Create TriggerByDisplay component with appropriate colored icons
- Add dedicated Code icon for debugging triggers (blue background)
- Add KnowledgeRetrieval icon for RAG pipeline triggers (green background)
- Use existing webhook, schedule, and plugin icons with proper colors
- Add comprehensive i18n translations for Chinese, Japanese, and English
- Integrate icon display into workflow logs table
- Follow project color standards from block-icon component
2025-09-29 12:53:35 +08:00
lyzno1 2c3cf9a25e Merge remote-tracking branch 'origin/main' into feat/trigger 2025-09-29 12:13:39 +08:00
lyzno1 67fbfc0b8f fix: adjust MoreActions menu position based on sidebar state 2025-09-29 12:13:18 +08:00
hjlarry 6e6198c64e debug webhook node 2025-09-29 09:28:19 +08:00
lyzno1 6b677c16ce refactor: use Tailwind className for MiniMap node colors instead of CSS variables 2025-09-29 08:09:38 +08:00
yessenia 973b937ba5 feat: add subscription in node 2025-09-28 22:40:31 +08:00
zhsama 48597ef193 feat: enhance minimap node color handling 2025-09-28 21:11:46 +08:00
zhsama ffbc007f82 feat(i18n): add tooltip and placeholder for callback URL in plugin-trigger translations 2025-09-28 20:13:10 +08:00
lyzno1 8fc88f8cbf Merge remote-tracking branch 'origin/main' into feat/trigger 2025-09-28 19:32:33 +08:00
zhsama a4b932c78b feat: integrate chat mode detection in ChangeBlock component 2025-09-28 17:10:09 +08:00
hjlarry 2ff4af8ce3 add debug run schedule node 2025-09-28 16:37:37 +08:00
zhsama 6795015d00 refactor: enhance type definitions and update import paths in form input and trigger components 2025-09-28 15:42:38 +08:00
zhsama b100ce15cd refactor: update import paths and remove unused props in block selector components 2025-09-28 15:21:44 +08:00
yessenia 3edf1e2f59 feat: add checkbox list 2025-09-28 15:12:17 +08:00
lyzno1 4d49db0ff9 Unify SearchBox styles with Input component and add autoFocus 2025-09-28 14:33:27 +08:00
lyzno1 7da22e4915 Add toast notifications to TriggerCard toggle operations 2025-09-28 14:21:51 +08:00
lyzno1 8d4a9df6b1 fix: more button dropdown menu visibility and auto-close behavior 2025-09-28 14:15:33 +08:00
lyzno1 f620e78b20 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-09-28 13:40:46 +08:00
hjlarry 8df80781d9 _node_type has changed to node_type in new version 2025-09-28 09:36:45 +08:00
hjlarry edec065fee fix can't subtract offset-naive and offset-aware datetimes 2025-09-28 09:10:21 +08:00
lyzno1 0fe529c3aa Merge remote-tracking branch 'origin/main' into feat/trigger 2025-09-27 21:32:38 +08:00
zhsama bcfdd07f85 feat(plugin): enhance trigger events list with dynamic tool integration
- Refactor TriggerEventsList component to utilize provider information for dynamic tool rendering.
- Implement locale-aware text handling for trigger descriptions and labels.
- Introduce utility functions for better management of tool parameters and trigger descriptions.
- Improve user experience by ensuring consistent display of trigger events based on available provider data.

This update enhances the functionality and maintainability of the trigger events list, aligning with the project's metadata-driven approach.
2025-09-26 23:27:27 +08:00
lyzno1 a9a118aaf9 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-09-26 22:48:15 +08:00
lyzno1 60c86dd8d1 fix(workflow): replace hardcoded trigger node logic with metadata-driven approach
- Add isStart: true to all trigger nodes (TriggerWebhook, TriggerSchedule, TriggerPlugin)
- Replace hardcoded BlockEnum checks in use-checklist.ts with metadata-driven logic
- Update trigger node tests to validate metadata instead of obsolete methods
- Add webhook URL validation to TriggerWebhook node
- Ensure backward compatibility with existing workflow configurations

This change migrates from hardcoded trigger node identification to a
centralized metadata-driven approach, improving maintainability and
consistency across the workflow system.
2025-09-26 22:35:21 +08:00
lyzno1 8feef2c1a9 refactor: app publisher 2025-09-26 22:06:05 +08:00
lyzno1 4ba99db88c feat: Restore complete test run functionality and fix workflow block selector system
This comprehensive restore includes:

## Test Run System Restoration
- Restore test-run-menu.tsx component with multi-trigger support and keyboard shortcuts
- Restore use-dynamic-test-run-options.tsx hook for dynamic trigger option generation
- Restore workflow-entry.ts utilities for entry node detection and validation
- Integrate complete test run functionality back into run-mode.tsx

## Block Selector System Fixes
- Fix workflow block selector constants by uncommenting BLOCKS and START_BLOCKS arrays
- Restore proper i18n translations for trigger node descriptions using workflow.blocksAbout keys
- Filter trigger types from Blocks tab to prevent duplication with Start tab
- Fix trigger node handle display to match start node behavior (hide left input handles)

## Workflow Validation System Improvements
- Restore unified workflow validation using correct getValidTreeNodes(nodes, edges) signature
- Remove duplicate Start node validation from isRequired mechanism
- Eliminate "user input must be added" validation error by setting Start node isRequired: false
- Fix end node connectivity validation to properly detect valid workflow chains

## Component Integration
- Verify all dependencies exist (TriggerAll icon, useAllTriggerPlugins hook)
- Maintain keyboard shortcut integration (Alt+R, ~, 0-9 keys)
- Preserve portal-based dropdown positioning and tooltip structure
- Support multiple trigger types: user_input, schedule, webhook, plugin, all

This restores the complete test run functionality that was missing from feat/trigger branch
by systematically analyzing and restoring components from feat/trigger-backup-before-merge.
2025-09-26 21:34:08 +08:00
lyzno1 b4801adfbd refactor(workflow): Remove Start node from isRequired mechanism
- Set Start node isRequired: false since entry node validation is handled by unified logic
- Remove conditional skip logic in checklist since Start is no longer in isRequiredNodesType
- Cleaner separation of concerns: unified entry node check vs individual required nodes
- Eliminates architectural inconsistency where Start was both individually required and part of group validation
2025-09-26 21:09:48 +08:00
lyzno1 08e8f8676e fix(workflow): Remove duplicate Start node validation
- Skip Start node requirement in isRequiredNodesType loop since it's already covered by unified entry node validation
- Eliminates duplicate 'User Input must be added' error when trigger nodes are present
- Both useChecklist and useChecklistBeforePublish now consistently handle entry node validation
- Resolves UI showing redundant validation errors for Start vs Trigger nodes
2025-09-26 21:08:21 +08:00
lyzno1 2dca0c20db fix: restore unified workflow validation system
Major fixes to workflow checklist validation:

## Fixed getValidTreeNodes function (workflow.ts)
- Restore original function signature: (nodes, edges) instead of (startNode, nodes, edges)
- Re-implement automatic start node discovery for all entry types
- Unified traversal from Start, TriggerWebhook, TriggerSchedule, TriggerPlugin nodes
- Single call now discovers all valid connected nodes correctly

## Simplified useChecklist validation (use-checklist.ts)
- Remove complex manual start node iteration and result aggregation
- Unified entry node validation concept for all start node types
- Remove dependency on getStartNodes() utility
- Simplified validation logic matching backup branch approach

## Resolved Issues
-  End node connectivity: Now correctly detects connections from any entry node
-  Unified entry validation: All start types (Start/Triggers) validated consistently
-  Simplified architecture: Restored proven validation approach from backup branch

This restores the reliable workflow validation system while maintaining trigger node support.
2025-09-26 20:54:28 +08:00
lyzno1 6f57aa3f53 fix: hide left input handles for all trigger node types
- Extend handle hiding logic to include TriggerWebhook, TriggerSchedule, TriggerPlugin
- Make trigger nodes behave like Start nodes without left-side input handles
- Apply fix to both main workflow and preview node handle components
- Ensures consistent UX where all start-type nodes have no input handles
2025-09-26 20:39:29 +08:00
lyzno1 1aafe915e4 fix: trigger tooltip descriptions and filter trigger types from Nodes tab
- Fix trigger tooltip descriptions to use workflow.blocksAbout translations
- Filter TriggerWebhook/TriggerSchedule/TriggerPlugin from Blocks component
- Ensure trigger types only appear in Start tab, not Nodes tab
2025-09-26 20:28:59 +08:00
lyzno1 6d4d25ee6f feat(workflow): Restore block selector functionality
- Restore BLOCKS constant array and useBlocks hook
- Add intelligent fallback mechanism for blocks prop
- Fix metadata access in StartBlocks tooltip
- Restore defaultActiveTab support in NodeSelector
- Improve component robustness with graceful degradation
- Fix TypeScript errors and component interfaces

Phase 1-3 of atomic refactoring complete:
- Critical fixes: Constants, hooks, components
- Interface fixes: Props, tabs, modal integration
- Architecture improvements: Metadata, wrappers
2025-09-26 20:05:59 +08:00
yessenia 6b94d30a5f fix: oauth subscription 2025-09-26 17:44:57 +08:00
lyzno1 1a9798c559
fix(workflow): Fix onboarding node creation after knowledge pipeline refactor (#26289) 2025-09-26 16:43:36 +08:00
lyzno1 764436ed8e feat(workflow): Enable keyboard delete for all node types including Start
Removes explicit Start node exclusion from handleNodesDelete function:
- Remove BlockEnum.Start filter from bundled nodes selection
- Remove BlockEnum.Start filter from selected node detection
- Allows DEL/Backspace keys to delete Start nodes same as other nodes
- Button delete already worked, now keyboard delete works too

Fixes: Start nodes can now be deleted via both button and keyboard shortcuts

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-26 14:10:28 +08:00
lyzno1 2a1c5ff57b feat(workflow): Enable entry node deletion and fix draft sync
Complete workflow liberalization following PR #24627:

1. Remove Start node deletion restriction by removing isUndeletable property
2. Fix draft sync blocking when no Start node exists
3. Restore isWorkflowDataLoaded protection to prevent race conditions
4. Ensure all entry nodes (Start + 3 trigger types) have equal deletion rights

This allows workflows with only trigger nodes and fixes the issue where
added nodes would disappear after page refresh due to sync API blocking.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-26 13:54:52 +08:00
lyzno1 cc4ba1a3a9 chore: add settings.local.json to .gitignore 2025-09-26 13:26:51 +08:00
lyzno1 d68a9f1850 Merge remote-tracking branch 'origin/main' into feat/trigger
Resolve merge conflict in use-workflow.ts:
- Keep trigger branch workflow-entry utilities imports
- Preserve SUPPORT_OUTPUT_VARS_NODE from main branch
- Remove unused PARALLEL_DEPTH_LIMIT import

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-26 13:17:14 +08:00
Harry 4f460160d2 refactor(api): reorganize migration files 2025-09-26 12:31:44 +08:00
Harry d5ff89f6d3 refactor(api): enhance request handling and time management
- Initialized `response` variable in `trigger.py` to ensure proper handling in the trigger endpoint.
- Updated `http_parser.py` to conditionally set `CONTENT_TYPE` and `CONTENT_LENGTH` headers for improved robustness.
- Changed `datetime.utcnow()` to `datetime.now(UTC)` in `sqlalchemy_workflow_trigger_log_repository.py` and `rate_limiter.py` for consistent time zone handling.
- Refactored `async_workflow_service.py` to use the public method `get_tenant_owner_timezone` for better encapsulation.
- Simplified subscription retrieval logic in `plugin_parameter_service.py` for clarity.

These changes improve code reliability and maintainability while ensuring accurate time management and request processing.
2025-09-25 19:46:52 +08:00
Harry 452588dded refactor(api): fix pyright check
- Replaced `is_editor` checks with `has_edit_permission` in `workflow_trigger.py` and `workflow.py` to enhance clarity and consistency in permission handling.
- Updated the rate limiter to use `datetime.now(UTC)` instead of `datetime.utcnow()` for accurate time handling.
- Added `__all__` declaration in `trigger/__init__.py` for better module export management.
- Initialized `debug_dispatched` variable in `trigger_processing_tasks.py` to ensure proper tracking during workflow dispatching.

These changes improve code readability and maintainability while ensuring correct permission checks and time management.
2025-09-25 18:32:22 +08:00
Harry aef862d9ce refactor(api): remove unused PluginTriggerApi route
- Removed the `PluginTriggerApi` resource route from `workflow_trigger.py` to streamline the API and improve maintainability. This change contributes to a cleaner and more organized codebase.
2025-09-25 18:23:17 +08:00
Harry 896f3252b8 refactor(api): refactor all
- Replaced direct imports of `TriggerProviderID` and `ToolProviderID` from `core.plugin.entities.plugin` with imports from `models.provider_ids` for better organization.
- Refactored workflow node classes to inherit from a unified `Node` class, improving consistency and maintainability.
- Removed unused code and comments to clean up the implementation, particularly in the `workflow_trigger.py` and `builtin_tools_manage_service.py` files.

These changes enhance the clarity and structure of the codebase, facilitating easier future modifications.
2025-09-25 18:22:30 +08:00
yessenia 6853a699e1 Merge branch 'main' into feat/trigger 2025-09-25 17:43:39 +08:00
yessenia cd07eef639 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-09-25 17:14:24 +08:00
Harry ef9a741781 feat(trigger): enhance trigger management with new error handling and response structure
- Added `TriggerInvokeError` and `TriggerIgnoreEventError` for better error categorization during trigger invocation.
- Updated `TriggerInvokeResponse` to include a `cancelled` field, indicating if a trigger was ignored.
- Enhanced `TriggerManager` to handle specific errors and return appropriate responses.
- Refactored `dispatch_triggered_workflows` to improve workflow execution logic and error handling.

These changes improve the robustness and clarity of the trigger management system.
2025-09-23 16:01:59 +08:00
Harry c5de91ba94 refactor(trigger): update cache expiration constants and log key format
- Renamed validation-related constants to builder-related ones for clarity.
- Updated cache expiration from milliseconds to seconds for consistency.
- Adjusted log key format to reflect the builder context instead of validation.

These changes enhance the readability and maintainability of the TriggerSubscriptionBuilderService.
2025-09-22 13:37:46 +08:00
Harry bc1e6e011b fix(trigger): update cache key format in TriggerSubscriptionBuilderService
- Changed the cache key format in the `encode_cache_key` method from `trigger:subscription:validation:{subscription_id}` to `trigger:subscription:builder:{subscription_id}` to better reflect its purpose.

This update improves clarity in cache key usage for trigger subscriptions.
2025-09-22 13:37:46 +08:00
lyzno1 906028b1fb fix: start node validation 2025-09-22 12:58:20 +08:00
lyzno1 034602969f
feat(schedule-trigger): enhance cron parser with mature library and comprehensive testing (#26002) 2025-09-22 10:01:48 +08:00
非法操作 4ca14bfdad
chore: improve webhook (#25998) 2025-09-21 12:16:31 +08:00
lyzno1 59f56d8c94
feat: schedule trigger default daily midnight (#25937) 2025-09-19 08:05:00 +08:00
yessenia 63d26f0478 fix: api key params 2025-09-18 17:35:34 +08:00
yessenia eae65e55ce feat: oauth config opt & add dynamic options 2025-09-18 17:12:48 +08:00
lyzno1 0edf06329f fix: apply suggestions 2025-09-18 17:04:02 +08:00
lyzno1 6943a379c9 feat: show placeholder '--' for invalid cron expressions in node display
- Return '--' placeholder when cron mode has empty or invalid expressions
- Prevents displaying fallback dates that confuse users
- Maintains consistent UX for invalid schedule configurations
2025-09-18 17:04:02 +08:00
lyzno1 e49534b70c fix: make frequency optional 2025-09-18 17:04:02 +08:00
lyzno1 344616ca2f fix: clear opposite mode data only when editing, preserve data during mode switching 2025-09-18 17:04:02 +08:00
lyzno1 0e287a9c93 chore: add missing translations 2025-09-18 13:25:57 +08:00
lyzno1 8141f53af5 fix: add preventDefaultSubmit prop to BaseForm to prevent unwanted page refresh on Enter key 2025-09-18 12:48:26 +08:00
lyzno1 5a6cb0d887 feat: enhance API key modal step indicator with active dots and improved styling 2025-09-18 12:44:11 +08:00
lyzno1 26e7677595 fix: align width and use rounded xl 2025-09-18 12:08:21 +08:00
yessenia 814b0e1fe8 feat: oauth config init 2025-09-18 00:00:50 +08:00
Harry a173dc5c9d feat(provider): add multiple option support in ProviderConfig
- Introduced a new field `multiple` in the `ProviderConfig` class to allow for multiple selections, enhancing the configuration capabilities for providers.
- This addition improves flexibility in provider settings and aligns with the evolving requirements for provider configurations.
2025-09-17 22:12:01 +08:00
Harry a567facf2b refactor(trigger): streamline encrypter creation in TriggerProviderService
- Replaced calls to `create_trigger_provider_encrypter` and `create_trigger_provider_oauth_encrypter` with a unified `create_provider_encrypter` method, simplifying the encrypter creation process.
- Updated the parameters passed to the new method to enhance configuration management and cache handling.

These changes improve code clarity and maintainability in the trigger provider service.
2025-09-17 21:47:11 +08:00
Harry e76d80defe fix(trigger): update client parameter handling in TriggerProviderService
- Modified the `create_provider_encrypter` call to include a cache assignment, ensuring proper management of encryption resources.
- Added a cache deletion step after updating client parameters, enhancing the integrity of the parameter handling process.

These changes improve the reliability of client parameter updates within the trigger provider service.
2025-09-17 20:57:52 +08:00
Harry 4a17025467 fix(trigger): update session management in TriggerProviderService
- Changed session management in `TriggerProviderService` from `autoflush=True` to `expire_on_commit=False` for improved control over session state.
- This change enhances the reliability of database interactions by preventing automatic expiration of objects after commit, ensuring data consistency during trigger operations.

These updates contribute to better session handling and stability in trigger-related functionalities.
2025-09-16 18:01:44 +08:00
Harry bd1fcd3525 feat(trigger): add TriggerProviderInfoApi and enhance trigger provider service
- Introduced `TriggerProviderInfoApi` to retrieve information for a specific trigger provider, improving API capabilities.
- Added `get_trigger_provider` method in `TriggerProviderService` to fetch trigger provider details, enhancing data retrieval.
- Updated route configurations to include the new API endpoint for trigger provider information.

These changes enhance the functionality and usability of trigger provider interactions within the application.
2025-09-16 17:03:52 +08:00
Harry 0cb0cea167 feat(trigger): enhance trigger plugin data structure and error handling
- Added `plugin_unique_identifier` to `PluginTriggerData` and `TriggerProviderApiEntity` to improve identification of trigger plugins.
- Introduced `PluginTriggerDispatchData` for structured dispatch data in Celery tasks, enhancing the clarity of trigger dispatching.
- Updated `dispatch_triggered_workflows_async` to utilize the new dispatch data structure, improving error handling and logging for trigger invocations.
- Enhanced metadata handling in `TriggerPluginNode` to include trigger information, aiding in debugging and tracking.

These changes improve the robustness and maintainability of trigger plugin interactions within the workflow system.
2025-09-16 15:39:40 +08:00
Harry ee68a685a7 fix(workflow): enforce non-nullable arguments in DraftWorkflowTriggerRunApi
- Updated the argument definitions in the DraftWorkflowTriggerRunApi to include `nullable=False` for `node_id`, `trigger_name`, and `subscription_id`. This change ensures that these fields are always provided in the request, improving the robustness of the API.

This fix enhances input validation and prevents potential errors related to missing arguments.
2025-09-16 11:25:16 +08:00
Harry c78bd492af feat(trigger): add supported creation methods to TriggerProviderApiEntity
- Introduced a new field `supported_creation_methods` in `TriggerProviderApiEntity` to specify the available methods for creating triggers, including OAUTH, APIKEY, and MANUAL.
- Updated the `PluginTriggerProviderController` to populate this field based on the entity's schemas, enhancing the API's clarity and usability.

These changes improve the flexibility and configurability of trigger providers within the application.
2025-09-15 17:01:29 +08:00
Harry 6857bb4406 feat(trigger): implement plugin trigger synchronization and subscription management in workflow
- Added a new event handler for syncing plugin trigger relationships when a draft workflow is synced, ensuring that the database reflects the current state of plugin triggers.
- Introduced subscription management features in the frontend, allowing users to select, add, and remove subscriptions for trigger plugins.
- Updated various components to support subscription handling, including the addition of new UI elements for subscription selection and removal.
- Enhanced internationalization support by adding new translation keys related to subscription management.

These changes improve the overall functionality and user experience of trigger plugins within workflows.
2025-09-15 15:49:07 +08:00
Harry dcf3ee6982 fix(trigger): update trigger label assignment for improved clarity
- Changed the label assignment in the convertToTriggerWithProvider function from trigger.description.human to trigger.identity.label, ensuring the label reflects the correct identity format.

This update enhances the accuracy of trigger data representation in the application.
2025-09-15 14:50:56 +08:00
Harry 76850749e4 feat(trigger): enhance trigger debugging with polling API and new subscription retrieval
- Refactored DraftWorkflowTriggerNodeApi and DraftWorkflowTriggerRunApi to implement polling for trigger events instead of listening, improving responsiveness and reliability.
- Introduced TriggerSubscriptionBuilderGetApi to retrieve subscription instances for trigger providers, enhancing the API's capabilities.
- Removed deprecated trigger event classes and streamlined event handling in TriggerDebugService, ensuring a cleaner architecture.
- Updated Queue and Stream entities to reflect the changes in trigger event handling, improving overall clarity and maintainability.

These enhancements significantly improve the trigger debugging experience and API usability.
2025-09-14 19:12:31 +08:00
yessenia 91e5e33440 feat: add modal style opt 2025-09-12 20:22:33 +08:00
lyzno1 11e55088c9
fix: restore id prop passing to node children in BaseNode (#25520) 2025-09-11 17:54:31 +08:00
Harry 57c0bc9fb6 feat(trigger): refactor trigger debug event handling and improve response structures
- Renamed and refactored trigger debug event classes to enhance clarity and consistency, including changes from `TriggerDebugEventData` to `TriggerEventData` and related response classes.
- Updated `DraftWorkflowTriggerNodeApi` and `DraftWorkflowTriggerRunApi` to utilize the new event structures, improving the handling of trigger events.
- Removed the `TriggerDebugEventGenerator` class, consolidating event generation directly within the API logic for streamlined processing.
- Enhanced error handling and response formatting for trigger events, ensuring structured outputs for better integration and debugging.

This refactor improves the overall architecture of trigger debugging, making it more intuitive and maintainable.
2025-09-11 16:55:58 +08:00
Harry c3ebb22a4b feat(trigger): add workflows_in_use field to TriggerProviderSubscriptionApiEntity
- Introduced a new field `workflows_in_use` to the TriggerProviderSubscriptionApiEntity to track the number of workflows utilizing each subscription.
- Enhanced the TriggerProviderService to populate this field by querying the WorkflowPluginTrigger model for usage counts associated with each subscription.

This addition improves the visibility of subscription usage within the trigger provider context.
2025-09-11 16:55:58 +08:00
Harry 1562d00037 feat(trigger): implement trigger debugging functionality
- Added DraftWorkflowTriggerNodeApi and DraftWorkflowTriggerRunApi for debugging trigger nodes and workflows.
- Enhanced TriggerDebugService to manage trigger debugging sessions and event listening.
- Introduced structured event responses for trigger debugging, including listening started, received, node finished, and workflow started events.
- Updated Queue and Stream entities to support new trigger debug events.
- Refactored trigger input handling to streamline the process of creating inputs from trigger data.

This implementation improves the debugging capabilities for trigger nodes and workflows, providing clearer event handling and structured responses.
2025-09-11 16:55:58 +08:00
Harry e9e843b27d fix(tool): standardize tool naming across components
- Updated references from `trigger_name` to `tool_name` in multiple components for consistency.
- Adjusted type definitions to reflect the change in naming convention, enhancing clarity in the codebase.
2025-09-11 16:55:57 +08:00
Harry ec33b9908e fix(trigger): improve formatting of OAuth client response in TriggerOAuthClientManageApi
- Refactored the return statement in the TriggerOAuthClientManageApi to enhance readability and maintainability.
- Ensured consistent formatting of the response structure for better clarity in API responses.
2025-09-11 16:55:57 +08:00
yessenia 67004368d9 feat: sub card style 2025-09-11 16:22:59 +08:00
Stream 94ecbd44e4
feat: add API endpoint to extract plugin assets 2025-09-11 14:48:42 +08:00
Stream ba76312248
feat: adapt to plugin_daemon endpoint 2025-09-11 14:46:12 +08:00
yessenia 50bff270b6 feat: add subscription 2025-09-10 23:21:33 +08:00
Harry bd5cf1c272 fix(trigger): enhance OAuth client response in TriggerOAuthClientManageApi
- Integrated TriggerManager to retrieve the trigger provider's OAuth client schema.
- Updated the return structure to include the redirect URI and OAuth client schema for improved API response clarity.
2025-09-10 17:35:30 +08:00
Yeuoly d22404994a chore: add comments on generate_webhook_id 2025-09-10 17:23:29 +08:00
Yeuoly 9898730cc5 feat: add webhook node limit validation (max 5 per workflow)
- Add MAX_WEBHOOK_NODES_PER_WORKFLOW constant set to 5
- Validate webhook node count in sync_webhook_relationships method
- Raise ValueError when workflow exceeds webhook node limit
- Block workflow save when limit is exceeded to ensure data integrity
- Provide clear error message indicating current count and maximum allowed

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-10 17:22:09 +08:00
Yeuoly b0f1e55a87 refactor: remove triggered_by field from webhook triggers and use automatic sync
- Remove triggered_by field from WorkflowWebhookTrigger model
- Replace manual webhook creation/deletion APIs with automatic sync via WebhookService
- Keep only GET API for retrieving webhook information
- Use same webhook ID for both debug and production environments (differentiated by endpoint)
- Add sync_webhook_relationships to automatically manage webhook lifecycle
- Update tests to remove triggered_by references
- Clean up unused imports and fix type checking issues

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-10 17:17:19 +08:00
Harry 6566824807 fix(trigger): update return type in TriggerSubscriptionBuilderService
- Changed the return type of the method in `TriggerSubscriptionBuilderService` from `SubscriptionBuilder` to `SubscriptionBuilderApiEntity` for improved clarity and alignment with API entity structures.
- Updated the return statement to utilize the new method for converting the builder to the API entity.
2025-09-10 15:48:32 +08:00
Harry 9249a2af0d fix(trigger): update event data publishing in TriggerDebugService
- Changed the event data publishing method in `TriggerDebugService` to use `model_dump()` for improved data structure handling when publishing to Redis Pub/Sub.
2025-09-10 15:48:32 +08:00
Yeuoly 112fc3b1d1 fix: clear schedule config when exporting data 2025-09-10 13:50:37 +08:00
Yeuoly 37299b3bd7 fix: rename migration 2025-09-10 13:41:50 +08:00
Yeuoly 8f65ce995a fix: migrations 2025-09-10 13:38:34 +08:00
诗浓 4a743e6dc1
feat: add workflow schedule trigger support (#24428)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-10 13:24:23 +08:00
lyzno1 07dda61929
fix/tooltip and onboarding ui (#25451) 2025-09-10 10:40:14 +08:00
Harry 0d8438ef40 fix(trigger): add 'trigger' category key to plugin constants for error avoid 2025-09-10 10:34:33 +08:00
Yeuoly 96bb638969 fix: limits 2025-09-09 23:32:51 +08:00
lyzno1 e74962272e
fix: only workflow use trigger api (#25443) 2025-09-09 23:14:10 +08:00
Harry 5a15419baf feat(trigger): implement debug session capabilities for trigger nodes
- Added `DraftWorkflowTriggerNodeApi` to handle debugging of trigger nodes, allowing for real-time event listening and session management.
- Introduced `TriggerDebugService` for managing debug sessions and event dispatching using Redis Pub/Sub.
- Updated `TriggerService` to support dispatching events to debug sessions and refactored related methods for improved clarity and functionality.
- Enhanced data structures in `request.py` and `entities.py` to accommodate new debug event data requirements.

These changes significantly improve the debugging capabilities for trigger nodes in draft workflows, facilitating better development and troubleshooting processes.
2025-09-09 21:27:31 +08:00
Harry e8403977b9 feat(plugin): add triggers field to PluginDeclaration for enhanced functionality
- Introduced a new `triggers` field in the `PluginDeclaration` class to support trigger functionalities within plugins.
- This addition improves the integration of triggers in the plugin architecture, aligning with recent updates to the trigger entity structures.

These changes enhance the overall capabilities of the plugin system.
2025-09-09 17:22:11 +08:00
Harry add2ca85f2 refactor(trigger): update plugin and trigger entity structures
- Removed unnecessary newline in `TriggerPluginNode` class for consistency.
- Made `provider` in `TriggerIdentity` optional to enhance flexibility.
- Added `trigger` field to `PluginDeclaration` and updated `PluginCategory` to include `Trigger`, improving the integration of trigger functionalities within the plugin architecture.

These changes streamline the entity definitions and enhance the overall structure of the trigger and plugin components.
2025-09-09 17:16:44 +08:00
lyzno1 fbb7b02e90
fix(webhook): prevent SimpleSelect from resetting user selections (#25423)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-09-09 17:11:11 +08:00
lyzno1 249b62c9de
fix: workflow header (#25411) 2025-09-09 15:34:15 +08:00
lyzno1 b433322e8d
feat/trigger plugin apikey (#25388) 2025-09-09 15:01:06 +08:00
lyzno1 1c8850fc95
feat: adjust scroll to selected node position to top-left area (#25403) 2025-09-09 14:58:42 +08:00
Harry dc16f1b65a refactor(trigger): simplify provider path handling in workflow components
- Updated various components to directly use `provider.name` instead of constructing a path with `provider.plugin_id` and `provider.name`.
- Adjusted related calls to `invalidateSubscriptions` and other functions to reflect this change.

These modifications enhance code clarity and streamline the handling of provider information in the trigger plugin components.
2025-09-09 00:17:20 +08:00
Harry ff30395dc1 fix(OAuthClientConfigModal): simplify provider path handling in OAuth configuration
- Updated the provider path handling in `OAuthClientConfigModal` to directly use `provider.name` instead of constructing a path with `provider.plugin_id` and `provider.name`.
- Adjusted the corresponding calls to `invalidateOAuthConfig` and `configureTriggerOAuth` to reflect this change.

These modifications enhance code clarity and streamline the OAuth configuration process in the trigger plugin component.
2025-09-09 00:10:04 +08:00
Harry 8e600f3302 feat(trigger): optimize trigger parameter schema handling in useConfig
- Refactored the trigger parameter schema construction in `useConfig` to utilize a Map for improved efficiency and clarity.
- Updated the return value to ensure unique schema entries, enhancing the integrity of the trigger configuration.

These changes streamline the management of trigger parameters, improving performance and maintainability in the workflow component.
2025-09-08 23:39:44 +08:00
Harry 5a1e0a8379 feat(FormInputItem): enhance UI components for improved user experience
- Added loading indicator using `RiLoader4Line` to `FormInputItem` for better feedback during option fetching.
- Refactored button and option styles for improved accessibility and visual consistency.
- Updated text color classes to enhance readability based on loading state and selection.

These changes improve the overall user experience and visual clarity of the form input components.
2025-09-08 23:19:33 +08:00
Harry 2a3ce6baa9 feat(trigger): enhance plugin and trigger integration with updated naming conventions
- Refactored `PluginFetchDynamicSelectOptionsApi` to replace the `extra` argument with `credential_id`, improving clarity in dynamic option fetching.
- Updated `ProviderConfigEncrypter` to rename `mask_tool_credentials` to `mask_credentials` for consistency, and added a new method to maintain backward compatibility.
- Enhanced `PluginParameterService` to utilize `credential_id` for fetching subscriptions, improving the handling of trigger credentials.
- Adjusted various components and types in the frontend to replace `tool_name` with `trigger_name`, ensuring consistency across the application.
- Introduced `multiple` property in `TriggerParameter` to support multi-select functionality.

These changes improve the integration of triggers and plugins, enhance code clarity, and align naming conventions across the codebase.
2025-09-08 23:14:50 +08:00
Harry 01b2f9cff6 feat: add providerType prop to form components for dynamic behavior
- Introduced `providerType` prop in `FormInputItem`, `ToolForm`, and `ToolFormItem` components to support both 'tool' and 'trigger' types, enhancing flexibility in handling different provider scenarios.
- Updated the `useFetchDynamicOptions` function to accept `provider_type` as 'tool' | 'trigger', allowing for more dynamic option fetching based on the provider type.

These changes improve the adaptability of the form components and streamline the integration of different provider types in the workflow.
2025-09-08 18:29:48 +08:00
Harry ac38614171 refactor(trigger): streamline trigger provider verification and update imports
- Updated `TriggerSubscriptionBuilderVerifyApi` to directly return the result of `verify_trigger_subscription_builder`, improving clarity.
- Refactored import statement in `trigger_plugin/__init__.py` to point to the correct module, enhancing code organization.
- Removed the obsolete `node.py` file, cleaning up the codebase by eliminating unused components.

These changes enhance the maintainability and clarity of the trigger provider functionality.
2025-09-08 18:25:04 +08:00
Harry eb95c5cd07 feat(trigger): enhance subscription builder management and update API
- Introduced `SubscriptionBuilderUpdater` class to streamline updates to subscription builders, encapsulating properties like name, parameters, and credentials.
- Refactored API endpoints to utilize the new updater class, improving code clarity and maintainability.
- Adjusted OAuth handling to create and update subscription builders more effectively, ensuring proper credential management.

This change enhances the overall functionality and organization of the trigger subscription builder API.
2025-09-08 15:09:47 +08:00
lyzno1 a799b54b9e
feat: initialize trigger status at application level to prevent canvas refresh state issues (#25329) 2025-09-08 09:34:28 +08:00
lyzno1 98ba0236e6
feat: implement trigger plugin authentication UI (#25310) 2025-09-07 21:53:22 +08:00
lyzno1 b6c552df07
fix: add stable sorting for trigger list to prevent position changes (#25328) 2025-09-07 21:52:41 +08:00
lyzno1 e2827e475d
feat: implement trigger-plugin support with real-time status sync (#25326) 2025-09-07 21:29:53 +08:00
lyzno1 58cbd337b5
fix: improve test run menu and checklist ui (#25300) 2025-09-06 22:54:36 +08:00
lyzno1 a91e59d544
feat: implement trigger plugin frontend integration (#25283) 2025-09-06 16:18:46 +08:00
Harry 814787677a feat(trigger): update plugin trigger API and model to use trigger_name
- Modified `PluginTriggerApi` to accept `trigger_name` as a JSON argument and return encoded plugin triggers.
- Updated `WorkflowPluginTrigger` model to replace `trigger_id` with `trigger_name` for better clarity.
- Adjusted `WorkflowPluginTriggerService` to handle the new `trigger_name` field and ensure proper error handling for subscriptions.
- Enhanced `workflow_trigger_fields` to include `trigger_name` in the plugin trigger schema.

This change improves the API's clarity and aligns the model with the updated naming conventions.
2025-09-05 15:56:13 +08:00
Harry 85caa5bd0c fix(trigger): clean up whitespace in encryption utility and trigger provider service
- Removed unnecessary blank lines in `encryption.py` and `trigger_provider_service.py` for improved code readability.
- This minor adjustment enhances the overall code quality without altering functionality.

🤖 Generated with [Claude Code](https://claude.ai/code)
2025-09-05 15:56:13 +08:00
lyzno1 e04083fc0e
feat: add icon support for trigger plugin workflow nodes (#25241) 2025-09-05 15:50:54 +08:00
Harry cf532e5e0d feat(trigger): add context caching for trigger providers
- Add plugin_trigger_providers and plugin_trigger_providers_lock to contexts module
- Implement caching mechanism in TriggerManager.get_trigger_provider() method
- Cache fetched trigger providers to reduce repeated daemon calls
- Use double-check locking pattern for thread-safe cache access

This follows the same pattern as ToolManager.get_plugin_provider() to improve performance
by avoiding redundant requests to the daemon when accessing trigger providers.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-05 14:30:10 +08:00
Harry c097fc2c48 refactor(trigger): add uuid import to trigger provider service
- Imported `uuid` in `trigger_provider_service.py` to support unique identifier generation.
- This change prepares the service for future enhancements that may require UUID functionality.
2025-09-05 14:30:10 +08:00
Harry 0371d71409 feat(trigger): enhance trigger subscription management and cache handling
- Added `name` parameter to `TriggerSubscriptionBuilderCreateApi` for better subscription identification.
- Implemented `delete_cache_for_subscription` function to clear cache associated with trigger subscriptions.
- Updated `WorkflowPluginTriggerService` to check for existing subscriptions before creating new plugin triggers, improving error handling.
- Refactored `TriggerProviderService` to utilize the new cache deletion method during provider deletion.

This improves the overall management of trigger subscriptions and enhances cache efficiency.
2025-09-05 14:30:10 +08:00
非法操作 81ef7343d4
chore: (trigger) refactor webhook service (#25229) 2025-09-05 14:00:20 +08:00
zhangxuhe1 8e4b59c90c
feat: improve trigger plugin UI layout and responsiveness (#25232) 2025-09-05 14:00:14 +08:00
非法操作 68f73410fc
chore: (trigger) add WEBHOOK_REQUEST_BODY_MAX_SIZE (#25217) 2025-09-05 12:23:11 +08:00
lyzno1 88af8ed374
fix: block selector ui (#25228) 2025-09-05 12:22:13 +08:00
Harry 015f82878e feat(trigger): integrate plugin icon retrieval into trigger provider
- Added `get_plugin_icon_url` method in `PluginService` to fetch plugin icons.
- Updated `PluginTriggerProviderController` to use the new method for icon handling.
- Refactored `ToolTransformService` to utilize `PluginService` for consistent icon URL generation.

This enhances the trigger provider's ability to manage plugin icons effectively.
2025-09-05 12:01:41 +08:00
Harry 3874e58dc2 refactor(trigger): enhance trigger provider deletion process and session management 2025-09-05 11:31:57 +08:00
lyzno1 9f8c159583
feat(trigger): implement trigger plugin block selector following tools pattern (#25204) 2025-09-05 10:20:47 +08:00
非法操作 d8f6f9ce19
chore: (trigger)change content type from form to application/octet-stream (#25167) 2025-09-05 09:54:07 +08:00
Harry eab03e63d4 refactor(trigger): rename request logs API and enhance logging functionality
- Renamed `TriggerSubscriptionBuilderRequestLogsApi` to `TriggerSubscriptionBuilderLogsApi` for clarity.
- Updated the API endpoint to retrieve logs for subscription builders.
- Enhanced logging functionality in `TriggerSubscriptionBuilderService` to append and list logs more effectively.
- Refactored trigger processing tasks to improve naming consistency and clarity in logging.

🤖 Generated with [Claude Code](https://claude.ai/code)
2025-09-04 21:11:25 +08:00
非法操作 461829274a
feat: (trigger) support file upload in webhook (#25159) 2025-09-04 18:33:42 +08:00
Harry e751c0c535 refactor(trigger): update trigger provider API and clean up unused classes
- Renamed the API endpoint for trigger providers from `/workspaces/current/trigger-providers` to `/workspaces/current/triggers` for consistency.
- Removed unused `TriggerProviderCredentialsCache` and `TriggerProviderOAuthClientParamsCache` classes to streamline the codebase.
- Enhanced the `TriggerProviderApiEntity` to include additional properties and improved the conversion logic in `PluginTriggerProviderController`.

🤖 Generated with [Claude Code](https://claude.ai/code)
2025-09-04 17:45:15 +08:00
lyzno1 1fffc79c32
fix: prevent empty workflow draft sync during page navigation (#25140) 2025-09-04 17:13:49 +08:00
非法操作 83fab4bc19
chore: (webhook) when content type changed clear the body variables (#25136) 2025-09-04 15:09:54 +08:00
Harry f60e28d2f5 feat(trigger): enhance user role validation and add request logs API for trigger providers
- Updated user role validation in PluginTriggerApi and WebhookTriggerApi to assert current_user as an Account and check tenant ID.
- Introduced TriggerSubscriptionBuilderRequestLogsApi to retrieve request logs for subscription instances, ensuring proper user authentication and error handling.
- Added new API endpoint for accessing request logs related to trigger providers.

🤖 Generated with [Claude Code](https://claude.ai/code)
2025-09-04 14:44:02 +08:00
Harry a62d7aa3ee feat(trigger): add plugin trigger workflow support and refactor trigger system
- Add new workflow plugin trigger service for managing plugin-based triggers
- Implement trigger provider encryption utilities for secure credential storage
- Add custom trigger errors module for better error handling
- Refactor trigger provider and manager classes for improved plugin integration
- Update API endpoints to support plugin trigger workflows
- Add database migration for plugin trigger workflow support

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-04 13:20:43 +08:00
非法操作 cc84a45244
chore: (webhook) use variable instead of InputVar (#25119) 2025-09-04 11:10:42 +08:00
cathy 5cf3d24018
fix(webhook): selected type ui style (#25106) 2025-09-04 10:59:08 +08:00
lyzno1 4bdbe617fe
fix: uuidv7 (#25097) 2025-09-04 08:44:14 +08:00
lyzno1 33c867fd8c
feat(workflow): enhance webhook status code input with increment/decrement controls (#25099) 2025-09-03 22:26:00 +08:00
非法操作 2013ceb9d2
chore: validate param type of application/json when call a webhook (#25074) 2025-09-03 15:49:07 +08:00
非法操作 7120c6414c
fix: content type of webhook (#25032) 2025-09-03 15:13:01 +08:00
Harry 5ce7b2d98d refactor(migrations): remove obsolete plugin_trigger migration file
- Deleted the plugin_trigger migration file as it is no longer needed in the codebase.
- Updated model imports in `__init__.py` to include new trigger-related classes for better organization.
2025-09-03 15:02:17 +08:00
Harry cb82198271 refactor(trigger): update trigger provider classes and API endpoints
- Renamed classes for trigger subscription management to improve clarity, including TriggerProviderSubscriptionListApi to TriggerSubscriptionListApi and TriggerSubscriptionsDeleteApi to TriggerSubscriptionDeleteApi.
- Updated API endpoint paths to reflect the new naming conventions for trigger subscriptions.
- Removed deprecated TriggerOAuthRefreshTokenApi class to streamline the codebase.
- Added trigger_providers import to the console controller for better organization.
2025-09-03 14:53:27 +08:00
Harry 5e5ffaa416 feat(tool-form): add extraParams prop to ToolForm and ToolFormItem components
- Introduced extraParams prop to both ToolForm and ToolFormItem components for enhanced flexibility in passing additional parameters.
- Updated component usage to accommodate the new prop, improving the overall functionality of the tool forms.
2025-09-03 14:53:27 +08:00
Harry 4b253e1f73 feat(trigger): plugin trigger workflow 2025-09-03 14:53:27 +08:00
Harry dd929dbf0e fix(dynamic_select): implement function 2025-09-03 14:53:27 +08:00
Harry 97a9d34e96 feat(trigger): introduce plugin trigger management and enhance trigger processing
- Remove the debug endpoint for cleaner API structure
- Add support for TRIGGER_PLUGIN in NodeType enumeration
- Implement WorkflowPluginTrigger model to map plugin triggers to workflow nodes
- Enhance TriggerService to process plugin triggers and store trigger data in Redis
- Update node mapping to include TriggerPluginNode for workflow execution

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
Harry 602070ec9c refactor(trigger): improve method signature formatting in TriggerService
- Adjust the formatting of the `process_triggered_workflows` method signature for better readability
- Ensure consistent style across method definitions in the TriggerService class

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
Harry afd8989150 feat(trigger): introduce subscription builder and enhance trigger management
- Refactor trigger provider classes to improve naming consistency, including renaming classes for subscription management
- Implement new TriggerSubscriptionBuilderService for creating and verifying subscription builders
- Update API endpoints to support subscription builder creation and verification
- Enhance data models to include new attributes for subscription builders
- Remove the deprecated TriggerSubscriptionValidationService to streamline the codebase

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
Harry 694197a701 refactor(trigger): clean up imports and optimize trigger-related code
- Remove unused imports in trigger-related files for better clarity and maintainability
- Streamline import statements across various modules to enhance code quality

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
Harry 2f08306695 feat(trigger): enhance trigger subscription management and processing
- Refactor trigger provider classes to improve naming consistency and clarity
- Introduce new methods for managing trigger subscriptions, including validation and dispatching
- Update API endpoints to reflect changes in subscription handling
- Implement logging and request management for endpoint interactions
- Enhance data models to support subscription attributes and lifecycle management

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
Harry 6acc77d86d feat(trigger): refactor trigger provider to subscription model
- Rename classes and methods to reflect the transition from credentials to subscriptions
- Update API endpoints for managing trigger subscriptions
- Modify data models and entities to support subscription attributes
- Enhance service methods for listing, adding, updating, and deleting subscriptions
- Adjust encryption utilities to handle subscription data

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
Harry 5ddd5e49ee feat(trigger): enhance subscription schema and provider configuration
- Update ProviderConfig to allow a list as a default value
- Introduce SubscriptionSchema for better organization of subscription-related configurations
- Modify TriggerProviderApiEntity to use Optional for subscription_schema
- Add custom_model_schema to TriggerProviderEntity for additional configuration options

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
Harry 72f9e77368 refactor(trigger): clean up and optimize trigger-related code
- Remove unused classes and imports in encryption utilities
- Simplify method signatures for better readability
- Enhance code quality by adding newlines for clarity
- Update tests to reflect changes in import paths

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:26 +08:00
Harry a46c9238fa feat(trigger): implement complete OAuth authorization flow for trigger providers
- Add OAuth authorization URL generation API endpoint
- Implement OAuth callback handler for credential storage
- Support both system-level and tenant-level OAuth clients
- Add trigger provider credential encryption utilities
- Refactor trigger entities into separate modules
- Update trigger provider service with OAuth client management
- Add credential cache for trigger providers

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-03 14:53:26 +08:00
Harry 87120ad4ac feat(trigger): add trigger provider management and webhook handling functionality 2025-09-03 14:53:26 +08:00
非法操作 7544b5ec9a
fix: delete var of webhook (#25038) 2025-09-03 14:49:56 +08:00
非法操作 ff4a62d1e7
chore: limit webhook status code 200~399 (#25045) 2025-09-03 14:48:18 +08:00
lyzno1 41daa51988
fix: missing key for translation path (#25059) 2025-09-03 14:43:40 +08:00
cathy d522350c99
fix(webhook-trigger): request array type adjustment (#25005) 2025-09-02 23:20:12 +08:00
lyzno1 1d1bb9451e
fix: prevent workflow canvas clearing due to race condition and viewport errors (#25003) 2025-09-02 20:53:44 +08:00
lyzno1 1fce1a61d4
feat(workflow-log): enhance workflow logs UI with sorting and status filters (#24978) 2025-09-02 16:43:11 +08:00
非法操作 883a6caf96
feat: add trigger by of app log (#24973) 2025-09-02 16:04:08 +08:00
非法操作 a239c39f09
fix: webhook http method should case insensitive (#24957) 2025-09-02 14:47:24 +08:00
lyzno1 e925a8ab99
fix(app-cards): restrict toggle enable to Start nodes only (#24918) 2025-09-01 22:52:23 +08:00
Yeuoly bccaf939e6 fix: migrations 2025-09-01 18:07:21 +08:00
Yeuoly 676648e0b3 Merge branch 'main' into feat/trigger 2025-09-01 18:05:31 +08:00
cathy 4ae19e6dde
fix(webhook-trigger): remove error handling (#24902) 2025-09-01 17:11:49 +08:00
非法操作 4d0ff5c281
feat: implement variable synchronization for webhook node (#24874)
Co-authored-by: Claude <noreply@anthropic.com>
2025-09-01 16:58:06 +08:00
lyzno1 327b354cc2
refactor: unify trigger node architecture and clean up technical debt (#24886)
Co-authored-by: hjlarry <hjlarry@163.com>
Co-authored-by: Claude <noreply@anthropic.com>
2025-09-01 15:47:44 +08:00
lyzno1 6d307cc9fc
Fix test run shortcut consistency and improve dropdown styling (#24849) 2025-09-01 14:47:21 +08:00
lyzno1 adc7134af5
fix: improve TimePicker footer layout and button styling (#24831) 2025-09-01 13:34:53 +08:00
cathy 10f19cd0c2
fix(webhook): add content-type aware parameter type handling (#24865) 2025-09-01 10:06:26 +08:00
lyzno1 9ed45594c6
fix: improve schedule trigger and quick settings app-operation btns ui (#24843) 2025-08-31 16:59:49 +08:00
非法操作 c138f4c3a6
fix: check AppTrigger status before webhook execution (#24829)
Co-authored-by: Claude <noreply@anthropic.com>
2025-08-30 16:40:21 +08:00
lyzno1 a35be05790
Fix workflow card toggle logic and implement minimal state UI (#24822) 2025-08-30 16:35:34 +08:00
lyzno1 60b5ed8e5d
fix: enhance webhook trigger panel UI consistency and user experience (#24780) 2025-08-29 17:41:42 +08:00
lyzno1 d8ddbc4d87
feat: enhance webhook trigger panel UI consistency and interactivity (#24759)
Co-authored-by: hjlarry <hjlarry@163.com>
2025-08-29 14:24:23 +08:00
非法操作 19c0fc85e2
feat: when add/delete webhook trigger call the API (#24755) 2025-08-29 14:23:50 +08:00
lyzno1 a58df35ead
fix: improve trigger card layout spacing and remove dividers (#24756) 2025-08-29 13:37:44 +08:00
lyzno1 9789bd02d8
feat: implement trigger card component with auto-refresh (#24743) 2025-08-29 11:57:08 +08:00
lyzno1 d94e54923f
Improve tooltip design for trigger blocks (#24724) 2025-08-28 23:18:00 +08:00
lyzno1 64c7be59b7
Improve workflow block selector search functionality (#24707) 2025-08-28 17:21:34 +08:00
非法操作 89ad6ad902
feat: add app trigger list api (#24693) 2025-08-28 15:23:08 +08:00
lyzno1 4f73bc9693
fix(schedule): add time logic to weekly frequency mode for consistent behavior with daily mode (#24673) 2025-08-28 14:40:11 +08:00
lyzno1 add6b79231
UI enhancements for workflow checklist component (#24647) 2025-08-28 10:10:10 +08:00
lyzno1 c90dad566f
feat: enhance workflow error handling and internationalization (#24648) 2025-08-28 09:41:22 +08:00
lyzno1 5cbe6bf8f8
fix(schedule): correct weekly frequency weekday calculation algorithm (#24641) 2025-08-27 18:20:09 +08:00
Yeuoly 4ef6ff217e
fix: improve code quality in webhook services and controllers (#24634)
Co-authored-by: Claude <noreply@anthropic.com>
2025-08-27 17:50:51 +08:00
lyzno1 87abfbf515
Allow empty workflows and improve workflow validation (#24627) 2025-08-27 17:49:09 +08:00
lyzno1 73e65fd838
feat: align trigger webhook style with schedule node and fix selection border truncation (#24635) 2025-08-27 17:47:14 +08:00
Yeuoly e53edb0fc2
refactor: optimize TenantDailyRateLimiter to use UTC internally with timezone-aware error messages (#24632)
Co-authored-by: Claude <noreply@anthropic.com>
2025-08-27 17:35:04 +08:00
非法操作 17908fbf6b
fix: only workflow should display start modal (#24623) 2025-08-27 16:20:31 +08:00
zhangxuhe1 3dae108f84
refactor(sidebar): Restructure app operations with toggle functionality (#24625) 2025-08-27 16:20:17 +08:00
lyzno1 5bbf685035
feat: fix i18n missing keys and merge upstream/main (#24615)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
Signed-off-by: Yongtao Huang <yongtaoh2022@gmail.com>
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
Signed-off-by: zhanluxianshen <zhanluxianshen@163.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: GuanMu <ballmanjq@gmail.com>
Co-authored-by: Davide Delbianco <davide.delbianco@outlook.com>
Co-authored-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
Co-authored-by: kenwoodjw <blackxin55+@gmail.com>
Co-authored-by: Yongtao Huang <yongtaoh2022@gmail.com>
Co-authored-by: Yongtao Huang <99629139+hyongtao-db@users.noreply.github.com>
Co-authored-by: Qiang Lee <18018968632@163.com>
Co-authored-by: 李强04 <liqiang04@gaotu.cn>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Asuka Minato <i@asukaminato.eu.org>
Co-authored-by: Matri Qi <matrixdom@126.com>
Co-authored-by: huayaoyue6 <huayaoyue@163.com>
Co-authored-by: Bowen Liang <liangbowen@gf.com.cn>
Co-authored-by: znn <jubinkumarsoni@gmail.com>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: yihong <zouzou0208@gmail.com>
Co-authored-by: Muke Wang <shaodwaaron@gmail.com>
Co-authored-by: wangmuke <wangmuke@kingsware.cn>
Co-authored-by: Wu Tianwei <30284043+WTW0313@users.noreply.github.com>
Co-authored-by: quicksand <quicksandzn@gmail.com>
Co-authored-by: 非法操作 <hjlarry@163.com>
Co-authored-by: zxhlyh <jasonapring2015@outlook.com>
Co-authored-by: Eric Guo <eric.guocz@gmail.com>
Co-authored-by: Zhedong Cen <cenzhedong2@126.com>
Co-authored-by: jiangbo721 <jiangbo721@163.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: hjlarry <25834719+hjlarry@users.noreply.github.com>
Co-authored-by: lxsummer <35754229+lxjustdoit@users.noreply.github.com>
Co-authored-by: 湛露先生 <zhanluxianshen@163.com>
Co-authored-by: Guangdong Liu <liugddx@gmail.com>
Co-authored-by: QuantumGhost <obelisk.reg+git@gmail.com>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Yessenia-d <yessenia.contact@gmail.com>
Co-authored-by: huangzhuo1949 <167434202+huangzhuo1949@users.noreply.github.com>
Co-authored-by: huangzhuo <huangzhuo1@xiaomi.com>
Co-authored-by: 17hz <0x149527@gmail.com>
Co-authored-by: Amy <1530140574@qq.com>
Co-authored-by: Joel <iamjoel007@gmail.com>
Co-authored-by: Nite Knite <nkCoding@gmail.com>
Co-authored-by: Yeuoly <45712896+Yeuoly@users.noreply.github.com>
Co-authored-by: Petrus Han <petrus.hanks@gmail.com>
Co-authored-by: iamjoel <2120155+iamjoel@users.noreply.github.com>
Co-authored-by: Kalo Chin <frog.beepers.0n@icloud.com>
Co-authored-by: Ujjwal Maurya <ujjwalsbx@gmail.com>
Co-authored-by: Maries <xh001x@hotmail.com>
2025-08-27 15:07:28 +08:00
非法操作 a63d1e87b1
feat: webhook trigger backend api (#24387)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-08-27 14:42:45 +08:00
lyzno1 7129de98cd
feat: implement workflow onboarding modal system (#24551) 2025-08-27 13:31:22 +08:00
非法操作 2984dbc0df
fix: when workflow not has start node can't open service api (#24564) 2025-08-26 18:06:11 +08:00
非法操作 392db7f611
fix: when workflow only has trigger node can't save (#24546) 2025-08-26 16:41:47 +08:00
lyzno1 5a427b8daa
refactor: rename RunAllTriggers icon to TriggerAll for semantic clarity (#24478)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-08-25 17:51:04 +08:00
Yeuoly 18f2e6f166
refactor: Use specific error types for workflow execution (#24475)
Co-authored-by: Claude <noreply@anthropic.com>
2025-08-25 16:19:12 +08:00
lyzno1 e78903302f
feat(trigger-schedule): simplify timezone handling with user-centric approach (#24401) 2025-08-24 21:03:59 +08:00
cathy 4084ade86c
refactor(trigger-webhook): remove redundant WebhookParam type and sim… (#24390) 2025-08-24 00:21:47 +08:00
cathy 6b0d919dbd
feat: webhook trigger frontend (#24311) 2025-08-23 23:54:41 +08:00
Yeuoly a7b558b38b
feat/trigger: support specifying root node (#24388)
Co-authored-by: Claude <noreply@anthropic.com>
2025-08-23 20:44:03 +08:00
Yeuoly 6aed7e3ff4
feat/trigger universal entry (#24358)
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-08-23 20:18:08 +08:00
lyzno1 8e93a8a2e2
refactor: comprehensive schedule trigger component redesign (#24359)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: zhangxuhe1 <xuhezhang6@gmail.com>
2025-08-23 11:03:18 +08:00
Yeuoly e38a86e37b Merge branch 'main' into feat/trigger 2025-08-22 20:11:49 +08:00
lyzno1 392e3530bf
feat: replace mock data with dynamic workflow options in test run dropdown (#24320) 2025-08-22 16:36:09 +08:00
lyzno1 833c902b2b
feat(workflow): Plugin Trigger Node with Unified Entry Node System (#24205) 2025-08-20 23:49:10 +08:00
lyzno1 6eaea64b3f
feat: implement multi-select monthly trigger schedule (#24247) 2025-08-20 06:23:30 -07:00
lyzno1 5303b50737
fix: initialize recur fields when switching to hourly frequency (#24181) 2025-08-20 09:32:05 +08:00
lyzno1 6acbcfe679
UI improvements: fix translation and custom icons for schedule trigger (#24167) 2025-08-19 18:27:07 +08:00
lyzno1 16ef5ebb97
fix: remove duplicate weekdays keys in i18n workflow files (#24157) 2025-08-19 14:55:16 +08:00
lyzno1 acfb95f9c2
Refactor Start node UI to User Input and optimize EntryNodeContainer (#24156) 2025-08-19 14:40:24 +08:00
lyzno1 aacea166d7
fix: resolve merge conflict between Features removal and validation enhancement (#24150) 2025-08-19 13:47:38 +08:00
lyzno1 f7bb3b852a
feat: implement Schedule Trigger validation with multi-start node topology support (#24134) 2025-08-19 11:55:15 +08:00
lyzno1 d4ff1e031a
Remove workflow features button (#24085) 2025-08-19 09:32:07 +08:00
lyzno1 6a3d135d49
fix: simplify trigger-schedule hourly mode calculation and improve UI consistency (#24082)
Co-authored-by: zhangxuhe1 <xuhezhang6@gmail.com>
2025-08-18 23:37:57 +08:00
lyzno1 5c4bf7aabd
feat: Test Run dropdown with dynamic trigger selection (#24113) 2025-08-18 17:46:36 +08:00
lyzno1 e9c7dc7464
feat: update workflow run button to Test Run with keyboard shortcut (#24071) 2025-08-18 10:44:17 +08:00
lyzno1 74ad21b145
feat: comprehensive trigger node system with Schedule Trigger implementation (#24039)
Co-authored-by: zhangxuhe1 <xuhezhang6@gmail.com>
2025-08-18 09:23:16 +08:00
lyzno1 f214eeb7b1
feat: add scroll to selected node button in workflow header (#24030)
Co-authored-by: zhangxuhe1 <xuhezhang6@gmail.com>
2025-08-16 19:26:44 +08:00
lyzno1 ae25f90f34
Replace export button with more actions button in workflow control panel (#24033) 2025-08-16 19:25:18 +08:00
729 changed files with 42335 additions and 6471 deletions

View File

@ -1,6 +1,7 @@
#!/bin/bash
WORKSPACE_ROOT=$(pwd)
npm add -g pnpm@10.15.0
corepack enable
cd web && pnpm install
pipx install uv

View File

@ -2,6 +2,8 @@ name: autofix.ci
on:
pull_request:
branches: ["main"]
push:
branches: ["main"]
permissions:
contents: read

8
.gitignore vendored
View File

@ -6,6 +6,9 @@ __pycache__/
# C extensions
*.so
# *db files
*.db
# Distribution / packaging
.Python
build/
@ -235,4 +238,7 @@ scripts/stress-test/reports/
# mcp
.playwright-mcp/
.serena/
.serena/
# settings
*.local.json

View File

@ -27,6 +27,9 @@ FILES_URL=http://localhost:5001
# Example: INTERNAL_FILES_URL=http://api:5001
INTERNAL_FILES_URL=http://127.0.0.1:5001
# TRIGGER URL
TRIGGER_URL=http://localhost:5001
# The time in seconds after the signature is rejected
FILES_ACCESS_TIMEOUT=300
@ -460,6 +463,9 @@ HTTP_REQUEST_NODE_MAX_BINARY_SIZE=10485760
HTTP_REQUEST_NODE_MAX_TEXT_SIZE=1048576
HTTP_REQUEST_NODE_SSL_VERIFY=True
# Webhook request configuration
WEBHOOK_REQUEST_BODY_MAX_SIZE=10485760
# Respect X-* headers to redirect clients
RESPECT_XFORWARD_HEADERS_ENABLED=false
@ -537,6 +543,12 @@ ENABLE_CLEAN_MESSAGES=false
ENABLE_MAIL_CLEAN_DOCUMENT_NOTIFY_TASK=false
ENABLE_DATASETS_QUEUE_MONITOR=false
ENABLE_CHECK_UPGRADABLE_PLUGIN_TASK=true
ENABLE_WORKFLOW_SCHEDULE_POLLER_TASK=true
# Interval time in minutes for polling scheduled workflows(default: 1 min)
WORKFLOW_SCHEDULE_POLLER_INTERVAL=1
WORKFLOW_SCHEDULE_POLLER_BATCH_SIZE=100
# Maximum number of scheduled workflows to dispatch per tick (0 for unlimited)
WORKFLOW_SCHEDULE_MAX_DISPATCH_PER_TICK=0
# Position configuration
POSITION_TOOL_PINS=

View File

@ -54,7 +54,7 @@
"--loglevel",
"DEBUG",
"-Q",
"dataset,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,priority_pipeline,pipeline"
"dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor"
]
}
]

62
api/AGENTS.md Normal file
View File

@ -0,0 +1,62 @@
# Agent Skill Index
Start with the section that best matches your need. Each entry lists the problems it solves plus key files/concepts so you know what to expect before opening it.
______________________________________________________________________
## Platform Foundations
- **[Infrastructure Overview](agent_skills/infra.md)**\
When to read this:
- You need to understand where a feature belongs in the architecture.
- Youre wiring storage, Redis, vector stores, or OTEL.
- Youre about to add CLI commands or async jobs.\
What it covers: configuration stack (`configs/app_config.py`, remote settings), storage entry points (`extensions/ext_storage.py`, `core/file/file_manager.py`), Redis conventions (`extensions/ext_redis.py`), plugin runtime topology, vector-store factory (`core/rag/datasource/vdb/*`), observability hooks, SSRF proxy usage, and core CLI commands.
- **[Coding Style](agent_skills/coding_style.md)**\
When to read this:
- Youre writing or reviewing backend code and need the authoritative checklist.
- Youre unsure about Pydantic validators, SQLAlchemy session usage, or logging patterns.
- You want the exact lint/type/test commands used in PRs.\
Includes: Ruff & BasedPyright commands, no-annotation policy, session examples (`with Session(db.engine, ...)`), `@field_validator` usage, logging expectations, and the rule set for file size, helpers, and package management.
______________________________________________________________________
## Plugin & Extension Development
- **[Plugin Systems](agent_skills/plugin.md)**\
When to read this:
- Youre building or debugging a marketplace plugin.
- You need to know how manifests, providers, daemons, and migrations fit together.\
What it covers: plugin manifests (`core/plugin/entities/plugin.py`), installation/upgrade flows (`services/plugin/plugin_service.py`, CLI commands), runtime adapters (`core/plugin/impl/*` for tool/model/datasource/trigger/endpoint/agent), daemon coordination (`core/plugin/entities/plugin_daemon.py`), and how provider registries surface capabilities to the rest of the platform.
- **[Plugin OAuth](agent_skills/plugin_oauth.md)**\
When to read this:
- You must integrate OAuth for a plugin or datasource.
- Youre handling credential encryption or refresh flows.\
Topics: credential storage, encryption helpers (`core/helper/provider_encryption.py`), OAuth client bootstrap (`services/plugin/oauth_service.py`, `services/plugin/plugin_parameter_service.py`), and how console/API layers expose the flows.
______________________________________________________________________
## Workflow Entry & Execution
- **[Trigger Concepts](agent_skills/trigger.md)**\
When to read this:
- Youre debugging why a workflow didnt start.
- Youre adding a new trigger type or hook.
- You need to trace async execution, draft debugging, or webhook/schedule pipelines.\
Details: Start-node taxonomy, webhook & schedule internals (`core/workflow/nodes/trigger_*`, `services/trigger/*`), async orchestration (`services/async_workflow_service.py`, Celery queues), debug event bus, and storage/logging interactions.
______________________________________________________________________
## Additional Notes for Agents
- All skill docs assume you follow the coding style guide—run Ruff/BasedPyright/tests listed there before submitting changes.
- When you cannot find an answer in these briefs, search the codebase using the paths referenced (e.g., `core/plugin/impl/tool.py`, `services/dataset_service.py`).
- If you run into cross-cutting concerns (tenancy, configuration, storage), check the infrastructure guide first; it links to most supporting modules.
- Keep multi-tenancy and configuration central: everything flows through `configs.dify_config` and `tenant_id`.
- When touching plugins or triggers, consult both the system overview and the specialised doc to ensure you adjust lifecycle, storage, and observability consistently.

View File

@ -0,0 +1,115 @@
## Linter
- Always follow `.ruff.toml`.
- Run `uv run ruff check --fix --unsafe-fixes`.
- Keep each line under 100 characters (including spaces).
## Code Style
- `snake_case` for variables and functions.
- `PascalCase` for classes.
- `UPPER_CASE` for constants.
## Rules
- Use Pydantic v2 standard.
- Use `uv` for package management.
- Do not override dunder methods like `__init__`, `__iadd__`, etc.
- Never launch services (`uv run app.py`, `flask run`, etc.); running tests under `tests/` is allowed.
- Prefer simple functions over classes for lightweight helpers.
- Keep files below 800 lines; split when necessary.
- Keep code readable—no clever hacks.
- Never use `print`; log with `logger = logging.getLogger(__name__)`.
## Guiding Principles
- Mirror the projects layered architecture: controller → service → core/domain.
- Reuse existing helpers in `core/`, `services/`, and `libs/` before creating new abstractions.
- Optimise for observability: deterministic control flow, clear logging, actionable errors.
## SQLAlchemy Patterns
- Models inherit from `models.base.Base`; never create ad-hoc metadata or engines.
- Open sessions with context managers:
```python
from sqlalchemy.orm import Session
with Session(db.engine, expire_on_commit=False) as session:
stmt = select(Workflow).where(
Workflow.id == workflow_id,
Workflow.tenant_id == tenant_id,
)
workflow = session.execute(stmt).scalar_one_or_none()
```
- Use SQLAlchemy expressions; avoid raw SQL unless necessary.
- Introduce repository abstractions only for very large tables (e.g., workflow executions) to support alternative storage strategies.
- Always scope queries by `tenant_id` and protect write paths with safeguards (`FOR UPDATE`, row counts, etc.).
## Storage & External IO
- Access storage via `extensions.ext_storage.storage`.
- Use `core.helper.ssrf_proxy` for outbound HTTP fetches.
- Background tasks that touch storage must be idempotent and log the relevant object identifiers.
## Pydantic Usage
- Define DTOs with Pydantic v2 models and forbid extras by default.
- Use `@field_validator` / `@model_validator` for domain rules.
- Example:
```python
from pydantic import BaseModel, ConfigDict, HttpUrl, field_validator
class TriggerConfig(BaseModel):
endpoint: HttpUrl
secret: str
model_config = ConfigDict(extra="forbid")
@field_validator("secret")
def ensure_secret_prefix(cls, value: str) -> str:
if not value.startswith("dify_"):
raise ValueError("secret must start with dify_")
return value
```
## Generics & Protocols
- Use `typing.Protocol` to define behavioural contracts (e.g., cache interfaces).
- Apply generics (`TypeVar`, `Generic`) for reusable utilities like caches or providers.
- Validate dynamic inputs at runtime when generics cannot enforce safety alone.
## Error Handling & Logging
- Raise domain-specific exceptions (`services/errors`, `core/errors`) and translate to HTTP responses in controllers.
- Declare `logger = logging.getLogger(__name__)` at module top.
- Include tenant/app/workflow identifiers in log context.
- Log retryable events at `warning`, terminal failures at `error`.
## Tooling & Checks
- Format/lint: `uv run --project api --dev ruff format ./api` and `uv run --project api --dev ruff check --fix --unsafe-fixes ./api`.
- Type checks: `uv run --directory api --dev basedpyright`.
- Tests: `uv run --project api --dev dev/pytest/pytest_unit_tests.sh`.
- Run all of the above before submitting your work.
## Controllers & Services
- Controllers: parse input via Pydantic, invoke services, return serialised responses; no business logic.
- Services: coordinate repositories, providers, background tasks; keep side effects explicit.
- Avoid repositories unless necessary; direct SQLAlchemy usage is preferred for typical tables.
- Document non-obvious behaviour with concise comments.
## Miscellaneous
- Use `configs.dify_config` for configuration—never read environment variables directly.
- Maintain tenant awareness end-to-end; `tenant_id` must flow through every layer touching shared resources.
- Queue async work through `services/async_workflow_service`; implement tasks under `tasks/` with explicit queue selection.
- Keep experimental scripts under `dev/`; do not ship them in production builds.

96
api/agent_skills/infra.md Normal file
View File

@ -0,0 +1,96 @@
## Configuration
- Import `configs.dify_config` for every runtime toggle. Do not read environment variables directly.
- Add new settings to the proper mixin inside `configs/` (deployment, feature, middleware, etc.) so they load through `DifyConfig`.
- Remote overrides come from the optional providers in `configs/remote_settings_sources`; keep defaults in code safe when the value is missing.
- Example: logging pulls targets from `extensions/ext_logging.py`, and model provider URLs are assembled in `services/entities/model_provider_entities.py`.
## Dependencies
- Runtime dependencies live in `[project].dependencies` inside `pyproject.toml`. Optional clients go into the `storage`, `tools`, or `vdb` groups under `[dependency-groups]`.
- Always pin versions and keep the list alphabetised. Shared tooling (lint, typing, pytest) belongs in the `dev` group.
- When code needs a new package, explain why in the PR and run `uv lock` so the lockfile stays current.
## Storage & Files
- Use `extensions.ext_storage.storage` for all blob IO; it already respects the configured backend.
- Convert files for workflows with helpers in `core/file/file_manager.py`; they handle signed URLs and multimodal payloads.
- When writing controller logic, delegate upload quotas and metadata to `services/file_service.py` instead of touching storage directly.
- All outbound HTTP fetches (webhooks, remote files) must go through the SSRF-safe client in `core/helper/ssrf_proxy.py`; it wraps `httpx` with the allow/deny rules configured for the platform.
## Redis & Shared State
- Access Redis through `extensions.ext_redis.redis_client`. For locking, reuse `redis_client.lock`.
- Prefer higher-level helpers when available: rate limits use `libs.helper.RateLimiter`, provider metadata uses caches in `core/helper/provider_cache.py`.
## Models
- SQLAlchemy models sit in `models/` and inherit from the shared declarative `Base` defined in `models/base.py` (metadata configured via `models/engine.py`).
- `models/__init__.py` exposes grouped aggregates: account/tenant models, app and conversation tables, datasets, providers, workflow runs, triggers, etc. Import from there to avoid deep path churn.
- Follow the DDD boundary: persistence objects live in `models/`, repositories under `repositories/` translate them into domain entities, and services consume those repositories.
- When adding a table, create the model class, register it in `models/__init__.py`, wire a repository if needed, and generate an Alembic migration as described below.
## Vector Stores
- Vector client implementations live in `core/rag/datasource/vdb/<provider>`, with a common factory in `core/rag/datasource/vdb/vector_factory.py` and enums in `core/rag/datasource/vdb/vector_type.py`.
- Retrieval pipelines call these providers through `core/rag/datasource/retrieval_service.py` and dataset ingestion flows in `services/dataset_service.py`.
- The CLI helper `flask vdb-migrate` orchestrates bulk migrations using routines in `commands.py`; reuse that pattern when adding new backend transitions.
- To add another store, mirror the provider layout, register it with the factory, and include any schema changes in Alembic migrations.
## Observability & OTEL
- OpenTelemetry settings live under the observability mixin in `configs/observability`. Toggle exporters and sampling via `dify_config`, not ad-hoc env reads.
- HTTP, Celery, Redis, SQLAlchemy, and httpx instrumentation is initialised in `extensions/ext_app_metrics.py` and `extensions/ext_request_logging.py`; reuse these hooks when adding new workers or entrypoints.
- When creating background tasks or external calls, propagate tracing context with helpers in the existing instrumented clients (e.g. use the shared `httpx` session from `core/helper/http_client_pooling.py`).
- If you add a new external integration, ensure spans and metrics are emitted by wiring the appropriate OTEL instrumentation package in `pyproject.toml` and configuring it in `extensions/`.
## Ops Integrations
- Langfuse support and other tracing bridges live under `core/ops/opik_trace`. Config toggles sit in `configs/observability`, while exporters are initialised in the OTEL extensions mentioned above.
- External monitoring services should follow this pattern: keep client code in `core/ops`, expose switches via `dify_config`, and hook initialisation in `extensions/ext_app_metrics.py` or sibling modules.
- Before instrumenting new code paths, check whether existing context helpers (e.g. `extensions/ext_request_logging.py`) already capture the necessary metadata.
## Controllers, Services, Core
- Controllers only parse HTTP input and call a service method. Keep business rules in `services/`.
- Services enforce tenant rules, quotas, and orchestration, then call into `core/` engines (workflow execution, tools, LLMs).
- When adding a new endpoint, search for an existing service to extend before introducing a new layer. Example: workflow APIs pipe through `services/workflow_service.py` into `core/workflow`.
## Plugins, Tools, Providers
- In Dify a plugin is a tenant-installable bundle that declares one or more providers (tool, model, datasource, trigger, endpoint, agent strategy) plus its resource needs and version metadata. The manifest (`core/plugin/entities/plugin.py`) mirrors what you see in the marketplace documentation.
- Installation, upgrades, and migrations are orchestrated by `services/plugin/plugin_service.py` together with helpers such as `services/plugin/plugin_migration.py`.
- Runtime loading happens through the implementations under `core/plugin/impl/*` (tool/model/datasource/trigger/endpoint/agent). These modules normalise plugin providers so that downstream systems (`core/tools/tool_manager.py`, `services/model_provider_service.py`, `services/trigger/*`) can treat builtin and plugin capabilities the same way.
- For remote execution, plugin daemons (`core/plugin/entities/plugin_daemon.py`, `core/plugin/impl/plugin.py`) manage lifecycle hooks, credential forwarding, and background workers that keep plugin processes in sync with the main application.
- Acquire tool implementations through `core/tools/tool_manager.py`; it resolves builtin, plugin, and workflow-as-tool providers uniformly, injecting the right context (tenant, credentials, runtime config).
- To add a new plugin capability, extend the relevant `core/plugin/entities` schema and register the implementation in the matching `core/plugin/impl` module rather than importing the provider directly.
## Async Workloads
see `agent_skills/trigger.md` for more detailed documentation.
- Enqueue background work through `services/async_workflow_service.py`. It routes jobs to the tiered Celery queues defined in `tasks/`.
- Workers boot from `celery_entrypoint.py` and execute functions in `tasks/workflow_execution_tasks.py`, `tasks/trigger_processing_tasks.py`, etc.
- Scheduled workflows poll from `schedule/workflow_schedule_tasks.py`. Follow the same pattern if you need new periodic jobs.
## Database & Migrations
- SQLAlchemy models live under `models/` and map directly to migration files in `migrations/versions`.
- Generate migrations with `uv run --project api flask db revision --autogenerate -m "<summary>"`, then review the diff; never hand-edit the database outside Alembic.
- Apply migrations locally using `uv run --project api flask db upgrade`; production deploys expect the same history.
- If you add tenant-scoped data, confirm the upgrade includes tenant filters or defaults consistent with the service logic touching those tables.
## CLI Commands
- Maintenance commands from `commands.py` are registered on the Flask CLI. Run them via `uv run --project api flask <command>`.
- Use the built-in `db` commands from Flask-Migrate for schema operations (`flask db upgrade`, `flask db stamp`, etc.). Only fall back to custom helpers if you need their extra behaviour.
- Custom entries such as `flask reset-password`, `flask reset-email`, and `flask vdb-migrate` handle self-hosted account recovery and vector database migrations.
- Before adding a new command, check whether an existing service can be reused and ensure the command guards edition-specific behaviour (many enforce `SELF_HOSTED`). Document any additions in the PR.
- Ruff helpers are run directly with `uv`: `uv run --project api --dev ruff format ./api` for formatting and `uv run --project api --dev ruff check ./api` (add `--fix` if you want automatic fixes).
## When You Add Features
- Check for an existing helper or service before writing a new util.
- Uphold tenancy: every service method should receive the tenant ID from controller wrappers such as `controllers/console/wraps.py`.
- Update or create tests alongside behaviour changes (`tests/unit_tests` for fast coverage, `tests/integration_tests` when touching orchestrations).
- Run `uv run --project api --dev ruff check ./api`, `uv run --directory api --dev basedpyright`, and `uv run --project api --dev dev/pytest/pytest_unit_tests.sh` before submitting changes.

View File

@ -0,0 +1 @@
// TBD

View File

@ -0,0 +1 @@
// TBD

View File

@ -0,0 +1,53 @@
## Overview
Trigger is a collection of nodes that we called `Start` nodes, also, the concept of `Start` is the same as `RootNode` in the workflow engine `core/workflow/graph_engine`, On the other hand, `Start` node is the entry point of workflows, every workflow run always starts from a `Start` node.
## Trigger nodes
- `UserInput`
- `Trigger Webhook`
- `Trigger Schedule`
- `Trigger Plugin`
### UserInput
Before `Trigger` concept is introduced, it's what we called `Start` node, but now, to avoid confusion, it was renamed to `UserInput` node, has a strong relation with `ServiceAPI` in `controllers/service_api/app`
1. `UserInput` node introduces a list of arguments that need to be provided by the user, finally it will be converted into variables in the workflow variable pool.
1. `ServiceAPI` accept those arguments, and pass through them into `UserInput` node.
1. For its detailed implementation, please refer to `core/workflow/nodes/start`
### Trigger Webhook
Inside Webhook Node, Dify provided a UI panel that allows user define a HTTP manifest `core/workflow/nodes/trigger_webhook/entities.py`.`WebhookData`, also, Dify generates a random webhook id for each `Trigger Webhook` node, the implementation was implemented in `core/trigger/utils/endpoint.py`, as you can see, `webhook-debug` is a debug mode for webhook, you may find it in `controllers/trigger/webhook.py`.
Finally, requests to `webhook` endpoint will be converted into variables in workflow variable pool during workflow execution.
### Trigger Schedule
`Trigger Schedule` node is a node that allows user define a schedule to trigger the workflow, detailed manifest is here `core/workflow/nodes/trigger_schedule/entities.py`, we have a poller and executor to handle millions of schedules, see `docker/entrypoint.sh` / `schedule/workflow_schedule_task.py` for help.
To Achieve this, a `WorkflowSchedulePlan` model was introduced in `models/trigger.py`, and a `events/event_handlers/sync_workflow_schedule_when_app_published.py` was used to sync workflow schedule plans when app is published.
### Trigger Plugin
`Trigger Plugin` node allows user define there own distributed trigger plugin, whenever a request was received, Dify forwards it to the plugin and wait for parsed variables from it.
1. Requests were saved in storage by `services/trigger/trigger_request_service.py`, referenced by `services/trigger/trigger_service.py`.`TriggerService`.`process_endpoint`
1. Plugins accept those requests and parse variables from it, see `core/plugin/impl/trigger.py` for details.
A `subscription` concept was out here by Dify, it means an endpoint address from Dify was bound to thirdparty webhook service like `Github` `Slack` `Linear` `GoogleDrive` `Gmail` etc. Once a subscription was created, Dify continually receives requests from the platforms and handle them one by one.
## Worker Pool / Async Task
All the events that triggered a new workflow run is always in async mode, a unified entrypoint can be found here `services/async_workflow_service.py`.`AsyncWorkflowService`.`trigger_workflow_async`.
The infrastructure we used is `celery`, we've already configured it in `docker/entrypoint.sh`, and the consumers are in `tasks/async_workflow_tasks.py`, 3 queues were used to handle different tiers of users, `PROFESSIONAL_QUEUE` `TEAM_QUEUE` `SANDBOX_QUEUE`.
## Debug Strategy
Dify divided users into 2 groups: builders / end users.
Builders are the users who create workflows, in this stage, debugging a workflow becomes a critical part of the workflow development process, as the start node in workflows, trigger nodes can `listen` to the events from `WebhookDebug` `Schedule` `Plugin`, debugging process was created in `controllers/console/app/workflow.py`.`DraftWorkflowTriggerNodeApi`.
A polling process can be considered as combine of few single `poll` operations, each `poll` operation fetches events cached in `Redis`, returns `None` if no event was found, more detailed implemented: `core/trigger/debug/event_bus.py` was used to handle the polling process, and `core/trigger/debug/event_selectors.py` was used to select the event poller based on the trigger type.

View File

@ -15,12 +15,12 @@ from sqlalchemy.orm import sessionmaker
from configs import dify_config
from constants.languages import languages
from core.helper import encrypter
from core.plugin.entities.plugin_daemon import CredentialType
from core.plugin.impl.plugin import PluginInstaller
from core.rag.datasource.vdb.vector_factory import Vector
from core.rag.datasource.vdb.vector_type import VectorType
from core.rag.index_processor.constant.built_in_field import BuiltInField
from core.rag.models.document import Document
from core.tools.entities.tool_entities import CredentialType
from core.tools.utils.system_oauth_encryption import encrypt_system_oauth_params
from events.app_event import app_was_created
from extensions.ext_database import db
@ -1229,6 +1229,55 @@ def setup_system_tool_oauth_client(provider, client_params):
click.echo(click.style(f"OAuth client params setup successfully. id: {oauth_client.id}", fg="green"))
@click.command("setup-system-trigger-oauth-client", help="Setup system trigger oauth client.")
@click.option("--provider", prompt=True, help="Provider name")
@click.option("--client-params", prompt=True, help="Client Params")
def setup_system_trigger_oauth_client(provider, client_params):
"""
Setup system trigger oauth client
"""
from models.provider_ids import TriggerProviderID
from models.trigger import TriggerOAuthSystemClient
provider_id = TriggerProviderID(provider)
provider_name = provider_id.provider_name
plugin_id = provider_id.plugin_id
try:
# json validate
click.echo(click.style(f"Validating client params: {client_params}", fg="yellow"))
client_params_dict = TypeAdapter(dict[str, Any]).validate_json(client_params)
click.echo(click.style("Client params validated successfully.", fg="green"))
click.echo(click.style(f"Encrypting client params: {client_params}", fg="yellow"))
click.echo(click.style(f"Using SECRET_KEY: `{dify_config.SECRET_KEY}`", fg="yellow"))
oauth_client_params = encrypt_system_oauth_params(client_params_dict)
click.echo(click.style("Client params encrypted successfully.", fg="green"))
except Exception as e:
click.echo(click.style(f"Error parsing client params: {str(e)}", fg="red"))
return
deleted_count = (
db.session.query(TriggerOAuthSystemClient)
.filter_by(
provider=provider_name,
plugin_id=plugin_id,
)
.delete()
)
if deleted_count > 0:
click.echo(click.style(f"Deleted {deleted_count} existing oauth client params.", fg="yellow"))
oauth_client = TriggerOAuthSystemClient(
provider=provider_name,
plugin_id=plugin_id,
encrypted_oauth_params=oauth_client_params,
)
db.session.add(oauth_client)
db.session.commit()
click.echo(click.style(f"OAuth client params setup successfully. id: {oauth_client.id}", fg="green"))
def _find_orphaned_draft_variables(batch_size: int = 1000) -> list[str]:
"""
Find draft variables that reference non-existent apps.

View File

@ -174,6 +174,33 @@ class CodeExecutionSandboxConfig(BaseSettings):
)
class TriggerConfig(BaseSettings):
"""
Configuration for trigger
"""
WEBHOOK_REQUEST_BODY_MAX_SIZE: PositiveInt = Field(
description="Maximum allowed size for webhook request bodies in bytes",
default=10485760,
)
class AsyncWorkflowConfig(BaseSettings):
"""
Configuration for async workflow
"""
ASYNC_WORKFLOW_SCHEDULER_GRANULARITY: int = Field(
description="Granularity for async workflow scheduler, "
"sometime, few users could block the queue due to some time-consuming tasks, "
"to avoid this, workflow can be suspended if needed, to achieve"
"this, a time-based checker is required, every granularity seconds, "
"the checker will check the workflow queue and suspend the workflow",
default=120,
ge=1,
)
class PluginConfig(BaseSettings):
"""
Plugin configs
@ -263,6 +290,8 @@ class EndpointConfig(BaseSettings):
description="Template url for endpoint plugin", default="http://localhost:5002/e/{hook_id}"
)
TRIGGER_URL: str = Field(description="Template url for triggers", default="http://localhost:5001")
class FileAccessConfig(BaseSettings):
"""
@ -995,6 +1024,44 @@ class CeleryScheduleTasksConfig(BaseSettings):
description="Enable check upgradable plugin task",
default=True,
)
ENABLE_WORKFLOW_SCHEDULE_POLLER_TASK: bool = Field(
description="Enable workflow schedule poller task",
default=True,
)
WORKFLOW_SCHEDULE_POLLER_INTERVAL: int = Field(
description="Workflow schedule poller interval in minutes",
default=1,
)
WORKFLOW_SCHEDULE_POLLER_BATCH_SIZE: int = Field(
description="Maximum number of schedules to process in each poll batch",
default=100,
)
WORKFLOW_SCHEDULE_MAX_DISPATCH_PER_TICK: int = Field(
description="Maximum schedules to dispatch per tick (0=unlimited, circuit breaker)",
default=0,
)
# Trigger provider refresh (simple version)
ENABLE_TRIGGER_PROVIDER_REFRESH_TASK: bool = Field(
description="Enable trigger provider refresh poller",
default=True,
)
TRIGGER_PROVIDER_REFRESH_INTERVAL: int = Field(
description="Trigger provider refresh poller interval in minutes",
default=1,
)
TRIGGER_PROVIDER_REFRESH_BATCH_SIZE: int = Field(
description="Max trigger subscriptions to process per tick",
default=200,
)
TRIGGER_PROVIDER_CREDENTIAL_THRESHOLD_SECONDS: int = Field(
description="Proactive credential refresh threshold in seconds",
default=180,
)
TRIGGER_PROVIDER_SUBSCRIPTION_THRESHOLD_SECONDS: int = Field(
description="Proactive subscription refresh threshold in seconds",
default=60 * 60,
)
class PositionConfig(BaseSettings):
@ -1118,6 +1185,8 @@ class FeatureConfig(
AuthConfig, # Changed from OAuthConfig to AuthConfig
BillingConfig,
CodeExecutionSandboxConfig,
TriggerConfig,
AsyncWorkflowConfig,
PluginConfig,
MarketplaceConfig,
DataSetConfig,

View File

@ -9,6 +9,7 @@ if TYPE_CHECKING:
from core.model_runtime.entities.model_entities import AIModelEntity
from core.plugin.entities.plugin_daemon import PluginModelProviderEntity
from core.tools.plugin_tool.provider import PluginToolProviderController
from core.trigger.provider import PluginTriggerProviderController
"""
@ -41,3 +42,11 @@ datasource_plugin_providers: RecyclableContextVar[dict[str, "DatasourcePluginPro
datasource_plugin_providers_lock: RecyclableContextVar[Lock] = RecyclableContextVar(
ContextVar("datasource_plugin_providers_lock")
)
plugin_trigger_providers: RecyclableContextVar[dict[str, "PluginTriggerProviderController"]] = RecyclableContextVar(
ContextVar("plugin_trigger_providers")
)
plugin_trigger_providers_lock: RecyclableContextVar[Lock] = RecyclableContextVar(
ContextVar("plugin_trigger_providers_lock")
)

View File

@ -66,6 +66,7 @@ from .app import (
workflow_draft_variable,
workflow_run,
workflow_statistic,
workflow_trigger,
)
# Import auth controllers
@ -126,6 +127,7 @@ from .workspace import (
models,
plugin,
tool_providers,
trigger_providers,
workspace,
)
@ -196,6 +198,7 @@ __all__ = [
"statistic",
"tags",
"tool_providers",
"trigger_providers",
"version",
"website",
"workflow",
@ -203,5 +206,6 @@ __all__ = [
"workflow_draft_variable",
"workflow_run",
"workflow_statistic",
"workflow_trigger",
"workspace",
]

View File

@ -11,6 +11,7 @@ from controllers.console.app.error import (
)
from controllers.console.wraps import account_initialization_required, setup_required
from core.errors.error import ModelCurrentlyNotSupportError, ProviderTokenNotInitError, QuotaExceededError
from core.helper.code_executor.code_node_provider import CodeNodeProvider
from core.helper.code_executor.javascript.javascript_code_provider import JavascriptCodeProvider
from core.helper.code_executor.python3.python3_code_provider import Python3CodeProvider
from core.llm_generator.llm_generator import LLMGenerator
@ -206,13 +207,11 @@ class InstructionGenerateApi(Resource):
)
args = parser.parse_args()
_, current_tenant_id = current_account_with_tenant()
code_template = (
Python3CodeProvider.get_default_code()
if args["language"] == "python"
else (JavascriptCodeProvider.get_default_code())
if args["language"] == "javascript"
else ""
providers: list[type[CodeNodeProvider]] = [Python3CodeProvider, JavascriptCodeProvider]
code_provider: type[CodeNodeProvider] | None = next(
(p for p in providers if p.is_accept_language(args["language"])), None
)
code_template = code_provider.get_default_code() if code_provider else ""
try:
# Generate from nothing for a workflow node
if (args["current"] == code_template or args["current"] == "") and args["node_id"] != "":

View File

@ -6,7 +6,7 @@ from typing import cast
from flask import abort, request
from flask_restx import Resource, fields, inputs, marshal_with, reqparse
from sqlalchemy.orm import Session
from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
from werkzeug.exceptions import BadRequest, Forbidden, InternalServerError, NotFound
import services
from controllers.console import api, console_ns
@ -19,6 +19,15 @@ from core.app.apps.base_app_queue_manager import AppQueueManager
from core.app.entities.app_invoke_entities import InvokeFrom
from core.file.models import File
from core.helper.trace_id_helper import get_external_trace_id
from core.model_runtime.utils.encoders import jsonable_encoder
from core.plugin.impl.exc import PluginInvokeError
from core.trigger.debug.event_selectors import (
TriggerDebugEvent,
TriggerDebugEventPoller,
create_event_poller,
select_trigger_debug_events,
)
from core.workflow.enums import NodeType
from core.workflow.graph_engine.manager import GraphEngineManager
from extensions.ext_database import db
from factories import file_factory, variable_factory
@ -37,6 +46,7 @@ from services.errors.llm import InvokeRateLimitError
from services.workflow_service import DraftWorkflowDeletionError, WorkflowInUseError, WorkflowService
logger = logging.getLogger(__name__)
LISTENING_RETRY_IN = 2000
# TODO(QuantumGhost): Refactor existing node run API to handle file parameter parsing
@ -915,3 +925,230 @@ class DraftWorkflowNodeLastRunApi(Resource):
if node_exec is None:
raise NotFound("last run not found")
return node_exec
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/trigger/run")
class DraftWorkflowTriggerRunApi(Resource):
"""
Full workflow debug - Polling API for trigger events
Path: /apps/<uuid:app_id>/workflows/draft/trigger/run
"""
@api.doc("poll_draft_workflow_trigger_run")
@api.doc(description="Poll for trigger events and execute full workflow when event arrives")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"DraftWorkflowTriggerRunRequest",
{
"node_id": fields.String(required=True, description="Node ID"),
},
)
)
@api.response(200, "Trigger event received and workflow executed successfully")
@api.response(403, "Permission denied")
@api.response(500, "Internal server error")
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.WORKFLOW])
@edit_permission_required
def post(self, app_model: App):
"""
Poll for trigger events and execute full workflow when event arrives
"""
current_user, _ = current_account_with_tenant()
parser = reqparse.RequestParser()
parser.add_argument("node_id", type=str, required=True, location="json", nullable=False)
args = parser.parse_args()
node_id = args["node_id"]
workflow_service = WorkflowService()
draft_workflow = workflow_service.get_draft_workflow(app_model)
if not draft_workflow:
raise ValueError("Workflow not found")
poller: TriggerDebugEventPoller = create_event_poller(
draft_workflow=draft_workflow,
tenant_id=app_model.tenant_id,
user_id=current_user.id,
app_id=app_model.id,
node_id=node_id,
)
event: TriggerDebugEvent | None = None
try:
event = poller.poll()
if not event:
return jsonable_encoder({"status": "waiting", "retry_in": LISTENING_RETRY_IN})
return helper.compact_generate_response(
AppGenerateService.generate(
app_model=app_model,
user=current_user,
args=event.workflow_args,
invoke_from=InvokeFrom.DEBUGGER,
streaming=True,
root_node_id=node_id,
)
)
except InvokeRateLimitError as ex:
raise InvokeRateLimitHttpError(ex.description)
except PluginInvokeError as e:
raise BadRequest(e.to_user_friendly_error())
except Exception as e:
logger.exception("Error polling trigger debug event")
raise e
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/nodes/<string:node_id>/trigger/run")
class DraftWorkflowTriggerNodeApi(Resource):
"""
Single node debug - Polling API for trigger events
Path: /apps/<uuid:app_id>/workflows/draft/nodes/<string:node_id>/trigger/run
"""
@api.doc("poll_draft_workflow_trigger_node")
@api.doc(description="Poll for trigger events and execute single node when event arrives")
@api.doc(params={"app_id": "Application ID", "node_id": "Node ID"})
@api.response(200, "Trigger event received and node executed successfully")
@api.response(403, "Permission denied")
@api.response(500, "Internal server error")
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.WORKFLOW])
@edit_permission_required
def post(self, app_model: App, node_id: str):
"""
Poll for trigger events and execute single node when event arrives
"""
current_user, _ = current_account_with_tenant()
workflow_service = WorkflowService()
draft_workflow = workflow_service.get_draft_workflow(app_model)
if not draft_workflow:
raise ValueError("Workflow not found")
node_config = draft_workflow.get_node_config_by_id(node_id=node_id)
if not node_config:
raise ValueError("Node data not found for node %s", node_id)
node_type: NodeType = draft_workflow.get_node_type_from_node_config(node_config)
event: TriggerDebugEvent | None = None
# for schedule trigger, when run single node, just execute directly
if node_type == NodeType.TRIGGER_SCHEDULE:
event = TriggerDebugEvent(
workflow_args={},
node_id=node_id,
)
# for other trigger types, poll for the event
else:
try:
poller: TriggerDebugEventPoller = create_event_poller(
draft_workflow=draft_workflow,
tenant_id=app_model.tenant_id,
user_id=current_user.id,
app_id=app_model.id,
node_id=node_id,
)
event = poller.poll()
except PluginInvokeError as e:
return jsonable_encoder({"status": "error", "error": e.to_user_friendly_error()}), 400
except Exception as e:
logger.exception("Error polling trigger debug event")
raise e
if not event:
return jsonable_encoder({"status": "waiting", "retry_in": LISTENING_RETRY_IN})
raw_files = event.workflow_args.get("files")
files = _parse_file(draft_workflow, raw_files if isinstance(raw_files, list) else None)
try:
node_execution = workflow_service.run_draft_workflow_node(
app_model=app_model,
draft_workflow=draft_workflow,
node_id=node_id,
user_inputs=event.workflow_args.get("inputs") or {},
account=current_user,
query="",
files=files,
)
return jsonable_encoder(node_execution)
except Exception as e:
logger.exception("Error running draft workflow trigger node")
return jsonable_encoder(
{"status": "error", "error": "An unexpected error occurred while running the node."}
), 400
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/trigger/run-all")
class DraftWorkflowTriggerRunAllApi(Resource):
"""
Full workflow debug - Polling API for trigger events
Path: /apps/<uuid:app_id>/workflows/draft/trigger/run-all
"""
@api.doc("draft_workflow_trigger_run_all")
@api.doc(description="Full workflow debug when the start node is a trigger")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"DraftWorkflowTriggerRunAllRequest",
{
"node_ids": fields.List(fields.String, required=True, description="Node IDs"),
},
)
)
@api.response(200, "Workflow executed successfully")
@api.response(403, "Permission denied")
@api.response(500, "Internal server error")
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.WORKFLOW])
@edit_permission_required
def post(self, app_model: App):
"""
Full workflow debug when the start node is a trigger
"""
current_user, _ = current_account_with_tenant()
parser = reqparse.RequestParser()
parser.add_argument("node_ids", type=list, required=True, location="json", nullable=False)
args = parser.parse_args()
node_ids = args["node_ids"]
workflow_service = WorkflowService()
draft_workflow = workflow_service.get_draft_workflow(app_model)
if not draft_workflow:
raise ValueError("Workflow not found")
try:
trigger_debug_event: TriggerDebugEvent | None = select_trigger_debug_events(
draft_workflow=draft_workflow,
app_model=app_model,
user_id=current_user.id,
node_ids=node_ids,
)
except PluginInvokeError as e:
raise ValueError(e.to_user_friendly_error())
except Exception as e:
logger.exception("Error polling trigger debug event")
raise e
if trigger_debug_event is None:
return jsonable_encoder({"status": "waiting", "retry_in": LISTENING_RETRY_IN})
try:
response = AppGenerateService.generate(
app_model=app_model,
user=current_user,
args=trigger_debug_event.workflow_args,
invoke_from=InvokeFrom.DEBUGGER,
streaming=True,
root_node_id=trigger_debug_event.node_id,
)
return helper.compact_generate_response(response)
except InvokeRateLimitError as ex:
raise InvokeRateLimitHttpError(ex.description)
except Exception:
logger.exception("Error running draft workflow trigger run-all")
return jsonable_encoder(
{
"status": "error",
}
), 400

View File

@ -28,6 +28,7 @@ class WorkflowAppLogApi(Resource):
"created_at__after": "Filter logs created after this timestamp",
"created_by_end_user_session_id": "Filter by end user session ID",
"created_by_account": "Filter by account",
"detail": "Whether to return detailed logs",
"page": "Page number (1-99999)",
"limit": "Number of items per page (1-100)",
}
@ -68,6 +69,7 @@ class WorkflowAppLogApi(Resource):
required=False,
default=None,
)
.add_argument("detail", type=bool, location="args", required=False, default=False)
.add_argument("page", type=int_range(1, 99999), default=1, location="args")
.add_argument("limit", type=int_range(1, 100), default=20, location="args")
)
@ -92,6 +94,7 @@ class WorkflowAppLogApi(Resource):
created_at_after=args.created_at__after,
page=args.page,
limit=args.limit,
detail=args.detail,
created_by_end_user_session_id=args.created_by_end_user_session_id,
created_by_account=args.created_by_account,
)

View File

@ -0,0 +1,145 @@
import logging
from flask_restx import Resource, marshal_with, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session
from werkzeug.exceptions import Forbidden, NotFound
from configs import dify_config
from controllers.console import api
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from extensions.ext_database import db
from fields.workflow_trigger_fields import trigger_fields, triggers_list_fields, webhook_trigger_fields
from libs.login import current_user, login_required
from models.enums import AppTriggerStatus
from models.model import Account, App, AppMode
from models.trigger import AppTrigger, WorkflowWebhookTrigger
logger = logging.getLogger(__name__)
class WebhookTriggerApi(Resource):
"""Webhook Trigger API"""
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.WORKFLOW)
@marshal_with(webhook_trigger_fields)
def get(self, app_model: App):
"""Get webhook trigger for a node"""
parser = reqparse.RequestParser()
parser.add_argument("node_id", type=str, required=True, help="Node ID is required")
args = parser.parse_args()
node_id = str(args["node_id"])
with Session(db.engine) as session:
# Get webhook trigger for this app and node
webhook_trigger = (
session.query(WorkflowWebhookTrigger)
.where(
WorkflowWebhookTrigger.app_id == app_model.id,
WorkflowWebhookTrigger.node_id == node_id,
)
.first()
)
if not webhook_trigger:
raise NotFound("Webhook trigger not found for this node")
return webhook_trigger
class AppTriggersApi(Resource):
"""App Triggers list API"""
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.WORKFLOW)
@marshal_with(triggers_list_fields)
def get(self, app_model: App):
"""Get app triggers list"""
assert isinstance(current_user, Account)
assert current_user.current_tenant_id is not None
with Session(db.engine) as session:
# Get all triggers for this app using select API
triggers = (
session.execute(
select(AppTrigger)
.where(
AppTrigger.tenant_id == current_user.current_tenant_id,
AppTrigger.app_id == app_model.id,
)
.order_by(AppTrigger.created_at.desc(), AppTrigger.id.desc())
)
.scalars()
.all()
)
# Add computed icon field for each trigger
url_prefix = dify_config.CONSOLE_API_URL + "/console/api/workspaces/current/tool-provider/builtin/"
for trigger in triggers:
if trigger.trigger_type == "trigger-plugin":
trigger.icon = url_prefix + trigger.provider_name + "/icon" # type: ignore
else:
trigger.icon = "" # type: ignore
return {"data": triggers}
class AppTriggerEnableApi(Resource):
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.WORKFLOW)
@marshal_with(trigger_fields)
def post(self, app_model: App):
"""Update app trigger (enable/disable)"""
parser = reqparse.RequestParser()
parser.add_argument("trigger_id", type=str, required=True, nullable=False, location="json")
parser.add_argument("enable_trigger", type=bool, required=True, nullable=False, location="json")
args = parser.parse_args()
assert isinstance(current_user, Account)
assert current_user.current_tenant_id is not None
if not current_user.has_edit_permission:
raise Forbidden()
trigger_id = args["trigger_id"]
with Session(db.engine) as session:
# Find the trigger using select
trigger = session.execute(
select(AppTrigger).where(
AppTrigger.id == trigger_id,
AppTrigger.tenant_id == current_user.current_tenant_id,
AppTrigger.app_id == app_model.id,
)
).scalar_one_or_none()
if not trigger:
raise NotFound("Trigger not found")
# Update status based on enable_trigger boolean
trigger.status = AppTriggerStatus.ENABLED if args["enable_trigger"] else AppTriggerStatus.DISABLED
session.commit()
session.refresh(trigger)
# Add computed icon field
url_prefix = dify_config.CONSOLE_API_URL + "/console/api/workspaces/current/tool-provider/builtin/"
if trigger.trigger_type == "trigger-plugin":
trigger.icon = url_prefix + trigger.provider_name + "/icon" # type: ignore
else:
trigger.icon = "" # type: ignore
return trigger
api.add_resource(WebhookTriggerApi, "/apps/<uuid:app_id>/workflows/triggers/webhook")
api.add_resource(AppTriggersApi, "/apps/<uuid:app_id>/triggers")
api.add_resource(AppTriggerEnableApi, "/apps/<uuid:app_id>/trigger-enable")

View File

@ -114,6 +114,25 @@ class PluginIconApi(Resource):
return send_file(io.BytesIO(icon_bytes), mimetype=mimetype, max_age=icon_cache_max_age)
@console_ns.route("/workspaces/current/plugin/asset")
class PluginAssetApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self):
req = reqparse.RequestParser()
req.add_argument("plugin_unique_identifier", type=str, required=True, location="args")
req.add_argument("file_name", type=str, required=True, location="args")
args = req.parse_args()
_, tenant_id = current_account_with_tenant()
try:
binary = PluginService.extract_asset(tenant_id, args["plugin_unique_identifier"], args["file_name"])
return send_file(io.BytesIO(binary), mimetype="application/octet-stream")
except PluginDaemonClientSideError as e:
raise ValueError(e)
@console_ns.route("/workspaces/current/plugin/upload/pkg")
class PluginUploadFromPkgApi(Resource):
@setup_required
@ -558,19 +577,21 @@ class PluginFetchDynamicSelectOptionsApi(Resource):
.add_argument("provider", type=str, required=True, location="args")
.add_argument("action", type=str, required=True, location="args")
.add_argument("parameter", type=str, required=True, location="args")
.add_argument("credential_id", type=str, required=False, location="args")
.add_argument("provider_type", type=str, required=True, location="args")
)
args = parser.parse_args()
try:
options = PluginParameterService.get_dynamic_select_options(
tenant_id,
user_id,
args["plugin_id"],
args["provider"],
args["action"],
args["parameter"],
args["provider_type"],
tenant_id=tenant_id,
user_id=user_id,
plugin_id=args["plugin_id"],
provider=args["provider"],
action=args["action"],
parameter=args["parameter"],
credential_id=args["credential_id"],
provider_type=args["provider_type"],
)
except PluginDaemonClientSideError as e:
raise ValueError(e)
@ -686,3 +707,23 @@ class PluginAutoUpgradeExcludePluginApi(Resource):
args = req.parse_args()
return jsonable_encoder({"success": PluginAutoUpgradeService.exclude_plugin(tenant_id, args["plugin_id"])})
@console_ns.route("/workspaces/current/plugin/readme")
class PluginReadmeApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self):
_, tenant_id = current_account_with_tenant()
parser = reqparse.RequestParser()
parser.add_argument("plugin_unique_identifier", type=str, required=True, location="args")
parser.add_argument("language", type=str, required=False, location="args")
args = parser.parse_args()
return jsonable_encoder(
{
"readme": PluginService.fetch_plugin_readme(
tenant_id, args["plugin_unique_identifier"], args.get("language", "en-US")
)
}
)

View File

@ -21,12 +21,14 @@ from core.mcp.auth.auth_flow import auth, handle_callback
from core.mcp.error import MCPAuthError, MCPError, MCPRefreshTokenError
from core.mcp.mcp_client import MCPClient
from core.model_runtime.utils.encoders import jsonable_encoder
from core.plugin.entities.plugin_daemon import CredentialType
from core.plugin.impl.oauth import OAuthHandler
from core.tools.entities.tool_entities import CredentialType
from extensions.ext_database import db
from libs.helper import StrLen, alphanumeric, uuid_value
from libs.login import current_account_with_tenant, login_required
from models.provider_ids import ToolProviderID
# from models.provider_ids import ToolProviderID
from services.plugin.oauth_service import OAuthProxyService
from services.tools.api_tools_manage_service import ApiToolManageService
from services.tools.builtin_tools_manage_service import BuiltinToolManageService

View File

@ -0,0 +1,592 @@
import logging
from flask import make_response, redirect, request
from flask_restx import Resource, reqparse
from sqlalchemy.orm import Session
from werkzeug.exceptions import BadRequest, Forbidden
from configs import dify_config
from controllers.console import api
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.web.error import NotFoundError
from core.model_runtime.utils.encoders import jsonable_encoder
from core.plugin.entities.plugin_daemon import CredentialType
from core.plugin.impl.oauth import OAuthHandler
from core.trigger.entities.entities import SubscriptionBuilderUpdater
from core.trigger.trigger_manager import TriggerManager
from extensions.ext_database import db
from libs.login import current_user, login_required
from models.account import Account
from models.provider_ids import TriggerProviderID
from services.plugin.oauth_service import OAuthProxyService
from services.trigger.trigger_provider_service import TriggerProviderService
from services.trigger.trigger_subscription_builder_service import TriggerSubscriptionBuilderService
from services.trigger.trigger_subscription_operator_service import TriggerSubscriptionOperatorService
logger = logging.getLogger(__name__)
class TriggerProviderIconApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self, provider):
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
return TriggerManager.get_trigger_plugin_icon(tenant_id=user.current_tenant_id, provider_id=provider)
class TriggerProviderListApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self):
"""List all trigger providers for the current tenant"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
return jsonable_encoder(TriggerProviderService.list_trigger_providers(user.current_tenant_id))
class TriggerProviderInfoApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self, provider):
"""Get info for a trigger provider"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
return jsonable_encoder(
TriggerProviderService.get_trigger_provider(user.current_tenant_id, TriggerProviderID(provider))
)
class TriggerSubscriptionListApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self, provider):
"""List all trigger subscriptions for the current tenant's provider"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
if not user.is_admin_or_owner:
raise Forbidden()
try:
return jsonable_encoder(
TriggerProviderService.list_trigger_provider_subscriptions(
tenant_id=user.current_tenant_id, provider_id=TriggerProviderID(provider)
)
)
except ValueError as e:
return jsonable_encoder({"error": str(e)}), 404
except Exception as e:
logger.exception("Error listing trigger providers", exc_info=e)
raise
class TriggerSubscriptionBuilderCreateApi(Resource):
@setup_required
@login_required
@account_initialization_required
def post(self, provider):
"""Add a new subscription instance for a trigger provider"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
if not user.is_admin_or_owner:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("credential_type", type=str, required=False, nullable=True, location="json")
args = parser.parse_args()
try:
credential_type = CredentialType.of(args.get("credential_type") or CredentialType.UNAUTHORIZED.value)
subscription_builder = TriggerSubscriptionBuilderService.create_trigger_subscription_builder(
tenant_id=user.current_tenant_id,
user_id=user.id,
provider_id=TriggerProviderID(provider),
credential_type=credential_type,
)
return jsonable_encoder({"subscription_builder": subscription_builder})
except Exception as e:
logger.exception("Error adding provider credential", exc_info=e)
raise
class TriggerSubscriptionBuilderGetApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self, provider, subscription_builder_id):
"""Get a subscription instance for a trigger provider"""
return jsonable_encoder(
TriggerSubscriptionBuilderService.get_subscription_builder_by_id(subscription_builder_id)
)
class TriggerSubscriptionBuilderVerifyApi(Resource):
@setup_required
@login_required
@account_initialization_required
def post(self, provider, subscription_builder_id):
"""Verify a subscription instance for a trigger provider"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
if not user.is_admin_or_owner:
raise Forbidden()
parser = reqparse.RequestParser()
# The credentials of the subscription builder
parser.add_argument("credentials", type=dict, required=False, nullable=True, location="json")
args = parser.parse_args()
try:
# Use atomic update_and_verify to prevent race conditions
return TriggerSubscriptionBuilderService.update_and_verify_builder(
tenant_id=user.current_tenant_id,
user_id=user.id,
provider_id=TriggerProviderID(provider),
subscription_builder_id=subscription_builder_id,
subscription_builder_updater=SubscriptionBuilderUpdater(
credentials=args.get("credentials", None),
),
)
except Exception as e:
logger.exception("Error verifying provider credential", exc_info=e)
raise ValueError(str(e)) from e
class TriggerSubscriptionBuilderUpdateApi(Resource):
@setup_required
@login_required
@account_initialization_required
def post(self, provider, subscription_builder_id):
"""Update a subscription instance for a trigger provider"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
parser = reqparse.RequestParser()
# The name of the subscription builder
parser.add_argument("name", type=str, required=False, nullable=True, location="json")
# The parameters of the subscription builder
parser.add_argument("parameters", type=dict, required=False, nullable=True, location="json")
# The properties of the subscription builder
parser.add_argument("properties", type=dict, required=False, nullable=True, location="json")
# The credentials of the subscription builder
parser.add_argument("credentials", type=dict, required=False, nullable=True, location="json")
args = parser.parse_args()
try:
return jsonable_encoder(
TriggerSubscriptionBuilderService.update_trigger_subscription_builder(
tenant_id=user.current_tenant_id,
provider_id=TriggerProviderID(provider),
subscription_builder_id=subscription_builder_id,
subscription_builder_updater=SubscriptionBuilderUpdater(
name=args.get("name", None),
parameters=args.get("parameters", None),
properties=args.get("properties", None),
credentials=args.get("credentials", None),
),
)
)
except Exception as e:
logger.exception("Error updating provider credential", exc_info=e)
raise
class TriggerSubscriptionBuilderLogsApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self, provider, subscription_builder_id):
"""Get the request logs for a subscription instance for a trigger provider"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
try:
logs = TriggerSubscriptionBuilderService.list_logs(subscription_builder_id)
return jsonable_encoder({"logs": [log.model_dump(mode="json") for log in logs]})
except Exception as e:
logger.exception("Error getting request logs for subscription builder", exc_info=e)
raise
class TriggerSubscriptionBuilderBuildApi(Resource):
@setup_required
@login_required
@account_initialization_required
def post(self, provider, subscription_builder_id):
"""Build a subscription instance for a trigger provider"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
if not user.is_admin_or_owner:
raise Forbidden()
parser = reqparse.RequestParser()
# The name of the subscription builder
parser.add_argument("name", type=str, required=False, nullable=True, location="json")
# The parameters of the subscription builder
parser.add_argument("parameters", type=dict, required=False, nullable=True, location="json")
# The properties of the subscription builder
parser.add_argument("properties", type=dict, required=False, nullable=True, location="json")
# The credentials of the subscription builder
parser.add_argument("credentials", type=dict, required=False, nullable=True, location="json")
args = parser.parse_args()
try:
# Use atomic update_and_build to prevent race conditions
TriggerSubscriptionBuilderService.update_and_build_builder(
tenant_id=user.current_tenant_id,
user_id=user.id,
provider_id=TriggerProviderID(provider),
subscription_builder_id=subscription_builder_id,
subscription_builder_updater=SubscriptionBuilderUpdater(
name=args.get("name", None),
parameters=args.get("parameters", None),
properties=args.get("properties", None),
),
)
return 200
except Exception as e:
logger.exception("Error building provider credential", exc_info=e)
raise ValueError(str(e)) from e
class TriggerSubscriptionDeleteApi(Resource):
@setup_required
@login_required
@account_initialization_required
def post(self, subscription_id: str):
"""Delete a subscription instance"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
if not user.is_admin_or_owner:
raise Forbidden()
try:
with Session(db.engine) as session:
# Delete trigger provider subscription
TriggerProviderService.delete_trigger_provider(
session=session,
tenant_id=user.current_tenant_id,
subscription_id=subscription_id,
)
# Delete plugin triggers
TriggerSubscriptionOperatorService.delete_plugin_trigger_by_subscription(
session=session,
tenant_id=user.current_tenant_id,
subscription_id=subscription_id,
)
session.commit()
return {"result": "success"}
except ValueError as e:
raise BadRequest(str(e))
except Exception as e:
logger.exception("Error deleting provider credential", exc_info=e)
raise
class TriggerOAuthAuthorizeApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self, provider):
"""Initiate OAuth authorization flow for a trigger provider"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
try:
provider_id = TriggerProviderID(provider)
plugin_id = provider_id.plugin_id
provider_name = provider_id.provider_name
tenant_id = user.current_tenant_id
# Get OAuth client configuration
oauth_client_params = TriggerProviderService.get_oauth_client(
tenant_id=tenant_id,
provider_id=provider_id,
)
if oauth_client_params is None:
raise NotFoundError("No OAuth client configuration found for this trigger provider")
# Create subscription builder
subscription_builder = TriggerSubscriptionBuilderService.create_trigger_subscription_builder(
tenant_id=tenant_id,
user_id=user.id,
provider_id=provider_id,
credential_type=CredentialType.OAUTH2,
)
# Create OAuth handler and proxy context
oauth_handler = OAuthHandler()
context_id = OAuthProxyService.create_proxy_context(
user_id=user.id,
tenant_id=tenant_id,
plugin_id=plugin_id,
provider=provider_name,
extra_data={
"subscription_builder_id": subscription_builder.id,
},
)
# Build redirect URI for callback
redirect_uri = f"{dify_config.CONSOLE_API_URL}/console/api/oauth/plugin/{provider}/trigger/callback"
# Get authorization URL
authorization_url_response = oauth_handler.get_authorization_url(
tenant_id=tenant_id,
user_id=user.id,
plugin_id=plugin_id,
provider=provider_name,
redirect_uri=redirect_uri,
system_credentials=oauth_client_params,
)
# Create response with cookie
response = make_response(
jsonable_encoder(
{
"authorization_url": authorization_url_response.authorization_url,
"subscription_builder_id": subscription_builder.id,
"subscription_builder": subscription_builder,
}
)
)
response.set_cookie(
"context_id",
context_id,
httponly=True,
samesite="Lax",
max_age=OAuthProxyService.__MAX_AGE__,
)
return response
except Exception as e:
logger.exception("Error initiating OAuth flow", exc_info=e)
raise
class TriggerOAuthCallbackApi(Resource):
@setup_required
def get(self, provider):
"""Handle OAuth callback for trigger provider"""
context_id = request.cookies.get("context_id")
if not context_id:
raise Forbidden("context_id not found")
# Use and validate proxy context
context = OAuthProxyService.use_proxy_context(context_id)
if context is None:
raise Forbidden("Invalid context_id")
# Parse provider ID
provider_id = TriggerProviderID(provider)
plugin_id = provider_id.plugin_id
provider_name = provider_id.provider_name
user_id = context.get("user_id")
tenant_id = context.get("tenant_id")
subscription_builder_id = context.get("subscription_builder_id")
# Get OAuth client configuration
oauth_client_params = TriggerProviderService.get_oauth_client(
tenant_id=tenant_id,
provider_id=provider_id,
)
if oauth_client_params is None:
raise Forbidden("No OAuth client configuration found for this trigger provider")
# Get OAuth credentials from callback
oauth_handler = OAuthHandler()
redirect_uri = f"{dify_config.CONSOLE_API_URL}/console/api/oauth/plugin/{provider}/trigger/callback"
credentials_response = oauth_handler.get_credentials(
tenant_id=tenant_id,
user_id=user_id,
plugin_id=plugin_id,
provider=provider_name,
redirect_uri=redirect_uri,
system_credentials=oauth_client_params,
request=request,
)
credentials = credentials_response.credentials
expires_at = credentials_response.expires_at
if not credentials:
raise ValueError("Failed to get OAuth credentials from the provider.")
# Update subscription builder
TriggerSubscriptionBuilderService.update_trigger_subscription_builder(
tenant_id=tenant_id,
provider_id=provider_id,
subscription_builder_id=subscription_builder_id,
subscription_builder_updater=SubscriptionBuilderUpdater(
credentials=credentials,
credential_expires_at=expires_at,
),
)
# Redirect to OAuth callback page
return redirect(f"{dify_config.CONSOLE_WEB_URL}/oauth-callback")
class TriggerOAuthClientManageApi(Resource):
@setup_required
@login_required
@account_initialization_required
def get(self, provider):
"""Get OAuth client configuration for a provider"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
if not user.is_admin_or_owner:
raise Forbidden()
try:
provider_id = TriggerProviderID(provider)
# Get custom OAuth client params if exists
custom_params = TriggerProviderService.get_custom_oauth_client_params(
tenant_id=user.current_tenant_id,
provider_id=provider_id,
)
# Check if custom client is enabled
is_custom_enabled = TriggerProviderService.is_oauth_custom_client_enabled(
tenant_id=user.current_tenant_id,
provider_id=provider_id,
)
system_client_exists = TriggerProviderService.is_oauth_system_client_exists(
tenant_id=user.current_tenant_id,
provider_id=provider_id,
)
provider_controller = TriggerManager.get_trigger_provider(user.current_tenant_id, provider_id)
redirect_uri = f"{dify_config.CONSOLE_API_URL}/console/api/oauth/plugin/{provider}/trigger/callback"
return jsonable_encoder(
{
"configured": bool(custom_params or system_client_exists),
"system_configured": system_client_exists,
"custom_configured": bool(custom_params),
"oauth_client_schema": provider_controller.get_oauth_client_schema(),
"custom_enabled": is_custom_enabled,
"redirect_uri": redirect_uri,
"params": custom_params or {},
}
)
except Exception as e:
logger.exception("Error getting OAuth client", exc_info=e)
raise
@setup_required
@login_required
@account_initialization_required
def post(self, provider):
"""Configure custom OAuth client for a provider"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
if not user.is_admin_or_owner:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("client_params", type=dict, required=False, nullable=True, location="json")
parser.add_argument("enabled", type=bool, required=False, nullable=True, location="json")
args = parser.parse_args()
try:
provider_id = TriggerProviderID(provider)
return TriggerProviderService.save_custom_oauth_client_params(
tenant_id=user.current_tenant_id,
provider_id=provider_id,
client_params=args.get("client_params"),
enabled=args.get("enabled"),
)
except ValueError as e:
raise BadRequest(str(e))
except Exception as e:
logger.exception("Error configuring OAuth client", exc_info=e)
raise
@setup_required
@login_required
@account_initialization_required
def delete(self, provider):
"""Remove custom OAuth client configuration"""
user = current_user
assert isinstance(user, Account)
assert user.current_tenant_id is not None
if not user.is_admin_or_owner:
raise Forbidden()
try:
provider_id = TriggerProviderID(provider)
return TriggerProviderService.delete_custom_oauth_client_params(
tenant_id=user.current_tenant_id,
provider_id=provider_id,
)
except ValueError as e:
raise BadRequest(str(e))
except Exception as e:
logger.exception("Error removing OAuth client", exc_info=e)
raise
# Trigger Subscription
api.add_resource(TriggerProviderIconApi, "/workspaces/current/trigger-provider/<path:provider>/icon")
api.add_resource(TriggerProviderListApi, "/workspaces/current/triggers")
api.add_resource(TriggerProviderInfoApi, "/workspaces/current/trigger-provider/<path:provider>/info")
api.add_resource(TriggerSubscriptionListApi, "/workspaces/current/trigger-provider/<path:provider>/subscriptions/list")
api.add_resource(
TriggerSubscriptionDeleteApi,
"/workspaces/current/trigger-provider/<path:subscription_id>/subscriptions/delete",
)
# Trigger Subscription Builder
api.add_resource(
TriggerSubscriptionBuilderCreateApi,
"/workspaces/current/trigger-provider/<path:provider>/subscriptions/builder/create",
)
api.add_resource(
TriggerSubscriptionBuilderGetApi,
"/workspaces/current/trigger-provider/<path:provider>/subscriptions/builder/<path:subscription_builder_id>",
)
api.add_resource(
TriggerSubscriptionBuilderUpdateApi,
"/workspaces/current/trigger-provider/<path:provider>/subscriptions/builder/update/<path:subscription_builder_id>",
)
api.add_resource(
TriggerSubscriptionBuilderVerifyApi,
"/workspaces/current/trigger-provider/<path:provider>/subscriptions/builder/verify/<path:subscription_builder_id>",
)
api.add_resource(
TriggerSubscriptionBuilderBuildApi,
"/workspaces/current/trigger-provider/<path:provider>/subscriptions/builder/build/<path:subscription_builder_id>",
)
api.add_resource(
TriggerSubscriptionBuilderLogsApi,
"/workspaces/current/trigger-provider/<path:provider>/subscriptions/builder/logs/<path:subscription_builder_id>",
)
# OAuth
api.add_resource(
TriggerOAuthAuthorizeApi, "/workspaces/current/trigger-provider/<path:provider>/subscriptions/oauth/authorize"
)
api.add_resource(TriggerOAuthCallbackApi, "/oauth/plugin/<path:provider>/trigger/callback")
api.add_resource(TriggerOAuthClientManageApi, "/workspaces/current/trigger-provider/<path:provider>/oauth/client")

View File

@ -19,7 +19,8 @@ from libs.datetime_utils import naive_utc_now
from libs.login import current_user
from models import Account, Tenant, TenantAccountJoin, TenantStatus
from models.dataset import Dataset, RateLimitLog
from models.model import ApiToken, App, DefaultEndUserSessionID, EndUser
from models.model import ApiToken, App
from services.end_user_service import EndUserService
from services.feature_service import FeatureService
P = ParamSpec("P")
@ -83,7 +84,7 @@ def validate_app_token(view: Callable[P, R] | None = None, *, fetch_user_arg: Fe
if user_id:
user_id = str(user_id)
end_user = create_or_update_end_user_for_user_id(app_model, user_id)
end_user = EndUserService.get_or_create_end_user(app_model, user_id)
kwargs["end_user"] = end_user
# Set EndUser as current logged-in user for flask_login.current_user
@ -308,39 +309,6 @@ def validate_and_get_api_token(scope: str | None = None):
return api_token
def create_or_update_end_user_for_user_id(app_model: App, user_id: str | None = None) -> EndUser:
"""
Create or update session terminal based on user ID.
"""
if not user_id:
user_id = DefaultEndUserSessionID.DEFAULT_SESSION_ID
with Session(db.engine, expire_on_commit=False) as session:
end_user = (
session.query(EndUser)
.where(
EndUser.tenant_id == app_model.tenant_id,
EndUser.app_id == app_model.id,
EndUser.session_id == user_id,
EndUser.type == "service_api",
)
.first()
)
if end_user is None:
end_user = EndUser(
tenant_id=app_model.tenant_id,
app_id=app_model.id,
type="service_api",
is_anonymous=user_id == DefaultEndUserSessionID.DEFAULT_SESSION_ID,
session_id=user_id,
)
session.add(end_user)
session.commit()
return end_user
class DatasetApiResource(Resource):
method_decorators = [validate_dataset_token]

View File

@ -0,0 +1,12 @@
from flask import Blueprint
# Create trigger blueprint
bp = Blueprint("trigger", __name__, url_prefix="/triggers")
# Import routes after blueprint creation to avoid circular imports
from . import trigger, webhook
__all__ = [
"trigger",
"webhook",
]

View File

@ -0,0 +1,44 @@
import logging
import re
from flask import jsonify, request
from werkzeug.exceptions import NotFound
from controllers.trigger import bp
from services.trigger.trigger_service import TriggerService
from services.trigger.trigger_subscription_builder_service import TriggerSubscriptionBuilderService
logger = logging.getLogger(__name__)
UUID_PATTERN = r"^[0-9a-f]{8}-[0-9a-f]{4}-4[0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}$"
UUID_MATCHER = re.compile(UUID_PATTERN)
@bp.route("/plugin/<string:endpoint_id>", methods=["GET", "POST", "PUT", "PATCH", "DELETE", "HEAD", "OPTIONS"])
def trigger_endpoint(endpoint_id: str):
"""
Handle endpoint trigger calls.
"""
# endpoint_id must be UUID
if not UUID_MATCHER.match(endpoint_id):
raise NotFound("Invalid endpoint ID")
handling_chain = [
TriggerService.process_endpoint,
TriggerSubscriptionBuilderService.process_builder_validation_endpoint,
]
response = None
try:
for handler in handling_chain:
response = handler(endpoint_id, request)
if response:
break
if not response:
logger.error("Endpoint not found for {endpoint_id}")
return jsonify({"error": "Endpoint not found"}), 404
return response
except ValueError as e:
logger.exception("Endpoint processing failed for {endpoint_id}: {e}")
return jsonify({"error": "Endpoint processing failed", "message": str(e)}), 500
except Exception as e:
logger.exception("Webhook processing failed for {endpoint_id}")
return jsonify({"error": "Internal server error"}), 500

View File

@ -0,0 +1,105 @@
import logging
import time
from flask import jsonify
from werkzeug.exceptions import NotFound, RequestEntityTooLarge
from controllers.trigger import bp
from core.trigger.debug.event_bus import TriggerDebugEventBus
from core.trigger.debug.events import WebhookDebugEvent, build_webhook_pool_key
from services.trigger.webhook_service import WebhookService
logger = logging.getLogger(__name__)
def _prepare_webhook_execution(webhook_id: str, is_debug: bool = False):
"""Fetch trigger context, extract request data, and validate payload using unified processing.
Args:
webhook_id: The webhook ID to process
is_debug: If True, skip status validation for debug mode
"""
webhook_trigger, workflow, node_config = WebhookService.get_webhook_trigger_and_workflow(
webhook_id, is_debug=is_debug
)
try:
# Use new unified extraction and validation
webhook_data = WebhookService.extract_and_validate_webhook_data(webhook_trigger, node_config)
return webhook_trigger, workflow, node_config, webhook_data, None
except ValueError as e:
# Fall back to raw extraction for error reporting
webhook_data = WebhookService.extract_webhook_data(webhook_trigger)
return webhook_trigger, workflow, node_config, webhook_data, str(e)
@bp.route("/webhook/<string:webhook_id>", methods=["GET", "POST", "PUT", "PATCH", "DELETE", "HEAD", "OPTIONS"])
def handle_webhook(webhook_id: str):
"""
Handle webhook trigger calls.
This endpoint receives webhook calls and processes them according to the
configured webhook trigger settings.
"""
try:
webhook_trigger, workflow, node_config, webhook_data, error = _prepare_webhook_execution(webhook_id)
if error:
return jsonify({"error": "Bad Request", "message": error}), 400
# Process webhook call (send to Celery)
WebhookService.trigger_workflow_execution(webhook_trigger, webhook_data, workflow)
# Return configured response
response_data, status_code = WebhookService.generate_webhook_response(node_config)
return jsonify(response_data), status_code
except ValueError as e:
raise NotFound(str(e))
except RequestEntityTooLarge:
raise
except Exception as e:
logger.exception("Webhook processing failed for %s", webhook_id)
return jsonify({"error": "Internal server error", "message": str(e)}), 500
@bp.route("/webhook-debug/<string:webhook_id>", methods=["GET", "POST", "PUT", "PATCH", "DELETE", "HEAD", "OPTIONS"])
def handle_webhook_debug(webhook_id: str):
"""Handle webhook debug calls without triggering production workflow execution."""
try:
webhook_trigger, _, node_config, webhook_data, error = _prepare_webhook_execution(webhook_id, is_debug=True)
if error:
return jsonify({"error": "Bad Request", "message": error}), 400
workflow_inputs = WebhookService.build_workflow_inputs(webhook_data)
# Generate pool key and dispatch debug event
pool_key: str = build_webhook_pool_key(
tenant_id=webhook_trigger.tenant_id,
app_id=webhook_trigger.app_id,
node_id=webhook_trigger.node_id,
)
event = WebhookDebugEvent(
request_id=f"webhook_debug_{webhook_trigger.webhook_id}_{int(time.time() * 1000)}",
timestamp=int(time.time()),
node_id=webhook_trigger.node_id,
payload={
"inputs": workflow_inputs,
"webhook_data": webhook_data,
"method": webhook_data.get("method"),
},
)
TriggerDebugEventBus.dispatch(
tenant_id=webhook_trigger.tenant_id,
event=event,
pool_key=pool_key,
)
response_data, status_code = WebhookService.generate_webhook_response(node_config)
return jsonify(response_data), status_code
except ValueError as e:
raise NotFound(str(e))
except RequestEntityTooLarge:
raise
except Exception as e:
logger.exception("Webhook debug processing failed for %s", webhook_id)
return jsonify({"error": "Internal server error", "message": "An internal error has occurred."}), 500

View File

@ -37,6 +37,7 @@ from core.file import FILE_MODEL_IDENTITY, File
from core.plugin.impl.datasource import PluginDatasourceManager
from core.tools.entities.tool_entities import ToolProviderType
from core.tools.tool_manager import ToolManager
from core.trigger.trigger_manager import TriggerManager
from core.variables.segments import ArrayFileSegment, FileSegment, Segment
from core.workflow.enums import (
NodeType,
@ -303,6 +304,11 @@ class WorkflowResponseConverter:
response.data.extras["icon"] = provider_entity.declaration.identity.generate_datasource_icon_url(
self._application_generate_entity.app_config.tenant_id
)
elif event.node_type == NodeType.TRIGGER_PLUGIN:
response.data.extras["icon"] = TriggerManager.get_trigger_plugin_icon(
self._application_generate_entity.app_config.tenant_id,
event.provider_id,
)
return response

View File

@ -27,6 +27,7 @@ from core.helper.trace_id_helper import extract_external_trace_id_from_args
from core.model_runtime.errors.invoke import InvokeAuthorizationError
from core.ops.ops_trace_manager import TraceQueueManager
from core.repositories import DifyCoreRepositoryFactory
from core.workflow.graph_engine.layers.base import GraphEngineLayer
from core.workflow.repositories.draft_variable_repository import DraftVariableSaverFactory
from core.workflow.repositories.workflow_execution_repository import WorkflowExecutionRepository
from core.workflow.repositories.workflow_node_execution_repository import WorkflowNodeExecutionRepository
@ -53,7 +54,10 @@ class WorkflowAppGenerator(BaseAppGenerator):
invoke_from: InvokeFrom,
streaming: Literal[True],
call_depth: int,
) -> Generator[Mapping | str, None, None]: ...
triggered_from: WorkflowRunTriggeredFrom | None = None,
root_node_id: str | None = None,
graph_engine_layers: Sequence[GraphEngineLayer] = (),
) -> Generator[Mapping[str, Any] | str, None, None]: ...
@overload
def generate(
@ -66,6 +70,9 @@ class WorkflowAppGenerator(BaseAppGenerator):
invoke_from: InvokeFrom,
streaming: Literal[False],
call_depth: int,
triggered_from: WorkflowRunTriggeredFrom | None = None,
root_node_id: str | None = None,
graph_engine_layers: Sequence[GraphEngineLayer] = (),
) -> Mapping[str, Any]: ...
@overload
@ -79,7 +86,10 @@ class WorkflowAppGenerator(BaseAppGenerator):
invoke_from: InvokeFrom,
streaming: bool,
call_depth: int,
) -> Union[Mapping[str, Any], Generator[Mapping | str, None, None]]: ...
triggered_from: WorkflowRunTriggeredFrom | None = None,
root_node_id: str | None = None,
graph_engine_layers: Sequence[GraphEngineLayer] = (),
) -> Union[Mapping[str, Any], Generator[Mapping[str, Any] | str, None, None]]: ...
def generate(
self,
@ -91,7 +101,10 @@ class WorkflowAppGenerator(BaseAppGenerator):
invoke_from: InvokeFrom,
streaming: bool = True,
call_depth: int = 0,
) -> Union[Mapping[str, Any], Generator[Mapping | str, None, None]]:
triggered_from: WorkflowRunTriggeredFrom | None = None,
root_node_id: str | None = None,
graph_engine_layers: Sequence[GraphEngineLayer] = (),
) -> Union[Mapping[str, Any], Generator[Mapping[str, Any] | str, None, None]]:
files: Sequence[Mapping[str, Any]] = args.get("files") or []
# parse files
@ -126,17 +139,20 @@ class WorkflowAppGenerator(BaseAppGenerator):
**extract_external_trace_id_from_args(args),
}
workflow_run_id = str(uuid.uuid4())
if triggered_from in (WorkflowRunTriggeredFrom.DEBUGGING, WorkflowRunTriggeredFrom.APP_RUN):
# start node get inputs
inputs = self._prepare_user_inputs(
user_inputs=inputs,
variables=app_config.variables,
tenant_id=app_model.tenant_id,
strict_type_validation=True if invoke_from == InvokeFrom.SERVICE_API else False,
)
# init application generate entity
application_generate_entity = WorkflowAppGenerateEntity(
task_id=str(uuid.uuid4()),
app_config=app_config,
file_upload_config=file_extra_config,
inputs=self._prepare_user_inputs(
user_inputs=inputs,
variables=app_config.variables,
tenant_id=app_model.tenant_id,
strict_type_validation=True if invoke_from == InvokeFrom.SERVICE_API else False,
),
inputs=inputs,
files=list(system_files),
user_id=user.id,
stream=streaming,
@ -155,7 +171,10 @@ class WorkflowAppGenerator(BaseAppGenerator):
# Create session factory
session_factory = sessionmaker(bind=db.engine, expire_on_commit=False)
# Create workflow execution(aka workflow run) repository
if invoke_from == InvokeFrom.DEBUGGER:
if triggered_from is not None:
# Use explicitly provided triggered_from (for async triggers)
workflow_triggered_from = triggered_from
elif invoke_from == InvokeFrom.DEBUGGER:
workflow_triggered_from = WorkflowRunTriggeredFrom.DEBUGGING
else:
workflow_triggered_from = WorkflowRunTriggeredFrom.APP_RUN
@ -182,8 +201,16 @@ class WorkflowAppGenerator(BaseAppGenerator):
workflow_execution_repository=workflow_execution_repository,
workflow_node_execution_repository=workflow_node_execution_repository,
streaming=streaming,
root_node_id=root_node_id,
graph_engine_layers=graph_engine_layers,
)
def resume(self, *, workflow_run_id: str) -> None:
"""
@TBD
"""
pass
def _generate(
self,
*,
@ -196,6 +223,8 @@ class WorkflowAppGenerator(BaseAppGenerator):
workflow_node_execution_repository: WorkflowNodeExecutionRepository,
streaming: bool = True,
variable_loader: VariableLoader = DUMMY_VARIABLE_LOADER,
root_node_id: str | None = None,
graph_engine_layers: Sequence[GraphEngineLayer] = (),
) -> Union[Mapping[str, Any], Generator[str | Mapping[str, Any], None, None]]:
"""
Generate App response.
@ -231,8 +260,10 @@ class WorkflowAppGenerator(BaseAppGenerator):
"queue_manager": queue_manager,
"context": context,
"variable_loader": variable_loader,
"root_node_id": root_node_id,
"workflow_execution_repository": workflow_execution_repository,
"workflow_node_execution_repository": workflow_node_execution_repository,
"graph_engine_layers": graph_engine_layers,
},
)
@ -426,6 +457,8 @@ class WorkflowAppGenerator(BaseAppGenerator):
variable_loader: VariableLoader,
workflow_execution_repository: WorkflowExecutionRepository,
workflow_node_execution_repository: WorkflowNodeExecutionRepository,
root_node_id: str | None = None,
graph_engine_layers: Sequence[GraphEngineLayer] = (),
) -> None:
"""
Generate worker in a new thread.
@ -469,6 +502,8 @@ class WorkflowAppGenerator(BaseAppGenerator):
system_user_id=system_user_id,
workflow_execution_repository=workflow_execution_repository,
workflow_node_execution_repository=workflow_node_execution_repository,
root_node_id=root_node_id,
graph_engine_layers=graph_engine_layers,
)
try:

View File

@ -1,5 +1,6 @@
import logging
import time
from collections.abc import Sequence
from typing import cast
from core.app.apps.base_app_queue_manager import AppQueueManager
@ -8,6 +9,7 @@ from core.app.apps.workflow_app_runner import WorkflowBasedAppRunner
from core.app.entities.app_invoke_entities import InvokeFrom, WorkflowAppGenerateEntity
from core.workflow.enums import WorkflowType
from core.workflow.graph_engine.command_channels.redis_channel import RedisChannel
from core.workflow.graph_engine.layers.base import GraphEngineLayer
from core.workflow.graph_engine.layers.persistence import PersistenceWorkflowInfo, WorkflowPersistenceLayer
from core.workflow.repositories.workflow_execution_repository import WorkflowExecutionRepository
from core.workflow.repositories.workflow_node_execution_repository import WorkflowNodeExecutionRepository
@ -16,6 +18,7 @@ from core.workflow.system_variable import SystemVariable
from core.workflow.variable_loader import VariableLoader
from core.workflow.workflow_entry import WorkflowEntry
from extensions.ext_redis import redis_client
from libs.datetime_utils import naive_utc_now
from models.enums import UserFrom
from models.workflow import Workflow
@ -35,17 +38,21 @@ class WorkflowAppRunner(WorkflowBasedAppRunner):
variable_loader: VariableLoader,
workflow: Workflow,
system_user_id: str,
root_node_id: str | None = None,
workflow_execution_repository: WorkflowExecutionRepository,
workflow_node_execution_repository: WorkflowNodeExecutionRepository,
graph_engine_layers: Sequence[GraphEngineLayer] = (),
):
super().__init__(
queue_manager=queue_manager,
variable_loader=variable_loader,
app_id=application_generate_entity.app_config.app_id,
graph_engine_layers=graph_engine_layers,
)
self.application_generate_entity = application_generate_entity
self._workflow = workflow
self._sys_user_id = system_user_id
self._root_node_id = root_node_id
self._workflow_execution_repository = workflow_execution_repository
self._workflow_node_execution_repository = workflow_node_execution_repository
@ -60,6 +67,7 @@ class WorkflowAppRunner(WorkflowBasedAppRunner):
files=self.application_generate_entity.files,
user_id=self._sys_user_id,
app_id=app_config.app_id,
timestamp=int(naive_utc_now().timestamp()),
workflow_id=app_config.workflow_id,
workflow_execution_id=self.application_generate_entity.workflow_execution_id,
)
@ -92,6 +100,7 @@ class WorkflowAppRunner(WorkflowBasedAppRunner):
workflow_id=self._workflow.id,
tenant_id=self._workflow.tenant_id,
user_id=self.application_generate_entity.user_id,
root_node_id=self._root_node_id,
)
# RUN WORKFLOW

View File

@ -84,6 +84,7 @@ class WorkflowBasedAppRunner:
workflow_id: str = "",
tenant_id: str = "",
user_id: str = "",
root_node_id: str | None = None,
) -> Graph:
"""
Init graph
@ -117,7 +118,7 @@ class WorkflowBasedAppRunner:
)
# init graph
graph = Graph.init(graph_config=graph_config, node_factory=node_factory)
graph = Graph.init(graph_config=graph_config, node_factory=node_factory, root_node_id=root_node_id)
if not graph:
raise ValueError("graph not found in workflow")

View File

@ -32,6 +32,10 @@ class InvokeFrom(StrEnum):
# https://docs.dify.ai/en/guides/application-publishing/launch-your-webapp-quickly/README
WEB_APP = "web-app"
# TRIGGER indicates that this invocation is from a trigger.
# this is used for plugin trigger and webhook trigger.
TRIGGER = "trigger"
# EXPLORE indicates that this invocation is from
# the workflow (or chatflow) explore page.
EXPLORE = "explore"
@ -65,6 +69,8 @@ class InvokeFrom(StrEnum):
return "dev"
elif self == InvokeFrom.EXPLORE:
return "explore_app"
elif self == InvokeFrom.TRIGGER:
return "trigger"
elif self == InvokeFrom.SERVICE_API:
return "api"

View File

@ -1,5 +1,5 @@
from sqlalchemy import Engine
from sqlalchemy.orm import sessionmaker
from sqlalchemy.orm import Session, sessionmaker
from core.workflow.graph_engine.layers.base import GraphEngineLayer
from core.workflow.graph_events.base import GraphEngineEvent
@ -9,7 +9,7 @@ from repositories.factory import DifyAPIRepositoryFactory
class PauseStatePersistenceLayer(GraphEngineLayer):
def __init__(self, session_factory: Engine | sessionmaker, state_owner_user_id: str):
def __init__(self, session_factory: Engine | sessionmaker[Session], state_owner_user_id: str):
"""Create a PauseStatePersistenceLayer.
The `state_owner_user_id` is used when creating state file for pause.

View File

@ -0,0 +1,21 @@
from core.workflow.graph_engine.layers.base import GraphEngineLayer
from core.workflow.graph_events.base import GraphEngineEvent
from core.workflow.graph_events.graph import GraphRunPausedEvent
class SuspendLayer(GraphEngineLayer):
""" """
def on_graph_start(self):
pass
def on_event(self, event: GraphEngineEvent):
"""
Handle the paused event, stash runtime state into storage and wait for resume.
"""
if isinstance(event, GraphRunPausedEvent):
pass
def on_graph_end(self, error: Exception | None):
""" """
pass

View File

@ -0,0 +1,88 @@
import logging
import uuid
from typing import ClassVar
from apscheduler.schedulers.background import BackgroundScheduler # type: ignore
from core.workflow.graph_engine.entities.commands import CommandType, GraphEngineCommand
from core.workflow.graph_engine.layers.base import GraphEngineLayer
from core.workflow.graph_events.base import GraphEngineEvent
from services.workflow.entities import WorkflowScheduleCFSPlanEntity
from services.workflow.scheduler import CFSPlanScheduler, SchedulerCommand
logger = logging.getLogger(__name__)
class TimeSliceLayer(GraphEngineLayer):
"""
CFS plan scheduler to control the timeslice of the workflow.
"""
scheduler: ClassVar[BackgroundScheduler] = BackgroundScheduler()
def __init__(self, cfs_plan_scheduler: CFSPlanScheduler) -> None:
"""
CFS plan scheduler allows to control the timeslice of the workflow.
"""
if not TimeSliceLayer.scheduler.running:
TimeSliceLayer.scheduler.start()
super().__init__()
self.cfs_plan_scheduler = cfs_plan_scheduler
self.stopped = False
self.schedule_id = ""
def _checker_job(self, schedule_id: str):
"""
Check if the workflow need to be suspended.
"""
try:
if self.stopped:
self.scheduler.remove_job(schedule_id)
return
if self.cfs_plan_scheduler.can_schedule() == SchedulerCommand.RESOURCE_LIMIT_REACHED:
# remove the job
self.scheduler.remove_job(schedule_id)
if not self.command_channel:
logger.exception("No command channel to stop the workflow")
return
# send command to pause the workflow
self.command_channel.send_command(
GraphEngineCommand(
command_type=CommandType.PAUSE,
payload={
"reason": SchedulerCommand.RESOURCE_LIMIT_REACHED,
},
)
)
except Exception:
logger.exception("scheduler error during check if the workflow need to be suspended")
def on_graph_start(self):
"""
Start timer to check if the workflow need to be suspended.
"""
if self.cfs_plan_scheduler.plan.schedule_strategy == WorkflowScheduleCFSPlanEntity.Strategy.TimeSlice:
self.schedule_id = uuid.uuid4().hex
self.scheduler.add_job(
lambda: self._checker_job(self.schedule_id),
"interval",
seconds=self.cfs_plan_scheduler.plan.granularity,
id=self.schedule_id,
)
def on_event(self, event: GraphEngineEvent):
pass
def on_graph_end(self, error: Exception | None) -> None:
self.stopped = True
# remove the scheduler
if self.schedule_id:
self.scheduler.remove_job(self.schedule_id)

View File

@ -0,0 +1,88 @@
import logging
from datetime import UTC, datetime
from typing import Any, ClassVar
from pydantic import TypeAdapter
from sqlalchemy.orm import Session, sessionmaker
from core.workflow.graph_engine.layers.base import GraphEngineLayer
from core.workflow.graph_events.base import GraphEngineEvent
from core.workflow.graph_events.graph import GraphRunFailedEvent, GraphRunPausedEvent, GraphRunSucceededEvent
from models.enums import WorkflowTriggerStatus
from repositories.sqlalchemy_workflow_trigger_log_repository import SQLAlchemyWorkflowTriggerLogRepository
from tasks.workflow_cfs_scheduler.cfs_scheduler import AsyncWorkflowCFSPlanEntity
logger = logging.getLogger(__name__)
class TriggerPostLayer(GraphEngineLayer):
"""
Trigger post layer.
"""
_STATUS_MAP: ClassVar[dict[type[GraphEngineEvent], WorkflowTriggerStatus]] = {
GraphRunSucceededEvent: WorkflowTriggerStatus.SUCCEEDED,
GraphRunFailedEvent: WorkflowTriggerStatus.FAILED,
GraphRunPausedEvent: WorkflowTriggerStatus.PAUSED,
}
def __init__(
self,
cfs_plan_scheduler_entity: AsyncWorkflowCFSPlanEntity,
start_time: datetime,
trigger_log_id: str,
session_maker: sessionmaker[Session],
):
self.trigger_log_id = trigger_log_id
self.start_time = start_time
self.cfs_plan_scheduler_entity = cfs_plan_scheduler_entity
self.session_maker = session_maker
def on_graph_start(self):
pass
def on_event(self, event: GraphEngineEvent):
"""
Update trigger log with success or failure.
"""
if isinstance(event, tuple(self._STATUS_MAP.keys())):
with self.session_maker() as session:
repo = SQLAlchemyWorkflowTriggerLogRepository(session)
trigger_log = repo.get_by_id(self.trigger_log_id)
if not trigger_log:
logger.exception("Trigger log not found: %s", self.trigger_log_id)
return
# Calculate elapsed time
elapsed_time = (datetime.now(UTC) - self.start_time).total_seconds()
# Extract relevant data from result
if not self.graph_runtime_state:
logger.exception("Graph runtime state is not set")
return
outputs = self.graph_runtime_state.outputs
# BASICLY, workflow_execution_id is the same as workflow_run_id
workflow_run_id = self.graph_runtime_state.system_variable.workflow_execution_id
assert workflow_run_id, "Workflow run id is not set"
total_tokens = self.graph_runtime_state.total_tokens
# Update trigger log with success
trigger_log.status = self._STATUS_MAP[type(event)]
trigger_log.workflow_run_id = workflow_run_id
trigger_log.outputs = TypeAdapter(dict[str, Any]).dump_json(outputs).decode()
if trigger_log.elapsed_time is None:
trigger_log.elapsed_time = elapsed_time
else:
trigger_log.elapsed_time += elapsed_time
trigger_log.total_tokens = total_tokens
trigger_log.finished_at = datetime.now(UTC)
repo.update(trigger_log)
session.commit()
def on_graph_end(self, error: Exception | None) -> None:
pass

View File

@ -14,6 +14,7 @@ class CommonParameterType(StrEnum):
APP_SELECTOR = "app-selector"
MODEL_SELECTOR = "model-selector"
TOOLS_SELECTOR = "array[tools]"
CHECKBOX = "checkbox"
ANY = auto()
# Dynamic select parameter

View File

@ -107,7 +107,7 @@ class CustomModelConfiguration(BaseModel):
model: str
model_type: ModelType
credentials: dict | None = None
credentials: dict | None
current_credential_id: str | None = None
current_credential_name: str | None = None
available_model_credentials: list[CredentialConfiguration] = []
@ -207,6 +207,7 @@ class ProviderConfig(BasicProviderConfig):
required: bool = False
default: Union[int, str, float, bool] | None = None
options: list[Option] | None = None
multiple: bool | None = False
label: I18nObject | None = None
help: I18nObject | None = None
url: str | None = None

View File

@ -3,7 +3,7 @@ import re
from collections.abc import Sequence
from typing import Any
from core.tools.entities.tool_entities import CredentialType
from core.plugin.entities.plugin_daemon import CredentialType
logger = logging.getLogger(__name__)

View File

@ -0,0 +1,129 @@
import contextlib
from collections.abc import Mapping
from copy import deepcopy
from typing import Any, Protocol
from core.entities.provider_entities import BasicProviderConfig
from core.helper import encrypter
class ProviderConfigCache(Protocol):
"""
Interface for provider configuration cache operations
"""
def get(self) -> dict[str, Any] | None:
"""Get cached provider configuration"""
...
def set(self, config: dict[str, Any]) -> None:
"""Cache provider configuration"""
...
def delete(self) -> None:
"""Delete cached provider configuration"""
...
class ProviderConfigEncrypter:
tenant_id: str
config: list[BasicProviderConfig]
provider_config_cache: ProviderConfigCache
def __init__(
self,
tenant_id: str,
config: list[BasicProviderConfig],
provider_config_cache: ProviderConfigCache,
):
self.tenant_id = tenant_id
self.config = config
self.provider_config_cache = provider_config_cache
def _deep_copy(self, data: Mapping[str, Any]) -> Mapping[str, Any]:
"""
deep copy data
"""
return deepcopy(data)
def encrypt(self, data: Mapping[str, Any]) -> Mapping[str, Any]:
"""
encrypt tool credentials with tenant id
return a deep copy of credentials with encrypted values
"""
data = dict(self._deep_copy(data))
# get fields need to be decrypted
fields = dict[str, BasicProviderConfig]()
for credential in self.config:
fields[credential.name] = credential
for field_name, field in fields.items():
if field.type == BasicProviderConfig.Type.SECRET_INPUT:
if field_name in data:
encrypted = encrypter.encrypt_token(self.tenant_id, data[field_name] or "")
data[field_name] = encrypted
return data
def mask_credentials(self, data: Mapping[str, Any]) -> Mapping[str, Any]:
"""
mask credentials
return a deep copy of credentials with masked values
"""
data = dict(self._deep_copy(data))
# get fields need to be decrypted
fields = dict[str, BasicProviderConfig]()
for credential in self.config:
fields[credential.name] = credential
for field_name, field in fields.items():
if field.type == BasicProviderConfig.Type.SECRET_INPUT:
if field_name in data:
if len(data[field_name]) > 6:
data[field_name] = (
data[field_name][:2] + "*" * (len(data[field_name]) - 4) + data[field_name][-2:]
)
else:
data[field_name] = "*" * len(data[field_name])
return data
def mask_plugin_credentials(self, data: Mapping[str, Any]) -> Mapping[str, Any]:
return self.mask_credentials(data)
def decrypt(self, data: Mapping[str, Any]) -> Mapping[str, Any]:
"""
decrypt tool credentials with tenant id
return a deep copy of credentials with decrypted values
"""
cached_credentials = self.provider_config_cache.get()
if cached_credentials:
return cached_credentials
data = dict(self._deep_copy(data))
# get fields need to be decrypted
fields = dict[str, BasicProviderConfig]()
for credential in self.config:
fields[credential.name] = credential
for field_name, field in fields.items():
if field.type == BasicProviderConfig.Type.SECRET_INPUT:
if field_name in data:
with contextlib.suppress(Exception):
# if the value is None or empty string, skip decrypt
if not data[field_name]:
continue
data[field_name] = encrypter.decrypt_token(self.tenant_id, data[field_name])
self.provider_config_cache.set(dict(data))
return data
def create_provider_encrypter(tenant_id: str, config: list[BasicProviderConfig], cache: ProviderConfigCache):
return ProviderConfigEncrypter(tenant_id=tenant_id, config=config, provider_config_cache=cache), cache

View File

@ -4,7 +4,6 @@ from typing import Union
from sqlalchemy import select
from sqlalchemy.orm import Session
from controllers.service_api.wraps import create_or_update_end_user_for_user_id
from core.app.app_config.common.parameters_mapping import get_parameters_from_feature_dict
from core.app.apps.advanced_chat.app_generator import AdvancedChatAppGenerator
from core.app.apps.agent_chat.app_generator import AgentChatAppGenerator
@ -16,6 +15,7 @@ from core.plugin.backwards_invocation.base import BaseBackwardsInvocation
from extensions.ext_database import db
from models import Account
from models.model import App, AppMode, EndUser
from services.end_user_service import EndUserService
class PluginAppBackwardsInvocation(BaseBackwardsInvocation):
@ -64,7 +64,7 @@ class PluginAppBackwardsInvocation(BaseBackwardsInvocation):
"""
app = cls._get_app(app_id, tenant_id)
if not user_id:
user = create_or_update_end_user_for_user_id(app)
user = EndUserService.get_or_create_end_user(app)
else:
user = cls._get_user(user_id)

View File

@ -39,7 +39,7 @@ class PluginParameterType(StrEnum):
TOOLS_SELECTOR = CommonParameterType.TOOLS_SELECTOR
ANY = CommonParameterType.ANY
DYNAMIC_SELECT = CommonParameterType.DYNAMIC_SELECT
CHECKBOX = CommonParameterType.CHECKBOX
# deprecated, should not use.
SYSTEM_FILES = CommonParameterType.SYSTEM_FILES
@ -94,6 +94,7 @@ def as_normal_type(typ: StrEnum):
if typ.value in {
PluginParameterType.SECRET_INPUT,
PluginParameterType.SELECT,
PluginParameterType.CHECKBOX,
}:
return "string"
return typ.value
@ -102,7 +103,13 @@ def as_normal_type(typ: StrEnum):
def cast_parameter_value(typ: StrEnum, value: Any, /):
try:
match typ.value:
case PluginParameterType.STRING | PluginParameterType.SECRET_INPUT | PluginParameterType.SELECT:
case (
PluginParameterType.STRING
| PluginParameterType.SECRET_INPUT
| PluginParameterType.SELECT
| PluginParameterType.CHECKBOX
| PluginParameterType.DYNAMIC_SELECT
):
if value is None:
return ""
else:

View File

@ -13,6 +13,7 @@ from core.plugin.entities.base import BasePluginEntity
from core.plugin.entities.endpoint import EndpointProviderDeclaration
from core.tools.entities.common_entities import I18nObject
from core.tools.entities.tool_entities import ToolProviderEntity
from core.trigger.entities.entities import TriggerProviderEntity
class PluginInstallationSource(StrEnum):
@ -63,6 +64,7 @@ class PluginCategory(StrEnum):
Extension = auto()
AgentStrategy = "agent-strategy"
Datasource = "datasource"
Trigger = "trigger"
class PluginDeclaration(BaseModel):
@ -71,6 +73,7 @@ class PluginDeclaration(BaseModel):
models: list[str] | None = Field(default_factory=list[str])
endpoints: list[str] | None = Field(default_factory=list[str])
datasources: list[str] | None = Field(default_factory=list[str])
triggers: list[str] | None = Field(default_factory=list[str])
class Meta(BaseModel):
minimum_dify_version: str | None = Field(default=None)
@ -106,6 +109,7 @@ class PluginDeclaration(BaseModel):
endpoint: EndpointProviderDeclaration | None = None
agent_strategy: AgentStrategyProviderEntity | None = None
datasource: DatasourceProviderEntity | None = None
trigger: TriggerProviderEntity | None = None
meta: Meta
@field_validator("version")
@ -129,6 +133,8 @@ class PluginDeclaration(BaseModel):
values["category"] = PluginCategory.Datasource
elif values.get("agent_strategy"):
values["category"] = PluginCategory.AgentStrategy
elif values.get("trigger"):
values["category"] = PluginCategory.Trigger
else:
values["category"] = PluginCategory.Extension
return values

View File

@ -1,3 +1,4 @@
import enum
from collections.abc import Mapping, Sequence
from datetime import datetime
from enum import StrEnum
@ -14,6 +15,7 @@ from core.plugin.entities.parameters import PluginParameterOption
from core.plugin.entities.plugin import PluginDeclaration, PluginEntity
from core.tools.entities.common_entities import I18nObject
from core.tools.entities.tool_entities import ToolProviderEntityWithPlugin
from core.trigger.entities.entities import TriggerProviderEntity
T = TypeVar("T", bound=(BaseModel | dict | list | bool | str))
@ -205,3 +207,53 @@ class PluginListResponse(BaseModel):
class PluginDynamicSelectOptionsResponse(BaseModel):
options: Sequence[PluginParameterOption] = Field(description="The options of the dynamic select.")
class PluginTriggerProviderEntity(BaseModel):
provider: str
plugin_unique_identifier: str
plugin_id: str
declaration: TriggerProviderEntity
class CredentialType(enum.StrEnum):
API_KEY = "api-key"
OAUTH2 = "oauth2"
UNAUTHORIZED = "unauthorized"
def get_name(self):
if self == CredentialType.API_KEY:
return "API KEY"
elif self == CredentialType.OAUTH2:
return "AUTH"
elif self == CredentialType.UNAUTHORIZED:
return "UNAUTHORIZED"
else:
return self.value.replace("-", " ").upper()
def is_editable(self):
return self == CredentialType.API_KEY
def is_validate_allowed(self):
return self == CredentialType.API_KEY
@classmethod
def values(cls):
return [item.value for item in cls]
@classmethod
def of(cls, credential_type: str) -> "CredentialType":
type_name = credential_type.lower()
if type_name in {"api-key", "api_key"}:
return cls.API_KEY
elif type_name in {"oauth2", "oauth"}:
return cls.OAUTH2
elif type_name == "unauthorized":
return cls.UNAUTHORIZED
else:
raise ValueError(f"Invalid credential type: {credential_type}")
class PluginReadmeResponse(BaseModel):
content: str = Field(description="The readme of the plugin.")
language: str = Field(description="The language of the readme.")

View File

@ -1,5 +1,9 @@
import binascii
import json
from collections.abc import Mapping
from typing import Any, Literal
from flask import Response
from pydantic import BaseModel, ConfigDict, Field, field_validator
from core.entities.provider_entities import BasicProviderConfig
@ -13,6 +17,7 @@ from core.model_runtime.entities.message_entities import (
UserPromptMessage,
)
from core.model_runtime.entities.model_entities import ModelType
from core.plugin.utils.http_parser import deserialize_response
from core.workflow.nodes.parameter_extractor.entities import (
ModelConfig as ParameterExtractorModelConfig,
)
@ -237,3 +242,43 @@ class RequestFetchAppInfo(BaseModel):
"""
app_id: str
class TriggerInvokeEventResponse(BaseModel):
variables: Mapping[str, Any] = Field(default_factory=dict)
cancelled: bool | None = False
model_config = ConfigDict(protected_namespaces=(), arbitrary_types_allowed=True)
@field_validator("variables", mode="before")
@classmethod
def convert_variables(cls, v):
if isinstance(v, str):
return json.loads(v)
else:
return v
class TriggerSubscriptionResponse(BaseModel):
subscription: dict[str, Any]
class TriggerValidateProviderCredentialsResponse(BaseModel):
result: bool
class TriggerDispatchResponse(BaseModel):
user_id: str
events: list[str]
response: Response
payload: Mapping[str, Any] = Field(default_factory=dict)
model_config = ConfigDict(protected_namespaces=(), arbitrary_types_allowed=True)
@field_validator("response", mode="before")
@classmethod
def convert_response(cls, v: str):
try:
return deserialize_response(binascii.unhexlify(v.encode()))
except Exception as e:
raise ValueError("Failed to deserialize response from hex string") from e

View File

@ -10,3 +10,13 @@ class PluginAssetManager(BasePluginClient):
if response.status_code != 200:
raise ValueError(f"can not found asset {id}")
return response.content
def extract_asset(self, tenant_id: str, plugin_unique_identifier: str, filename: str) -> bytes:
response = self._request(
method="GET",
path=f"plugin/{tenant_id}/extract-asset/",
params={"plugin_unique_identifier": plugin_unique_identifier, "file_path": filename},
)
if response.status_code != 200:
raise ValueError(f"can not found asset {plugin_unique_identifier}, {str(response.status_code)}")
return response.content

View File

@ -29,6 +29,12 @@ from core.plugin.impl.exc import (
PluginPermissionDeniedError,
PluginUniqueIdentifierError,
)
from core.trigger.errors import (
EventIgnoreError,
TriggerInvokeError,
TriggerPluginInvokeError,
TriggerProviderCredentialValidationError,
)
plugin_daemon_inner_api_baseurl = URL(str(dify_config.PLUGIN_DAEMON_URL))
_plugin_daemon_timeout_config = cast(
@ -43,7 +49,7 @@ elif isinstance(_plugin_daemon_timeout_config, httpx.Timeout):
else:
plugin_daemon_request_timeout = httpx.Timeout(_plugin_daemon_timeout_config)
T = TypeVar("T", bound=(BaseModel | dict | list | bool | str))
T = TypeVar("T", bound=(BaseModel | dict[str, Any] | list[Any] | bool | str))
logger = logging.getLogger(__name__)
@ -53,10 +59,10 @@ class BasePluginClient:
self,
method: str,
path: str,
headers: dict | None = None,
data: bytes | dict | str | None = None,
params: dict | None = None,
files: dict | None = None,
headers: dict[str, str] | None = None,
data: bytes | dict[str, Any] | str | None = None,
params: dict[str, Any] | None = None,
files: dict[str, Any] | None = None,
) -> httpx.Response:
"""
Make a request to the plugin daemon inner API.
@ -87,17 +93,17 @@ class BasePluginClient:
def _prepare_request(
self,
path: str,
headers: dict | None,
data: bytes | dict | str | None,
params: dict | None,
files: dict | None,
) -> tuple[str, dict, bytes | dict | str | None, dict | None, dict | None]:
headers: dict[str, str] | None,
data: bytes | dict[str, Any] | str | None,
params: dict[str, Any] | None,
files: dict[str, Any] | None,
) -> tuple[str, dict[str, str], bytes | dict[str, Any] | str | None, dict[str, Any] | None, dict[str, Any] | None]:
url = plugin_daemon_inner_api_baseurl / path
prepared_headers = dict(headers or {})
prepared_headers["X-Api-Key"] = dify_config.PLUGIN_DAEMON_KEY
prepared_headers.setdefault("Accept-Encoding", "gzip, deflate, br")
prepared_data: bytes | dict | str | None = (
prepared_data: bytes | dict[str, Any] | str | None = (
data if isinstance(data, (bytes, str, dict)) or data is None else None
)
if isinstance(data, dict):
@ -112,10 +118,10 @@ class BasePluginClient:
self,
method: str,
path: str,
params: dict | None = None,
headers: dict | None = None,
data: bytes | dict | None = None,
files: dict | None = None,
params: dict[str, Any] | None = None,
headers: dict[str, str] | None = None,
data: bytes | dict[str, Any] | None = None,
files: dict[str, Any] | None = None,
) -> Generator[str, None, None]:
"""
Make a stream request to the plugin daemon inner API
@ -138,7 +144,7 @@ class BasePluginClient:
try:
with httpx.stream(**stream_kwargs) as response:
for raw_line in response.iter_lines():
if raw_line is None:
if not raw_line:
continue
line = raw_line.decode("utf-8") if isinstance(raw_line, bytes) else raw_line
line = line.strip()
@ -155,10 +161,10 @@ class BasePluginClient:
method: str,
path: str,
type_: type[T],
headers: dict | None = None,
data: bytes | dict | None = None,
params: dict | None = None,
files: dict | None = None,
headers: dict[str, str] | None = None,
data: bytes | dict[str, Any] | None = None,
params: dict[str, Any] | None = None,
files: dict[str, Any] | None = None,
) -> Generator[T, None, None]:
"""
Make a stream request to the plugin daemon inner API and yield the response as a model.
@ -171,10 +177,10 @@ class BasePluginClient:
method: str,
path: str,
type_: type[T],
headers: dict | None = None,
headers: dict[str, str] | None = None,
data: bytes | None = None,
params: dict | None = None,
files: dict | None = None,
params: dict[str, Any] | None = None,
files: dict[str, Any] | None = None,
) -> T:
"""
Make a request to the plugin daemon inner API and return the response as a model.
@ -187,11 +193,11 @@ class BasePluginClient:
method: str,
path: str,
type_: type[T],
headers: dict | None = None,
data: bytes | dict | None = None,
params: dict | None = None,
files: dict | None = None,
transformer: Callable[[dict], dict] | None = None,
headers: dict[str, str] | None = None,
data: bytes | dict[str, Any] | None = None,
params: dict[str, Any] | None = None,
files: dict[str, Any] | None = None,
transformer: Callable[[dict[str, Any]], dict[str, Any]] | None = None,
) -> T:
"""
Make a request to the plugin daemon inner API and return the response as a model.
@ -239,10 +245,10 @@ class BasePluginClient:
method: str,
path: str,
type_: type[T],
headers: dict | None = None,
data: bytes | dict | None = None,
params: dict | None = None,
files: dict | None = None,
headers: dict[str, str] | None = None,
data: bytes | dict[str, Any] | None = None,
params: dict[str, Any] | None = None,
files: dict[str, Any] | None = None,
) -> Generator[T, None, None]:
"""
Make a stream request to the plugin daemon inner API and yield the response as a model.
@ -302,6 +308,14 @@ class BasePluginClient:
raise CredentialsValidateFailedError(error_object.get("message"))
case EndpointSetupFailedError.__name__:
raise EndpointSetupFailedError(error_object.get("message"))
case TriggerProviderCredentialValidationError.__name__:
raise TriggerProviderCredentialValidationError(error_object.get("message"))
case TriggerPluginInvokeError.__name__:
raise TriggerPluginInvokeError(description=error_object.get("description"))
case TriggerInvokeError.__name__:
raise TriggerInvokeError(error_object.get("message"))
case EventIgnoreError.__name__:
raise EventIgnoreError(description=error_object.get("description"))
case _:
raise PluginInvokeError(description=message)
case PluginDaemonInternalServerError.__name__:

View File

@ -15,6 +15,7 @@ class DynamicSelectClient(BasePluginClient):
provider: str,
action: str,
credentials: Mapping[str, Any],
credential_type: str,
parameter: str,
) -> PluginDynamicSelectOptionsResponse:
"""
@ -29,6 +30,7 @@ class DynamicSelectClient(BasePluginClient):
"data": {
"provider": GenericProviderID(provider).provider_name,
"credentials": credentials,
"credential_type": credential_type,
"provider_action": action,
"parameter": parameter,
},

View File

@ -58,6 +58,20 @@ class PluginInvokeError(PluginDaemonClientSideError, ValueError):
except Exception:
return self.description
def to_user_friendly_error(self, plugin_name: str = "currently running plugin") -> str:
"""
Convert the error to a user-friendly error message.
:param plugin_name: The name of the plugin that caused the error.
:return: A user-friendly error message.
"""
return (
f"An error occurred in the {plugin_name}, "
f"please contact the author of {plugin_name} for help, "
f"error type: {self.get_error_type()}, "
f"error details: {self.get_error_message()}"
)
class PluginUniqueIdentifierError(PluginDaemonClientSideError):
description: str = "Unique Identifier Error"

View File

@ -1,5 +1,7 @@
from collections.abc import Sequence
from requests import HTTPError
from core.plugin.entities.bundle import PluginBundleDependency
from core.plugin.entities.plugin import (
MissingPluginDependency,
@ -13,12 +15,35 @@ from core.plugin.entities.plugin_daemon import (
PluginInstallTask,
PluginInstallTaskStartResponse,
PluginListResponse,
PluginReadmeResponse,
)
from core.plugin.impl.base import BasePluginClient
from models.provider_ids import GenericProviderID
class PluginInstaller(BasePluginClient):
def fetch_plugin_readme(self, tenant_id: str, plugin_unique_identifier: str, language: str) -> str:
"""
Fetch plugin readme
"""
try:
response = self._request_with_plugin_daemon_response(
"GET",
f"plugin/{tenant_id}/management/fetch/readme",
PluginReadmeResponse,
params={
"tenant_id": tenant_id,
"plugin_unique_identifier": plugin_unique_identifier,
"language": language,
},
)
return response.content
except HTTPError as e:
message = e.args[0]
if "404" in message:
return ""
raise e
def fetch_plugin_by_identifier(
self,
tenant_id: str,

View File

@ -3,14 +3,12 @@ from typing import Any
from pydantic import BaseModel
from core.plugin.entities.plugin_daemon import (
PluginBasicBooleanResponse,
PluginToolProviderEntity,
)
# from core.plugin.entities.plugin import GenericProviderID, ToolProviderID
from core.plugin.entities.plugin_daemon import CredentialType, PluginBasicBooleanResponse, PluginToolProviderEntity
from core.plugin.impl.base import BasePluginClient
from core.plugin.utils.chunk_merger import merge_blob_chunks
from core.schemas.resolver import resolve_dify_schema_refs
from core.tools.entities.tool_entities import CredentialType, ToolInvokeMessage, ToolParameter
from core.tools.entities.tool_entities import ToolInvokeMessage, ToolParameter
from models.provider_ids import GenericProviderID, ToolProviderID

View File

@ -0,0 +1,305 @@
import binascii
from collections.abc import Generator, Mapping
from typing import Any
from flask import Request
from core.plugin.entities.plugin_daemon import CredentialType, PluginTriggerProviderEntity
from core.plugin.entities.request import (
TriggerDispatchResponse,
TriggerInvokeEventResponse,
TriggerSubscriptionResponse,
TriggerValidateProviderCredentialsResponse,
)
from core.plugin.impl.base import BasePluginClient
from core.plugin.utils.http_parser import serialize_request
from core.trigger.entities.entities import Subscription
from models.provider_ids import TriggerProviderID
class PluginTriggerClient(BasePluginClient):
def fetch_trigger_providers(self, tenant_id: str) -> list[PluginTriggerProviderEntity]:
"""
Fetch trigger providers for the given tenant.
"""
def transformer(json_response: dict[str, Any]) -> dict[str, Any]:
for provider in json_response.get("data", []):
declaration = provider.get("declaration", {}) or {}
provider_id = provider.get("plugin_id") + "/" + provider.get("provider")
for event in declaration.get("events", []):
event["identity"]["provider"] = provider_id
return json_response
response: list[PluginTriggerProviderEntity] = self._request_with_plugin_daemon_response(
method="GET",
path=f"plugin/{tenant_id}/management/triggers",
type_=list[PluginTriggerProviderEntity],
params={"page": 1, "page_size": 256},
transformer=transformer,
)
for provider in response:
provider.declaration.identity.name = f"{provider.plugin_id}/{provider.declaration.identity.name}"
# override the provider name for each trigger to plugin_id/provider_name
for event in provider.declaration.events:
event.identity.provider = provider.declaration.identity.name
return response
def fetch_trigger_provider(self, tenant_id: str, provider_id: TriggerProviderID) -> PluginTriggerProviderEntity:
"""
Fetch trigger provider for the given tenant and plugin.
"""
def transformer(json_response: dict[str, Any]) -> dict[str, Any]:
data = json_response.get("data")
if data:
for event in data.get("declaration", {}).get("events", []):
event["identity"]["provider"] = str(provider_id)
return json_response
response: PluginTriggerProviderEntity = self._request_with_plugin_daemon_response(
method="GET",
path=f"plugin/{tenant_id}/management/trigger",
type_=PluginTriggerProviderEntity,
params={"provider": provider_id.provider_name, "plugin_id": provider_id.plugin_id},
transformer=transformer,
)
response.declaration.identity.name = str(provider_id)
# override the provider name for each trigger to plugin_id/provider_name
for event in response.declaration.events:
event.identity.provider = str(provider_id)
return response
def invoke_trigger_event(
self,
tenant_id: str,
user_id: str,
provider: str,
event_name: str,
credentials: Mapping[str, str],
credential_type: CredentialType,
request: Request,
parameters: Mapping[str, Any],
subscription: Subscription,
payload: Mapping[str, Any],
) -> TriggerInvokeEventResponse:
"""
Invoke a trigger with the given parameters.
"""
provider_id = TriggerProviderID(provider)
response: Generator[TriggerInvokeEventResponse, None, None] = self._request_with_plugin_daemon_response_stream(
method="POST",
path=f"plugin/{tenant_id}/dispatch/trigger/invoke_event",
type_=TriggerInvokeEventResponse,
data={
"user_id": user_id,
"data": {
"provider": provider_id.provider_name,
"event": event_name,
"credentials": credentials,
"credential_type": credential_type,
"subscription": subscription.model_dump(),
"raw_http_request": binascii.hexlify(serialize_request(request)).decode(),
"parameters": parameters,
"payload": payload,
},
},
headers={
"X-Plugin-ID": provider_id.plugin_id,
"Content-Type": "application/json",
},
)
for resp in response:
return resp
raise ValueError("No response received from plugin daemon for invoke trigger")
def validate_provider_credentials(
self, tenant_id: str, user_id: str, provider: str, credentials: Mapping[str, str]
) -> bool:
"""
Validate the credentials of the trigger provider.
"""
provider_id = TriggerProviderID(provider)
response: Generator[TriggerValidateProviderCredentialsResponse, None, None] = (
self._request_with_plugin_daemon_response_stream(
method="POST",
path=f"plugin/{tenant_id}/dispatch/trigger/validate_credentials",
type_=TriggerValidateProviderCredentialsResponse,
data={
"user_id": user_id,
"data": {
"provider": provider_id.provider_name,
"credentials": credentials,
},
},
headers={
"X-Plugin-ID": provider_id.plugin_id,
"Content-Type": "application/json",
},
)
)
for resp in response:
return resp.result
raise ValueError("No response received from plugin daemon for validate provider credentials")
def dispatch_event(
self,
tenant_id: str,
provider: str,
subscription: Mapping[str, Any],
request: Request,
credentials: Mapping[str, str],
credential_type: CredentialType,
) -> TriggerDispatchResponse:
"""
Dispatch an event to triggers.
"""
provider_id = TriggerProviderID(provider)
response = self._request_with_plugin_daemon_response_stream(
method="POST",
path=f"plugin/{tenant_id}/dispatch/trigger/dispatch_event",
type_=TriggerDispatchResponse,
data={
"data": {
"provider": provider_id.provider_name,
"subscription": subscription,
"credentials": credentials,
"credential_type": credential_type,
"raw_http_request": binascii.hexlify(serialize_request(request)).decode(),
},
},
headers={
"X-Plugin-ID": provider_id.plugin_id,
"Content-Type": "application/json",
},
)
for resp in response:
return resp
raise ValueError("No response received from plugin daemon for dispatch event")
def subscribe(
self,
tenant_id: str,
user_id: str,
provider: str,
credentials: Mapping[str, str],
credential_type: CredentialType,
endpoint: str,
parameters: Mapping[str, Any],
) -> TriggerSubscriptionResponse:
"""
Subscribe to a trigger.
"""
provider_id = TriggerProviderID(provider)
response: Generator[TriggerSubscriptionResponse, None, None] = self._request_with_plugin_daemon_response_stream(
method="POST",
path=f"plugin/{tenant_id}/dispatch/trigger/subscribe",
type_=TriggerSubscriptionResponse,
data={
"user_id": user_id,
"data": {
"provider": provider_id.provider_name,
"credentials": credentials,
"credential_type": credential_type,
"endpoint": endpoint,
"parameters": parameters,
},
},
headers={
"X-Plugin-ID": provider_id.plugin_id,
"Content-Type": "application/json",
},
)
for resp in response:
return resp
raise ValueError("No response received from plugin daemon for subscribe")
def unsubscribe(
self,
tenant_id: str,
user_id: str,
provider: str,
subscription: Subscription,
credentials: Mapping[str, str],
credential_type: CredentialType,
) -> TriggerSubscriptionResponse:
"""
Unsubscribe from a trigger.
"""
provider_id = TriggerProviderID(provider)
response: Generator[TriggerSubscriptionResponse, None, None] = self._request_with_plugin_daemon_response_stream(
method="POST",
path=f"plugin/{tenant_id}/dispatch/trigger/unsubscribe",
type_=TriggerSubscriptionResponse,
data={
"user_id": user_id,
"data": {
"provider": provider_id.provider_name,
"subscription": subscription.model_dump(),
"credentials": credentials,
"credential_type": credential_type,
},
},
headers={
"X-Plugin-ID": provider_id.plugin_id,
"Content-Type": "application/json",
},
)
for resp in response:
return resp
raise ValueError("No response received from plugin daemon for unsubscribe")
def refresh(
self,
tenant_id: str,
user_id: str,
provider: str,
subscription: Subscription,
credentials: Mapping[str, str],
credential_type: CredentialType,
) -> TriggerSubscriptionResponse:
"""
Refresh a trigger subscription.
"""
provider_id = TriggerProviderID(provider)
response: Generator[TriggerSubscriptionResponse, None, None] = self._request_with_plugin_daemon_response_stream(
method="POST",
path=f"plugin/{tenant_id}/dispatch/trigger/refresh",
type_=TriggerSubscriptionResponse,
data={
"user_id": user_id,
"data": {
"provider": provider_id.provider_name,
"subscription": subscription.model_dump(),
"credentials": credentials,
"credential_type": credential_type,
},
},
headers={
"X-Plugin-ID": provider_id.plugin_id,
"Content-Type": "application/json",
},
)
for resp in response:
return resp
raise ValueError("No response received from plugin daemon for refresh")

View File

@ -0,0 +1,163 @@
from io import BytesIO
from flask import Request, Response
from werkzeug.datastructures import Headers
def serialize_request(request: Request) -> bytes:
method = request.method
path = request.full_path.rstrip("?")
raw = f"{method} {path} HTTP/1.1\r\n".encode()
for name, value in request.headers.items():
raw += f"{name}: {value}\r\n".encode()
raw += b"\r\n"
body = request.get_data(as_text=False)
if body:
raw += body
return raw
def deserialize_request(raw_data: bytes) -> Request:
header_end = raw_data.find(b"\r\n\r\n")
if header_end == -1:
header_end = raw_data.find(b"\n\n")
if header_end == -1:
header_data = raw_data
body = b""
else:
header_data = raw_data[:header_end]
body = raw_data[header_end + 2 :]
else:
header_data = raw_data[:header_end]
body = raw_data[header_end + 4 :]
lines = header_data.split(b"\r\n")
if len(lines) == 1 and b"\n" in lines[0]:
lines = header_data.split(b"\n")
if not lines or not lines[0]:
raise ValueError("Empty HTTP request")
request_line = lines[0].decode("utf-8", errors="ignore")
parts = request_line.split(" ", 2)
if len(parts) < 2:
raise ValueError(f"Invalid request line: {request_line}")
method = parts[0]
full_path = parts[1]
protocol = parts[2] if len(parts) > 2 else "HTTP/1.1"
if "?" in full_path:
path, query_string = full_path.split("?", 1)
else:
path = full_path
query_string = ""
headers = Headers()
for line in lines[1:]:
if not line:
continue
line_str = line.decode("utf-8", errors="ignore")
if ":" not in line_str:
continue
name, value = line_str.split(":", 1)
headers.add(name, value.strip())
host = headers.get("Host", "localhost")
if ":" in host:
server_name, server_port = host.rsplit(":", 1)
else:
server_name = host
server_port = "80"
environ = {
"REQUEST_METHOD": method,
"PATH_INFO": path,
"QUERY_STRING": query_string,
"SERVER_NAME": server_name,
"SERVER_PORT": server_port,
"SERVER_PROTOCOL": protocol,
"wsgi.input": BytesIO(body),
"wsgi.url_scheme": "http",
}
if "Content-Type" in headers:
content_type = headers.get("Content-Type")
if content_type is not None:
environ["CONTENT_TYPE"] = content_type
if "Content-Length" in headers:
content_length = headers.get("Content-Length")
if content_length is not None:
environ["CONTENT_LENGTH"] = content_length
elif body:
environ["CONTENT_LENGTH"] = str(len(body))
for name, value in headers.items():
if name.upper() in ("CONTENT-TYPE", "CONTENT-LENGTH"):
continue
env_name = f"HTTP_{name.upper().replace('-', '_')}"
environ[env_name] = value
return Request(environ)
def serialize_response(response: Response) -> bytes:
raw = f"HTTP/1.1 {response.status}\r\n".encode()
for name, value in response.headers.items():
raw += f"{name}: {value}\r\n".encode()
raw += b"\r\n"
body = response.get_data(as_text=False)
if body:
raw += body
return raw
def deserialize_response(raw_data: bytes) -> Response:
header_end = raw_data.find(b"\r\n\r\n")
if header_end == -1:
header_end = raw_data.find(b"\n\n")
if header_end == -1:
header_data = raw_data
body = b""
else:
header_data = raw_data[:header_end]
body = raw_data[header_end + 2 :]
else:
header_data = raw_data[:header_end]
body = raw_data[header_end + 4 :]
lines = header_data.split(b"\r\n")
if len(lines) == 1 and b"\n" in lines[0]:
lines = header_data.split(b"\n")
if not lines or not lines[0]:
raise ValueError("Empty HTTP response")
status_line = lines[0].decode("utf-8", errors="ignore")
parts = status_line.split(" ", 2)
if len(parts) < 2:
raise ValueError(f"Invalid status line: {status_line}")
status_code = int(parts[1])
response = Response(response=body, status=status_code)
for line in lines[1:]:
if not line:
continue
line_str = line.decode("utf-8", errors="ignore")
if ":" not in line_str:
continue
name, value = line_str.split(":", 1)
response.headers[name] = value.strip()
return response

View File

@ -3,7 +3,8 @@ from typing import Any
from pydantic import BaseModel, Field
from core.app.entities.app_invoke_entities import InvokeFrom
from core.tools.entities.tool_entities import CredentialType, ToolInvokeFrom
from core.plugin.entities.plugin_daemon import CredentialType
from core.tools.entities.tool_entities import ToolInvokeFrom
class ToolRuntime(BaseModel):

View File

@ -4,11 +4,11 @@ from typing import Any
from core.entities.provider_entities import ProviderConfig
from core.helper.module_import_helper import load_single_subclass_from_source
from core.plugin.entities.plugin_daemon import CredentialType
from core.tools.__base.tool_provider import ToolProviderController
from core.tools.__base.tool_runtime import ToolRuntime
from core.tools.builtin_tool.tool import BuiltinTool
from core.tools.entities.tool_entities import (
CredentialType,
OAuthSchema,
ToolEntity,
ToolProviderEntity,

View File

@ -6,9 +6,10 @@ from pydantic import BaseModel, Field, field_validator
from core.entities.mcp_provider import MCPAuthentication, MCPConfiguration
from core.model_runtime.utils.encoders import jsonable_encoder
from core.plugin.entities.plugin_daemon import CredentialType
from core.tools.__base.tool import ToolParameter
from core.tools.entities.common_entities import I18nObject
from core.tools.entities.tool_entities import CredentialType, ToolProviderType
from core.tools.entities.tool_entities import ToolProviderType
class ToolApiEntity(BaseModel):

View File

@ -267,6 +267,7 @@ class ToolParameter(PluginParameter):
SECRET_INPUT = PluginParameterType.SECRET_INPUT
FILE = PluginParameterType.FILE
FILES = PluginParameterType.FILES
CHECKBOX = PluginParameterType.CHECKBOX
APP_SELECTOR = PluginParameterType.APP_SELECTOR
MODEL_SELECTOR = PluginParameterType.MODEL_SELECTOR
ANY = PluginParameterType.ANY
@ -488,36 +489,3 @@ class ToolSelector(BaseModel):
def to_plugin_parameter(self) -> dict[str, Any]:
return self.model_dump()
class CredentialType(StrEnum):
API_KEY = "api-key"
OAUTH2 = auto()
def get_name(self):
if self == CredentialType.API_KEY:
return "API KEY"
elif self == CredentialType.OAUTH2:
return "AUTH"
else:
return self.value.replace("-", " ").upper()
def is_editable(self):
return self == CredentialType.API_KEY
def is_validate_allowed(self):
return self == CredentialType.API_KEY
@classmethod
def values(cls):
return [item.value for item in cls]
@classmethod
def of(cls, credential_type: str) -> "CredentialType":
type_name = credential_type.lower()
if type_name in {"api-key", "api_key"}:
return cls.API_KEY
elif type_name in {"oauth2", "oauth"}:
return cls.OAUTH2
else:
raise ValueError(f"Invalid credential type: {credential_type}")

View File

@ -8,7 +8,6 @@ from threading import Lock
from typing import TYPE_CHECKING, Any, Literal, Optional, Union, cast
import sqlalchemy as sa
from pydantic import TypeAdapter
from sqlalchemy import select
from sqlalchemy.orm import Session
from yarl import URL
@ -39,6 +38,7 @@ from core.app.entities.app_invoke_entities import InvokeFrom
from core.helper.module_import_helper import load_single_subclass_from_source
from core.helper.position_helper import is_filtered
from core.model_runtime.utils.encoders import jsonable_encoder
from core.plugin.entities.plugin_daemon import CredentialType
from core.tools.__base.tool import Tool
from core.tools.builtin_tool.provider import BuiltinToolProviderController
from core.tools.builtin_tool.providers._positions import BuiltinToolProviderSort
@ -49,7 +49,6 @@ from core.tools.entities.api_entities import ToolProviderApiEntity, ToolProvider
from core.tools.entities.common_entities import I18nObject
from core.tools.entities.tool_entities import (
ApiProviderAuthType,
CredentialType,
ToolInvokeFrom,
ToolParameter,
ToolProviderType,
@ -289,10 +288,8 @@ class ToolManager:
credentials=decrypted_credentials,
)
# update the credentials
builtin_provider.encrypted_credentials = (
TypeAdapter(dict[str, Any])
.dump_json(encrypter.encrypt(dict(refreshed_credentials.credentials)))
.decode("utf-8")
builtin_provider.encrypted_credentials = json.dumps(
encrypter.encrypt(refreshed_credentials.credentials)
)
builtin_provider.expires_at = refreshed_credentials.expires_at
db.session.commit()
@ -322,7 +319,7 @@ class ToolManager:
return api_provider.get_tool(tool_name).fork_tool_runtime(
runtime=ToolRuntime(
tenant_id=tenant_id,
credentials=encrypter.decrypt(credentials),
credentials=dict(encrypter.decrypt(credentials)),
invoke_from=invoke_from,
tool_invoke_from=tool_invoke_from,
)
@ -833,7 +830,7 @@ class ToolManager:
controller=controller,
)
masked_credentials = encrypter.mask_tool_credentials(encrypter.decrypt(credentials))
masked_credentials = encrypter.mask_plugin_credentials(encrypter.decrypt(credentials))
try:
icon = json.loads(provider_obj.icon)

View File

@ -1,137 +1,24 @@
import contextlib
from copy import deepcopy
from typing import Any, Protocol
# Import generic components from provider_encryption module
from core.helper.provider_encryption import (
ProviderConfigCache,
ProviderConfigEncrypter,
create_provider_encrypter,
)
from core.entities.provider_entities import BasicProviderConfig
from core.helper import encrypter
# Re-export for backward compatibility
__all__ = [
"ProviderConfigCache",
"ProviderConfigEncrypter",
"create_provider_encrypter",
"create_tool_provider_encrypter",
]
# Tool-specific imports
from core.helper.provider_cache import SingletonProviderCredentialsCache
from core.tools.__base.tool_provider import ToolProviderController
class ProviderConfigCache(Protocol):
"""
Interface for provider configuration cache operations
"""
def get(self) -> dict | None:
"""Get cached provider configuration"""
...
def set(self, config: dict[str, Any]):
"""Cache provider configuration"""
...
def delete(self):
"""Delete cached provider configuration"""
...
class ProviderConfigEncrypter:
tenant_id: str
config: list[BasicProviderConfig]
provider_config_cache: ProviderConfigCache
def __init__(
self,
tenant_id: str,
config: list[BasicProviderConfig],
provider_config_cache: ProviderConfigCache,
):
self.tenant_id = tenant_id
self.config = config
self.provider_config_cache = provider_config_cache
def _deep_copy(self, data: dict[str, str]) -> dict[str, str]:
"""
deep copy data
"""
return deepcopy(data)
def encrypt(self, data: dict[str, str]) -> dict[str, str]:
"""
encrypt tool credentials with tenant id
return a deep copy of credentials with encrypted values
"""
data = self._deep_copy(data)
# get fields need to be decrypted
fields = dict[str, BasicProviderConfig]()
for credential in self.config:
fields[credential.name] = credential
for field_name, field in fields.items():
if field.type == BasicProviderConfig.Type.SECRET_INPUT:
if field_name in data:
encrypted = encrypter.encrypt_token(self.tenant_id, data[field_name] or "")
data[field_name] = encrypted
return data
def mask_tool_credentials(self, data: dict[str, Any]) -> dict[str, Any]:
"""
mask tool credentials
return a deep copy of credentials with masked values
"""
data = self._deep_copy(data)
# get fields need to be decrypted
fields = dict[str, BasicProviderConfig]()
for credential in self.config:
fields[credential.name] = credential
for field_name, field in fields.items():
if field.type == BasicProviderConfig.Type.SECRET_INPUT:
if field_name in data:
if len(data[field_name]) > 6:
data[field_name] = (
data[field_name][:2] + "*" * (len(data[field_name]) - 4) + data[field_name][-2:]
)
else:
data[field_name] = "*" * len(data[field_name])
return data
def decrypt(self, data: dict[str, str]) -> dict[str, Any]:
"""
decrypt tool credentials with tenant id
return a deep copy of credentials with decrypted values
"""
cached_credentials = self.provider_config_cache.get()
if cached_credentials:
return cached_credentials
data = self._deep_copy(data)
# get fields need to be decrypted
fields = dict[str, BasicProviderConfig]()
for credential in self.config:
fields[credential.name] = credential
for field_name, field in fields.items():
if field.type == BasicProviderConfig.Type.SECRET_INPUT:
if field_name in data:
with contextlib.suppress(Exception):
# if the value is None or empty string, skip decrypt
if not data[field_name]:
continue
data[field_name] = encrypter.decrypt_token(self.tenant_id, data[field_name])
self.provider_config_cache.set(data)
return data
def create_provider_encrypter(
tenant_id: str, config: list[BasicProviderConfig], cache: ProviderConfigCache
) -> tuple[ProviderConfigEncrypter, ProviderConfigCache]:
return ProviderConfigEncrypter(tenant_id=tenant_id, config=config, provider_config_cache=cache), cache
def create_tool_provider_encrypter(
tenant_id: str, controller: ToolProviderController
) -> tuple[ProviderConfigEncrypter, ProviderConfigCache]:
def create_tool_provider_encrypter(tenant_id: str, controller: ToolProviderController):
cache = SingletonProviderCredentialsCache(
tenant_id=tenant_id,
provider_type=controller.provider_type.value,

View File

@ -0,0 +1 @@
# Core trigger module initialization

View File

@ -0,0 +1,124 @@
import hashlib
import logging
from typing import TypeVar
from redis import RedisError
from core.trigger.debug.events import BaseDebugEvent
from extensions.ext_redis import redis_client
logger = logging.getLogger(__name__)
TRIGGER_DEBUG_EVENT_TTL = 300
TTriggerDebugEvent = TypeVar("TTriggerDebugEvent", bound="BaseDebugEvent")
class TriggerDebugEventBus:
"""
Unified Redis-based trigger debug service with polling support.
Uses {tenant_id} hash tags for Redis Cluster compatibility.
Supports multiple event types through a generic dispatch/poll interface.
"""
# LUA_SELECT: Atomic poll or register for event
# KEYS[1] = trigger_debug_inbox:{tenant_id}:{address_id}
# KEYS[2] = trigger_debug_waiting_pool:{tenant_id}:...
# ARGV[1] = address_id
LUA_SELECT = (
"local v=redis.call('GET',KEYS[1]);"
"if v then redis.call('DEL',KEYS[1]);return v end;"
"redis.call('SADD',KEYS[2],ARGV[1]);"
f"redis.call('EXPIRE',KEYS[2],{TRIGGER_DEBUG_EVENT_TTL});"
"return false"
)
# LUA_DISPATCH: Dispatch event to all waiting addresses
# KEYS[1] = trigger_debug_waiting_pool:{tenant_id}:...
# ARGV[1] = tenant_id
# ARGV[2] = event_json
LUA_DISPATCH = (
"local a=redis.call('SMEMBERS',KEYS[1]);"
"if #a==0 then return 0 end;"
"redis.call('DEL',KEYS[1]);"
"for i=1,#a do "
f"redis.call('SET','trigger_debug_inbox:'..ARGV[1]..':'..a[i],ARGV[2],'EX',{TRIGGER_DEBUG_EVENT_TTL});"
"end;"
"return #a"
)
@classmethod
def dispatch(
cls,
tenant_id: str,
event: BaseDebugEvent,
pool_key: str,
) -> int:
"""
Dispatch event to all waiting addresses in the pool.
Args:
tenant_id: Tenant ID for hash tag
event: Event object to dispatch
pool_key: Pool key (generate using build_{?}_pool_key(...))
Returns:
Number of addresses the event was dispatched to
"""
event_data = event.model_dump_json()
try:
result = redis_client.eval(
cls.LUA_DISPATCH,
1,
pool_key,
tenant_id,
event_data,
)
return int(result)
except RedisError:
logger.exception("Failed to dispatch event to pool: %s", pool_key)
return 0
@classmethod
def poll(
cls,
event_type: type[TTriggerDebugEvent],
pool_key: str,
tenant_id: str,
user_id: str,
app_id: str,
node_id: str,
) -> TTriggerDebugEvent | None:
"""
Poll for an event or register to the waiting pool.
If an event is available in the inbox, return it immediately.
Otherwise, register the address to the waiting pool for future dispatch.
Args:
event_class: Event class for deserialization and type safety
pool_key: Pool key (generate using build_{?}_pool_key(...))
tenant_id: Tenant ID
user_id: User ID for address calculation
app_id: App ID for address calculation
node_id: Node ID for address calculation
Returns:
Event object if available, None otherwise
"""
address_id: str = hashlib.sha256(f"{user_id}|{app_id}|{node_id}".encode()).hexdigest()
address: str = f"trigger_debug_inbox:{tenant_id}:{address_id}"
try:
event_data = redis_client.eval(
cls.LUA_SELECT,
2,
address,
pool_key,
address_id,
)
return event_type.model_validate_json(json_data=event_data) if event_data else None
except RedisError:
logger.exception("Failed to poll event from pool: %s", pool_key)
return None

View File

@ -0,0 +1,243 @@
"""Trigger debug service supporting plugin and webhook debugging in draft workflows."""
import hashlib
import logging
import time
from abc import ABC, abstractmethod
from collections.abc import Mapping
from datetime import datetime
from typing import Any
from pydantic import BaseModel
from core.plugin.entities.request import TriggerInvokeEventResponse
from core.trigger.debug.event_bus import TriggerDebugEventBus
from core.trigger.debug.events import (
PluginTriggerDebugEvent,
ScheduleDebugEvent,
WebhookDebugEvent,
build_plugin_pool_key,
build_webhook_pool_key,
)
from core.workflow.enums import NodeType
from core.workflow.nodes.trigger_plugin.entities import TriggerEventNodeData
from core.workflow.nodes.trigger_schedule.entities import ScheduleConfig
from extensions.ext_redis import redis_client
from libs.datetime_utils import ensure_naive_utc, naive_utc_now
from libs.schedule_utils import calculate_next_run_at
from models.model import App
from models.provider_ids import TriggerProviderID
from models.workflow import Workflow
logger = logging.getLogger(__name__)
class TriggerDebugEvent(BaseModel):
workflow_args: Mapping[str, Any]
node_id: str
class TriggerDebugEventPoller(ABC):
app_id: str
user_id: str
tenant_id: str
node_config: Mapping[str, Any]
node_id: str
def __init__(self, tenant_id: str, user_id: str, app_id: str, node_config: Mapping[str, Any], node_id: str):
self.tenant_id = tenant_id
self.user_id = user_id
self.app_id = app_id
self.node_config = node_config
self.node_id = node_id
@abstractmethod
def poll(self) -> TriggerDebugEvent | None:
raise NotImplementedError
class PluginTriggerDebugEventPoller(TriggerDebugEventPoller):
def poll(self) -> TriggerDebugEvent | None:
from services.trigger.trigger_service import TriggerService
plugin_trigger_data = TriggerEventNodeData.model_validate(self.node_config.get("data", {}))
provider_id = TriggerProviderID(plugin_trigger_data.provider_id)
pool_key: str = build_plugin_pool_key(
name=plugin_trigger_data.event_name,
provider_id=str(provider_id),
tenant_id=self.tenant_id,
subscription_id=plugin_trigger_data.subscription_id,
)
plugin_trigger_event: PluginTriggerDebugEvent | None = TriggerDebugEventBus.poll(
event_type=PluginTriggerDebugEvent,
pool_key=pool_key,
tenant_id=self.tenant_id,
user_id=self.user_id,
app_id=self.app_id,
node_id=self.node_id,
)
if not plugin_trigger_event:
return None
trigger_event_response: TriggerInvokeEventResponse = TriggerService.invoke_trigger_event(
event=plugin_trigger_event,
user_id=plugin_trigger_event.user_id,
tenant_id=self.tenant_id,
node_config=self.node_config,
)
if trigger_event_response.cancelled:
return None
return TriggerDebugEvent(
workflow_args={
"inputs": trigger_event_response.variables,
"files": [],
},
node_id=self.node_id,
)
class WebhookTriggerDebugEventPoller(TriggerDebugEventPoller):
def poll(self) -> TriggerDebugEvent | None:
pool_key = build_webhook_pool_key(
tenant_id=self.tenant_id,
app_id=self.app_id,
node_id=self.node_id,
)
webhook_event: WebhookDebugEvent | None = TriggerDebugEventBus.poll(
event_type=WebhookDebugEvent,
pool_key=pool_key,
tenant_id=self.tenant_id,
user_id=self.user_id,
app_id=self.app_id,
node_id=self.node_id,
)
if not webhook_event:
return None
from services.trigger.webhook_service import WebhookService
payload = webhook_event.payload or {}
workflow_inputs = payload.get("inputs")
if workflow_inputs is None:
webhook_data = payload.get("webhook_data", {})
workflow_inputs = WebhookService.build_workflow_inputs(webhook_data)
workflow_args: Mapping[str, Any] = {
"inputs": workflow_inputs or {},
"files": [],
}
return TriggerDebugEvent(workflow_args=workflow_args, node_id=self.node_id)
class ScheduleTriggerDebugEventPoller(TriggerDebugEventPoller):
"""
Poller for schedule trigger debug events.
This poller will simulate the schedule trigger event by creating a schedule debug runtime cache
and calculating the next run at.
"""
RUNTIME_CACHE_TTL = 60 * 5
class ScheduleDebugRuntime(BaseModel):
cache_key: str
timezone: str
cron_expression: str
next_run_at: datetime
def schedule_debug_runtime_key(self, cron_hash: str) -> str:
return f"schedule_debug_runtime:{self.tenant_id}:{self.user_id}:{self.app_id}:{self.node_id}:{cron_hash}"
def get_or_create_schedule_debug_runtime(self):
from services.trigger.schedule_service import ScheduleService
schedule_config: ScheduleConfig = ScheduleService.to_schedule_config(self.node_config)
cron_hash = hashlib.sha256(schedule_config.cron_expression.encode()).hexdigest()
cache_key = self.schedule_debug_runtime_key(cron_hash)
runtime_cache = redis_client.get(cache_key)
if runtime_cache is None:
schedule_debug_runtime = self.ScheduleDebugRuntime(
cron_expression=schedule_config.cron_expression,
timezone=schedule_config.timezone,
cache_key=cache_key,
next_run_at=ensure_naive_utc(
calculate_next_run_at(schedule_config.cron_expression, schedule_config.timezone)
),
)
redis_client.setex(
name=self.schedule_debug_runtime_key(cron_hash),
time=self.RUNTIME_CACHE_TTL,
value=schedule_debug_runtime.model_dump_json(),
)
return schedule_debug_runtime
else:
redis_client.expire(cache_key, self.RUNTIME_CACHE_TTL)
runtime = self.ScheduleDebugRuntime.model_validate_json(runtime_cache)
runtime.next_run_at = ensure_naive_utc(runtime.next_run_at)
return runtime
def create_schedule_event(self, schedule_debug_runtime: ScheduleDebugRuntime) -> ScheduleDebugEvent:
redis_client.delete(schedule_debug_runtime.cache_key)
return ScheduleDebugEvent(
timestamp=int(time.time()),
node_id=self.node_id,
inputs={},
)
def poll(self) -> TriggerDebugEvent | None:
schedule_debug_runtime = self.get_or_create_schedule_debug_runtime()
if schedule_debug_runtime.next_run_at > naive_utc_now():
return None
schedule_event: ScheduleDebugEvent = self.create_schedule_event(schedule_debug_runtime)
workflow_args: Mapping[str, Any] = {
"inputs": schedule_event.inputs or {},
"files": [],
}
return TriggerDebugEvent(workflow_args=workflow_args, node_id=self.node_id)
def create_event_poller(
draft_workflow: Workflow, tenant_id: str, user_id: str, app_id: str, node_id: str
) -> TriggerDebugEventPoller:
node_config = draft_workflow.get_node_config_by_id(node_id=node_id)
if not node_config:
raise ValueError("Node data not found for node %s", node_id)
node_type = draft_workflow.get_node_type_from_node_config(node_config)
match node_type:
case NodeType.TRIGGER_PLUGIN:
return PluginTriggerDebugEventPoller(
tenant_id=tenant_id, user_id=user_id, app_id=app_id, node_config=node_config, node_id=node_id
)
case NodeType.TRIGGER_WEBHOOK:
return WebhookTriggerDebugEventPoller(
tenant_id=tenant_id, user_id=user_id, app_id=app_id, node_config=node_config, node_id=node_id
)
case NodeType.TRIGGER_SCHEDULE:
return ScheduleTriggerDebugEventPoller(
tenant_id=tenant_id, user_id=user_id, app_id=app_id, node_config=node_config, node_id=node_id
)
case _:
raise ValueError("unable to create event poller for node type %s", node_type)
def select_trigger_debug_events(
draft_workflow: Workflow, app_model: App, user_id: str, node_ids: list[str]
) -> TriggerDebugEvent | None:
event: TriggerDebugEvent | None = None
for node_id in node_ids:
node_config = draft_workflow.get_node_config_by_id(node_id=node_id)
if not node_config:
raise ValueError("Node data not found for node %s", node_id)
poller: TriggerDebugEventPoller = create_event_poller(
draft_workflow=draft_workflow,
tenant_id=app_model.tenant_id,
user_id=user_id,
app_id=app_model.id,
node_id=node_id,
)
event = poller.poll()
if event is not None:
return event
return None

View File

@ -0,0 +1,67 @@
from collections.abc import Mapping
from enum import StrEnum
from typing import Any
from pydantic import BaseModel, Field
class TriggerDebugPoolKey(StrEnum):
"""Trigger debug pool key."""
SCHEDULE = "schedule_trigger_debug_waiting_pool"
WEBHOOK = "webhook_trigger_debug_waiting_pool"
PLUGIN = "plugin_trigger_debug_waiting_pool"
class BaseDebugEvent(BaseModel):
"""Base class for all debug events."""
timestamp: int
class ScheduleDebugEvent(BaseDebugEvent):
"""Debug event for schedule triggers."""
node_id: str
inputs: Mapping[str, Any]
class WebhookDebugEvent(BaseDebugEvent):
"""Debug event for webhook triggers."""
request_id: str
node_id: str
payload: dict[str, Any] = Field(default_factory=dict)
def build_webhook_pool_key(tenant_id: str, app_id: str, node_id: str) -> str:
"""Generate pool key for webhook events.
Args:
tenant_id: Tenant ID
app_id: App ID
node_id: Node ID
"""
return f"{TriggerDebugPoolKey.WEBHOOK}:{tenant_id}:{app_id}:{node_id}"
class PluginTriggerDebugEvent(BaseDebugEvent):
"""Debug event for plugin triggers."""
name: str
user_id: str = Field(description="This is end user id, only for trigger the event. no related with account user id")
request_id: str
subscription_id: str
provider_id: str
def build_plugin_pool_key(tenant_id: str, provider_id: str, subscription_id: str, name: str) -> str:
"""Generate pool key for plugin trigger events.
Args:
name: Event name
tenant_id: Tenant ID
provider_id: Provider ID
subscription_id: Subscription ID
"""
return f"{TriggerDebugPoolKey.PLUGIN}:{tenant_id}:{str(provider_id)}:{subscription_id}:{name}"

View File

@ -0,0 +1,76 @@
from collections.abc import Mapping
from typing import Any
from pydantic import BaseModel, Field
from core.entities.provider_entities import ProviderConfig
from core.plugin.entities.plugin_daemon import CredentialType
from core.tools.entities.common_entities import I18nObject
from core.trigger.entities.entities import (
EventIdentity,
EventParameter,
SubscriptionConstructor,
TriggerCreationMethod,
)
class TriggerProviderSubscriptionApiEntity(BaseModel):
id: str = Field(description="The unique id of the subscription")
name: str = Field(description="The name of the subscription")
provider: str = Field(description="The provider id of the subscription")
credential_type: CredentialType = Field(description="The type of the credential")
credentials: dict[str, Any] = Field(description="The credentials of the subscription")
endpoint: str = Field(description="The endpoint of the subscription")
parameters: dict[str, Any] = Field(description="The parameters of the subscription")
properties: dict[str, Any] = Field(description="The properties of the subscription")
workflows_in_use: int = Field(description="The number of workflows using this subscription")
class EventApiEntity(BaseModel):
name: str = Field(description="The name of the trigger")
identity: EventIdentity = Field(description="The identity of the trigger")
description: I18nObject = Field(description="The description of the trigger")
parameters: list[EventParameter] = Field(description="The parameters of the trigger")
output_schema: Mapping[str, Any] | None = Field(description="The output schema of the trigger")
class TriggerProviderApiEntity(BaseModel):
author: str = Field(..., description="The author of the trigger provider")
name: str = Field(..., description="The name of the trigger provider")
label: I18nObject = Field(..., description="The label of the trigger provider")
description: I18nObject = Field(..., description="The description of the trigger provider")
icon: str | None = Field(default=None, description="The icon of the trigger provider")
icon_dark: str | None = Field(default=None, description="The dark icon of the trigger provider")
tags: list[str] = Field(default_factory=list, description="The tags of the trigger provider")
plugin_id: str | None = Field(default="", description="The plugin id of the tool")
plugin_unique_identifier: str | None = Field(default="", description="The unique identifier of the tool")
supported_creation_methods: list[TriggerCreationMethod] = Field(
default_factory=list,
description="Supported creation methods for the trigger provider. like 'OAUTH', 'APIKEY', 'MANUAL'.",
)
subscription_constructor: SubscriptionConstructor | None = Field(
default=None, description="The subscription constructor of the trigger provider"
)
subscription_schema: list[ProviderConfig] = Field(
default_factory=list,
description="The subscription schema of the trigger provider",
)
events: list[EventApiEntity] = Field(description="The events of the trigger provider")
class SubscriptionBuilderApiEntity(BaseModel):
id: str = Field(description="The id of the subscription builder")
name: str = Field(description="The name of the subscription builder")
provider: str = Field(description="The provider id of the subscription builder")
endpoint: str = Field(description="The endpoint id of the subscription builder")
parameters: Mapping[str, Any] = Field(description="The parameters of the subscription builder")
properties: Mapping[str, Any] = Field(description="The properties of the subscription builder")
credentials: Mapping[str, str] = Field(description="The credentials of the subscription builder")
credential_type: CredentialType = Field(description="The credential type of the subscription builder")
__all__ = ["EventApiEntity", "TriggerProviderApiEntity", "TriggerProviderSubscriptionApiEntity"]

View File

@ -0,0 +1,288 @@
from collections.abc import Mapping
from datetime import datetime
from enum import StrEnum
from typing import Any, Union
from pydantic import BaseModel, ConfigDict, Field, ValidationInfo, field_validator
from core.entities.provider_entities import ProviderConfig
from core.plugin.entities.parameters import (
PluginParameterAutoGenerate,
PluginParameterOption,
PluginParameterTemplate,
PluginParameterType,
)
from core.tools.entities.common_entities import I18nObject
class EventParameterType(StrEnum):
"""The type of the parameter"""
STRING = PluginParameterType.STRING
NUMBER = PluginParameterType.NUMBER
BOOLEAN = PluginParameterType.BOOLEAN
SELECT = PluginParameterType.SELECT
FILE = PluginParameterType.FILE
FILES = PluginParameterType.FILES
MODEL_SELECTOR = PluginParameterType.MODEL_SELECTOR
APP_SELECTOR = PluginParameterType.APP_SELECTOR
OBJECT = PluginParameterType.OBJECT
ARRAY = PluginParameterType.ARRAY
DYNAMIC_SELECT = PluginParameterType.DYNAMIC_SELECT
CHECKBOX = PluginParameterType.CHECKBOX
class EventParameter(BaseModel):
"""
The parameter of the event
"""
name: str = Field(..., description="The name of the parameter")
label: I18nObject = Field(..., description="The label presented to the user")
type: EventParameterType = Field(..., description="The type of the parameter")
auto_generate: PluginParameterAutoGenerate | None = Field(
default=None, description="The auto generate of the parameter"
)
template: PluginParameterTemplate | None = Field(default=None, description="The template of the parameter")
scope: str | None = None
required: bool | None = False
multiple: bool | None = Field(
default=False,
description="Whether the parameter is multiple select, only valid for select or dynamic-select type",
)
default: Union[int, float, str, list[Any], None] = None
min: Union[float, int, None] = None
max: Union[float, int, None] = None
precision: int | None = None
options: list[PluginParameterOption] | None = None
description: I18nObject | None = None
class TriggerProviderIdentity(BaseModel):
"""
The identity of the trigger provider
"""
author: str = Field(..., description="The author of the trigger provider")
name: str = Field(..., description="The name of the trigger provider")
label: I18nObject = Field(..., description="The label of the trigger provider")
description: I18nObject = Field(..., description="The description of the trigger provider")
icon: str | None = Field(default=None, description="The icon of the trigger provider")
icon_dark: str | None = Field(default=None, description="The dark icon of the trigger provider")
tags: list[str] = Field(default_factory=list, description="The tags of the trigger provider")
class EventIdentity(BaseModel):
"""
The identity of the event
"""
author: str = Field(..., description="The author of the event")
name: str = Field(..., description="The name of the event")
label: I18nObject = Field(..., description="The label of the event")
provider: str | None = Field(default=None, description="The provider of the event")
class EventEntity(BaseModel):
"""
The configuration of an event
"""
identity: EventIdentity = Field(..., description="The identity of the event")
parameters: list[EventParameter] = Field(
default_factory=list[EventParameter], description="The parameters of the event"
)
description: I18nObject = Field(..., description="The description of the event")
output_schema: Mapping[str, Any] | None = Field(
default=None, description="The output schema that this event produces"
)
@field_validator("parameters", mode="before")
@classmethod
def set_parameters(cls, v, validation_info: ValidationInfo) -> list[EventParameter]:
return v or []
class OAuthSchema(BaseModel):
client_schema: list[ProviderConfig] = Field(default_factory=list, description="The schema of the OAuth client")
credentials_schema: list[ProviderConfig] = Field(
default_factory=list, description="The schema of the OAuth credentials"
)
class SubscriptionConstructor(BaseModel):
"""
The subscription constructor of the trigger provider
"""
parameters: list[EventParameter] = Field(
default_factory=list, description="The parameters schema of the subscription constructor"
)
credentials_schema: list[ProviderConfig] = Field(
default_factory=list,
description="The credentials schema of the subscription constructor",
)
oauth_schema: OAuthSchema | None = Field(
default=None,
description="The OAuth schema of the subscription constructor if OAuth is supported",
)
def get_default_parameters(self) -> Mapping[str, Any]:
"""Get the default parameters from the parameters schema"""
if not self.parameters:
return {}
return {param.name: param.default for param in self.parameters if param.default}
class TriggerProviderEntity(BaseModel):
"""
The configuration of a trigger provider
"""
identity: TriggerProviderIdentity = Field(..., description="The identity of the trigger provider")
subscription_schema: list[ProviderConfig] = Field(
default_factory=list,
description="The configuration schema stored in the subscription entity",
)
subscription_constructor: SubscriptionConstructor | None = Field(
default=None,
description="The subscription constructor of the trigger provider",
)
events: list[EventEntity] = Field(default_factory=list, description="The events of the trigger provider")
class Subscription(BaseModel):
"""
Result of a successful trigger subscription operation.
Contains all information needed to manage the subscription lifecycle.
"""
expires_at: int = Field(
..., description="The timestamp when the subscription will expire, this for refresh the subscription"
)
endpoint: str = Field(..., description="The webhook endpoint URL allocated by Dify for receiving events")
parameters: Mapping[str, Any] = Field(
default_factory=dict, description="The parameters of the subscription constructor"
)
properties: Mapping[str, Any] = Field(
..., description="Subscription data containing all properties and provider-specific information"
)
class UnsubscribeResult(BaseModel):
"""
Result of a trigger unsubscription operation.
Provides detailed information about the unsubscription attempt,
including success status and error details if failed.
"""
success: bool = Field(..., description="Whether the unsubscription was successful")
message: str | None = Field(
None,
description="Human-readable message about the operation result. "
"Success message for successful operations, "
"detailed error information for failures.",
)
class RequestLog(BaseModel):
id: str = Field(..., description="The id of the request log")
endpoint: str = Field(..., description="The endpoint of the request log")
request: dict[str, Any] = Field(..., description="The request of the request log")
response: dict[str, Any] = Field(..., description="The response of the request log")
created_at: datetime = Field(..., description="The created at of the request log")
class SubscriptionBuilder(BaseModel):
id: str = Field(..., description="The id of the subscription builder")
name: str | None = Field(default=None, description="The name of the subscription builder")
tenant_id: str = Field(..., description="The tenant id of the subscription builder")
user_id: str = Field(..., description="The user id of the subscription builder")
provider_id: str = Field(..., description="The provider id of the subscription builder")
endpoint_id: str = Field(..., description="The endpoint id of the subscription builder")
parameters: Mapping[str, Any] = Field(..., description="The parameters of the subscription builder")
properties: Mapping[str, Any] = Field(..., description="The properties of the subscription builder")
credentials: Mapping[str, str] = Field(..., description="The credentials of the subscription builder")
credential_type: str | None = Field(default=None, description="The credential type of the subscription builder")
credential_expires_at: int | None = Field(
default=None, description="The credential expires at of the subscription builder"
)
expires_at: int = Field(..., description="The expires at of the subscription builder")
def to_subscription(self) -> Subscription:
return Subscription(
expires_at=self.expires_at,
endpoint=self.endpoint_id,
properties=self.properties,
)
class SubscriptionBuilderUpdater(BaseModel):
name: str | None = Field(default=None, description="The name of the subscription builder")
parameters: Mapping[str, Any] | None = Field(default=None, description="The parameters of the subscription builder")
properties: Mapping[str, Any] | None = Field(default=None, description="The properties of the subscription builder")
credentials: Mapping[str, str] | None = Field(
default=None, description="The credentials of the subscription builder"
)
credential_type: str | None = Field(default=None, description="The credential type of the subscription builder")
credential_expires_at: int | None = Field(
default=None, description="The credential expires at of the subscription builder"
)
expires_at: int | None = Field(default=None, description="The expires at of the subscription builder")
def update(self, subscription_builder: SubscriptionBuilder) -> None:
if self.name is not None:
subscription_builder.name = self.name
if self.parameters is not None:
subscription_builder.parameters = self.parameters
if self.properties is not None:
subscription_builder.properties = self.properties
if self.credentials is not None:
subscription_builder.credentials = self.credentials
if self.credential_type is not None:
subscription_builder.credential_type = self.credential_type
if self.credential_expires_at is not None:
subscription_builder.credential_expires_at = self.credential_expires_at
if self.expires_at is not None:
subscription_builder.expires_at = self.expires_at
class TriggerEventData(BaseModel):
"""Event data dispatched to trigger sessions."""
subscription_id: str
events: list[str]
request_id: str
timestamp: float
model_config = ConfigDict(arbitrary_types_allowed=True)
class TriggerCreationMethod(StrEnum):
OAUTH = "OAUTH"
APIKEY = "APIKEY"
MANUAL = "MANUAL"
# Export all entities
__all__: list[str] = [
"EventEntity",
"EventIdentity",
"EventParameter",
"EventParameterType",
"OAuthSchema",
"RequestLog",
"Subscription",
"SubscriptionBuilder",
"TriggerCreationMethod",
"TriggerEventData",
"TriggerProviderEntity",
"TriggerProviderIdentity",
"UnsubscribeResult",
]

View File

@ -0,0 +1,19 @@
from core.plugin.impl.exc import PluginInvokeError
class TriggerProviderCredentialValidationError(ValueError):
pass
class TriggerPluginInvokeError(PluginInvokeError):
pass
class TriggerInvokeError(PluginInvokeError):
pass
class EventIgnoreError(TriggerInvokeError):
"""
Trigger event ignore error
"""

View File

@ -0,0 +1,421 @@
"""
Trigger Provider Controller for managing trigger providers
"""
import logging
from collections.abc import Mapping
from typing import Any
from flask import Request
from core.entities.provider_entities import BasicProviderConfig
from core.plugin.entities.plugin_daemon import CredentialType
from core.plugin.entities.request import (
TriggerDispatchResponse,
TriggerInvokeEventResponse,
TriggerSubscriptionResponse,
)
from core.plugin.impl.trigger import PluginTriggerClient
from core.trigger.entities.api_entities import EventApiEntity, TriggerProviderApiEntity
from core.trigger.entities.entities import (
EventEntity,
EventParameter,
ProviderConfig,
Subscription,
SubscriptionConstructor,
TriggerCreationMethod,
TriggerProviderEntity,
TriggerProviderIdentity,
UnsubscribeResult,
)
from core.trigger.errors import TriggerProviderCredentialValidationError
from models.provider_ids import TriggerProviderID
from services.plugin.plugin_service import PluginService
logger = logging.getLogger(__name__)
class PluginTriggerProviderController:
"""
Controller for plugin trigger providers
"""
def __init__(
self,
entity: TriggerProviderEntity,
plugin_id: str,
plugin_unique_identifier: str,
provider_id: TriggerProviderID,
tenant_id: str,
):
"""
Initialize plugin trigger provider controller
:param entity: Trigger provider entity
:param plugin_id: Plugin ID
:param plugin_unique_identifier: Plugin unique identifier
:param provider_id: Provider ID
:param tenant_id: Tenant ID
"""
self.entity = entity
self.tenant_id = tenant_id
self.plugin_id = plugin_id
self.provider_id = provider_id
self.plugin_unique_identifier = plugin_unique_identifier
def get_provider_id(self) -> TriggerProviderID:
"""
Get provider ID
"""
return self.provider_id
def to_api_entity(self) -> TriggerProviderApiEntity:
"""
Convert to API entity
"""
icon = (
PluginService.get_plugin_icon_url(self.tenant_id, self.entity.identity.icon)
if self.entity.identity.icon
else None
)
icon_dark = (
PluginService.get_plugin_icon_url(self.tenant_id, self.entity.identity.icon_dark)
if self.entity.identity.icon_dark
else None
)
subscription_constructor = self.entity.subscription_constructor
supported_creation_methods = [TriggerCreationMethod.MANUAL]
if subscription_constructor and subscription_constructor.oauth_schema:
supported_creation_methods.append(TriggerCreationMethod.OAUTH)
if subscription_constructor and subscription_constructor.credentials_schema:
supported_creation_methods.append(TriggerCreationMethod.APIKEY)
return TriggerProviderApiEntity(
author=self.entity.identity.author,
name=self.entity.identity.name,
label=self.entity.identity.label,
description=self.entity.identity.description,
icon=icon,
icon_dark=icon_dark,
tags=self.entity.identity.tags,
plugin_id=self.plugin_id,
plugin_unique_identifier=self.plugin_unique_identifier,
subscription_constructor=subscription_constructor,
subscription_schema=self.entity.subscription_schema,
supported_creation_methods=supported_creation_methods,
events=[
EventApiEntity(
name=event.identity.name,
identity=event.identity,
description=event.description,
parameters=event.parameters,
output_schema=event.output_schema,
)
for event in self.entity.events
],
)
@property
def identity(self) -> TriggerProviderIdentity:
"""Get provider identity"""
return self.entity.identity
def get_events(self) -> list[EventEntity]:
"""
Get all events for this provider
:return: List of event entities
"""
return self.entity.events
def get_event(self, event_name: str) -> EventEntity | None:
"""
Get a specific event by name
:param event_name: Event name
:return: Event entity or None
"""
for event in self.entity.events:
if event.identity.name == event_name:
return event
return None
def get_subscription_default_properties(self) -> Mapping[str, Any]:
"""
Get default properties for this provider
:return: Default properties
"""
return {prop.name: prop.default for prop in self.entity.subscription_schema if prop.default}
def get_subscription_constructor(self) -> SubscriptionConstructor | None:
"""
Get subscription constructor for this provider
:return: Subscription constructor
"""
return self.entity.subscription_constructor
def validate_credentials(self, user_id: str, credentials: Mapping[str, str]) -> None:
"""
Validate credentials against schema
:param credentials: Credentials to validate
:return: Validation response
"""
# First validate against schema
subscription_constructor: SubscriptionConstructor | None = self.entity.subscription_constructor
if not subscription_constructor:
raise ValueError("Subscription constructor not found")
for config in subscription_constructor.credentials_schema or []:
if config.required and config.name not in credentials:
raise TriggerProviderCredentialValidationError(f"Missing required credential field: {config.name}")
# Then validate with the plugin daemon
manager = PluginTriggerClient()
provider_id = self.get_provider_id()
response = manager.validate_provider_credentials(
tenant_id=self.tenant_id,
user_id=user_id,
provider=str(provider_id),
credentials=credentials,
)
if not response:
raise TriggerProviderCredentialValidationError(
"Invalid credentials",
)
def get_supported_credential_types(self) -> list[CredentialType]:
"""
Get supported credential types for this provider.
:return: List of supported credential types
"""
types: list[CredentialType] = []
subscription_constructor = self.entity.subscription_constructor
if subscription_constructor and subscription_constructor.oauth_schema:
types.append(CredentialType.OAUTH2)
if subscription_constructor and subscription_constructor.credentials_schema:
types.append(CredentialType.API_KEY)
return types
def get_credentials_schema(self, credential_type: CredentialType | str) -> list[ProviderConfig]:
"""
Get credentials schema by credential type
:param credential_type: The type of credential (oauth or api_key)
:return: List of provider config schemas
"""
subscription_constructor = self.entity.subscription_constructor
if not subscription_constructor:
return []
credential_type = CredentialType.of(credential_type)
if credential_type == CredentialType.OAUTH2:
return (
subscription_constructor.oauth_schema.credentials_schema.copy()
if subscription_constructor and subscription_constructor.oauth_schema
else []
)
if credential_type == CredentialType.API_KEY:
return (
subscription_constructor.credentials_schema.copy() or []
if subscription_constructor and subscription_constructor.credentials_schema
else []
)
if credential_type == CredentialType.UNAUTHORIZED:
return []
raise ValueError(f"Invalid credential type: {credential_type}")
def get_credential_schema_config(self, credential_type: CredentialType | str) -> list[BasicProviderConfig]:
"""
Get credential schema config by credential type
"""
return [x.to_basic_provider_config() for x in self.get_credentials_schema(credential_type)]
def get_oauth_client_schema(self) -> list[ProviderConfig]:
"""
Get OAuth client schema for this provider
:return: List of OAuth client config schemas
"""
subscription_constructor = self.entity.subscription_constructor
return (
subscription_constructor.oauth_schema.client_schema.copy()
if subscription_constructor and subscription_constructor.oauth_schema
else []
)
def get_properties_schema(self) -> list[BasicProviderConfig]:
"""
Get properties schema for this provider
:return: List of properties config schemas
"""
return (
[x.to_basic_provider_config() for x in self.entity.subscription_schema.copy()]
if self.entity.subscription_schema
else []
)
def get_event_parameters(self, event_name: str) -> Mapping[str, EventParameter]:
"""
Get event parameters for this provider
"""
event = self.get_event(event_name)
if not event:
return {}
return {parameter.name: parameter for parameter in event.parameters}
def dispatch(
self,
request: Request,
subscription: Subscription,
credentials: Mapping[str, str],
credential_type: CredentialType,
) -> TriggerDispatchResponse:
"""
Dispatch a trigger through plugin runtime
:param user_id: User ID
:param request: Flask request object
:param subscription: Subscription
:param credentials: Provider credentials
:param credential_type: Credential type
:return: Dispatch response with triggers and raw HTTP response
"""
manager = PluginTriggerClient()
provider_id: TriggerProviderID = self.get_provider_id()
response: TriggerDispatchResponse = manager.dispatch_event(
tenant_id=self.tenant_id,
provider=str(provider_id),
subscription=subscription.model_dump(),
request=request,
credentials=credentials,
credential_type=credential_type,
)
return response
def invoke_trigger_event(
self,
user_id: str,
event_name: str,
parameters: Mapping[str, Any],
credentials: Mapping[str, str],
credential_type: CredentialType,
subscription: Subscription,
request: Request,
payload: Mapping[str, Any],
) -> TriggerInvokeEventResponse:
"""
Execute a trigger through plugin runtime
:param user_id: User ID
:param event_name: Event name
:param parameters: Trigger parameters
:param credentials: Provider credentials
:param credential_type: Credential type
:param request: Request
:param payload: Payload
:return: Trigger execution result
"""
manager = PluginTriggerClient()
provider_id: TriggerProviderID = self.get_provider_id()
return manager.invoke_trigger_event(
tenant_id=self.tenant_id,
user_id=user_id,
provider=str(provider_id),
event_name=event_name,
credentials=credentials,
credential_type=credential_type,
request=request,
parameters=parameters,
subscription=subscription,
payload=payload,
)
def subscribe_trigger(
self,
user_id: str,
endpoint: str,
parameters: Mapping[str, Any],
credentials: Mapping[str, str],
credential_type: CredentialType,
) -> Subscription:
"""
Subscribe to a trigger through plugin runtime
:param user_id: User ID
:param endpoint: Subscription endpoint
:param subscription_params: Subscription parameters
:param credentials: Provider credentials
:param credential_type: Credential type
:return: Subscription result
"""
manager = PluginTriggerClient()
provider_id: TriggerProviderID = self.get_provider_id()
response: TriggerSubscriptionResponse = manager.subscribe(
tenant_id=self.tenant_id,
user_id=user_id,
provider=str(provider_id),
endpoint=endpoint,
parameters=parameters,
credentials=credentials,
credential_type=credential_type,
)
return Subscription.model_validate(response.subscription)
def unsubscribe_trigger(
self, user_id: str, subscription: Subscription, credentials: Mapping[str, str], credential_type: CredentialType
) -> UnsubscribeResult:
"""
Unsubscribe from a trigger through plugin runtime
:param user_id: User ID
:param subscription: Subscription metadata
:param credentials: Provider credentials
:param credential_type: Credential type
:return: Unsubscribe result
"""
manager = PluginTriggerClient()
provider_id: TriggerProviderID = self.get_provider_id()
response: TriggerSubscriptionResponse = manager.unsubscribe(
tenant_id=self.tenant_id,
user_id=user_id,
provider=str(provider_id),
subscription=subscription,
credentials=credentials,
credential_type=credential_type,
)
return UnsubscribeResult.model_validate(response.subscription)
def refresh_trigger(
self, subscription: Subscription, credentials: Mapping[str, str], credential_type: CredentialType
) -> Subscription:
"""
Refresh a trigger subscription through plugin runtime
:param subscription: Subscription metadata
:param credentials: Provider credentials
:return: Refreshed subscription result
"""
manager = PluginTriggerClient()
provider_id: TriggerProviderID = self.get_provider_id()
response: TriggerSubscriptionResponse = manager.refresh(
tenant_id=self.tenant_id,
user_id="system", # System refresh
provider=str(provider_id),
subscription=subscription,
credentials=credentials,
credential_type=credential_type,
)
return Subscription.model_validate(response.subscription)
__all__ = ["PluginTriggerProviderController"]

View File

@ -0,0 +1,283 @@
"""
Trigger Manager for loading and managing trigger providers and triggers
"""
import logging
from collections.abc import Mapping
from threading import Lock
from typing import Any
from flask import Request
import contexts
from configs import dify_config
from core.plugin.entities.plugin_daemon import CredentialType, PluginTriggerProviderEntity
from core.plugin.entities.request import TriggerInvokeEventResponse
from core.plugin.impl.exc import PluginDaemonError, PluginNotFoundError
from core.plugin.impl.trigger import PluginTriggerClient
from core.trigger.entities.entities import (
EventEntity,
Subscription,
UnsubscribeResult,
)
from core.trigger.errors import EventIgnoreError
from core.trigger.provider import PluginTriggerProviderController
from models.provider_ids import TriggerProviderID
logger = logging.getLogger(__name__)
class TriggerManager:
"""
Manager for trigger providers and triggers
"""
@classmethod
def get_trigger_plugin_icon(cls, tenant_id: str, provider_id: str) -> str:
"""
Get the icon of a trigger plugin
"""
manager = PluginTriggerClient()
provider: PluginTriggerProviderEntity = manager.fetch_trigger_provider(
tenant_id=tenant_id, provider_id=TriggerProviderID(provider_id)
)
filename = provider.declaration.identity.icon
base_url = f"{dify_config.CONSOLE_API_URL}/console/api/workspaces/current/plugin/icon"
return f"{base_url}?tenant_id={tenant_id}&filename={filename}"
@classmethod
def list_plugin_trigger_providers(cls, tenant_id: str) -> list[PluginTriggerProviderController]:
"""
List all plugin trigger providers for a tenant
:param tenant_id: Tenant ID
:return: List of trigger provider controllers
"""
manager = PluginTriggerClient()
provider_entities = manager.fetch_trigger_providers(tenant_id)
controllers: list[PluginTriggerProviderController] = []
for provider in provider_entities:
try:
controller = PluginTriggerProviderController(
entity=provider.declaration,
plugin_id=provider.plugin_id,
plugin_unique_identifier=provider.plugin_unique_identifier,
provider_id=TriggerProviderID(provider.provider),
tenant_id=tenant_id,
)
controllers.append(controller)
except Exception:
logger.exception("Failed to load trigger provider %s", provider.plugin_id)
continue
return controllers
@classmethod
def get_trigger_provider(cls, tenant_id: str, provider_id: TriggerProviderID) -> PluginTriggerProviderController:
"""
Get a specific plugin trigger provider
:param tenant_id: Tenant ID
:param provider_id: Provider ID
:return: Trigger provider controller or None
"""
# check if context is set
try:
contexts.plugin_trigger_providers.get()
except LookupError:
contexts.plugin_trigger_providers.set({})
contexts.plugin_trigger_providers_lock.set(Lock())
plugin_trigger_providers = contexts.plugin_trigger_providers.get()
provider_id_str = str(provider_id)
if provider_id_str in plugin_trigger_providers:
return plugin_trigger_providers[provider_id_str]
with contexts.plugin_trigger_providers_lock.get():
# double check
plugin_trigger_providers = contexts.plugin_trigger_providers.get()
if provider_id_str in plugin_trigger_providers:
return plugin_trigger_providers[provider_id_str]
try:
manager = PluginTriggerClient()
provider = manager.fetch_trigger_provider(tenant_id, provider_id)
if not provider:
raise ValueError(f"Trigger provider {provider_id} not found")
controller = PluginTriggerProviderController(
entity=provider.declaration,
plugin_id=provider.plugin_id,
plugin_unique_identifier=provider.plugin_unique_identifier,
provider_id=provider_id,
tenant_id=tenant_id,
)
plugin_trigger_providers[provider_id_str] = controller
return controller
except PluginNotFoundError as e:
raise ValueError(f"Trigger provider {provider_id} not found") from e
except PluginDaemonError as e:
raise e
except Exception as e:
logger.exception("Failed to load trigger provider")
raise e
@classmethod
def list_all_trigger_providers(cls, tenant_id: str) -> list[PluginTriggerProviderController]:
"""
List all trigger providers (plugin)
:param tenant_id: Tenant ID
:return: List of all trigger provider controllers
"""
return cls.list_plugin_trigger_providers(tenant_id)
@classmethod
def list_triggers_by_provider(cls, tenant_id: str, provider_id: TriggerProviderID) -> list[EventEntity]:
"""
List all triggers for a specific provider
:param tenant_id: Tenant ID
:param provider_id: Provider ID
:return: List of trigger entities
"""
provider = cls.get_trigger_provider(tenant_id, provider_id)
return provider.get_events()
@classmethod
def invoke_trigger_event(
cls,
tenant_id: str,
user_id: str,
provider_id: TriggerProviderID,
event_name: str,
parameters: Mapping[str, Any],
credentials: Mapping[str, str],
credential_type: CredentialType,
subscription: Subscription,
request: Request,
payload: Mapping[str, Any],
) -> TriggerInvokeEventResponse:
"""
Execute a trigger
:param tenant_id: Tenant ID
:param user_id: User ID
:param provider_id: Provider ID
:param event_name: Event name
:param parameters: Trigger parameters
:param credentials: Provider credentials
:param credential_type: Credential type
:param subscription: Subscription
:param request: Request
:param payload: Payload
:return: Trigger execution result
"""
provider: PluginTriggerProviderController = cls.get_trigger_provider(
tenant_id=tenant_id, provider_id=provider_id
)
try:
return provider.invoke_trigger_event(
user_id=user_id,
event_name=event_name,
parameters=parameters,
credentials=credentials,
credential_type=credential_type,
subscription=subscription,
request=request,
payload=payload,
)
except EventIgnoreError:
return TriggerInvokeEventResponse(variables={}, cancelled=True)
@classmethod
def subscribe_trigger(
cls,
tenant_id: str,
user_id: str,
provider_id: TriggerProviderID,
endpoint: str,
parameters: Mapping[str, Any],
credentials: Mapping[str, str],
credential_type: CredentialType,
) -> Subscription:
"""
Subscribe to a trigger (e.g., register webhook)
:param tenant_id: Tenant ID
:param user_id: User ID
:param provider_id: Provider ID
:param endpoint: Subscription endpoint
:param parameters: Subscription parameters
:param credentials: Provider credentials
:param credential_type: Credential type
:return: Subscription result
"""
provider: PluginTriggerProviderController = cls.get_trigger_provider(
tenant_id=tenant_id, provider_id=provider_id
)
return provider.subscribe_trigger(
user_id=user_id,
endpoint=endpoint,
parameters=parameters,
credentials=credentials,
credential_type=credential_type,
)
@classmethod
def unsubscribe_trigger(
cls,
tenant_id: str,
user_id: str,
provider_id: TriggerProviderID,
subscription: Subscription,
credentials: Mapping[str, str],
credential_type: CredentialType,
) -> UnsubscribeResult:
"""
Unsubscribe from a trigger
:param tenant_id: Tenant ID
:param user_id: User ID
:param provider_id: Provider ID
:param subscription: Subscription metadata from subscribe operation
:param credentials: Provider credentials
:param credential_type: Credential type
:return: Unsubscription result
"""
provider: PluginTriggerProviderController = cls.get_trigger_provider(
tenant_id=tenant_id, provider_id=provider_id
)
return provider.unsubscribe_trigger(
user_id=user_id,
subscription=subscription,
credentials=credentials,
credential_type=credential_type,
)
@classmethod
def refresh_trigger(
cls,
tenant_id: str,
provider_id: TriggerProviderID,
subscription: Subscription,
credentials: Mapping[str, str],
credential_type: CredentialType,
) -> Subscription:
"""
Refresh a trigger subscription
:param tenant_id: Tenant ID
:param provider_id: Provider ID
:param subscription: Subscription metadata from subscribe operation
:param credentials: Provider credentials
:param credential_type: Credential type
:return: Refreshed subscription result
"""
# TODO you should update the subscription using the return value of the refresh_trigger
return cls.get_trigger_provider(tenant_id=tenant_id, provider_id=provider_id).refresh_trigger(
subscription=subscription, credentials=credentials, credential_type=credential_type
)

View File

@ -0,0 +1,145 @@
from collections.abc import Mapping
from typing import Union
from core.entities.provider_entities import BasicProviderConfig, ProviderConfig
from core.helper.provider_cache import ProviderCredentialsCache
from core.helper.provider_encryption import ProviderConfigCache, ProviderConfigEncrypter, create_provider_encrypter
from core.plugin.entities.plugin_daemon import CredentialType
from core.trigger.entities.api_entities import TriggerProviderSubscriptionApiEntity
from core.trigger.provider import PluginTriggerProviderController
from models.trigger import TriggerSubscription
class TriggerProviderCredentialsCache(ProviderCredentialsCache):
"""Cache for trigger provider credentials"""
def __init__(self, tenant_id: str, provider_id: str, credential_id: str):
super().__init__(tenant_id=tenant_id, provider_id=provider_id, credential_id=credential_id)
def _generate_cache_key(self, **kwargs) -> str:
tenant_id = kwargs["tenant_id"]
provider_id = kwargs["provider_id"]
credential_id = kwargs["credential_id"]
return f"trigger_credentials:tenant_id:{tenant_id}:provider_id:{provider_id}:credential_id:{credential_id}"
class TriggerProviderOAuthClientParamsCache(ProviderCredentialsCache):
"""Cache for trigger provider OAuth client"""
def __init__(self, tenant_id: str, provider_id: str):
super().__init__(tenant_id=tenant_id, provider_id=provider_id)
def _generate_cache_key(self, **kwargs) -> str:
tenant_id = kwargs["tenant_id"]
provider_id = kwargs["provider_id"]
return f"trigger_oauth_client:tenant_id:{tenant_id}:provider_id:{provider_id}"
class TriggerProviderPropertiesCache(ProviderCredentialsCache):
"""Cache for trigger provider properties"""
def __init__(self, tenant_id: str, provider_id: str, subscription_id: str):
super().__init__(tenant_id=tenant_id, provider_id=provider_id, subscription_id=subscription_id)
def _generate_cache_key(self, **kwargs) -> str:
tenant_id = kwargs["tenant_id"]
provider_id = kwargs["provider_id"]
subscription_id = kwargs["subscription_id"]
return f"trigger_properties:tenant_id:{tenant_id}:provider_id:{provider_id}:subscription_id:{subscription_id}"
def create_trigger_provider_encrypter_for_subscription(
tenant_id: str,
controller: PluginTriggerProviderController,
subscription: Union[TriggerSubscription, TriggerProviderSubscriptionApiEntity],
) -> tuple[ProviderConfigEncrypter, ProviderConfigCache]:
cache = TriggerProviderCredentialsCache(
tenant_id=tenant_id,
provider_id=str(controller.get_provider_id()),
credential_id=subscription.id,
)
encrypter, _ = create_provider_encrypter(
tenant_id=tenant_id,
config=controller.get_credential_schema_config(subscription.credential_type),
cache=cache,
)
return encrypter, cache
def delete_cache_for_subscription(tenant_id: str, provider_id: str, subscription_id: str):
cache = TriggerProviderCredentialsCache(
tenant_id=tenant_id,
provider_id=provider_id,
credential_id=subscription_id,
)
cache.delete()
def create_trigger_provider_encrypter_for_properties(
tenant_id: str,
controller: PluginTriggerProviderController,
subscription: Union[TriggerSubscription, TriggerProviderSubscriptionApiEntity],
) -> tuple[ProviderConfigEncrypter, ProviderConfigCache]:
cache = TriggerProviderPropertiesCache(
tenant_id=tenant_id,
provider_id=str(controller.get_provider_id()),
subscription_id=subscription.id,
)
encrypter, _ = create_provider_encrypter(
tenant_id=tenant_id,
config=controller.get_properties_schema(),
cache=cache,
)
return encrypter, cache
def create_trigger_provider_encrypter(
tenant_id: str, controller: PluginTriggerProviderController, credential_id: str, credential_type: CredentialType
) -> tuple[ProviderConfigEncrypter, ProviderConfigCache]:
cache = TriggerProviderCredentialsCache(
tenant_id=tenant_id,
provider_id=str(controller.get_provider_id()),
credential_id=credential_id,
)
encrypter, _ = create_provider_encrypter(
tenant_id=tenant_id,
config=controller.get_credential_schema_config(credential_type),
cache=cache,
)
return encrypter, cache
def create_trigger_provider_oauth_encrypter(
tenant_id: str, controller: PluginTriggerProviderController
) -> tuple[ProviderConfigEncrypter, ProviderConfigCache]:
cache = TriggerProviderOAuthClientParamsCache(
tenant_id=tenant_id,
provider_id=str(controller.get_provider_id()),
)
encrypter, _ = create_provider_encrypter(
tenant_id=tenant_id,
config=[x.to_basic_provider_config() for x in controller.get_oauth_client_schema()],
cache=cache,
)
return encrypter, cache
def masked_credentials(
schemas: list[ProviderConfig],
credentials: Mapping[str, str],
) -> Mapping[str, str]:
masked_credentials = {}
configs = {x.name: x.to_basic_provider_config() for x in schemas}
for key, value in credentials.items():
config = configs.get(key)
if not config:
masked_credentials[key] = value
continue
if config.type == BasicProviderConfig.Type.SECRET_INPUT:
if len(value) <= 4:
masked_credentials[key] = "*" * len(value)
else:
masked_credentials[key] = value[:2] + "*" * (len(value) - 4) + value[-2:]
else:
masked_credentials[key] = value
return masked_credentials

View File

@ -0,0 +1,24 @@
from yarl import URL
from configs import dify_config
"""
Basic URL for thirdparty trigger services
"""
base_url = URL(dify_config.TRIGGER_URL)
def generate_plugin_trigger_endpoint_url(endpoint_id: str) -> str:
"""
Generate url for plugin trigger endpoint url
"""
return str(base_url / "triggers" / "plugin" / endpoint_id)
def generate_webhook_trigger_endpoint(webhook_id: str, debug: bool = False) -> str:
"""
Generate url for webhook trigger endpoint url
"""
return str(base_url / "triggers" / ("webhook-debug" if debug else "webhook") / webhook_id)

View File

@ -0,0 +1,12 @@
from collections.abc import Sequence
from itertools import starmap
def build_trigger_refresh_lock_key(tenant_id: str, subscription_id: str) -> str:
"""Build the Redis lock key for trigger subscription refresh in-flight protection."""
return f"trigger_provider_refresh_lock:{tenant_id}_{subscription_id}"
def build_trigger_refresh_lock_keys(pairs: Sequence[tuple[str, str]]) -> list[str]:
"""Build Redis lock keys for a sequence of (tenant_id, subscription_id) pairs."""
return list(starmap(build_trigger_refresh_lock_key, pairs))

View File

@ -22,6 +22,7 @@ class SystemVariableKey(StrEnum):
APP_ID = "app_id"
WORKFLOW_ID = "workflow_id"
WORKFLOW_EXECUTION_ID = "workflow_run_id"
TIMESTAMP = "timestamp"
# RAG Pipeline
DOCUMENT_ID = "document_id"
ORIGINAL_DOCUMENT_ID = "original_document_id"
@ -58,8 +59,31 @@ class NodeType(StrEnum):
DOCUMENT_EXTRACTOR = "document-extractor"
LIST_OPERATOR = "list-operator"
AGENT = "agent"
TRIGGER_WEBHOOK = "trigger-webhook"
TRIGGER_SCHEDULE = "trigger-schedule"
TRIGGER_PLUGIN = "trigger-plugin"
HUMAN_INPUT = "human-input"
@property
def is_trigger_node(self) -> bool:
"""Check if this node type is a trigger node."""
return self in [
NodeType.TRIGGER_WEBHOOK,
NodeType.TRIGGER_SCHEDULE,
NodeType.TRIGGER_PLUGIN,
]
@property
def is_start_node(self) -> bool:
"""Check if this node type can serve as a workflow entry point."""
return self in [
NodeType.START,
NodeType.DATASOURCE,
NodeType.TRIGGER_WEBHOOK,
NodeType.TRIGGER_SCHEDULE,
NodeType.TRIGGER_PLUGIN,
]
class NodeExecutionType(StrEnum):
"""Node execution type classification."""
@ -208,6 +232,7 @@ class WorkflowNodeExecutionMetadataKey(StrEnum):
CURRENCY = "currency"
TOOL_INFO = "tool_info"
AGENT_LOG = "agent_log"
TRIGGER_INFO = "trigger_info"
ITERATION_ID = "iteration_id"
ITERATION_INDEX = "iteration_index"
LOOP_ID = "loop_id"

View File

@ -117,7 +117,7 @@ class Graph:
node_type = node_data.get("type")
if not isinstance(node_type, str):
continue
if node_type in [NodeType.START, NodeType.DATASOURCE]:
if NodeType(node_type).is_start_node:
start_node_id = nid
break

View File

@ -126,6 +126,12 @@ class Node:
start_event.provider_id = f"{plugin_id}/{provider_name}"
start_event.provider_type = getattr(self.get_base_node_data(), "provider_type", "")
from core.workflow.nodes.trigger_plugin.trigger_event_node import TriggerEventNode
if isinstance(self, TriggerEventNode):
start_event.provider_id = getattr(self.get_base_node_data(), "provider_id", "")
start_event.provider_type = getattr(self.get_base_node_data(), "provider_type", "")
from typing import cast
from core.workflow.nodes.agent.agent_node import AgentNode

View File

@ -22,6 +22,9 @@ from core.workflow.nodes.question_classifier import QuestionClassifierNode
from core.workflow.nodes.start import StartNode
from core.workflow.nodes.template_transform import TemplateTransformNode
from core.workflow.nodes.tool import ToolNode
from core.workflow.nodes.trigger_plugin import TriggerEventNode
from core.workflow.nodes.trigger_schedule import TriggerScheduleNode
from core.workflow.nodes.trigger_webhook import TriggerWebhookNode
from core.workflow.nodes.variable_aggregator import VariableAggregatorNode
from core.workflow.nodes.variable_assigner.v1 import VariableAssignerNode as VariableAssignerNodeV1
from core.workflow.nodes.variable_assigner.v2 import VariableAssignerNode as VariableAssignerNodeV2
@ -147,4 +150,16 @@ NODE_TYPE_CLASSES_MAPPING: Mapping[NodeType, Mapping[str, type[Node]]] = {
LATEST_VERSION: KnowledgeIndexNode,
"1": KnowledgeIndexNode,
},
NodeType.TRIGGER_WEBHOOK: {
LATEST_VERSION: TriggerWebhookNode,
"1": TriggerWebhookNode,
},
NodeType.TRIGGER_PLUGIN: {
LATEST_VERSION: TriggerEventNode,
"1": TriggerEventNode,
},
NodeType.TRIGGER_SCHEDULE: {
LATEST_VERSION: TriggerScheduleNode,
"1": TriggerScheduleNode,
},
}

View File

@ -164,10 +164,7 @@ class ToolNode(Node):
status=WorkflowNodeExecutionStatus.FAILED,
inputs=parameters_for_log,
metadata={WorkflowNodeExecutionMetadataKey.TOOL_INFO: tool_info},
error="An error occurred in the plugin, "
f"please contact the author of {node_data.provider_name} for help, "
f"error type: {e.get_error_type()}, "
f"error details: {e.get_error_message()}",
error=e.to_user_friendly_error(plugin_name=node_data.provider_name),
error_type=type(e).__name__,
)
)

View File

@ -0,0 +1,3 @@
from .trigger_event_node import TriggerEventNode
__all__ = ["TriggerEventNode"]

View File

@ -0,0 +1,77 @@
from collections.abc import Mapping
from typing import Any, Literal, Union
from pydantic import BaseModel, Field, ValidationInfo, field_validator
from core.trigger.entities.entities import EventParameter
from core.workflow.nodes.base.entities import BaseNodeData
from core.workflow.nodes.trigger_plugin.exc import TriggerEventParameterError
class TriggerEventNodeData(BaseNodeData):
"""Plugin trigger node data"""
class TriggerEventInput(BaseModel):
value: Union[Any, list[str]]
type: Literal["mixed", "variable", "constant"]
@field_validator("type", mode="before")
@classmethod
def check_type(cls, value, validation_info: ValidationInfo):
type = value
value = validation_info.data.get("value")
if value is None:
return type
if type == "mixed" and not isinstance(value, str):
raise ValueError("value must be a string")
if type == "variable":
if not isinstance(value, list):
raise ValueError("value must be a list")
for val in value:
if not isinstance(val, str):
raise ValueError("value must be a list of strings")
if type == "constant" and not isinstance(value, str | int | float | bool | dict | list):
raise ValueError("value must be a string, int, float, bool or dict")
return type
title: str
desc: str | None = None
plugin_id: str = Field(..., description="Plugin ID")
provider_id: str = Field(..., description="Provider ID")
event_name: str = Field(..., description="Event name")
subscription_id: str = Field(..., description="Subscription ID")
plugin_unique_identifier: str = Field(..., description="Plugin unique identifier")
event_parameters: Mapping[str, TriggerEventInput] = Field(default_factory=dict, description="Trigger parameters")
def resolve_parameters(
self,
*,
parameter_schemas: Mapping[str, EventParameter],
) -> Mapping[str, Any]:
"""
Generate parameters based on the given plugin trigger parameters.
Args:
parameter_schemas (Mapping[str, EventParameter]): The mapping of parameter schemas.
Returns:
Mapping[str, Any]: A dictionary containing the generated parameters.
"""
result: dict[str, Any] = {}
for parameter_name in self.event_parameters:
parameter: EventParameter | None = parameter_schemas.get(parameter_name)
if not parameter:
result[parameter_name] = None
continue
event_input = self.event_parameters[parameter_name]
# trigger node only supports constant input
if event_input.type != "constant":
raise TriggerEventParameterError(f"Unknown plugin trigger input type '{event_input.type}'")
result[parameter_name] = event_input.value
return result

View File

@ -0,0 +1,10 @@
class TriggerEventNodeError(ValueError):
"""Base exception for plugin trigger node errors."""
pass
class TriggerEventParameterError(TriggerEventNodeError):
"""Exception raised for errors in plugin trigger parameters."""
pass

View File

@ -0,0 +1,89 @@
from collections.abc import Mapping
from typing import Any
from core.workflow.constants import SYSTEM_VARIABLE_NODE_ID
from core.workflow.entities.workflow_node_execution import WorkflowNodeExecutionMetadataKey, WorkflowNodeExecutionStatus
from core.workflow.enums import ErrorStrategy, NodeExecutionType, NodeType
from core.workflow.node_events import NodeRunResult
from core.workflow.nodes.base.entities import BaseNodeData, RetryConfig
from core.workflow.nodes.base.node import Node
from .entities import TriggerEventNodeData
class TriggerEventNode(Node):
node_type = NodeType.TRIGGER_PLUGIN
execution_type = NodeExecutionType.ROOT
_node_data: TriggerEventNodeData
def init_node_data(self, data: Mapping[str, Any]) -> None:
self._node_data = TriggerEventNodeData.model_validate(data)
def _get_error_strategy(self) -> ErrorStrategy | None:
return self._node_data.error_strategy
def _get_retry_config(self) -> RetryConfig:
return self._node_data.retry_config
def _get_title(self) -> str:
return self._node_data.title
def _get_description(self) -> str | None:
return self._node_data.desc
def _get_default_value_dict(self) -> dict[str, Any]:
return self._node_data.default_value_dict
def get_base_node_data(self) -> BaseNodeData:
return self._node_data
@classmethod
def get_default_config(cls, filters: Mapping[str, object] | None = None) -> Mapping[str, object]:
return {
"type": "plugin",
"config": {
"title": "",
"plugin_id": "",
"provider_id": "",
"event_name": "",
"subscription_id": "",
"plugin_unique_identifier": "",
"event_parameters": {},
},
}
@classmethod
def version(cls) -> str:
return "1"
def _run(self) -> NodeRunResult:
"""
Run the plugin trigger node.
This node invokes the trigger to convert request data into events
and makes them available to downstream nodes.
"""
# Get trigger data passed when workflow was triggered
metadata = {
WorkflowNodeExecutionMetadataKey.TRIGGER_INFO: {
"provider_id": self._node_data.provider_id,
"event_name": self._node_data.event_name,
"plugin_unique_identifier": self._node_data.plugin_unique_identifier,
},
}
node_inputs = dict(self.graph_runtime_state.variable_pool.user_inputs)
system_inputs = self.graph_runtime_state.variable_pool.system_variables.to_dict()
# TODO: System variables should be directly accessible, no need for special handling
# Set system variables as node outputs.
for var in system_inputs:
node_inputs[SYSTEM_VARIABLE_NODE_ID + "." + var] = system_inputs[var]
outputs = dict(node_inputs)
return NodeRunResult(
status=WorkflowNodeExecutionStatus.SUCCEEDED,
inputs=node_inputs,
outputs=outputs,
metadata=metadata,
)

View File

@ -0,0 +1,3 @@
from core.workflow.nodes.trigger_schedule.trigger_schedule_node import TriggerScheduleNode
__all__ = ["TriggerScheduleNode"]

View File

@ -0,0 +1,49 @@
from typing import Literal, Union
from pydantic import BaseModel, Field
from core.workflow.nodes.base import BaseNodeData
class TriggerScheduleNodeData(BaseNodeData):
"""
Trigger Schedule Node Data
"""
mode: str = Field(default="visual", description="Schedule mode: visual or cron")
frequency: str | None = Field(default=None, description="Frequency for visual mode: hourly, daily, weekly, monthly")
cron_expression: str | None = Field(default=None, description="Cron expression for cron mode")
visual_config: dict | None = Field(default=None, description="Visual configuration details")
timezone: str = Field(default="UTC", description="Timezone for schedule execution")
class ScheduleConfig(BaseModel):
node_id: str
cron_expression: str
timezone: str = "UTC"
class SchedulePlanUpdate(BaseModel):
node_id: str | None = None
cron_expression: str | None = None
timezone: str | None = None
class VisualConfig(BaseModel):
"""Visual configuration for schedule trigger"""
# For hourly frequency
on_minute: int | None = Field(default=0, ge=0, le=59, description="Minute of the hour (0-59)")
# For daily, weekly, monthly frequencies
time: str | None = Field(default="12:00 AM", description="Time in 12-hour format (e.g., '2:30 PM')")
# For weekly frequency
weekdays: list[Literal["sun", "mon", "tue", "wed", "thu", "fri", "sat"]] | None = Field(
default=None, description="List of weekdays to run on"
)
# For monthly frequency
monthly_days: list[Union[int, Literal["last"]]] | None = Field(
default=None, description="Days of month to run on (1-31 or 'last')"
)

View File

@ -0,0 +1,31 @@
from core.workflow.nodes.base.exc import BaseNodeError
class ScheduleNodeError(BaseNodeError):
"""Base schedule node error."""
pass
class ScheduleNotFoundError(ScheduleNodeError):
"""Schedule not found error."""
pass
class ScheduleConfigError(ScheduleNodeError):
"""Schedule configuration error."""
pass
class ScheduleExecutionError(ScheduleNodeError):
"""Schedule execution error."""
pass
class TenantOwnerNotFoundError(ScheduleExecutionError):
"""Tenant owner not found error for schedule execution."""
pass

View File

@ -0,0 +1,69 @@
from collections.abc import Mapping
from typing import Any
from core.workflow.constants import SYSTEM_VARIABLE_NODE_ID
from core.workflow.entities.workflow_node_execution import WorkflowNodeExecutionStatus
from core.workflow.enums import ErrorStrategy, NodeExecutionType, NodeType
from core.workflow.node_events import NodeRunResult
from core.workflow.nodes.base.entities import BaseNodeData, RetryConfig
from core.workflow.nodes.base.node import Node
from core.workflow.nodes.trigger_schedule.entities import TriggerScheduleNodeData
class TriggerScheduleNode(Node):
node_type = NodeType.TRIGGER_SCHEDULE
execution_type = NodeExecutionType.ROOT
_node_data: TriggerScheduleNodeData
def init_node_data(self, data: Mapping[str, Any]) -> None:
self._node_data = TriggerScheduleNodeData(**data)
def _get_error_strategy(self) -> ErrorStrategy | None:
return self._node_data.error_strategy
def _get_retry_config(self) -> RetryConfig:
return self._node_data.retry_config
def _get_title(self) -> str:
return self._node_data.title
def _get_description(self) -> str | None:
return self._node_data.desc
def _get_default_value_dict(self) -> dict[str, Any]:
return self._node_data.default_value_dict
def get_base_node_data(self) -> BaseNodeData:
return self._node_data
@classmethod
def version(cls) -> str:
return "1"
@classmethod
def get_default_config(cls, filters: Mapping[str, object] | None = None) -> Mapping[str, object]:
return {
"type": "trigger-schedule",
"config": {
"mode": "visual",
"frequency": "daily",
"visual_config": {"time": "12:00 AM", "on_minute": 0, "weekdays": ["sun"], "monthly_days": [1]},
"timezone": "UTC",
},
}
def _run(self) -> NodeRunResult:
node_inputs = dict(self.graph_runtime_state.variable_pool.user_inputs)
system_inputs = self.graph_runtime_state.variable_pool.system_variables.to_dict()
# TODO: System variables should be directly accessible, no need for special handling
# Set system variables as node outputs.
for var in system_inputs:
node_inputs[SYSTEM_VARIABLE_NODE_ID + "." + var] = system_inputs[var]
outputs = dict(node_inputs)
return NodeRunResult(
status=WorkflowNodeExecutionStatus.SUCCEEDED,
inputs=node_inputs,
outputs=outputs,
)

View File

@ -0,0 +1,3 @@
from .node import TriggerWebhookNode
__all__ = ["TriggerWebhookNode"]

View File

@ -0,0 +1,79 @@
from collections.abc import Sequence
from enum import StrEnum
from typing import Literal
from pydantic import BaseModel, Field, field_validator
from core.workflow.nodes.base import BaseNodeData
class Method(StrEnum):
GET = "get"
POST = "post"
HEAD = "head"
PATCH = "patch"
PUT = "put"
DELETE = "delete"
class ContentType(StrEnum):
JSON = "application/json"
FORM_DATA = "multipart/form-data"
FORM_URLENCODED = "application/x-www-form-urlencoded"
TEXT = "text/plain"
BINARY = "application/octet-stream"
class WebhookParameter(BaseModel):
"""Parameter definition for headers, query params, or body."""
name: str
required: bool = False
class WebhookBodyParameter(BaseModel):
"""Body parameter with type information."""
name: str
type: Literal[
"string",
"number",
"boolean",
"object",
"array[string]",
"array[number]",
"array[boolean]",
"array[object]",
"file",
] = "string"
required: bool = False
class WebhookData(BaseNodeData):
"""
Webhook Node Data.
"""
class SyncMode(StrEnum):
SYNC = "async" # only support
method: Method = Method.GET
content_type: ContentType = Field(default=ContentType.JSON)
headers: Sequence[WebhookParameter] = Field(default_factory=list)
params: Sequence[WebhookParameter] = Field(default_factory=list) # query parameters
body: Sequence[WebhookBodyParameter] = Field(default_factory=list)
@field_validator("method", mode="before")
@classmethod
def normalize_method(cls, v) -> str:
"""Normalize HTTP method to lowercase to support both uppercase and lowercase input."""
if isinstance(v, str):
return v.lower()
return v
status_code: int = 200 # Expected status code for response
response_body: str = "" # Template for response body
# Webhook specific fields (not from client data, set internally)
webhook_id: str | None = None # Set when webhook trigger is created
timeout: int = 30 # Timeout in seconds to wait for webhook response

View File

@ -0,0 +1,25 @@
from core.workflow.nodes.base.exc import BaseNodeError
class WebhookNodeError(BaseNodeError):
"""Base webhook node error."""
pass
class WebhookTimeoutError(WebhookNodeError):
"""Webhook timeout error."""
pass
class WebhookNotFoundError(WebhookNodeError):
"""Webhook not found error."""
pass
class WebhookConfigError(WebhookNodeError):
"""Webhook configuration error."""
pass

View File

@ -0,0 +1,148 @@
from collections.abc import Mapping
from typing import Any
from core.workflow.constants import SYSTEM_VARIABLE_NODE_ID
from core.workflow.entities.workflow_node_execution import WorkflowNodeExecutionStatus
from core.workflow.enums import ErrorStrategy, NodeExecutionType, NodeType
from core.workflow.node_events import NodeRunResult
from core.workflow.nodes.base.entities import BaseNodeData, RetryConfig
from core.workflow.nodes.base.node import Node
from .entities import ContentType, WebhookData
class TriggerWebhookNode(Node):
node_type = NodeType.TRIGGER_WEBHOOK
execution_type = NodeExecutionType.ROOT
_node_data: WebhookData
def init_node_data(self, data: Mapping[str, Any]) -> None:
self._node_data = WebhookData.model_validate(data)
def _get_error_strategy(self) -> ErrorStrategy | None:
return self._node_data.error_strategy
def _get_retry_config(self) -> RetryConfig:
return self._node_data.retry_config
def _get_title(self) -> str:
return self._node_data.title
def _get_description(self) -> str | None:
return self._node_data.desc
def _get_default_value_dict(self) -> dict[str, Any]:
return self._node_data.default_value_dict
def get_base_node_data(self) -> BaseNodeData:
return self._node_data
@classmethod
def get_default_config(cls, filters: Mapping[str, object] | None = None) -> Mapping[str, object]:
return {
"type": "webhook",
"config": {
"method": "get",
"content_type": "application/json",
"headers": [],
"params": [],
"body": [],
"async_mode": True,
"status_code": 200,
"response_body": "",
"timeout": 30,
},
}
@classmethod
def version(cls) -> str:
return "1"
def _run(self) -> NodeRunResult:
"""
Run the webhook node.
Like the start node, this simply takes the webhook data from the variable pool
and makes it available to downstream nodes. The actual webhook handling
happens in the trigger controller.
"""
# Get webhook data from variable pool (injected by Celery task)
webhook_inputs = dict(self.graph_runtime_state.variable_pool.user_inputs)
# Extract webhook-specific outputs based on node configuration
outputs = self._extract_configured_outputs(webhook_inputs)
system_inputs = self.graph_runtime_state.variable_pool.system_variables.to_dict()
# TODO: System variables should be directly accessible, no need for special handling
# Set system variables as node outputs.
for var in system_inputs:
outputs[SYSTEM_VARIABLE_NODE_ID + "." + var] = system_inputs[var]
return NodeRunResult(
status=WorkflowNodeExecutionStatus.SUCCEEDED,
inputs=webhook_inputs,
outputs=outputs,
)
def _extract_configured_outputs(self, webhook_inputs: dict[str, Any]) -> dict[str, Any]:
"""Extract outputs based on node configuration from webhook inputs."""
outputs = {}
# Get the raw webhook data (should be injected by Celery task)
webhook_data = webhook_inputs.get("webhook_data", {})
def _to_sanitized(name: str) -> str:
return name.replace("-", "_")
def _get_normalized(mapping: dict[str, Any], key: str) -> Any:
if not isinstance(mapping, dict):
return None
if key in mapping:
return mapping[key]
alternate = key.replace("-", "_") if "-" in key else key.replace("_", "-")
if alternate in mapping:
return mapping[alternate]
return None
# Extract configured headers (case-insensitive)
webhook_headers = webhook_data.get("headers", {})
webhook_headers_lower = {k.lower(): v for k, v in webhook_headers.items()}
for header in self._node_data.headers:
header_name = header.name
value = _get_normalized(webhook_headers, header_name)
if value is None:
value = _get_normalized(webhook_headers_lower, header_name.lower())
sanitized_name = _to_sanitized(header_name)
outputs[sanitized_name] = value
# Extract configured query parameters
for param in self._node_data.params:
param_name = param.name
outputs[param_name] = webhook_data.get("query_params", {}).get(param_name)
# Extract configured body parameters
for body_param in self._node_data.body:
param_name = body_param.name
param_type = body_param.type
if self._node_data.content_type == ContentType.TEXT:
# For text/plain, the entire body is a single string parameter
outputs[param_name] = str(webhook_data.get("body", {}).get("raw", ""))
continue
elif self._node_data.content_type == ContentType.BINARY:
outputs[param_name] = webhook_data.get("body", {}).get("raw", b"")
continue
if param_type == "file":
# Get File object (already processed by webhook controller)
file_obj = webhook_data.get("files", {}).get(param_name)
outputs[param_name] = file_obj
else:
# Get regular body parameter
outputs[param_name] = webhook_data.get("body", {}).get(param_name)
# Include raw webhook data for debugging/advanced use
outputs["_webhook_raw"] = webhook_data
return outputs

View File

@ -29,6 +29,8 @@ class SystemVariable(BaseModel):
app_id: str | None = None
workflow_id: str | None = None
timestamp: int | None = None
files: Sequence[File] = Field(default_factory=list)
# NOTE: The `workflow_execution_id` field was previously named `workflow_run_id`.
@ -108,6 +110,8 @@ class SystemVariable(BaseModel):
d[SystemVariableKey.DATASOURCE_INFO] = self.datasource_info
if self.invoke_from is not None:
d[SystemVariableKey.INVOKE_FROM] = self.invoke_from
if self.timestamp is not None:
d[SystemVariableKey.TIMESTAMP] = self.timestamp
return d
def as_view(self) -> "SystemVariableReadOnlyView":

View File

@ -30,10 +30,42 @@ if [[ "${MODE}" == "worker" ]]; then
CONCURRENCY_OPTION="-c ${CELERY_WORKER_AMOUNT:-1}"
fi
exec celery -A celery_entrypoint.celery worker -P ${CELERY_WORKER_CLASS:-gevent} $CONCURRENCY_OPTION \
# Configure queues based on edition if not explicitly set
if [[ -z "${CELERY_QUEUES}" ]]; then
if [[ "${EDITION}" == "CLOUD" ]]; then
# Cloud edition: separate queues for dataset and trigger tasks
DEFAULT_QUEUES="dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow_professional,workflow_team,workflow_sandbox,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor"
else
# Community edition (SELF_HOSTED): dataset, pipeline and workflow have separate queues
DEFAULT_QUEUES="dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor"
fi
else
DEFAULT_QUEUES="${CELERY_QUEUES}"
fi
# Support for Kubernetes deployment with specific queue workers
# Environment variables that can be set:
# - CELERY_WORKER_QUEUES: Comma-separated list of queues (overrides CELERY_QUEUES)
# - CELERY_WORKER_CONCURRENCY: Number of worker processes (overrides CELERY_WORKER_AMOUNT)
# - CELERY_WORKER_POOL: Pool implementation (overrides CELERY_WORKER_CLASS)
if [[ -n "${CELERY_WORKER_QUEUES}" ]]; then
DEFAULT_QUEUES="${CELERY_WORKER_QUEUES}"
echo "Using CELERY_WORKER_QUEUES: ${DEFAULT_QUEUES}"
fi
if [[ -n "${CELERY_WORKER_CONCURRENCY}" ]]; then
CONCURRENCY_OPTION="-c ${CELERY_WORKER_CONCURRENCY}"
echo "Using CELERY_WORKER_CONCURRENCY: ${CELERY_WORKER_CONCURRENCY}"
fi
WORKER_POOL="${CELERY_WORKER_POOL:-${CELERY_WORKER_CLASS:-gevent}}"
echo "Starting Celery worker with queues: ${DEFAULT_QUEUES}"
exec celery -A celery_entrypoint.celery worker -P ${WORKER_POOL} $CONCURRENCY_OPTION \
--max-tasks-per-child ${MAX_TASKS_PER_CHILD:-50} --loglevel ${LOG_LEVEL:-INFO} \
-Q ${CELERY_QUEUES:-dataset,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,priority_pipeline,pipeline} \
--prefetch-multiplier=1
-Q ${DEFAULT_QUEUES} \
--prefetch-multiplier=${CELERY_PREFETCH_MULTIPLIER:-1}
elif [[ "${MODE}" == "beat" ]]; then
exec celery -A app.celery beat --loglevel ${LOG_LEVEL:-INFO}

View File

@ -6,12 +6,18 @@ from .create_site_record_when_app_created import handle as handle_create_site_re
from .delete_tool_parameters_cache_when_sync_draft_workflow import (
handle as handle_delete_tool_parameters_cache_when_sync_draft_workflow,
)
from .sync_plugin_trigger_when_app_created import handle as handle_sync_plugin_trigger_when_app_created
from .sync_webhook_when_app_created import handle as handle_sync_webhook_when_app_created
from .sync_workflow_schedule_when_app_published import handle as handle_sync_workflow_schedule_when_app_published
from .update_app_dataset_join_when_app_model_config_updated import (
handle as handle_update_app_dataset_join_when_app_model_config_updated,
)
from .update_app_dataset_join_when_app_published_workflow_updated import (
handle as handle_update_app_dataset_join_when_app_published_workflow_updated,
)
from .update_app_triggers_when_app_published_workflow_updated import (
handle as handle_update_app_triggers_when_app_published_workflow_updated,
)
# Consolidated handler replaces both deduct_quota_when_message_created and
# update_provider_last_used_at_when_message_created
@ -24,7 +30,11 @@ __all__ = [
"handle_create_installed_app_when_app_created",
"handle_create_site_record_when_app_created",
"handle_delete_tool_parameters_cache_when_sync_draft_workflow",
"handle_sync_plugin_trigger_when_app_created",
"handle_sync_webhook_when_app_created",
"handle_sync_workflow_schedule_when_app_published",
"handle_update_app_dataset_join_when_app_model_config_updated",
"handle_update_app_dataset_join_when_app_published_workflow_updated",
"handle_update_app_triggers_when_app_published_workflow_updated",
"handle_update_provider_when_message_created",
]

View File

@ -0,0 +1,22 @@
import logging
from events.app_event import app_draft_workflow_was_synced
from models.model import App, AppMode
from models.workflow import Workflow
from services.trigger.trigger_service import TriggerService
logger = logging.getLogger(__name__)
@app_draft_workflow_was_synced.connect
def handle(sender, synced_draft_workflow: Workflow, **kwargs):
"""
While creating a workflow or updating a workflow, we may need to sync
its plugin trigger relationships in DB.
"""
app: App = sender
if app.mode != AppMode.WORKFLOW.value:
# only handle workflow app, chatflow is not supported yet
return
TriggerService.sync_plugin_trigger_relationships(app, synced_draft_workflow)

View File

@ -0,0 +1,22 @@
import logging
from events.app_event import app_draft_workflow_was_synced
from models.model import App, AppMode
from models.workflow import Workflow
from services.trigger.webhook_service import WebhookService
logger = logging.getLogger(__name__)
@app_draft_workflow_was_synced.connect
def handle(sender, synced_draft_workflow: Workflow, **kwargs):
"""
While creating a workflow or updating a workflow, we may need to sync
its webhook relationships in DB.
"""
app: App = sender
if app.mode != AppMode.WORKFLOW.value:
# only handle workflow app, chatflow is not supported yet
return
WebhookService.sync_webhook_relationships(app, synced_draft_workflow)

View File

@ -0,0 +1,86 @@
import logging
from typing import cast
from sqlalchemy import select
from sqlalchemy.orm import Session
from core.workflow.nodes.trigger_schedule.entities import SchedulePlanUpdate
from events.app_event import app_published_workflow_was_updated
from extensions.ext_database import db
from models import AppMode, Workflow, WorkflowSchedulePlan
from services.trigger.schedule_service import ScheduleService
logger = logging.getLogger(__name__)
@app_published_workflow_was_updated.connect
def handle(sender, **kwargs):
"""
Handle app published workflow update event to sync workflow_schedule_plans table.
When a workflow is published, this handler will:
1. Extract schedule trigger nodes from the workflow graph
2. Compare with existing workflow_schedule_plans records
3. Create/update/delete schedule plans as needed
"""
app = sender
if app.mode != AppMode.WORKFLOW.value:
return
published_workflow = kwargs.get("published_workflow")
published_workflow = cast(Workflow, published_workflow)
sync_schedule_from_workflow(tenant_id=app.tenant_id, app_id=app.id, workflow=published_workflow)
def sync_schedule_from_workflow(tenant_id: str, app_id: str, workflow: Workflow) -> WorkflowSchedulePlan | None:
"""
Sync schedule plan from workflow graph configuration.
Args:
tenant_id: Tenant ID
app_id: App ID
workflow: Published workflow instance
Returns:
Updated or created WorkflowSchedulePlan, or None if no schedule node
"""
with Session(db.engine) as session:
schedule_config = ScheduleService.extract_schedule_config(workflow)
existing_plan = session.scalar(
select(WorkflowSchedulePlan).where(
WorkflowSchedulePlan.tenant_id == tenant_id,
WorkflowSchedulePlan.app_id == app_id,
)
)
if not schedule_config:
if existing_plan:
logger.info("No schedule node in workflow for app %s, removing schedule plan", app_id)
ScheduleService.delete_schedule(session=session, schedule_id=existing_plan.id)
session.commit()
return None
if existing_plan:
updates = SchedulePlanUpdate(
node_id=schedule_config.node_id,
cron_expression=schedule_config.cron_expression,
timezone=schedule_config.timezone,
)
updated_plan = ScheduleService.update_schedule(
session=session,
schedule_id=existing_plan.id,
updates=updates,
)
session.commit()
return updated_plan
else:
new_plan = ScheduleService.create_schedule(
session=session,
tenant_id=tenant_id,
app_id=app_id,
config=schedule_config,
)
session.commit()
return new_plan

View File

@ -0,0 +1,114 @@
from typing import cast
from sqlalchemy import select
from sqlalchemy.orm import Session
from core.workflow.nodes import NodeType
from events.app_event import app_published_workflow_was_updated
from extensions.ext_database import db
from models import AppMode
from models.enums import AppTriggerStatus
from models.trigger import AppTrigger
from models.workflow import Workflow
@app_published_workflow_was_updated.connect
def handle(sender, **kwargs):
"""
Handle app published workflow update event to sync app_triggers table.
When a workflow is published, this handler will:
1. Extract trigger nodes from the workflow graph
2. Compare with existing app_triggers records
3. Add new triggers and remove obsolete ones
"""
app = sender
if app.mode != AppMode.WORKFLOW.value:
return
published_workflow = kwargs.get("published_workflow")
published_workflow = cast(Workflow, published_workflow)
# Extract trigger info from workflow
trigger_infos = get_trigger_infos_from_workflow(published_workflow)
with Session(db.engine) as session:
# Get existing app triggers
existing_triggers = (
session.execute(
select(AppTrigger).where(AppTrigger.tenant_id == app.tenant_id, AppTrigger.app_id == app.id)
)
.scalars()
.all()
)
# Convert existing triggers to dict for easy lookup
existing_triggers_map = {trigger.node_id: trigger for trigger in existing_triggers}
# Get current and new node IDs
existing_node_ids = set(existing_triggers_map.keys())
new_node_ids = {info["node_id"] for info in trigger_infos}
# Calculate changes
added_node_ids = new_node_ids - existing_node_ids
removed_node_ids = existing_node_ids - new_node_ids
# Remove obsolete triggers
for node_id in removed_node_ids:
session.delete(existing_triggers_map[node_id])
for trigger_info in trigger_infos:
node_id = trigger_info["node_id"]
if node_id in added_node_ids:
# Create new trigger
app_trigger = AppTrigger(
tenant_id=app.tenant_id,
app_id=app.id,
trigger_type=trigger_info["node_type"],
title=trigger_info["node_title"],
node_id=node_id,
provider_name=trigger_info.get("node_provider_name", ""),
status=AppTriggerStatus.ENABLED,
)
session.add(app_trigger)
elif node_id in existing_node_ids:
# Update existing trigger if needed
existing_trigger = existing_triggers_map[node_id]
new_title = trigger_info["node_title"]
if new_title and existing_trigger.title != new_title:
existing_trigger.title = new_title
session.add(existing_trigger)
session.commit()
def get_trigger_infos_from_workflow(published_workflow: Workflow) -> list[dict]:
"""
Extract trigger node information from the workflow graph.
Returns:
List of trigger info dictionaries containing:
- node_type: The type of the trigger node ('trigger-webhook', 'trigger-schedule', 'trigger-plugin')
- node_id: The node ID in the workflow
- node_title: The title of the node
- node_provider_name: The name of the node's provider, only for plugin
"""
graph = published_workflow.graph_dict
if not graph:
return []
nodes = graph.get("nodes", [])
trigger_types = {NodeType.TRIGGER_WEBHOOK.value, NodeType.TRIGGER_SCHEDULE.value, NodeType.TRIGGER_PLUGIN.value}
trigger_infos = [
{
"node_type": node.get("data", {}).get("type"),
"node_id": node.get("id"),
"node_title": node.get("data", {}).get("title"),
"node_provider_name": node.get("data", {}).get("provider_name"),
}
for node in nodes
if node.get("data", {}).get("type") in trigger_types
]
return trigger_infos

View File

@ -18,6 +18,7 @@ def init_app(app: DifyApp):
from controllers.inner_api import bp as inner_api_bp
from controllers.mcp import bp as mcp_bp
from controllers.service_api import bp as service_api_bp
from controllers.trigger import bp as trigger_bp
from controllers.web import bp as web_bp
CORS(
@ -56,3 +57,11 @@ def init_app(app: DifyApp):
app.register_blueprint(inner_api_bp)
app.register_blueprint(mcp_bp)
# Register trigger blueprint with CORS for webhook calls
CORS(
trigger_bp,
allow_headers=["Content-Type", "Authorization", "X-App-Code"],
methods=["GET", "PUT", "POST", "DELETE", "OPTIONS", "PATCH", "HEAD"],
)
app.register_blueprint(trigger_bp)

View File

@ -96,7 +96,10 @@ def init_app(app: DifyApp) -> Celery:
celery_app.set_default()
app.extensions["celery"] = celery_app
imports = []
imports = [
"tasks.async_workflow_tasks", # trigger workers
"tasks.trigger_processing_tasks", # async trigger processing
]
day = dify_config.CELERY_BEAT_SCHEDULER_TIME
# if you add a new task, please add the switch to CeleryScheduleTasksConfig
@ -157,6 +160,18 @@ def init_app(app: DifyApp) -> Celery:
"task": "schedule.clean_workflow_runlogs_precise.clean_workflow_runlogs_precise",
"schedule": crontab(minute="0", hour="2"),
}
if dify_config.ENABLE_WORKFLOW_SCHEDULE_POLLER_TASK:
imports.append("schedule.workflow_schedule_task")
beat_schedule["workflow_schedule_task"] = {
"task": "schedule.workflow_schedule_task.poll_workflow_schedules",
"schedule": timedelta(minutes=dify_config.WORKFLOW_SCHEDULE_POLLER_INTERVAL),
}
if dify_config.ENABLE_TRIGGER_PROVIDER_REFRESH_TASK:
imports.append("schedule.trigger_provider_refresh_task")
beat_schedule["trigger_provider_refresh"] = {
"task": "schedule.trigger_provider_refresh_task.trigger_provider_refresh",
"schedule": timedelta(minutes=dify_config.TRIGGER_PROVIDER_REFRESH_INTERVAL),
}
celery_app.conf.update(beat_schedule=beat_schedule, imports=imports)
return celery_app

View File

@ -23,6 +23,7 @@ def init_app(app: DifyApp):
reset_password,
setup_datasource_oauth_client,
setup_system_tool_oauth_client,
setup_system_trigger_oauth_client,
transform_datasource_credentials,
upgrade_db,
vdb_migrate,
@ -47,6 +48,7 @@ def init_app(app: DifyApp):
clear_orphaned_file_records,
remove_orphaned_files_on_storage,
setup_system_tool_oauth_client,
setup_system_trigger_oauth_client,
cleanup_orphaned_draft_variables,
migrate_oss,
setup_datasource_oauth_client,

View File

@ -8,6 +8,7 @@ from libs.helper import TimestampField
workflow_app_log_partial_fields = {
"id": fields.String,
"workflow_run": fields.Nested(workflow_run_for_log_fields, attribute="workflow_run", allow_null=True),
"details": fields.Raw(attribute="details"),
"created_from": fields.String,
"created_by_role": fields.String,
"created_by_account": fields.Nested(simple_account_fields, attribute="created_by_account", allow_null=True),

View File

@ -8,6 +8,7 @@ workflow_run_for_log_fields = {
"id": fields.String,
"version": fields.String,
"status": fields.String,
"triggered_from": fields.String,
"error": fields.String,
"elapsed_time": fields.Float,
"total_tokens": fields.Integer,

View File

@ -0,0 +1,25 @@
from flask_restx import fields
trigger_fields = {
"id": fields.String,
"trigger_type": fields.String,
"title": fields.String,
"node_id": fields.String,
"provider_name": fields.String,
"icon": fields.String,
"status": fields.String,
"created_at": fields.DateTime(dt_format="iso8601"),
"updated_at": fields.DateTime(dt_format="iso8601"),
}
triggers_list_fields = {"data": fields.List(fields.Nested(trigger_fields))}
webhook_trigger_fields = {
"id": fields.String,
"webhook_id": fields.String,
"webhook_url": fields.String,
"webhook_debug_url": fields.String,
"node_id": fields.String,
"created_at": fields.DateTime(dt_format="iso8601"),
}

Some files were not shown because too many files have changed in this diff Show More