From 688b2cfc488493ae7396e8458ac1be45236eeb6a Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Yanli=20=E7=9B=90=E7=B2=92?= Date: Tue, 27 Jan 2026 01:09:33 +0800 Subject: [PATCH] remove the agent notes --- agent-notes/.gitkeep | 0 .../entities/message_entities.py.md | 25 ------------------- .../__base/large_language_model.py.md | 20 --------------- .../__base/test_llm_invoke_opaque_body.py.md | 12 --------- 4 files changed, 57 deletions(-) delete mode 100644 agent-notes/.gitkeep delete mode 100644 api/agent-notes/core/model_runtime/entities/message_entities.py.md delete mode 100644 api/agent-notes/core/model_runtime/model_providers/__base/large_language_model.py.md delete mode 100644 api/agent-notes/tests/unit_tests/core/model_runtime/__base/test_llm_invoke_opaque_body.py.md diff --git a/agent-notes/.gitkeep b/agent-notes/.gitkeep deleted file mode 100644 index e69de29bb2..0000000000 diff --git a/api/agent-notes/core/model_runtime/entities/message_entities.py.md b/api/agent-notes/core/model_runtime/entities/message_entities.py.md deleted file mode 100644 index 6c4bbed5c0..0000000000 --- a/api/agent-notes/core/model_runtime/entities/message_entities.py.md +++ /dev/null @@ -1,25 +0,0 @@ -## Purpose - -`core/model_runtime/entities/message_entities.py` defines the canonical in-memory Pydantic entities for model runtime -prompt messages and multi-modal message content. These entities are used across providers (built-in and plugin-backed) -and are serialized/deserialized when exchanging prompt/response payloads between layers. - -## Key invariants - -- `PromptMessage.content` is either a `str`, a list of typed content items (discriminated by `type`), or `None`. -- `PromptMessage.validate_content` normalizes dict/content-model inputs into the correct concrete content classes using - `CONTENT_TYPE_MAPPING`. -- `PromptMessage.serialize_content` ensures a list of content items is emitted as a list of plain dicts. -- `AssistantPromptMessage.tool_calls` may coexist with text/multi-modal content and is considered part of "non-empty". - -## Opaque pass-through fields - -- `opaque_body` is an optional JSON value on `PromptMessageContent` and `AssistantPromptMessage`. -- It is treated as an uninterpreted provider-specific payload and must be passed through unchanged between Dify and - plugin LLM providers (no validation/transformation beyond JSON serialization). - -## Safety / compatibility notes - -- Do not make `opaque_body` required; existing providers/plugins may not send it. -- Keep `type` discrimination stable; content subclasses must continue to be selectable via `Field(discriminator="type")`. - diff --git a/api/agent-notes/core/model_runtime/model_providers/__base/large_language_model.py.md b/api/agent-notes/core/model_runtime/model_providers/__base/large_language_model.py.md deleted file mode 100644 index 1a05d4ae69..0000000000 --- a/api/agent-notes/core/model_runtime/model_providers/__base/large_language_model.py.md +++ /dev/null @@ -1,20 +0,0 @@ -## Purpose - -`core/model_runtime/model_providers/__base/large_language_model.py` defines the base `LargeLanguageModel` interface used -by model providers, including plugin-backed providers via `PluginModelClient`. - -## Plugin invocation flow - -- For plugin-based providers, `invoke()` delegates to `PluginModelClient.invoke_llm(...)`, which streams - `LLMResultChunk` objects from the plugin daemon. -- Dify yields chunks to callers and also aggregates chunks to fire `after_invoke` callbacks (and to construct a - blocking `LLMResult` when `stream=False`). - -## Key invariants / edge cases - -- When aggregating chunks into an `LLMResult`, preserve provider-specific fields on the assistant message: - - `AssistantPromptMessage.opaque_body` (pass-through, uninterpreted JSON). - - Incremental `tool_calls` (merge deltas via `_increase_tool_call`). -- Chunk `.prompt_messages` may be empty for plugin responses (compat layer for the plugin daemon); Dify re-attaches the - original request `prompt_messages` for downstream consumers. - diff --git a/api/agent-notes/tests/unit_tests/core/model_runtime/__base/test_llm_invoke_opaque_body.py.md b/api/agent-notes/tests/unit_tests/core/model_runtime/__base/test_llm_invoke_opaque_body.py.md deleted file mode 100644 index b5ba0ac7a0..0000000000 --- a/api/agent-notes/tests/unit_tests/core/model_runtime/__base/test_llm_invoke_opaque_body.py.md +++ /dev/null @@ -1,12 +0,0 @@ -## Purpose - -Unit tests for plugin-backed `LargeLanguageModel.invoke()` behavior around preserving provider pass-through data. - -## What it covers - -- `AssistantPromptMessage.opaque_body` from plugin `LLMResultChunk` deltas is preserved: - - On the returned `LLMResult` in blocking (`stream=False`) mode. - - On the aggregated `LLMResult` passed to `on_after_invoke` callbacks in streaming mode. -- Streaming mode also verifies that `chunk.prompt_messages` is re-attached to the original request prompt messages. -- Streaming aggregation merges incremental `tool_calls` across chunks. -