Merge branch 'main' into feat/rag-2

This commit is contained in:
twwu 2025-08-18 11:16:18 +08:00
commit 386614951f
99 changed files with 3421 additions and 1810 deletions

File diff suppressed because it is too large Load Diff

View File

@ -82,7 +82,7 @@ jobs:
- name: Install pnpm
uses: pnpm/action-setup@v4
with:
version: 10
package_json_file: web/package.json
run_install: false
- name: Setup NodeJS
@ -95,10 +95,12 @@ jobs:
- name: Web dependencies
if: steps.changed-files.outputs.any_changed == 'true'
working-directory: ./web
run: pnpm install --frozen-lockfile
- name: Web style check
if: steps.changed-files.outputs.any_changed == 'true'
working-directory: ./web
run: pnpm run lint
docker-compose-template:

View File

@ -46,7 +46,7 @@ jobs:
- name: Install pnpm
uses: pnpm/action-setup@v4
with:
version: 10
package_json_file: web/package.json
run_install: false
- name: Set up Node.js
@ -59,10 +59,12 @@ jobs:
- name: Install dependencies
if: env.FILES_CHANGED == 'true'
working-directory: ./web
run: pnpm install --frozen-lockfile
- name: Generate i18n translations
if: env.FILES_CHANGED == 'true'
working-directory: ./web
run: pnpm run auto-gen-i18n ${{ env.FILE_ARGS }}
- name: Create Pull Request

View File

@ -35,7 +35,7 @@ jobs:
if: steps.changed-files.outputs.any_changed == 'true'
uses: pnpm/action-setup@v4
with:
version: 10
package_json_file: web/package.json
run_install: false
- name: Setup Node.js
@ -48,8 +48,10 @@ jobs:
- name: Install dependencies
if: steps.changed-files.outputs.any_changed == 'true'
working-directory: ./web
run: pnpm install --frozen-lockfile
- name: Run tests
if: steps.changed-files.outputs.any_changed == 'true'
working-directory: ./web
run: pnpm test

83
CLAUDE.md Normal file
View File

@ -0,0 +1,83 @@
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Project Overview
Dify is an open-source platform for developing LLM applications with an intuitive interface combining agentic AI workflows, RAG pipelines, agent capabilities, and model management.
The codebase consists of:
- **Backend API** (`/api`): Python Flask application with Domain-Driven Design architecture
- **Frontend Web** (`/web`): Next.js 15 application with TypeScript and React 19
- **Docker deployment** (`/docker`): Containerized deployment configurations
## Development Commands
### Backend (API)
All Python commands must be prefixed with `uv run --project api`:
```bash
# Start development servers
./dev/start-api # Start API server
./dev/start-worker # Start Celery worker
# Run tests
uv run --project api pytest # Run all tests
uv run --project api pytest tests/unit_tests/ # Unit tests only
uv run --project api pytest tests/integration_tests/ # Integration tests
# Code quality
./dev/reformat # Run all formatters and linters
uv run --project api ruff check --fix ./ # Fix linting issues
uv run --project api ruff format ./ # Format code
uv run --project api mypy . # Type checking
```
### Frontend (Web)
```bash
cd web
pnpm lint # Run ESLint
pnpm eslint-fix # Fix ESLint issues
pnpm test # Run Jest tests
```
## Testing Guidelines
### Backend Testing
- Use `pytest` for all backend tests
- Write tests first (TDD approach)
- Test structure: Arrange-Act-Assert
## Code Style Requirements
### Python
- Use type hints for all functions and class attributes
- No `Any` types unless absolutely necessary
- Implement special methods (`__repr__`, `__str__`) appropriately
### TypeScript/JavaScript
- Strict TypeScript configuration
- ESLint with Prettier integration
- Avoid `any` type
## Important Notes
- **Environment Variables**: Always use UV for Python commands: `uv run --project api <command>`
- **Comments**: Only write meaningful comments that explain "why", not "what"
- **File Creation**: Always prefer editing existing files over creating new ones
- **Documentation**: Don't create documentation files unless explicitly requested
- **Code Quality**: Always run `./dev/reformat` before committing backend changes
## Common Development Tasks
### Adding a New API Endpoint
1. Create controller in `/api/controllers/`
2. Add service logic in `/api/services/`
3. Update routes in controller's `__init__.py`
4. Write tests in `/api/tests/`
## Project-Specific Conventions
- All async tasks use Celery with Redis as broker

View File

@ -225,7 +225,8 @@ Deploy Dify to AWS with [CDK](https://aws.amazon.com/cdk/)
##### AWS
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK by @KevinZhao (EKS based)](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK by @tmokmss (ECS based)](https://github.com/aws-samples/dify-self-hosted-on-aws)
#### Using Alibaba Cloud Computing Nest

View File

@ -208,7 +208,8 @@ docker compose up -d
##### AWS
- [AWS CDK بواسطة @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK بواسطة @KevinZhao (EKS based)](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK بواسطة @tmokmss (ECS based)](https://github.com/aws-samples/dify-self-hosted-on-aws)
#### استخدام Alibaba Cloud للنشر
[بسرعة نشر Dify إلى سحابة علي بابا مع عش الحوسبة السحابية علي بابا](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)

View File

@ -225,7 +225,8 @@ GitHub-এ ডিফাইকে স্টার দিয়ে রাখুন
##### AWS
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK by @KevinZhao (EKS based)](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK by @tmokmss (ECS based)](https://github.com/aws-samples/dify-self-hosted-on-aws)
#### Alibaba Cloud ব্যবহার করে ডিপ্লয়

View File

@ -223,7 +223,8 @@ docker compose up -d
使用 [CDK](https://aws.amazon.com/cdk/) 将 Dify 部署到 AWS
##### AWS
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK by @KevinZhao (EKS based)](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK by @tmokmss (ECS based)](https://github.com/aws-samples/dify-self-hosted-on-aws)
#### 使用 阿里云计算巢 部署

View File

@ -220,7 +220,8 @@ Stellen Sie Dify mit nur einem Klick mithilfe von [terraform](https://www.terraf
Bereitstellung von Dify auf AWS mit [CDK](https://aws.amazon.com/cdk/)
##### AWS
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK by @KevinZhao (EKS based)](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK by @tmokmss (ECS based)](https://github.com/aws-samples/dify-self-hosted-on-aws)
#### Alibaba Cloud

View File

@ -220,7 +220,8 @@ Despliega Dify en una plataforma en la nube con un solo clic utilizando [terrafo
Despliegue Dify en AWS usando [CDK](https://aws.amazon.com/cdk/)
##### AWS
- [AWS CDK por @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK por @KevinZhao (EKS based)](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK por @tmokmss (ECS based)](https://github.com/aws-samples/dify-self-hosted-on-aws)
#### Alibaba Cloud

View File

@ -218,7 +218,8 @@ Déployez Dify sur une plateforme cloud en un clic en utilisant [terraform](http
Déployez Dify sur AWS en utilisant [CDK](https://aws.amazon.com/cdk/)
##### AWS
- [AWS CDK par @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK par @KevinZhao (EKS based)](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK par @tmokmss (ECS based)](https://github.com/aws-samples/dify-self-hosted-on-aws)
#### Alibaba Cloud

View File

@ -219,7 +219,8 @@ docker compose up -d
[CDK](https://aws.amazon.com/cdk/) を使用して、DifyをAWSにデプロイします
##### AWS
- [@KevinZhaoによるAWS CDK](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [@KevinZhaoによるAWS CDK (EKS based)](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [@tmokmssによるAWS CDK (ECS based)](https://github.com/aws-samples/dify-self-hosted-on-aws)
#### Alibaba Cloud
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)

View File

@ -218,7 +218,8 @@ wa'logh nIqHom neH ghun deployment toy'wI' [terraform](https://www.terraform.io/
wa'logh nIqHom neH ghun deployment toy'wI' [CDK](https://aws.amazon.com/cdk/) lo'laH.
##### AWS
- [AWS CDK qachlot @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK qachlot @KevinZhao (EKS based)](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK qachlot @tmokmss (ECS based)](https://github.com/aws-samples/dify-self-hosted-on-aws)
#### Alibaba Cloud

View File

@ -212,7 +212,8 @@ Dify를 Kubernetes에 배포하고 프리미엄 스케일링 설정을 구성했
[CDK](https://aws.amazon.com/cdk/)를 사용하여 AWS에 Dify 배포
##### AWS
- [KevinZhao의 AWS CDK](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [KevinZhao의 AWS CDK (EKS based)](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [tmokmss의 AWS CDK (ECS based)](https://github.com/aws-samples/dify-self-hosted-on-aws)
#### Alibaba Cloud

View File

@ -217,7 +217,8 @@ Implante o Dify na Plataforma Cloud com um único clique usando [terraform](http
Implante o Dify na AWS usando [CDK](https://aws.amazon.com/cdk/)
##### AWS
- [AWS CDK por @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK por @KevinZhao (EKS based)](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK por @tmokmss (ECS based)](https://github.com/aws-samples/dify-self-hosted-on-aws)
#### Alibaba Cloud

View File

@ -218,7 +218,8 @@ namestite Dify v Cloud Platform z enim klikom z uporabo [terraform](https://www.
Uvedite Dify v AWS z uporabo [CDK](https://aws.amazon.com/cdk/)
##### AWS
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK by @KevinZhao (EKS based)](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK by @tmokmss (ECS based)](https://github.com/aws-samples/dify-self-hosted-on-aws)
#### Alibaba Cloud

View File

@ -211,7 +211,8 @@ Dify'ı bulut platformuna tek tıklamayla dağıtın [terraform](https://www.ter
[CDK](https://aws.amazon.com/cdk/) kullanarak Dify'ı AWS'ye dağıtın
##### AWS
- [AWS CDK tarafından @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK tarafından @KevinZhao (EKS based)](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK tarafından @tmokmss (ECS based)](https://github.com/aws-samples/dify-self-hosted-on-aws)
#### Alibaba Cloud

View File

@ -223,7 +223,8 @@ Dify 的所有功能都提供相應的 API因此您可以輕鬆地將 Dify
### AWS
- [由 @KevinZhao 提供的 AWS CDK](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [由 @KevinZhao 提供的 AWS CDK (EKS based)](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [由 @tmokmss 提供的 AWS CDK (ECS based)](https://github.com/aws-samples/dify-self-hosted-on-aws)
#### 使用 阿里云计算巢進行部署

View File

@ -213,7 +213,8 @@ Triển khai Dify lên nền tảng đám mây với một cú nhấp chuột b
Triển khai Dify trên AWS bằng [CDK](https://aws.amazon.com/cdk/)
##### AWS
- [AWS CDK bởi @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK bởi @KevinZhao (EKS based)](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
- [AWS CDK bởi @tmokmss (ECS based)](https://github.com/aws-samples/dify-self-hosted-on-aws)
#### Alibaba Cloud

View File

@ -42,6 +42,15 @@ REDIS_PORT=6379
REDIS_USERNAME=
REDIS_PASSWORD=difyai123456
REDIS_USE_SSL=false
# SSL configuration for Redis (when REDIS_USE_SSL=true)
REDIS_SSL_CERT_REQS=CERT_NONE
# Options: CERT_NONE, CERT_OPTIONAL, CERT_REQUIRED
REDIS_SSL_CA_CERTS=
# Path to CA certificate file for SSL verification
REDIS_SSL_CERTFILE=
# Path to client certificate file for SSL authentication
REDIS_SSL_KEYFILE=
# Path to client private key file for SSL authentication
REDIS_DB=0
# redis Sentinel configuration.

View File

@ -51,6 +51,7 @@ def initialize_extensions(app: DifyApp):
ext_login,
ext_mail,
ext_migrate,
ext_orjson,
ext_otel,
ext_proxy_fix,
ext_redis,
@ -67,6 +68,7 @@ def initialize_extensions(app: DifyApp):
ext_logging,
ext_warnings,
ext_import_modules,
ext_orjson,
ext_set_secretkey,
ext_compress,
ext_code_based_extension,

View File

@ -39,6 +39,26 @@ class RedisConfig(BaseSettings):
default=False,
)
REDIS_SSL_CERT_REQS: str = Field(
description="SSL certificate requirements (CERT_NONE, CERT_OPTIONAL, CERT_REQUIRED)",
default="CERT_NONE",
)
REDIS_SSL_CA_CERTS: Optional[str] = Field(
description="Path to the CA certificate file for SSL verification",
default=None,
)
REDIS_SSL_CERTFILE: Optional[str] = Field(
description="Path to the client certificate file for SSL authentication",
default=None,
)
REDIS_SSL_KEYFILE: Optional[str] = Field(
description="Path to the client private key file for SSL authentication",
default=None,
)
REDIS_USE_SENTINEL: Optional[bool] = Field(
description="Enable Redis Sentinel mode for high availability",
default=False,

View File

@ -1,6 +1,7 @@
import logging
import flask_login
from flask import request
from flask_restful import Resource, reqparse
from werkzeug.exceptions import InternalServerError, NotFound
@ -24,6 +25,7 @@ from core.errors.error import (
ProviderTokenNotInitError,
QuotaExceededError,
)
from core.helper.trace_id_helper import get_external_trace_id
from core.model_runtime.errors.invoke import InvokeError
from libs import helper
from libs.helper import uuid_value
@ -115,6 +117,10 @@ class ChatMessageApi(Resource):
streaming = args["response_mode"] != "blocking"
args["auto_generate_name"] = False
external_trace_id = get_external_trace_id(request)
if external_trace_id:
args["external_trace_id"] = external_trace_id
account = flask_login.current_user
try:

View File

@ -23,6 +23,7 @@ from core.app.app_config.features.file_upload.manager import FileUploadConfigMan
from core.app.apps.base_app_queue_manager import AppQueueManager
from core.app.entities.app_invoke_entities import InvokeFrom
from core.file.models import File
from core.helper.trace_id_helper import get_external_trace_id
from extensions.ext_database import db
from factories import file_factory, variable_factory
from fields.workflow_fields import workflow_fields, workflow_pagination_fields
@ -185,6 +186,10 @@ class AdvancedChatDraftWorkflowRunApi(Resource):
args = parser.parse_args()
external_trace_id = get_external_trace_id(request)
if external_trace_id:
args["external_trace_id"] = external_trace_id
try:
response = AppGenerateService.generate(
app_model=app_model, user=current_user, args=args, invoke_from=InvokeFrom.DEBUGGER, streaming=True
@ -373,6 +378,10 @@ class DraftWorkflowRunApi(Resource):
parser.add_argument("files", type=list, required=False, location="json")
args = parser.parse_args()
external_trace_id = get_external_trace_id(request)
if external_trace_id:
args["external_trace_id"] = external_trace_id
try:
response = AppGenerateService.generate(
app_model=app_model,

View File

@ -163,11 +163,11 @@ class WorkflowVariableCollectionApi(Resource):
draft_var_srv = WorkflowDraftVariableService(
session=session,
)
workflow_vars = draft_var_srv.list_variables_without_values(
app_id=app_model.id,
page=args.page,
limit=args.limit,
)
workflow_vars = draft_var_srv.list_variables_without_values(
app_id=app_model.id,
page=args.page,
limit=args.limit,
)
return workflow_vars

View File

@ -32,7 +32,7 @@ class VersionApi(Resource):
return result
try:
response = requests.get(check_update_url, {"current_version": args.get("current_version")})
response = requests.get(check_update_url, {"current_version": args.get("current_version")}, timeout=(3, 10))
except Exception as error:
logging.warning("Check update version error: %s.", str(error))
result["version"] = args.get("current_version")

View File

@ -1,5 +1,3 @@
import json
from flask_restful import Resource, marshal_with, reqparse
from flask_restful.inputs import int_range
from sqlalchemy.orm import Session
@ -136,12 +134,15 @@ class ConversationVariableDetailApi(Resource):
variable_id = str(variable_id)
parser = reqparse.RequestParser()
parser.add_argument("value", required=True, location="json")
# using lambda is for passing the already-typed value without modification
# if no lambda, it will be converted to string
# the string cannot be converted using json.loads
parser.add_argument("value", required=True, location="json", type=lambda x: x)
args = parser.parse_args()
try:
return ConversationService.update_conversation_variable(
app_model, conversation_id, variable_id, end_user, json.loads(args["value"])
app_model, conversation_id, variable_id, end_user, args["value"]
)
except services.errors.conversation.ConversationNotExistsError:
raise NotFound("Conversation Not Exists.")

View File

@ -140,7 +140,9 @@ class ChatAppGenerator(MessageBasedAppGenerator):
)
# get tracing instance
trace_manager = TraceQueueManager(app_id=app_model.id)
trace_manager = TraceQueueManager(
app_id=app_model.id, user_id=user.id if isinstance(user, Account) else user.session_id
)
# init application generate entity
application_generate_entity = ChatAppGenerateEntity(

View File

@ -124,7 +124,9 @@ class CompletionAppGenerator(MessageBasedAppGenerator):
)
# get tracing instance
trace_manager = TraceQueueManager(app_model.id)
trace_manager = TraceQueueManager(
app_id=app_model.id, user_id=user.id if isinstance(user, Account) else user.session_id
)
# init application generate entity
application_generate_entity = CompletionAppGenerateEntity(

View File

@ -6,7 +6,6 @@ from core.app.entities.queue_entities import (
MessageQueueMessage,
QueueAdvancedChatMessageEndEvent,
QueueErrorEvent,
QueueMessage,
QueueMessageEndEvent,
QueueStopEvent,
)
@ -22,15 +21,6 @@ class MessageBasedAppQueueManager(AppQueueManager):
self._app_mode = app_mode
self._message_id = str(message_id)
def construct_queue_message(self, event: AppQueueEvent) -> QueueMessage:
return MessageQueueMessage(
task_id=self._task_id,
message_id=self._message_id,
conversation_id=self._conversation_id,
app_mode=self._app_mode,
event=event,
)
def _publish(self, event: AppQueueEvent, pub_from: PublishFrom) -> None:
"""
Publish event to queue

View File

@ -5,7 +5,7 @@ from base64 import b64encode
from collections.abc import Mapping
from typing import Any
from core.variables.utils import SegmentJSONEncoder
from core.variables.utils import dumps_with_segments
class TemplateTransformer(ABC):
@ -93,7 +93,7 @@ class TemplateTransformer(ABC):
@classmethod
def serialize_inputs(cls, inputs: Mapping[str, Any]) -> str:
inputs_json_str = json.dumps(inputs, ensure_ascii=False, cls=SegmentJSONEncoder).encode()
inputs_json_str = dumps_with_segments(inputs, ensure_ascii=False).encode()
input_base64_encoded = b64encode(inputs_json_str).decode("utf-8")
return input_base64_encoded

View File

@ -16,15 +16,33 @@ def get_external_trace_id(request: Any) -> Optional[str]:
"""
Retrieve the trace_id from the request.
Priority: header ('X-Trace-Id'), then parameters, then JSON body. Returns None if not provided or invalid.
Priority:
1. header ('X-Trace-Id')
2. parameters
3. JSON body
4. Current OpenTelemetry context (if enabled)
5. OpenTelemetry traceparent header (if present and valid)
Returns None if no valid trace_id is provided.
"""
trace_id = request.headers.get("X-Trace-Id")
if not trace_id:
trace_id = request.args.get("trace_id")
if not trace_id and getattr(request, "is_json", False):
json_data = getattr(request, "json", None)
if json_data:
trace_id = json_data.get("trace_id")
if not trace_id:
trace_id = get_trace_id_from_otel_context()
if not trace_id:
traceparent = request.headers.get("traceparent")
if traceparent:
trace_id = parse_traceparent_header(traceparent)
if isinstance(trace_id, str) and is_valid_trace_id(trace_id):
return trace_id
return None
@ -40,3 +58,49 @@ def extract_external_trace_id_from_args(args: Mapping[str, Any]) -> dict:
if trace_id:
return {"external_trace_id": trace_id}
return {}
def get_trace_id_from_otel_context() -> Optional[str]:
"""
Retrieve the current trace ID from the active OpenTelemetry trace context.
Returns None if:
1. OpenTelemetry SDK is not installed or enabled.
2. There is no active span or trace context.
"""
try:
from opentelemetry.trace import SpanContext, get_current_span
from opentelemetry.trace.span import INVALID_TRACE_ID
span = get_current_span()
if not span:
return None
span_context: SpanContext = span.get_span_context()
if not span_context or span_context.trace_id == INVALID_TRACE_ID:
return None
trace_id_hex = f"{span_context.trace_id:032x}"
return trace_id_hex
except Exception:
return None
def parse_traceparent_header(traceparent: str) -> Optional[str]:
"""
Parse the `traceparent` header to extract the trace_id.
Expected format:
'version-trace_id-span_id-flags'
Reference:
W3C Trace Context Specification: https://www.w3.org/TR/trace-context/
"""
try:
parts = traceparent.split("-")
if len(parts) == 4 and len(parts[1]) == 32:
return parts[1]
except Exception:
pass
return None

View File

@ -10,8 +10,6 @@ from core.mcp.types import (
from models.tools import MCPToolProvider
from services.tools.mcp_tools_manage_service import MCPToolManageService
LATEST_PROTOCOL_VERSION = "1.0"
class OAuthClientProvider:
mcp_provider: MCPToolProvider

View File

@ -7,6 +7,7 @@ from typing import Any, TypeAlias, final
from urllib.parse import urljoin, urlparse
import httpx
from httpx_sse import EventSource, ServerSentEvent
from sseclient import SSEClient
from core.mcp import types
@ -37,11 +38,6 @@ WriteQueue: TypeAlias = queue.Queue[SessionMessage | Exception | None]
StatusQueue: TypeAlias = queue.Queue[_StatusReady | _StatusError]
def remove_request_params(url: str) -> str:
"""Remove request parameters from URL, keeping only the path."""
return urljoin(url, urlparse(url).path)
class SSETransport:
"""SSE client transport implementation."""
@ -114,7 +110,7 @@ class SSETransport:
logger.exception("Error parsing server message")
read_queue.put(exc)
def _handle_sse_event(self, sse, read_queue: ReadQueue, status_queue: StatusQueue) -> None:
def _handle_sse_event(self, sse: ServerSentEvent, read_queue: ReadQueue, status_queue: StatusQueue) -> None:
"""Handle a single SSE event.
Args:
@ -130,7 +126,7 @@ class SSETransport:
case _:
logger.warning("Unknown SSE event: %s", sse.event)
def sse_reader(self, event_source, read_queue: ReadQueue, status_queue: StatusQueue) -> None:
def sse_reader(self, event_source: EventSource, read_queue: ReadQueue, status_queue: StatusQueue) -> None:
"""Read and process SSE events.
Args:
@ -225,7 +221,7 @@ class SSETransport:
self,
executor: ThreadPoolExecutor,
client: httpx.Client,
event_source,
event_source: EventSource,
) -> tuple[ReadQueue, WriteQueue]:
"""Establish connection and start worker threads.

View File

@ -16,13 +16,14 @@ from extensions.ext_database import db
from models.model import App, AppMCPServer, AppMode, EndUser
from services.app_generate_service import AppGenerateService
"""
Apply to MCP HTTP streamable server with stateless http
"""
logger = logging.getLogger(__name__)
class MCPServerStreamableHTTPRequestHandler:
"""
Apply to MCP HTTP streamable server with stateless http
"""
def __init__(
self, app: App, request: types.ClientRequest | types.ClientNotification, user_input_form: list[VariableEntity]
):

View File

@ -1,6 +1,10 @@
import json
from collections.abc import Generator
from contextlib import AbstractContextManager
import httpx
import httpx_sse
from httpx_sse import connect_sse
from configs import dify_config
from core.mcp.types import ErrorData, JSONRPCError
@ -55,20 +59,42 @@ def create_ssrf_proxy_mcp_http_client(
)
def ssrf_proxy_sse_connect(url, **kwargs):
def ssrf_proxy_sse_connect(url: str, **kwargs) -> AbstractContextManager[httpx_sse.EventSource]:
"""Connect to SSE endpoint with SSRF proxy protection.
This function creates an SSE connection using the configured proxy settings
to prevent SSRF attacks when connecting to external endpoints.
to prevent SSRF attacks when connecting to external endpoints. It returns
a context manager that yields an EventSource object for SSE streaming.
The function handles HTTP client creation and cleanup automatically, but
also accepts a pre-configured client via kwargs.
Args:
url: The SSE endpoint URL
**kwargs: Additional arguments passed to the SSE connection
url (str): The SSE endpoint URL to connect to
**kwargs: Additional arguments passed to the SSE connection, including:
- client (httpx.Client, optional): Pre-configured HTTP client.
If not provided, one will be created with SSRF protection.
- method (str, optional): HTTP method to use, defaults to "GET"
- headers (dict, optional): HTTP headers to include in the request
- timeout (httpx.Timeout, optional): Timeout configuration for the connection
Returns:
EventSource object for SSE streaming
AbstractContextManager[httpx_sse.EventSource]: A context manager that yields an EventSource
object for SSE streaming. The EventSource provides access to server-sent events.
Example:
```python
with ssrf_proxy_sse_connect(url, headers=headers) as event_source:
for sse in event_source.iter_sse():
print(sse.event, sse.data)
```
Note:
If a client is not provided in kwargs, one will be automatically created
with SSRF protection based on the application's configuration. If an
exception occurs during connection, any automatically created client
will be cleaned up automatically.
"""
from httpx_sse import connect_sse
# Extract client if provided, otherwise create one
client = kwargs.pop("client", None)
@ -101,7 +127,9 @@ def ssrf_proxy_sse_connect(url, **kwargs):
raise
def create_mcp_error_response(request_id: int | str | None, code: int, message: str, data=None):
def create_mcp_error_response(
request_id: int | str | None, code: int, message: str, data=None
) -> Generator[bytes, None, None]:
"""Create MCP error response"""
error_data = ErrorData(code=code, message=message, data=data)
json_response = JSONRPCError(

View File

@ -151,12 +151,9 @@ def jsonable_encoder(
return format(obj, "f")
if isinstance(obj, dict):
encoded_dict = {}
allowed_keys = set(obj.keys())
for key, value in obj.items():
if (
(not sqlalchemy_safe or (not isinstance(key, str)) or (not key.startswith("_sa")))
and (value is not None or not exclude_none)
and key in allowed_keys
if (not sqlalchemy_safe or (not isinstance(key, str)) or (not key.startswith("_sa"))) and (
value is not None or not exclude_none
):
encoded_key = jsonable_encoder(
key,

View File

@ -4,15 +4,15 @@ from collections.abc import Sequence
from typing import Optional
from urllib.parse import urljoin
from opentelemetry.trace import Status, StatusCode
from opentelemetry.trace import Link, Status, StatusCode
from sqlalchemy.orm import Session, sessionmaker
from core.ops.aliyun_trace.data_exporter.traceclient import (
TraceClient,
convert_datetime_to_nanoseconds,
convert_string_to_id,
convert_to_span_id,
convert_to_trace_id,
create_link,
generate_span_id,
)
from core.ops.aliyun_trace.entities.aliyun_trace_entity import SpanData
@ -103,10 +103,11 @@ class AliyunDataTrace(BaseTraceInstance):
def workflow_trace(self, trace_info: WorkflowTraceInfo):
trace_id = convert_to_trace_id(trace_info.workflow_run_id)
links = []
if trace_info.trace_id:
trace_id = convert_string_to_id(trace_info.trace_id)
links.append(create_link(trace_id_str=trace_info.trace_id))
workflow_span_id = convert_to_span_id(trace_info.workflow_run_id, "workflow")
self.add_workflow_span(trace_id, workflow_span_id, trace_info)
self.add_workflow_span(trace_id, workflow_span_id, trace_info, links)
workflow_node_executions = self.get_workflow_node_executions(trace_info)
for node_execution in workflow_node_executions:
@ -132,8 +133,9 @@ class AliyunDataTrace(BaseTraceInstance):
status = Status(StatusCode.ERROR, trace_info.error)
trace_id = convert_to_trace_id(message_id)
links = []
if trace_info.trace_id:
trace_id = convert_string_to_id(trace_info.trace_id)
links.append(create_link(trace_id_str=trace_info.trace_id))
message_span_id = convert_to_span_id(message_id, "message")
message_span = SpanData(
@ -152,6 +154,7 @@ class AliyunDataTrace(BaseTraceInstance):
OUTPUT_VALUE: str(trace_info.outputs),
},
status=status,
links=links,
)
self.trace_client.add_span(message_span)
@ -192,8 +195,9 @@ class AliyunDataTrace(BaseTraceInstance):
message_id = trace_info.message_id
trace_id = convert_to_trace_id(message_id)
links = []
if trace_info.trace_id:
trace_id = convert_string_to_id(trace_info.trace_id)
links.append(create_link(trace_id_str=trace_info.trace_id))
documents_data = extract_retrieval_documents(trace_info.documents)
dataset_retrieval_span = SpanData(
@ -211,6 +215,7 @@ class AliyunDataTrace(BaseTraceInstance):
INPUT_VALUE: str(trace_info.inputs),
OUTPUT_VALUE: json.dumps(documents_data, ensure_ascii=False),
},
links=links,
)
self.trace_client.add_span(dataset_retrieval_span)
@ -224,8 +229,9 @@ class AliyunDataTrace(BaseTraceInstance):
status = Status(StatusCode.ERROR, trace_info.error)
trace_id = convert_to_trace_id(message_id)
links = []
if trace_info.trace_id:
trace_id = convert_string_to_id(trace_info.trace_id)
links.append(create_link(trace_id_str=trace_info.trace_id))
tool_span = SpanData(
trace_id=trace_id,
@ -244,6 +250,7 @@ class AliyunDataTrace(BaseTraceInstance):
OUTPUT_VALUE: str(trace_info.tool_outputs),
},
status=status,
links=links,
)
self.trace_client.add_span(tool_span)
@ -413,7 +420,9 @@ class AliyunDataTrace(BaseTraceInstance):
status=self.get_workflow_node_status(node_execution),
)
def add_workflow_span(self, trace_id: int, workflow_span_id: int, trace_info: WorkflowTraceInfo):
def add_workflow_span(
self, trace_id: int, workflow_span_id: int, trace_info: WorkflowTraceInfo, links: Sequence[Link]
):
message_span_id = None
if trace_info.message_id:
message_span_id = convert_to_span_id(trace_info.message_id, "message")
@ -438,6 +447,7 @@ class AliyunDataTrace(BaseTraceInstance):
OUTPUT_VALUE: json.dumps(trace_info.workflow_run_outputs, ensure_ascii=False),
},
status=status,
links=links,
)
self.trace_client.add_span(message_span)
@ -456,6 +466,7 @@ class AliyunDataTrace(BaseTraceInstance):
OUTPUT_VALUE: json.dumps(trace_info.workflow_run_outputs, ensure_ascii=False),
},
status=status,
links=links,
)
self.trace_client.add_span(workflow_span)
@ -466,8 +477,9 @@ class AliyunDataTrace(BaseTraceInstance):
status = Status(StatusCode.ERROR, trace_info.error)
trace_id = convert_to_trace_id(message_id)
links = []
if trace_info.trace_id:
trace_id = convert_string_to_id(trace_info.trace_id)
links.append(create_link(trace_id_str=trace_info.trace_id))
suggested_question_span = SpanData(
trace_id=trace_id,
@ -487,6 +499,7 @@ class AliyunDataTrace(BaseTraceInstance):
OUTPUT_VALUE: json.dumps(trace_info.suggested_question, ensure_ascii=False),
},
status=status,
links=links,
)
self.trace_client.add_span(suggested_question_span)

View File

@ -16,6 +16,7 @@ from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import ReadableSpan
from opentelemetry.sdk.util.instrumentation import InstrumentationScope
from opentelemetry.semconv.resource import ResourceAttributes
from opentelemetry.trace import Link, SpanContext, TraceFlags
from configs import dify_config
from core.ops.aliyun_trace.entities.aliyun_trace_entity import SpanData
@ -166,6 +167,16 @@ class SpanBuilder:
return span
def create_link(trace_id_str: str) -> Link:
placeholder_span_id = 0x0000000000000000
trace_id = int(trace_id_str, 16)
span_context = SpanContext(
trace_id=trace_id, span_id=placeholder_span_id, is_remote=False, trace_flags=TraceFlags(TraceFlags.SAMPLED)
)
return Link(span_context)
def generate_span_id() -> int:
span_id = random.getrandbits(64)
while span_id == INVALID_SPAN_ID:

View File

@ -523,7 +523,7 @@ class ProviderManager:
# Init trial provider records if not exists
if ProviderQuotaType.TRIAL not in provider_quota_to_provider_record_dict:
try:
# FIXME ignore the type errork, onyl TrialHostingQuota has limit need to change the logic
# FIXME ignore the type error, only TrialHostingQuota has limit need to change the logic
new_provider_record = Provider(
tenant_id=tenant_id,
# TODO: Use provider name with prefix after the data migration.

View File

@ -1,7 +1,7 @@
import json
from collections import defaultdict
from typing import Any, Optional
import orjson
from pydantic import BaseModel
from configs import dify_config
@ -135,13 +135,13 @@ class Jieba(BaseKeyword):
dataset_keyword_table = self.dataset.dataset_keyword_table
keyword_data_source_type = dataset_keyword_table.data_source_type
if keyword_data_source_type == "database":
dataset_keyword_table.keyword_table = json.dumps(keyword_table_dict, cls=SetEncoder)
dataset_keyword_table.keyword_table = dumps_with_sets(keyword_table_dict)
db.session.commit()
else:
file_key = "keyword_files/" + self.dataset.tenant_id + "/" + self.dataset.id + ".txt"
if storage.exists(file_key):
storage.delete(file_key)
storage.save(file_key, json.dumps(keyword_table_dict, cls=SetEncoder).encode("utf-8"))
storage.save(file_key, dumps_with_sets(keyword_table_dict).encode("utf-8"))
def _get_dataset_keyword_table(self) -> Optional[dict]:
dataset_keyword_table = self.dataset.dataset_keyword_table
@ -157,12 +157,11 @@ class Jieba(BaseKeyword):
data_source_type=keyword_data_source_type,
)
if keyword_data_source_type == "database":
dataset_keyword_table.keyword_table = json.dumps(
dataset_keyword_table.keyword_table = dumps_with_sets(
{
"__type__": "keyword_table",
"__data__": {"index_id": self.dataset.id, "summary": None, "table": {}},
},
cls=SetEncoder,
}
)
db.session.add(dataset_keyword_table)
db.session.commit()
@ -257,8 +256,13 @@ class Jieba(BaseKeyword):
self._save_dataset_keyword_table(keyword_table)
class SetEncoder(json.JSONEncoder):
def default(self, obj):
if isinstance(obj, set):
return list(obj)
return super().default(obj)
def set_orjson_default(obj: Any) -> Any:
"""Default function for orjson serialization of set types"""
if isinstance(obj, set):
return list(obj)
raise TypeError(f"Object of type {type(obj).__name__} is not JSON serializable")
def dumps_with_sets(obj: Any) -> str:
"""JSON dumps with set support using orjson"""
return orjson.dumps(obj, default=set_orjson_default).decode("utf-8")

View File

@ -1 +0,0 @@

View File

@ -108,10 +108,18 @@ class ApiProviderAuthType(Enum):
:param value: mode value
:return: mode
"""
# 'api_key' deprecated in PR #21656
# normalize & tiny alias for backward compatibility
v = (value or "").strip().lower()
if v == "api_key":
v = cls.API_KEY_HEADER.value
for mode in cls:
if mode.value == value:
if mode.value == v:
return mode
raise ValueError(f"invalid mode value {value}")
valid = ", ".join(m.value for m in cls)
raise ValueError(f"invalid mode value '{value}', expected one of: {valid}")
class ToolInvokeMessage(BaseModel):

View File

@ -1,5 +1,7 @@
import json
from collections.abc import Iterable, Sequence
from typing import Any
import orjson
from .segment_group import SegmentGroup
from .segments import ArrayFileSegment, FileSegment, Segment
@ -12,15 +14,20 @@ def to_selector(node_id: str, name: str, paths: Iterable[str] = ()) -> Sequence[
return selectors
class SegmentJSONEncoder(json.JSONEncoder):
def default(self, o):
if isinstance(o, ArrayFileSegment):
return [v.model_dump() for v in o.value]
elif isinstance(o, FileSegment):
return o.value.model_dump()
elif isinstance(o, SegmentGroup):
return [self.default(seg) for seg in o.value]
elif isinstance(o, Segment):
return o.value
else:
super().default(o)
def segment_orjson_default(o: Any) -> Any:
"""Default function for orjson serialization of Segment types"""
if isinstance(o, ArrayFileSegment):
return [v.model_dump() for v in o.value]
elif isinstance(o, FileSegment):
return o.value.model_dump()
elif isinstance(o, SegmentGroup):
return [segment_orjson_default(seg) for seg in o.value]
elif isinstance(o, Segment):
return o.value
raise TypeError(f"Object of type {type(o).__name__} is not JSON serializable")
def dumps_with_segments(obj: Any, ensure_ascii: bool = False) -> str:
"""JSON dumps with segment support using orjson"""
option = orjson.OPT_NON_STR_KEYS
return orjson.dumps(obj, default=segment_orjson_default, option=option).decode("utf-8")

View File

@ -5,7 +5,7 @@ import logging
from collections.abc import Generator, Mapping, Sequence
from typing import TYPE_CHECKING, Any, Optional
from core.app.entities.app_invoke_entities import InvokeFrom, ModelConfigWithCredentialsEntity
from core.app.entities.app_invoke_entities import ModelConfigWithCredentialsEntity
from core.file import FileType, file_manager
from core.helper.code_executor import CodeExecutor, CodeLanguage
from core.llm_generator.output_parser.errors import OutputParserError
@ -194,17 +194,6 @@ class LLMNode(BaseNode):
else []
)
# single step run fetch file from sys files
if not files and self.invoke_from == InvokeFrom.DEBUGGER and not self.previous_node_id:
files = (
llm_utils.fetch_files(
variable_pool=variable_pool,
selector=["sys", "files"],
)
if self._node_data.vision.enabled
else []
)
if files:
node_inputs["#files#"] = [file.to_dict() for file in files]

View File

@ -318,33 +318,6 @@ class ToolNode(BaseNode):
json.append(message.message.json_object)
elif message.type == ToolInvokeMessage.MessageType.LINK:
assert isinstance(message.message, ToolInvokeMessage.TextMessage)
if message.meta:
transfer_method = message.meta.get("transfer_method", FileTransferMethod.TOOL_FILE)
else:
transfer_method = FileTransferMethod.TOOL_FILE
tool_file_id = message.message.text.split("/")[-1].split(".")[0]
with Session(db.engine) as session:
stmt = select(ToolFile).where(ToolFile.id == tool_file_id)
tool_file = session.scalar(stmt)
if tool_file is None:
raise ToolFileError(f"Tool file {tool_file_id} does not exist")
mapping = {
"tool_file_id": tool_file_id,
"type": file_factory.get_file_type_by_mime_type(tool_file.mimetype),
"transfer_method": transfer_method,
"url": message.message.text,
}
file = file_factory.build_from_mapping(
mapping=mapping,
tenant_id=self.tenant_id,
)
files.append(file)
stream_text = f"Link: {message.message.text}\n"
text += stream_text
yield RunStreamChunkEvent(chunk_content=stream_text, from_variable_selector=[node_id, "text"])

View File

@ -5,7 +5,7 @@ import click
from werkzeug.exceptions import NotFound
from core.indexing_runner import DocumentIsPausedError, IndexingRunner
from events.event_handlers.document_index_event import document_index_created
from events.document_index_event import document_index_created
from extensions.ext_database import db
from libs.datetime_utils import naive_utc_now
from models.dataset import Document

View File

@ -1,4 +1,6 @@
import ssl
from datetime import timedelta
from typing import Any, Optional
import pytz
from celery import Celery, Task # type: ignore
@ -8,6 +10,40 @@ from configs import dify_config
from dify_app import DifyApp
def _get_celery_ssl_options() -> Optional[dict[str, Any]]:
"""Get SSL configuration for Celery broker/backend connections."""
# Use REDIS_USE_SSL for consistency with the main Redis client
# Only apply SSL if we're using Redis as broker/backend
if not dify_config.REDIS_USE_SSL:
return None
# Check if Celery is actually using Redis
broker_is_redis = dify_config.CELERY_BROKER_URL and (
dify_config.CELERY_BROKER_URL.startswith("redis://") or dify_config.CELERY_BROKER_URL.startswith("rediss://")
)
if not broker_is_redis:
return None
# Map certificate requirement strings to SSL constants
cert_reqs_map = {
"CERT_NONE": ssl.CERT_NONE,
"CERT_OPTIONAL": ssl.CERT_OPTIONAL,
"CERT_REQUIRED": ssl.CERT_REQUIRED,
}
ssl_cert_reqs = cert_reqs_map.get(dify_config.REDIS_SSL_CERT_REQS, ssl.CERT_NONE)
ssl_options = {
"ssl_cert_reqs": ssl_cert_reqs,
"ssl_ca_certs": dify_config.REDIS_SSL_CA_CERTS,
"ssl_certfile": dify_config.REDIS_SSL_CERTFILE,
"ssl_keyfile": dify_config.REDIS_SSL_KEYFILE,
}
return ssl_options
def init_app(app: DifyApp) -> Celery:
class FlaskTask(Task):
def __call__(self, *args: object, **kwargs: object) -> object:
@ -33,14 +69,6 @@ def init_app(app: DifyApp) -> Celery:
task_ignore_result=True,
)
# Add SSL options to the Celery configuration
ssl_options = {
"ssl_cert_reqs": None,
"ssl_ca_certs": None,
"ssl_certfile": None,
"ssl_keyfile": None,
}
celery_app.conf.update(
result_backend=dify_config.CELERY_RESULT_BACKEND,
broker_transport_options=broker_transport_options,
@ -51,9 +79,13 @@ def init_app(app: DifyApp) -> Celery:
timezone=pytz.timezone(dify_config.LOG_TZ or "UTC"),
)
if dify_config.BROKER_USE_SSL:
# Apply SSL configuration if enabled
ssl_options = _get_celery_ssl_options()
if ssl_options:
celery_app.conf.update(
broker_use_ssl=ssl_options, # Add the SSL options to the broker configuration
broker_use_ssl=ssl_options,
# Also apply SSL to the backend if it's Redis
redis_backend_use_ssl=ssl_options if dify_config.CELERY_BACKEND == "redis" else None,
)
if dify_config.LOG_FILE:

View File

@ -0,0 +1,8 @@
from flask_orjson import OrjsonProvider
from dify_app import DifyApp
def init_app(app: DifyApp) -> None:
"""Initialize Flask-Orjson extension for faster JSON serialization"""
app.json = OrjsonProvider(app)

View File

@ -1,5 +1,6 @@
import functools
import logging
import ssl
from collections.abc import Callable
from datetime import timedelta
from typing import TYPE_CHECKING, Any, Union
@ -116,76 +117,132 @@ class RedisClientWrapper:
redis_client: RedisClientWrapper = RedisClientWrapper()
def init_app(app: DifyApp):
global redis_client
connection_class: type[Union[Connection, SSLConnection]] = Connection
if dify_config.REDIS_USE_SSL:
connection_class = SSLConnection
resp_protocol = dify_config.REDIS_SERIALIZATION_PROTOCOL
if dify_config.REDIS_ENABLE_CLIENT_SIDE_CACHE:
if resp_protocol >= 3:
clientside_cache_config = CacheConfig()
else:
raise ValueError("Client side cache is only supported in RESP3")
else:
clientside_cache_config = None
def _get_ssl_configuration() -> tuple[type[Union[Connection, SSLConnection]], dict[str, Any]]:
"""Get SSL configuration for Redis connection."""
if not dify_config.REDIS_USE_SSL:
return Connection, {}
redis_params: dict[str, Any] = {
cert_reqs_map = {
"CERT_NONE": ssl.CERT_NONE,
"CERT_OPTIONAL": ssl.CERT_OPTIONAL,
"CERT_REQUIRED": ssl.CERT_REQUIRED,
}
ssl_cert_reqs = cert_reqs_map.get(dify_config.REDIS_SSL_CERT_REQS, ssl.CERT_NONE)
ssl_kwargs = {
"ssl_cert_reqs": ssl_cert_reqs,
"ssl_ca_certs": dify_config.REDIS_SSL_CA_CERTS,
"ssl_certfile": dify_config.REDIS_SSL_CERTFILE,
"ssl_keyfile": dify_config.REDIS_SSL_KEYFILE,
}
return SSLConnection, ssl_kwargs
def _get_cache_configuration() -> CacheConfig | None:
"""Get client-side cache configuration if enabled."""
if not dify_config.REDIS_ENABLE_CLIENT_SIDE_CACHE:
return None
resp_protocol = dify_config.REDIS_SERIALIZATION_PROTOCOL
if resp_protocol < 3:
raise ValueError("Client side cache is only supported in RESP3")
return CacheConfig()
def _get_base_redis_params() -> dict[str, Any]:
"""Get base Redis connection parameters."""
return {
"username": dify_config.REDIS_USERNAME,
"password": dify_config.REDIS_PASSWORD or None, # Temporary fix for empty password
"password": dify_config.REDIS_PASSWORD or None,
"db": dify_config.REDIS_DB,
"encoding": "utf-8",
"encoding_errors": "strict",
"decode_responses": False,
"protocol": resp_protocol,
"cache_config": clientside_cache_config,
"protocol": dify_config.REDIS_SERIALIZATION_PROTOCOL,
"cache_config": _get_cache_configuration(),
}
if dify_config.REDIS_USE_SENTINEL:
assert dify_config.REDIS_SENTINELS is not None, "REDIS_SENTINELS must be set when REDIS_USE_SENTINEL is True"
assert dify_config.REDIS_SENTINEL_SERVICE_NAME is not None, (
"REDIS_SENTINEL_SERVICE_NAME must be set when REDIS_USE_SENTINEL is True"
)
sentinel_hosts = [
(node.split(":")[0], int(node.split(":")[1])) for node in dify_config.REDIS_SENTINELS.split(",")
]
sentinel = Sentinel(
sentinel_hosts,
sentinel_kwargs={
"socket_timeout": dify_config.REDIS_SENTINEL_SOCKET_TIMEOUT,
"username": dify_config.REDIS_SENTINEL_USERNAME,
"password": dify_config.REDIS_SENTINEL_PASSWORD,
},
)
master = sentinel.master_for(dify_config.REDIS_SENTINEL_SERVICE_NAME, **redis_params)
redis_client.initialize(master)
elif dify_config.REDIS_USE_CLUSTERS:
assert dify_config.REDIS_CLUSTERS is not None, "REDIS_CLUSTERS must be set when REDIS_USE_CLUSTERS is True"
nodes = [
ClusterNode(host=node.split(":")[0], port=int(node.split(":")[1]))
for node in dify_config.REDIS_CLUSTERS.split(",")
]
redis_client.initialize(
RedisCluster(
startup_nodes=nodes,
password=dify_config.REDIS_CLUSTERS_PASSWORD,
protocol=resp_protocol,
cache_config=clientside_cache_config,
)
)
else:
redis_params.update(
{
"host": dify_config.REDIS_HOST,
"port": dify_config.REDIS_PORT,
"connection_class": connection_class,
"protocol": resp_protocol,
"cache_config": clientside_cache_config,
}
)
pool = redis.ConnectionPool(**redis_params)
redis_client.initialize(redis.Redis(connection_pool=pool))
def _create_sentinel_client(redis_params: dict[str, Any]) -> Union[redis.Redis, RedisCluster]:
"""Create Redis client using Sentinel configuration."""
if not dify_config.REDIS_SENTINELS:
raise ValueError("REDIS_SENTINELS must be set when REDIS_USE_SENTINEL is True")
if not dify_config.REDIS_SENTINEL_SERVICE_NAME:
raise ValueError("REDIS_SENTINEL_SERVICE_NAME must be set when REDIS_USE_SENTINEL is True")
sentinel_hosts = [(node.split(":")[0], int(node.split(":")[1])) for node in dify_config.REDIS_SENTINELS.split(",")]
sentinel = Sentinel(
sentinel_hosts,
sentinel_kwargs={
"socket_timeout": dify_config.REDIS_SENTINEL_SOCKET_TIMEOUT,
"username": dify_config.REDIS_SENTINEL_USERNAME,
"password": dify_config.REDIS_SENTINEL_PASSWORD,
},
)
master: redis.Redis = sentinel.master_for(dify_config.REDIS_SENTINEL_SERVICE_NAME, **redis_params)
return master
def _create_cluster_client() -> Union[redis.Redis, RedisCluster]:
"""Create Redis cluster client."""
if not dify_config.REDIS_CLUSTERS:
raise ValueError("REDIS_CLUSTERS must be set when REDIS_USE_CLUSTERS is True")
nodes = [
ClusterNode(host=node.split(":")[0], port=int(node.split(":")[1]))
for node in dify_config.REDIS_CLUSTERS.split(",")
]
cluster: RedisCluster = RedisCluster(
startup_nodes=nodes,
password=dify_config.REDIS_CLUSTERS_PASSWORD,
protocol=dify_config.REDIS_SERIALIZATION_PROTOCOL,
cache_config=_get_cache_configuration(),
)
return cluster
def _create_standalone_client(redis_params: dict[str, Any]) -> Union[redis.Redis, RedisCluster]:
"""Create standalone Redis client."""
connection_class, ssl_kwargs = _get_ssl_configuration()
redis_params.update(
{
"host": dify_config.REDIS_HOST,
"port": dify_config.REDIS_PORT,
"connection_class": connection_class,
}
)
if ssl_kwargs:
redis_params.update(ssl_kwargs)
pool = redis.ConnectionPool(**redis_params)
client: redis.Redis = redis.Redis(connection_pool=pool)
return client
def init_app(app: DifyApp):
"""Initialize Redis client and attach it to the app."""
global redis_client
# Determine Redis mode and create appropriate client
if dify_config.REDIS_USE_SENTINEL:
redis_params = _get_base_redis_params()
client = _create_sentinel_client(redis_params)
elif dify_config.REDIS_USE_CLUSTERS:
client = _create_cluster_client()
else:
redis_params = _get_base_redis_params()
client = _create_standalone_client(redis_params)
# Initialize the wrapper and attach to app
redis_client.initialize(client)
app.extensions["redis"] = redis_client

View File

@ -1184,7 +1184,7 @@ class WorkflowDraftVariable(Base):
value: The Segment object to store as the variable's value.
"""
self.__value = value
self.value = json.dumps(value, cls=variable_utils.SegmentJSONEncoder)
self.value = variable_utils.dumps_with_segments(value)
self.value_type = value.value_type
def get_node_id(self) -> str | None:

View File

@ -5,8 +5,7 @@ check_untyped_defs = True
cache_fine_grained = True
sqlite_cache = True
exclude = (?x)(
core/model_runtime/model_providers/
| tests/
tests/
| migrations/
)

View File

@ -18,6 +18,7 @@ dependencies = [
"flask-cors~=6.0.0",
"flask-login~=0.6.3",
"flask-migrate~=4.0.7",
"flask-orjson~=2.0.0",
"flask-restful~=0.3.10",
"flask-sqlalchemy~=3.1.1",
"gevent~=24.11.1",

View File

@ -103,10 +103,10 @@ class ConversationService:
@classmethod
def _build_filter_condition(cls, sort_field: str, sort_direction: Callable, reference_conversation: Conversation):
field_value = getattr(reference_conversation, sort_field)
if sort_direction == desc:
if sort_direction is desc:
return getattr(Conversation, sort_field) < field_value
else:
return getattr(Conversation, sort_field) > field_value
return getattr(Conversation, sort_field) > field_value
@classmethod
def rename(
@ -147,7 +147,7 @@ class ConversationService:
app_model.tenant_id, message.query, conversation.id, app_model.id
)
conversation.name = name
except:
except Exception:
pass
db.session.commit()
@ -277,6 +277,11 @@ class ConversationService:
# Validate that the new value type matches the expected variable type
expected_type = SegmentType(current_variable.value_type)
# There is showing number in web ui but int in db
if expected_type == SegmentType.INTEGER:
expected_type = SegmentType.NUMBER
if not expected_type.is_valid(new_value):
inferred_type = SegmentType.infer_segment_type(new_value)
raise ConversationVariableTypeMismatchError(

View File

@ -55,7 +55,9 @@ class OAuthProxyService(BasePluginClient):
if not context_id:
raise ValueError("context_id is required")
# get data from redis
data = redis_client.getdel(f"{OAuthProxyService.__KEY_PREFIX__}{context_id}")
key = f"{OAuthProxyService.__KEY_PREFIX__}{context_id}"
data = redis_client.get(key)
if not data:
raise ValueError("context_id is invalid")
redis_client.delete(key)
return json.loads(data)

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,474 @@
from unittest.mock import MagicMock, patch
import pytest
from faker import Faker
from models.account import TenantAccountJoin, TenantAccountRole
from models.model import Account, Tenant
from models.provider import LoadBalancingModelConfig, Provider, ProviderModelSetting
from services.model_load_balancing_service import ModelLoadBalancingService
class TestModelLoadBalancingService:
"""Integration tests for ModelLoadBalancingService using testcontainers."""
@pytest.fixture
def mock_external_service_dependencies(self):
"""Mock setup for external service dependencies."""
with (
patch("services.model_load_balancing_service.ProviderManager") as mock_provider_manager,
patch("services.model_load_balancing_service.LBModelManager") as mock_lb_model_manager,
patch("services.model_load_balancing_service.ModelProviderFactory") as mock_model_provider_factory,
patch("services.model_load_balancing_service.encrypter") as mock_encrypter,
):
# Setup default mock returns
mock_provider_manager_instance = mock_provider_manager.return_value
# Mock provider configuration
mock_provider_config = MagicMock()
mock_provider_config.provider.provider = "openai"
mock_provider_config.custom_configuration.provider = None
# Mock provider model setting
mock_provider_model_setting = MagicMock()
mock_provider_model_setting.load_balancing_enabled = False
mock_provider_config.get_provider_model_setting.return_value = mock_provider_model_setting
# Mock provider configurations dict
mock_provider_configs = {"openai": mock_provider_config}
mock_provider_manager_instance.get_configurations.return_value = mock_provider_configs
# Mock LBModelManager
mock_lb_model_manager.get_config_in_cooldown_and_ttl.return_value = (False, 0)
# Mock ModelProviderFactory
mock_model_provider_factory_instance = mock_model_provider_factory.return_value
# Mock credential schemas
mock_credential_schema = MagicMock()
mock_credential_schema.credential_form_schemas = []
# Mock provider configuration methods
mock_provider_config.extract_secret_variables.return_value = []
mock_provider_config.obfuscated_credentials.return_value = {}
mock_provider_config._get_credential_schema.return_value = mock_credential_schema
yield {
"provider_manager": mock_provider_manager,
"lb_model_manager": mock_lb_model_manager,
"model_provider_factory": mock_model_provider_factory,
"encrypter": mock_encrypter,
"provider_config": mock_provider_config,
"provider_model_setting": mock_provider_model_setting,
"credential_schema": mock_credential_schema,
}
def _create_test_account_and_tenant(self, db_session_with_containers, mock_external_service_dependencies):
"""
Helper method to create a test account and tenant for testing.
Args:
db_session_with_containers: Database session from testcontainers infrastructure
mock_external_service_dependencies: Mock dependencies
Returns:
tuple: (account, tenant) - Created account and tenant instances
"""
fake = Faker()
# Create account
account = Account(
email=fake.email(),
name=fake.name(),
interface_language="en-US",
status="active",
)
from extensions.ext_database import db
db.session.add(account)
db.session.commit()
# Create tenant for the account
tenant = Tenant(
name=fake.company(),
status="normal",
)
db.session.add(tenant)
db.session.commit()
# Create tenant-account join
join = TenantAccountJoin(
tenant_id=tenant.id,
account_id=account.id,
role=TenantAccountRole.OWNER.value,
current=True,
)
db.session.add(join)
db.session.commit()
# Set current tenant for account
account.current_tenant = tenant
return account, tenant
def _create_test_provider_and_setting(
self, db_session_with_containers, tenant_id, mock_external_service_dependencies
):
"""
Helper method to create a test provider and provider model setting.
Args:
db_session_with_containers: Database session from testcontainers infrastructure
tenant_id: Tenant ID for the provider
mock_external_service_dependencies: Mock dependencies
Returns:
tuple: (provider, provider_model_setting) - Created provider and setting instances
"""
fake = Faker()
from extensions.ext_database import db
# Create provider
provider = Provider(
tenant_id=tenant_id,
provider_name="openai",
provider_type="custom",
is_valid=True,
)
db.session.add(provider)
db.session.commit()
# Create provider model setting
provider_model_setting = ProviderModelSetting(
tenant_id=tenant_id,
provider_name="openai",
model_name="gpt-3.5-turbo",
model_type="text-generation", # Use the origin model type that matches the query
enabled=True,
load_balancing_enabled=False,
)
db.session.add(provider_model_setting)
db.session.commit()
return provider, provider_model_setting
def test_enable_model_load_balancing_success(self, db_session_with_containers, mock_external_service_dependencies):
"""
Test successful model load balancing enablement.
This test verifies:
- Proper provider configuration retrieval
- Successful enablement of model load balancing
- Correct method calls to provider configuration
"""
# Arrange: Create test data
fake = Faker()
account, tenant = self._create_test_account_and_tenant(
db_session_with_containers, mock_external_service_dependencies
)
provider, provider_model_setting = self._create_test_provider_and_setting(
db_session_with_containers, tenant.id, mock_external_service_dependencies
)
# Setup mocks for enable method
mock_provider_config = mock_external_service_dependencies["provider_config"]
mock_provider_config.enable_model_load_balancing = MagicMock()
# Act: Execute the method under test
service = ModelLoadBalancingService()
service.enable_model_load_balancing(
tenant_id=tenant.id, provider="openai", model="gpt-3.5-turbo", model_type="llm"
)
# Assert: Verify the expected outcomes
mock_provider_config.enable_model_load_balancing.assert_called_once()
call_args = mock_provider_config.enable_model_load_balancing.call_args
assert call_args.kwargs["model"] == "gpt-3.5-turbo"
assert call_args.kwargs["model_type"].value == "llm" # ModelType enum value
# Verify database state
from extensions.ext_database import db
db.session.refresh(provider)
db.session.refresh(provider_model_setting)
assert provider.id is not None
assert provider_model_setting.id is not None
def test_disable_model_load_balancing_success(self, db_session_with_containers, mock_external_service_dependencies):
"""
Test successful model load balancing disablement.
This test verifies:
- Proper provider configuration retrieval
- Successful disablement of model load balancing
- Correct method calls to provider configuration
"""
# Arrange: Create test data
fake = Faker()
account, tenant = self._create_test_account_and_tenant(
db_session_with_containers, mock_external_service_dependencies
)
provider, provider_model_setting = self._create_test_provider_and_setting(
db_session_with_containers, tenant.id, mock_external_service_dependencies
)
# Setup mocks for disable method
mock_provider_config = mock_external_service_dependencies["provider_config"]
mock_provider_config.disable_model_load_balancing = MagicMock()
# Act: Execute the method under test
service = ModelLoadBalancingService()
service.disable_model_load_balancing(
tenant_id=tenant.id, provider="openai", model="gpt-3.5-turbo", model_type="llm"
)
# Assert: Verify the expected outcomes
mock_provider_config.disable_model_load_balancing.assert_called_once()
call_args = mock_provider_config.disable_model_load_balancing.call_args
assert call_args.kwargs["model"] == "gpt-3.5-turbo"
assert call_args.kwargs["model_type"].value == "llm" # ModelType enum value
# Verify database state
from extensions.ext_database import db
db.session.refresh(provider)
db.session.refresh(provider_model_setting)
assert provider.id is not None
assert provider_model_setting.id is not None
def test_enable_model_load_balancing_provider_not_found(
self, db_session_with_containers, mock_external_service_dependencies
):
"""
Test error handling when provider does not exist.
This test verifies:
- Proper error handling for non-existent provider
- Correct exception type and message
- No database state changes
"""
# Arrange: Create test data
fake = Faker()
account, tenant = self._create_test_account_and_tenant(
db_session_with_containers, mock_external_service_dependencies
)
# Setup mocks to return empty provider configurations
mock_provider_manager = mock_external_service_dependencies["provider_manager"]
mock_provider_manager_instance = mock_provider_manager.return_value
mock_provider_manager_instance.get_configurations.return_value = {}
# Act & Assert: Verify proper error handling
service = ModelLoadBalancingService()
with pytest.raises(ValueError) as exc_info:
service.enable_model_load_balancing(
tenant_id=tenant.id, provider="nonexistent_provider", model="gpt-3.5-turbo", model_type="llm"
)
# Verify correct error message
assert "Provider nonexistent_provider does not exist." in str(exc_info.value)
# Verify no database state changes occurred
from extensions.ext_database import db
db.session.rollback()
def test_get_load_balancing_configs_success(self, db_session_with_containers, mock_external_service_dependencies):
"""
Test successful retrieval of load balancing configurations.
This test verifies:
- Proper provider configuration retrieval
- Successful database query for load balancing configs
- Correct return format and data structure
"""
# Arrange: Create test data
fake = Faker()
account, tenant = self._create_test_account_and_tenant(
db_session_with_containers, mock_external_service_dependencies
)
provider, provider_model_setting = self._create_test_provider_and_setting(
db_session_with_containers, tenant.id, mock_external_service_dependencies
)
# Create load balancing config
from extensions.ext_database import db
load_balancing_config = LoadBalancingModelConfig(
tenant_id=tenant.id,
provider_name="openai",
model_name="gpt-3.5-turbo",
model_type="text-generation", # Use the origin model type that matches the query
name="config1",
encrypted_config='{"api_key": "test_key"}',
enabled=True,
)
db.session.add(load_balancing_config)
db.session.commit()
# Verify the config was created
db.session.refresh(load_balancing_config)
assert load_balancing_config.id is not None
# Setup mocks for get_load_balancing_configs method
mock_provider_config = mock_external_service_dependencies["provider_config"]
mock_provider_model_setting = mock_external_service_dependencies["provider_model_setting"]
mock_provider_model_setting.load_balancing_enabled = True
# Mock credential schema methods
mock_credential_schema = mock_external_service_dependencies["credential_schema"]
mock_credential_schema.credential_form_schemas = []
# Mock encrypter
mock_encrypter = mock_external_service_dependencies["encrypter"]
mock_encrypter.get_decrypt_decoding.return_value = ("key", "cipher")
# Mock _get_credential_schema method
mock_provider_config._get_credential_schema.return_value = mock_credential_schema
# Mock extract_secret_variables method
mock_provider_config.extract_secret_variables.return_value = []
# Mock obfuscated_credentials method
mock_provider_config.obfuscated_credentials.return_value = {}
# Mock LBModelManager.get_config_in_cooldown_and_ttl
mock_lb_model_manager = mock_external_service_dependencies["lb_model_manager"]
mock_lb_model_manager.get_config_in_cooldown_and_ttl.return_value = (False, 0)
# Act: Execute the method under test
service = ModelLoadBalancingService()
is_enabled, configs = service.get_load_balancing_configs(
tenant_id=tenant.id, provider="openai", model="gpt-3.5-turbo", model_type="llm"
)
# Assert: Verify the expected outcomes
assert is_enabled is True
assert len(configs) == 1
assert configs[0]["id"] == load_balancing_config.id
assert configs[0]["name"] == "config1"
assert configs[0]["enabled"] is True
assert configs[0]["in_cooldown"] is False
assert configs[0]["ttl"] == 0
# Verify database state
db.session.refresh(load_balancing_config)
assert load_balancing_config.id is not None
def test_get_load_balancing_configs_provider_not_found(
self, db_session_with_containers, mock_external_service_dependencies
):
"""
Test error handling when provider does not exist in get_load_balancing_configs.
This test verifies:
- Proper error handling for non-existent provider
- Correct exception type and message
- No database state changes
"""
# Arrange: Create test data
fake = Faker()
account, tenant = self._create_test_account_and_tenant(
db_session_with_containers, mock_external_service_dependencies
)
# Setup mocks to return empty provider configurations
mock_provider_manager = mock_external_service_dependencies["provider_manager"]
mock_provider_manager_instance = mock_provider_manager.return_value
mock_provider_manager_instance.get_configurations.return_value = {}
# Act & Assert: Verify proper error handling
service = ModelLoadBalancingService()
with pytest.raises(ValueError) as exc_info:
service.get_load_balancing_configs(
tenant_id=tenant.id, provider="nonexistent_provider", model="gpt-3.5-turbo", model_type="llm"
)
# Verify correct error message
assert "Provider nonexistent_provider does not exist." in str(exc_info.value)
# Verify no database state changes occurred
from extensions.ext_database import db
db.session.rollback()
def test_get_load_balancing_configs_with_inherit_config(
self, db_session_with_containers, mock_external_service_dependencies
):
"""
Test load balancing configs retrieval with inherit configuration.
This test verifies:
- Proper handling of inherit configuration
- Correct ordering of configurations
- Inherit config initialization when needed
"""
# Arrange: Create test data
fake = Faker()
account, tenant = self._create_test_account_and_tenant(
db_session_with_containers, mock_external_service_dependencies
)
provider, provider_model_setting = self._create_test_provider_and_setting(
db_session_with_containers, tenant.id, mock_external_service_dependencies
)
# Create load balancing config
from extensions.ext_database import db
load_balancing_config = LoadBalancingModelConfig(
tenant_id=tenant.id,
provider_name="openai",
model_name="gpt-3.5-turbo",
model_type="text-generation", # Use the origin model type that matches the query
name="config1",
encrypted_config='{"api_key": "test_key"}',
enabled=True,
)
db.session.add(load_balancing_config)
db.session.commit()
# Setup mocks for inherit config scenario
mock_provider_config = mock_external_service_dependencies["provider_config"]
mock_provider_config.custom_configuration.provider = MagicMock() # Enable custom config
mock_provider_model_setting = mock_external_service_dependencies["provider_model_setting"]
mock_provider_model_setting.load_balancing_enabled = True
# Mock credential schema methods
mock_credential_schema = mock_external_service_dependencies["credential_schema"]
mock_credential_schema.credential_form_schemas = []
# Mock encrypter
mock_encrypter = mock_external_service_dependencies["encrypter"]
mock_encrypter.get_decrypt_decoding.return_value = ("key", "cipher")
# Act: Execute the method under test
service = ModelLoadBalancingService()
is_enabled, configs = service.get_load_balancing_configs(
tenant_id=tenant.id, provider="openai", model="gpt-3.5-turbo", model_type="llm"
)
# Assert: Verify the expected outcomes
assert is_enabled is True
assert len(configs) == 2 # inherit config + existing config
# First config should be inherit config
assert configs[0]["name"] == "__inherit__"
assert configs[0]["enabled"] is True
# Second config should be the existing config
assert configs[1]["id"] == load_balancing_config.id
assert configs[1]["name"] == "config1"
# Verify database state
db.session.refresh(load_balancing_config)
assert load_balancing_config.id is not None
# Verify inherit config was created in database
inherit_configs = (
db.session.query(LoadBalancingModelConfig).filter(LoadBalancingModelConfig.name == "__inherit__").all()
)
assert len(inherit_configs) == 1

View File

@ -262,26 +262,6 @@ def test_sse_client_queue_cleanup():
# Note: In real implementation, cleanup should put None to signal shutdown
def test_sse_client_url_processing():
"""Test SSE client URL processing functions."""
from core.mcp.client.sse_client import remove_request_params
# Test URL with parameters
url_with_params = "http://example.com/sse?param1=value1&param2=value2"
cleaned_url = remove_request_params(url_with_params)
assert cleaned_url == "http://example.com/sse"
# Test URL without parameters
url_without_params = "http://example.com/sse"
cleaned_url = remove_request_params(url_without_params)
assert cleaned_url == "http://example.com/sse"
# Test URL with path and parameters
complex_url = "http://example.com/path/to/sse?session=123&token=abc"
cleaned_url = remove_request_params(complex_url)
assert cleaned_url == "http://example.com/path/to/sse"
def test_sse_client_headers_propagation():
"""Test that custom headers are properly propagated in SSE client."""
test_url = "http://test.example/sse"

View File

@ -0,0 +1,481 @@
import json
from datetime import date, datetime
from decimal import Decimal
from uuid import uuid4
import numpy as np
import pytest
import pytz
from core.tools.entities.tool_entities import ToolInvokeMessage
from core.tools.utils.message_transformer import ToolFileMessageTransformer, safe_json_dict, safe_json_value
class TestSafeJsonValue:
"""Test suite for safe_json_value function to ensure proper serialization of complex types"""
def test_datetime_conversion(self):
"""Test datetime conversion with timezone handling"""
# Test datetime with UTC timezone
dt = datetime(2024, 1, 1, 12, 0, 0, tzinfo=pytz.UTC)
result = safe_json_value(dt)
assert isinstance(result, str)
assert "2024-01-01T12:00:00+00:00" in result
# Test datetime without timezone (should default to UTC)
dt_no_tz = datetime(2024, 1, 1, 12, 0, 0)
result = safe_json_value(dt_no_tz)
assert isinstance(result, str)
# The exact time will depend on the system's timezone, so we check the format
assert "T" in result # ISO format separator
# Check that it's a valid ISO format datetime string
assert len(result) >= 19 # At least YYYY-MM-DDTHH:MM:SS
def test_date_conversion(self):
"""Test date conversion to ISO format"""
test_date = date(2024, 1, 1)
result = safe_json_value(test_date)
assert result == "2024-01-01"
def test_uuid_conversion(self):
"""Test UUID conversion to string"""
test_uuid = uuid4()
result = safe_json_value(test_uuid)
assert isinstance(result, str)
assert result == str(test_uuid)
def test_decimal_conversion(self):
"""Test Decimal conversion to float"""
test_decimal = Decimal("123.456")
result = safe_json_value(test_decimal)
assert result == 123.456
assert isinstance(result, float)
def test_bytes_conversion(self):
"""Test bytes conversion with UTF-8 decoding"""
# Test valid UTF-8 bytes
test_bytes = b"Hello, World!"
result = safe_json_value(test_bytes)
assert result == "Hello, World!"
# Test invalid UTF-8 bytes (should fall back to hex)
invalid_bytes = b"\xff\xfe\xfd"
result = safe_json_value(invalid_bytes)
assert result == "fffefd"
def test_memoryview_conversion(self):
"""Test memoryview conversion to hex string"""
test_bytes = b"test data"
test_memoryview = memoryview(test_bytes)
result = safe_json_value(test_memoryview)
assert result == "746573742064617461" # hex of "test data"
def test_numpy_ndarray_conversion(self):
"""Test numpy ndarray conversion to list"""
# Test 1D array
test_array = np.array([1, 2, 3, 4])
result = safe_json_value(test_array)
assert result == [1, 2, 3, 4]
# Test 2D array
test_2d_array = np.array([[1, 2], [3, 4]])
result = safe_json_value(test_2d_array)
assert result == [[1, 2], [3, 4]]
# Test array with float values
test_float_array = np.array([1.5, 2.7, 3.14])
result = safe_json_value(test_float_array)
assert result == [1.5, 2.7, 3.14]
def test_dict_conversion(self):
"""Test dictionary conversion using safe_json_dict"""
test_dict = {
"string": "value",
"number": 42,
"float": 3.14,
"boolean": True,
"list": [1, 2, 3],
"nested": {"key": "value"},
}
result = safe_json_value(test_dict)
assert isinstance(result, dict)
assert result == test_dict
def test_list_conversion(self):
"""Test list conversion with mixed types"""
test_list = [
"string",
42,
3.14,
True,
[1, 2, 3],
{"key": "value"},
datetime(2024, 1, 1, 12, 0, 0, tzinfo=pytz.UTC),
Decimal("123.456"),
uuid4(),
]
result = safe_json_value(test_list)
assert isinstance(result, list)
assert len(result) == len(test_list)
assert isinstance(result[6], str) # datetime should be converted to string
assert isinstance(result[7], float) # Decimal should be converted to float
assert isinstance(result[8], str) # UUID should be converted to string
def test_tuple_conversion(self):
"""Test tuple conversion to list"""
test_tuple = (1, "string", 3.14)
result = safe_json_value(test_tuple)
assert isinstance(result, list)
assert result == [1, "string", 3.14]
def test_set_conversion(self):
"""Test set conversion to list"""
test_set = {1, "string", 3.14}
result = safe_json_value(test_set)
assert isinstance(result, list)
# Note: set order is not guaranteed, so we check length and content
assert len(result) == 3
assert 1 in result
assert "string" in result
assert 3.14 in result
def test_basic_types_passthrough(self):
"""Test that basic types are passed through unchanged"""
assert safe_json_value("string") == "string"
assert safe_json_value(42) == 42
assert safe_json_value(3.14) == 3.14
assert safe_json_value(True) is True
assert safe_json_value(False) is False
assert safe_json_value(None) is None
def test_nested_complex_structure(self):
"""Test complex nested structure with all types"""
complex_data = {
"dates": [date(2024, 1, 1), date(2024, 1, 2)],
"timestamps": [
datetime(2024, 1, 1, 12, 0, 0, tzinfo=pytz.UTC),
datetime(2024, 1, 2, 12, 0, 0, tzinfo=pytz.UTC),
],
"numbers": [Decimal("123.456"), Decimal("789.012")],
"identifiers": [uuid4(), uuid4()],
"binary_data": [b"hello", b"world"],
"arrays": [np.array([1, 2, 3]), np.array([4, 5, 6])],
}
result = safe_json_value(complex_data)
# Verify structure is maintained
assert isinstance(result, dict)
assert "dates" in result
assert "timestamps" in result
assert "numbers" in result
assert "identifiers" in result
assert "binary_data" in result
assert "arrays" in result
# Verify conversions
assert all(isinstance(d, str) for d in result["dates"])
assert all(isinstance(t, str) for t in result["timestamps"])
assert all(isinstance(n, float) for n in result["numbers"])
assert all(isinstance(i, str) for i in result["identifiers"])
assert all(isinstance(b, str) for b in result["binary_data"])
assert all(isinstance(a, list) for a in result["arrays"])
class TestSafeJsonDict:
"""Test suite for safe_json_dict function"""
def test_valid_dict_conversion(self):
"""Test valid dictionary conversion"""
test_dict = {
"string": "value",
"number": 42,
"datetime": datetime(2024, 1, 1, 12, 0, 0, tzinfo=pytz.UTC),
"decimal": Decimal("123.456"),
}
result = safe_json_dict(test_dict)
assert isinstance(result, dict)
assert result["string"] == "value"
assert result["number"] == 42
assert isinstance(result["datetime"], str)
assert isinstance(result["decimal"], float)
def test_invalid_input_type(self):
"""Test that invalid input types raise TypeError"""
with pytest.raises(TypeError, match="safe_json_dict\\(\\) expects a dictionary \\(dict\\) as input"):
safe_json_dict("not a dict")
with pytest.raises(TypeError, match="safe_json_dict\\(\\) expects a dictionary \\(dict\\) as input"):
safe_json_dict([1, 2, 3])
with pytest.raises(TypeError, match="safe_json_dict\\(\\) expects a dictionary \\(dict\\) as input"):
safe_json_dict(42)
def test_empty_dict(self):
"""Test empty dictionary handling"""
result = safe_json_dict({})
assert result == {}
def test_nested_dict_conversion(self):
"""Test nested dictionary conversion"""
test_dict = {
"level1": {
"level2": {"datetime": datetime(2024, 1, 1, 12, 0, 0, tzinfo=pytz.UTC), "decimal": Decimal("123.456")}
}
}
result = safe_json_dict(test_dict)
assert isinstance(result["level1"]["level2"]["datetime"], str)
assert isinstance(result["level1"]["level2"]["decimal"], float)
class TestToolInvokeMessageJsonSerialization:
"""Test suite for ToolInvokeMessage JSON serialization through safe_json_value"""
def test_json_message_serialization(self):
"""Test JSON message serialization with complex data"""
complex_data = {
"timestamp": datetime(2024, 1, 1, 12, 0, 0, tzinfo=pytz.UTC),
"amount": Decimal("123.45"),
"id": uuid4(),
"binary": b"test data",
"array": np.array([1, 2, 3]),
}
# Create JSON message
json_message = ToolInvokeMessage.JsonMessage(json_object=complex_data)
message = ToolInvokeMessage(type=ToolInvokeMessage.MessageType.JSON, message=json_message)
# Apply safe_json_value transformation
transformed_data = safe_json_value(message.message.json_object)
# Verify transformations
assert isinstance(transformed_data["timestamp"], str)
assert isinstance(transformed_data["amount"], float)
assert isinstance(transformed_data["id"], str)
assert isinstance(transformed_data["binary"], str)
assert isinstance(transformed_data["array"], list)
# Verify JSON serialization works
json_string = json.dumps(transformed_data, ensure_ascii=False)
assert isinstance(json_string, str)
# Verify we can deserialize back
deserialized = json.loads(json_string)
assert deserialized["amount"] == 123.45
assert deserialized["array"] == [1, 2, 3]
def test_json_message_with_nested_structures(self):
"""Test JSON message with deeply nested complex structures"""
nested_data = {
"level1": {
"level2": {
"level3": {
"dates": [date(2024, 1, 1), date(2024, 1, 2)],
"timestamps": [datetime(2024, 1, 1, 12, 0, 0, tzinfo=pytz.UTC)],
"numbers": [Decimal("1.1"), Decimal("2.2")],
"arrays": [np.array([1, 2]), np.array([3, 4])],
}
}
}
}
json_message = ToolInvokeMessage.JsonMessage(json_object=nested_data)
message = ToolInvokeMessage(type=ToolInvokeMessage.MessageType.JSON, message=json_message)
# Transform the data
transformed_data = safe_json_value(message.message.json_object)
# Verify nested transformations
level3 = transformed_data["level1"]["level2"]["level3"]
assert all(isinstance(d, str) for d in level3["dates"])
assert all(isinstance(t, str) for t in level3["timestamps"])
assert all(isinstance(n, float) for n in level3["numbers"])
assert all(isinstance(a, list) for a in level3["arrays"])
# Test JSON serialization
json_string = json.dumps(transformed_data, ensure_ascii=False)
assert isinstance(json_string, str)
# Verify deserialization
deserialized = json.loads(json_string)
assert deserialized["level1"]["level2"]["level3"]["numbers"] == [1.1, 2.2]
def test_json_message_transformer_integration(self):
"""Test integration with ToolFileMessageTransformer for JSON messages"""
complex_data = {
"metadata": {
"created_at": datetime(2024, 1, 1, 12, 0, 0, tzinfo=pytz.UTC),
"version": Decimal("1.0"),
"tags": ["tag1", "tag2"],
},
"data": {"values": np.array([1.1, 2.2, 3.3]), "binary": b"binary content"},
}
# Create message generator
def message_generator():
json_message = ToolInvokeMessage.JsonMessage(json_object=complex_data)
message = ToolInvokeMessage(type=ToolInvokeMessage.MessageType.JSON, message=json_message)
yield message
# Transform messages
transformed_messages = list(
ToolFileMessageTransformer.transform_tool_invoke_messages(
message_generator(), user_id="test_user", tenant_id="test_tenant"
)
)
assert len(transformed_messages) == 1
transformed_message = transformed_messages[0]
assert transformed_message.type == ToolInvokeMessage.MessageType.JSON
# Verify the JSON object was transformed
json_obj = transformed_message.message.json_object
assert isinstance(json_obj["metadata"]["created_at"], str)
assert isinstance(json_obj["metadata"]["version"], float)
assert isinstance(json_obj["data"]["values"], list)
assert isinstance(json_obj["data"]["binary"], str)
# Test final JSON serialization
final_json = json.dumps(json_obj, ensure_ascii=False)
assert isinstance(final_json, str)
# Verify we can deserialize
deserialized = json.loads(final_json)
assert deserialized["metadata"]["version"] == 1.0
assert deserialized["data"]["values"] == [1.1, 2.2, 3.3]
def test_edge_cases_and_error_handling(self):
"""Test edge cases and error handling in JSON serialization"""
# Test with None values
data_with_none = {"null_value": None, "empty_string": "", "zero": 0, "false_value": False}
json_message = ToolInvokeMessage.JsonMessage(json_object=data_with_none)
message = ToolInvokeMessage(type=ToolInvokeMessage.MessageType.JSON, message=json_message)
transformed_data = safe_json_value(message.message.json_object)
json_string = json.dumps(transformed_data, ensure_ascii=False)
# Verify serialization works with edge cases
assert json_string is not None
deserialized = json.loads(json_string)
assert deserialized["null_value"] is None
assert deserialized["empty_string"] == ""
assert deserialized["zero"] == 0
assert deserialized["false_value"] is False
# Test with very large numbers
large_data = {
"large_int": 2**63 - 1,
"large_float": 1.7976931348623157e308,
"small_float": 2.2250738585072014e-308,
}
json_message = ToolInvokeMessage.JsonMessage(json_object=large_data)
message = ToolInvokeMessage(type=ToolInvokeMessage.MessageType.JSON, message=json_message)
transformed_data = safe_json_value(message.message.json_object)
json_string = json.dumps(transformed_data, ensure_ascii=False)
# Verify large numbers are handled correctly
deserialized = json.loads(json_string)
assert deserialized["large_int"] == 2**63 - 1
assert deserialized["large_float"] == 1.7976931348623157e308
assert deserialized["small_float"] == 2.2250738585072014e-308
class TestEndToEndSerialization:
"""Test suite for end-to-end serialization workflow"""
def test_complete_workflow_with_real_data(self):
"""Test complete workflow from complex data to JSON string and back"""
# Simulate real-world complex data structure
real_world_data = {
"user_profile": {
"id": uuid4(),
"name": "John Doe",
"email": "john@example.com",
"created_at": datetime(2024, 1, 1, 12, 0, 0, tzinfo=pytz.UTC),
"last_login": datetime(2024, 1, 15, 14, 30, 0, tzinfo=pytz.UTC),
"preferences": {"theme": "dark", "language": "en", "timezone": "UTC"},
},
"analytics": {
"session_count": 42,
"total_time": Decimal("123.45"),
"metrics": np.array([1.1, 2.2, 3.3, 4.4, 5.5]),
"events": [
{
"timestamp": datetime(2024, 1, 1, 10, 0, 0, tzinfo=pytz.UTC),
"action": "login",
"duration": Decimal("5.67"),
},
{
"timestamp": datetime(2024, 1, 1, 11, 0, 0, tzinfo=pytz.UTC),
"action": "logout",
"duration": Decimal("3600.0"),
},
],
},
"files": [
{
"id": uuid4(),
"name": "document.pdf",
"size": 1024,
"uploaded_at": datetime(2024, 1, 1, 9, 0, 0, tzinfo=pytz.UTC),
"checksum": b"abc123def456",
}
],
}
# Step 1: Create ToolInvokeMessage
json_message = ToolInvokeMessage.JsonMessage(json_object=real_world_data)
message = ToolInvokeMessage(type=ToolInvokeMessage.MessageType.JSON, message=json_message)
# Step 2: Apply safe_json_value transformation
transformed_data = safe_json_value(message.message.json_object)
# Step 3: Serialize to JSON string
json_string = json.dumps(transformed_data, ensure_ascii=False)
# Step 4: Verify the string is valid JSON
assert isinstance(json_string, str)
assert json_string.startswith("{")
assert json_string.endswith("}")
# Step 5: Deserialize back to Python object
deserialized_data = json.loads(json_string)
# Step 6: Verify data integrity
assert deserialized_data["user_profile"]["name"] == "John Doe"
assert deserialized_data["user_profile"]["email"] == "john@example.com"
assert isinstance(deserialized_data["user_profile"]["created_at"], str)
assert isinstance(deserialized_data["analytics"]["total_time"], float)
assert deserialized_data["analytics"]["total_time"] == 123.45
assert isinstance(deserialized_data["analytics"]["metrics"], list)
assert deserialized_data["analytics"]["metrics"] == [1.1, 2.2, 3.3, 4.4, 5.5]
assert isinstance(deserialized_data["files"][0]["checksum"], str)
# Step 7: Verify all complex types were properly converted
self._verify_all_complex_types_converted(deserialized_data)
def _verify_all_complex_types_converted(self, data):
"""Helper method to verify all complex types were properly converted"""
if isinstance(data, dict):
for key, value in data.items():
if key in ["id", "checksum"]:
# These should be strings (UUID/bytes converted)
assert isinstance(value, str)
elif key in ["created_at", "last_login", "timestamp", "uploaded_at"]:
# These should be strings (datetime converted)
assert isinstance(value, str)
elif key in ["total_time", "duration"]:
# These should be floats (Decimal converted)
assert isinstance(value, float)
elif key == "metrics":
# This should be a list (ndarray converted)
assert isinstance(value, list)
else:
# Recursively check nested structures
self._verify_all_complex_types_converted(value)
elif isinstance(data, list):
for item in data:
self._verify_all_complex_types_converted(item)

View File

@ -0,0 +1,149 @@
"""Tests for Celery SSL configuration."""
import ssl
from unittest.mock import MagicMock, patch
class TestCelerySSLConfiguration:
"""Test suite for Celery SSL configuration."""
def test_get_celery_ssl_options_when_ssl_disabled(self):
"""Test SSL options when REDIS_USE_SSL is False."""
mock_config = MagicMock()
mock_config.REDIS_USE_SSL = False
with patch("extensions.ext_celery.dify_config", mock_config):
from extensions.ext_celery import _get_celery_ssl_options
result = _get_celery_ssl_options()
assert result is None
def test_get_celery_ssl_options_when_broker_not_redis(self):
"""Test SSL options when broker is not Redis."""
mock_config = MagicMock()
mock_config.REDIS_USE_SSL = True
mock_config.CELERY_BROKER_URL = "amqp://localhost:5672"
with patch("extensions.ext_celery.dify_config", mock_config):
from extensions.ext_celery import _get_celery_ssl_options
result = _get_celery_ssl_options()
assert result is None
def test_get_celery_ssl_options_with_cert_none(self):
"""Test SSL options with CERT_NONE requirement."""
mock_config = MagicMock()
mock_config.REDIS_USE_SSL = True
mock_config.CELERY_BROKER_URL = "redis://localhost:6379/0"
mock_config.REDIS_SSL_CERT_REQS = "CERT_NONE"
mock_config.REDIS_SSL_CA_CERTS = None
mock_config.REDIS_SSL_CERTFILE = None
mock_config.REDIS_SSL_KEYFILE = None
with patch("extensions.ext_celery.dify_config", mock_config):
from extensions.ext_celery import _get_celery_ssl_options
result = _get_celery_ssl_options()
assert result is not None
assert result["ssl_cert_reqs"] == ssl.CERT_NONE
assert result["ssl_ca_certs"] is None
assert result["ssl_certfile"] is None
assert result["ssl_keyfile"] is None
def test_get_celery_ssl_options_with_cert_required(self):
"""Test SSL options with CERT_REQUIRED and certificates."""
mock_config = MagicMock()
mock_config.REDIS_USE_SSL = True
mock_config.CELERY_BROKER_URL = "rediss://localhost:6380/0"
mock_config.REDIS_SSL_CERT_REQS = "CERT_REQUIRED"
mock_config.REDIS_SSL_CA_CERTS = "/path/to/ca.crt"
mock_config.REDIS_SSL_CERTFILE = "/path/to/client.crt"
mock_config.REDIS_SSL_KEYFILE = "/path/to/client.key"
with patch("extensions.ext_celery.dify_config", mock_config):
from extensions.ext_celery import _get_celery_ssl_options
result = _get_celery_ssl_options()
assert result is not None
assert result["ssl_cert_reqs"] == ssl.CERT_REQUIRED
assert result["ssl_ca_certs"] == "/path/to/ca.crt"
assert result["ssl_certfile"] == "/path/to/client.crt"
assert result["ssl_keyfile"] == "/path/to/client.key"
def test_get_celery_ssl_options_with_cert_optional(self):
"""Test SSL options with CERT_OPTIONAL requirement."""
mock_config = MagicMock()
mock_config.REDIS_USE_SSL = True
mock_config.CELERY_BROKER_URL = "redis://localhost:6379/0"
mock_config.REDIS_SSL_CERT_REQS = "CERT_OPTIONAL"
mock_config.REDIS_SSL_CA_CERTS = "/path/to/ca.crt"
mock_config.REDIS_SSL_CERTFILE = None
mock_config.REDIS_SSL_KEYFILE = None
with patch("extensions.ext_celery.dify_config", mock_config):
from extensions.ext_celery import _get_celery_ssl_options
result = _get_celery_ssl_options()
assert result is not None
assert result["ssl_cert_reqs"] == ssl.CERT_OPTIONAL
assert result["ssl_ca_certs"] == "/path/to/ca.crt"
def test_get_celery_ssl_options_with_invalid_cert_reqs(self):
"""Test SSL options with invalid cert requirement defaults to CERT_NONE."""
mock_config = MagicMock()
mock_config.REDIS_USE_SSL = True
mock_config.CELERY_BROKER_URL = "redis://localhost:6379/0"
mock_config.REDIS_SSL_CERT_REQS = "INVALID_VALUE"
mock_config.REDIS_SSL_CA_CERTS = None
mock_config.REDIS_SSL_CERTFILE = None
mock_config.REDIS_SSL_KEYFILE = None
with patch("extensions.ext_celery.dify_config", mock_config):
from extensions.ext_celery import _get_celery_ssl_options
result = _get_celery_ssl_options()
assert result is not None
assert result["ssl_cert_reqs"] == ssl.CERT_NONE # Should default to CERT_NONE
def test_celery_init_applies_ssl_to_broker_and_backend(self):
"""Test that SSL options are applied to both broker and backend when using Redis."""
mock_config = MagicMock()
mock_config.REDIS_USE_SSL = True
mock_config.CELERY_BROKER_URL = "redis://localhost:6379/0"
mock_config.CELERY_BACKEND = "redis"
mock_config.CELERY_RESULT_BACKEND = "redis://localhost:6379/0"
mock_config.REDIS_SSL_CERT_REQS = "CERT_NONE"
mock_config.REDIS_SSL_CA_CERTS = None
mock_config.REDIS_SSL_CERTFILE = None
mock_config.REDIS_SSL_KEYFILE = None
mock_config.CELERY_USE_SENTINEL = False
mock_config.LOG_FORMAT = "%(message)s"
mock_config.LOG_TZ = "UTC"
mock_config.LOG_FILE = None
# Mock all the scheduler configs
mock_config.CELERY_BEAT_SCHEDULER_TIME = 1
mock_config.ENABLE_CLEAN_EMBEDDING_CACHE_TASK = False
mock_config.ENABLE_CLEAN_UNUSED_DATASETS_TASK = False
mock_config.ENABLE_CREATE_TIDB_SERVERLESS_TASK = False
mock_config.ENABLE_UPDATE_TIDB_SERVERLESS_STATUS_TASK = False
mock_config.ENABLE_CLEAN_MESSAGES = False
mock_config.ENABLE_MAIL_CLEAN_DOCUMENT_NOTIFY_TASK = False
mock_config.ENABLE_DATASETS_QUEUE_MONITOR = False
mock_config.ENABLE_CHECK_UPGRADABLE_PLUGIN_TASK = False
with patch("extensions.ext_celery.dify_config", mock_config):
from dify_app import DifyApp
from extensions.ext_celery import init_app
app = DifyApp(__name__)
celery_app = init_app(app)
# Check that SSL options were applied
assert "broker_use_ssl" in celery_app.conf
assert celery_app.conf["broker_use_ssl"] is not None
assert celery_app.conf["broker_use_ssl"]["ssl_cert_reqs"] == ssl.CERT_NONE
# Check that SSL is also applied to Redis backend
assert "redis_backend_use_ssl" in celery_app.conf
assert celery_app.conf["redis_backend_use_ssl"] is not None

View File

@ -1253,6 +1253,7 @@ dependencies = [
{ name = "flask-cors" },
{ name = "flask-login" },
{ name = "flask-migrate" },
{ name = "flask-orjson" },
{ name = "flask-restful" },
{ name = "flask-sqlalchemy" },
{ name = "gevent" },
@ -1440,6 +1441,7 @@ requires-dist = [
{ name = "flask-cors", specifier = "~=6.0.0" },
{ name = "flask-login", specifier = "~=0.6.3" },
{ name = "flask-migrate", specifier = "~=4.0.7" },
{ name = "flask-orjson", specifier = "~=2.0.0" },
{ name = "flask-restful", specifier = "~=0.3.10" },
{ name = "flask-sqlalchemy", specifier = "~=3.1.1" },
{ name = "gevent", specifier = "~=24.11.1" },
@ -1859,6 +1861,19 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/93/01/587023575286236f95d2ab8a826c320375ed5ea2102bb103ed89704ffa6b/Flask_Migrate-4.0.7-py3-none-any.whl", hash = "sha256:5c532be17e7b43a223b7500d620edae33795df27c75811ddf32560f7d48ec617", size = 21127, upload-time = "2024-03-11T18:42:59.462Z" },
]
[[package]]
name = "flask-orjson"
version = "2.0.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "flask" },
{ name = "orjson" },
]
sdist = { url = "https://files.pythonhosted.org/packages/a3/49/575796f6ddca171d82dbb12762e33166c8b8f8616c946f0a6dfbb9bc3cd6/flask_orjson-2.0.0.tar.gz", hash = "sha256:6df6631437f9bc52cf9821735f896efa5583b5f80712f7d29d9ef69a79986a9c", size = 2974, upload-time = "2024-01-15T00:03:22.236Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/f3/ca/53e14be018a2284acf799830e8cd8e0b263c0fd3dff1ad7b35f8417e7067/flask_orjson-2.0.0-py3-none-any.whl", hash = "sha256:5d15f2ba94b8d6c02aee88fc156045016e83db9eda2c30545fabd640aebaec9d", size = 3622, upload-time = "2024-01-15T00:03:17.511Z" },
]
[[package]]
name = "flask-restful"
version = "0.3.10"

View File

@ -264,6 +264,15 @@ REDIS_PORT=6379
REDIS_USERNAME=
REDIS_PASSWORD=difyai123456
REDIS_USE_SSL=false
# SSL configuration for Redis (when REDIS_USE_SSL=true)
REDIS_SSL_CERT_REQS=CERT_NONE
# Options: CERT_NONE, CERT_OPTIONAL, CERT_REQUIRED
REDIS_SSL_CA_CERTS=
# Path to CA certificate file for SSL verification
REDIS_SSL_CERTFILE=
# Path to client certificate file for SSL authentication
REDIS_SSL_KEYFILE=
# Path to client private key file for SSL authentication
REDIS_DB=0
# Whether to use Redis Sentinel mode.
@ -861,7 +870,7 @@ WORKFLOW_NODE_EXECUTION_STORAGE=rdbms
# Repository configuration
# Core workflow execution repository implementation
# Options:
# Options:
# - core.repositories.sqlalchemy_workflow_execution_repository.SQLAlchemyWorkflowExecutionRepository (default)
# - core.repositories.celery_workflow_execution_repository.CeleryWorkflowExecutionRepository
CORE_WORKFLOW_EXECUTION_REPOSITORY=core.repositories.sqlalchemy_workflow_execution_repository.SQLAlchemyWorkflowExecutionRepository
@ -1157,6 +1166,9 @@ MARKETPLACE_API_URL=https://marketplace.dify.ai
FORCE_VERIFYING_SIGNATURE=true
PLUGIN_STDIO_BUFFER_SIZE=1024
PLUGIN_STDIO_MAX_BUFFER_SIZE=5242880
PLUGIN_PYTHON_ENV_INIT_TIMEOUT=120
PLUGIN_MAX_EXECUTION_TIMEOUT=600
# PIP_MIRROR_URL=https://pypi.tuna.tsinghua.edu.cn/simple

View File

@ -181,6 +181,8 @@ services:
FORCE_VERIFYING_SIGNATURE: ${FORCE_VERIFYING_SIGNATURE:-true}
PYTHON_ENV_INIT_TIMEOUT: ${PLUGIN_PYTHON_ENV_INIT_TIMEOUT:-120}
PLUGIN_MAX_EXECUTION_TIMEOUT: ${PLUGIN_MAX_EXECUTION_TIMEOUT:-600}
PLUGIN_STDIO_BUFFER_SIZE: ${PLUGIN_STDIO_BUFFER_SIZE:-1024}
PLUGIN_STDIO_MAX_BUFFER_SIZE: ${PLUGIN_STDIO_MAX_BUFFER_SIZE:-5242880}
PIP_MIRROR_URL: ${PIP_MIRROR_URL:-}
PLUGIN_STORAGE_TYPE: ${PLUGIN_STORAGE_TYPE:-local}
PLUGIN_STORAGE_LOCAL_ROOT: ${PLUGIN_STORAGE_LOCAL_ROOT:-/app/storage}

View File

@ -71,6 +71,10 @@ x-shared-env: &shared-api-worker-env
REDIS_USERNAME: ${REDIS_USERNAME:-}
REDIS_PASSWORD: ${REDIS_PASSWORD:-difyai123456}
REDIS_USE_SSL: ${REDIS_USE_SSL:-false}
REDIS_SSL_CERT_REQS: ${REDIS_SSL_CERT_REQS:-CERT_NONE}
REDIS_SSL_CA_CERTS: ${REDIS_SSL_CA_CERTS:-}
REDIS_SSL_CERTFILE: ${REDIS_SSL_CERTFILE:-}
REDIS_SSL_KEYFILE: ${REDIS_SSL_KEYFILE:-}
REDIS_DB: ${REDIS_DB:-0}
REDIS_USE_SENTINEL: ${REDIS_USE_SENTINEL:-false}
REDIS_SENTINELS: ${REDIS_SENTINELS:-}
@ -506,6 +510,8 @@ x-shared-env: &shared-api-worker-env
MARKETPLACE_ENABLED: ${MARKETPLACE_ENABLED:-true}
MARKETPLACE_API_URL: ${MARKETPLACE_API_URL:-https://marketplace.dify.ai}
FORCE_VERIFYING_SIGNATURE: ${FORCE_VERIFYING_SIGNATURE:-true}
PLUGIN_STDIO_BUFFER_SIZE: ${PLUGIN_STDIO_BUFFER_SIZE:-1024}
PLUGIN_STDIO_MAX_BUFFER_SIZE: ${PLUGIN_STDIO_MAX_BUFFER_SIZE:-5242880}
PLUGIN_PYTHON_ENV_INIT_TIMEOUT: ${PLUGIN_PYTHON_ENV_INIT_TIMEOUT:-120}
PLUGIN_MAX_EXECUTION_TIMEOUT: ${PLUGIN_MAX_EXECUTION_TIMEOUT:-600}
PIP_MIRROR_URL: ${PIP_MIRROR_URL:-}
@ -747,6 +753,8 @@ services:
FORCE_VERIFYING_SIGNATURE: ${FORCE_VERIFYING_SIGNATURE:-true}
PYTHON_ENV_INIT_TIMEOUT: ${PLUGIN_PYTHON_ENV_INIT_TIMEOUT:-120}
PLUGIN_MAX_EXECUTION_TIMEOUT: ${PLUGIN_MAX_EXECUTION_TIMEOUT:-600}
PLUGIN_STDIO_BUFFER_SIZE: ${PLUGIN_STDIO_BUFFER_SIZE:-1024}
PLUGIN_STDIO_MAX_BUFFER_SIZE: ${PLUGIN_STDIO_MAX_BUFFER_SIZE:-5242880}
PIP_MIRROR_URL: ${PIP_MIRROR_URL:-}
PLUGIN_STORAGE_TYPE: ${PLUGIN_STORAGE_TYPE:-local}
PLUGIN_STORAGE_LOCAL_ROOT: ${PLUGIN_STORAGE_LOCAL_ROOT:-/app/storage}

View File

@ -6,7 +6,7 @@ LABEL maintainer="takatost@gmail.com"
# RUN sed -i 's/dl-cdn.alpinelinux.org/mirrors.aliyun.com/g' /etc/apk/repositories
RUN apk add --no-cache tzdata
RUN npm install -g pnpm@10.13.1
RUN corepack enable
ENV PNPM_HOME="/pnpm"
ENV PATH="$PNPM_HOME:$PATH"
@ -19,6 +19,9 @@ WORKDIR /app/web
COPY package.json .
COPY pnpm-lock.yaml .
# Use packageManager from package.json
RUN corepack install
# if you located in China, you can use taobao registry to speed up
# RUN pnpm install --frozen-lockfile --registry https://registry.npmmirror.com/

View File

@ -8,7 +8,6 @@ import {
} from '@heroicons/react/24/outline'
import { RiCloseLine, RiEditFill } from '@remixicon/react'
import { get } from 'lodash-es'
import InfiniteScroll from 'react-infinite-scroll-component'
import dayjs from 'dayjs'
import utc from 'dayjs/plugin/utc'
import timezone from 'dayjs/plugin/timezone'
@ -111,7 +110,8 @@ const statusTdRender = (statusCount: StatusCount) => {
const getFormattedChatList = (messages: ChatMessage[], conversationId: string, timezone: string, format: string) => {
const newChatList: IChatItem[] = []
messages.forEach((item: ChatMessage) => {
try {
messages.forEach((item: ChatMessage) => {
const questionFiles = item.message_files?.filter((file: any) => file.belongs_to === 'user') || []
newChatList.push({
id: `question-${item.id}`,
@ -178,7 +178,13 @@ const getFormattedChatList = (messages: ChatMessage[], conversationId: string, t
parentMessageId: `question-${item.id}`,
})
})
return newChatList
return newChatList
}
catch (error) {
console.error('getFormattedChatList processing failed:', error)
throw error
}
}
type IDetailPanel = {
@ -188,6 +194,9 @@ type IDetailPanel = {
}
function DetailPanel({ detail, onFeedback }: IDetailPanel) {
const MIN_ITEMS_FOR_SCROLL_LOADING = 8
const SCROLL_THRESHOLD_PX = 50
const SCROLL_DEBOUNCE_MS = 200
const { userProfile: { timezone } } = useAppContext()
const { formatTime } = useTimestamp()
const { onClose, appDetail } = useContext(DrawerContext)
@ -204,13 +213,19 @@ function DetailPanel({ detail, onFeedback }: IDetailPanel) {
const { t } = useTranslation()
const [hasMore, setHasMore] = useState(true)
const [varValues, setVarValues] = useState<Record<string, string>>({})
const isLoadingRef = useRef(false)
const [allChatItems, setAllChatItems] = useState<IChatItem[]>([])
const [chatItemTree, setChatItemTree] = useState<ChatItemInTree[]>([])
const [threadChatItems, setThreadChatItems] = useState<IChatItem[]>([])
const fetchData = useCallback(async () => {
if (isLoadingRef.current)
return
try {
isLoadingRef.current = true
if (!hasMore)
return
@ -218,8 +233,11 @@ function DetailPanel({ detail, onFeedback }: IDetailPanel) {
conversation_id: detail.id,
limit: 10,
}
if (allChatItems[0]?.id)
params.first_id = allChatItems[0]?.id.replace('question-', '')
// Use the oldest answer item ID for pagination
const answerItems = allChatItems.filter(item => item.isAnswer)
const oldestAnswerItem = answerItems[answerItems.length - 1]
if (oldestAnswerItem?.id)
params.first_id = oldestAnswerItem.id
const messageRes = await fetchChatMessages({
url: `/apps/${appDetail?.id}/chat-messages`,
params,
@ -249,15 +267,20 @@ function DetailPanel({ detail, onFeedback }: IDetailPanel) {
}
setChatItemTree(tree)
setThreadChatItems(getThreadMessages(tree, newAllChatItems.at(-1)?.id))
const lastMessageId = newAllChatItems.length > 0 ? newAllChatItems[newAllChatItems.length - 1].id : undefined
setThreadChatItems(getThreadMessages(tree, lastMessageId))
}
catch (err) {
console.error(err)
console.error('fetchData execution failed:', err)
}
finally {
isLoadingRef.current = false
}
}, [allChatItems, detail.id, hasMore, timezone, t, appDetail, detail?.model_config?.configs?.introduction])
const switchSibling = useCallback((siblingMessageId: string) => {
setThreadChatItems(getThreadMessages(chatItemTree, siblingMessageId))
const newThreadChatItems = getThreadMessages(chatItemTree, siblingMessageId)
setThreadChatItems(newThreadChatItems)
}, [chatItemTree])
const handleAnnotationEdited = useCallback((query: string, answer: string, index: number) => {
@ -344,13 +367,217 @@ function DetailPanel({ detail, onFeedback }: IDetailPanel) {
const fetchInitiated = useRef(false)
// Only load initial messages, don't auto-load more
useEffect(() => {
if (appDetail?.id && detail.id && appDetail?.mode !== 'completion' && !fetchInitiated.current) {
// Mark as initialized, but don't auto-load more messages
fetchInitiated.current = true
// Still call fetchData to get initial messages
fetchData()
}
}, [appDetail?.id, detail.id, appDetail?.mode, fetchData])
const [isLoading, setIsLoading] = useState(false)
const loadMoreMessages = useCallback(async () => {
if (isLoading || !hasMore || !appDetail?.id || !detail.id)
return
setIsLoading(true)
try {
const params: ChatMessagesRequest = {
conversation_id: detail.id,
limit: 10,
}
// Use the earliest response item as the first_id
const answerItems = allChatItems.filter(item => item.isAnswer)
const oldestAnswerItem = answerItems[answerItems.length - 1]
if (oldestAnswerItem?.id) {
params.first_id = oldestAnswerItem.id
}
else if (allChatItems.length > 0 && allChatItems[0]?.id) {
const firstId = allChatItems[0].id.replace('question-', '').replace('answer-', '')
params.first_id = firstId
}
const messageRes = await fetchChatMessages({
url: `/apps/${appDetail.id}/chat-messages`,
params,
})
if (!messageRes.data || messageRes.data.length === 0) {
setHasMore(false)
return
}
if (messageRes.data.length > 0) {
const varValues = messageRes.data.at(-1)!.inputs
setVarValues(varValues)
}
setHasMore(messageRes.has_more)
const newItems = getFormattedChatList(
messageRes.data,
detail.id,
timezone!,
t('appLog.dateTimeFormat') as string,
)
// Check for duplicate messages
const existingIds = new Set(allChatItems.map(item => item.id))
const uniqueNewItems = newItems.filter(item => !existingIds.has(item.id))
if (uniqueNewItems.length === 0) {
if (allChatItems.length > 1) {
const nextId = allChatItems[1].id.replace('question-', '').replace('answer-', '')
const retryParams = {
...params,
first_id: nextId,
}
const retryRes = await fetchChatMessages({
url: `/apps/${appDetail.id}/chat-messages`,
params: retryParams,
})
if (retryRes.data && retryRes.data.length > 0) {
const retryItems = getFormattedChatList(
retryRes.data,
detail.id,
timezone!,
t('appLog.dateTimeFormat') as string,
)
const retryUniqueItems = retryItems.filter(item => !existingIds.has(item.id))
if (retryUniqueItems.length > 0) {
const newAllChatItems = [
...retryUniqueItems,
...allChatItems,
]
setAllChatItems(newAllChatItems)
let tree = buildChatItemTree(newAllChatItems)
if (retryRes.has_more === false && detail?.model_config?.configs?.introduction) {
tree = [{
id: 'introduction',
isAnswer: true,
isOpeningStatement: true,
content: detail?.model_config?.configs?.introduction ?? 'hello',
feedbackDisabled: true,
children: tree,
}]
}
setChatItemTree(tree)
setHasMore(retryRes.has_more)
setThreadChatItems(getThreadMessages(tree, newAllChatItems.at(-1)?.id))
return
}
}
}
}
const newAllChatItems = [
...uniqueNewItems,
...allChatItems,
]
setAllChatItems(newAllChatItems)
let tree = buildChatItemTree(newAllChatItems)
if (messageRes.has_more === false && detail?.model_config?.configs?.introduction) {
tree = [{
id: 'introduction',
isAnswer: true,
isOpeningStatement: true,
content: detail?.model_config?.configs?.introduction ?? 'hello',
feedbackDisabled: true,
children: tree,
}]
}
setChatItemTree(tree)
setThreadChatItems(getThreadMessages(tree, newAllChatItems.at(-1)?.id))
}
catch (error) {
console.error(error)
setHasMore(false)
}
finally {
setIsLoading(false)
}
}, [allChatItems, detail.id, hasMore, isLoading, timezone, t, appDetail])
useEffect(() => {
const scrollableDiv = document.getElementById('scrollableDiv')
const outerDiv = scrollableDiv?.parentElement
const chatContainer = document.querySelector('.mx-1.mb-1.grow.overflow-auto') as HTMLElement
let scrollContainer: HTMLElement | null = null
if (outerDiv && outerDiv.scrollHeight > outerDiv.clientHeight) {
scrollContainer = outerDiv
}
else if (scrollableDiv && scrollableDiv.scrollHeight > scrollableDiv.clientHeight) {
scrollContainer = scrollableDiv
}
else if (chatContainer && chatContainer.scrollHeight > chatContainer.clientHeight) {
scrollContainer = chatContainer
}
else {
const possibleContainers = document.querySelectorAll('.overflow-auto, .overflow-y-auto')
for (let i = 0; i < possibleContainers.length; i++) {
const container = possibleContainers[i] as HTMLElement
if (container.scrollHeight > container.clientHeight) {
scrollContainer = container
break
}
}
}
if (!scrollContainer)
return
let lastLoadTime = 0
const throttleDelay = 200
const handleScroll = () => {
const currentScrollTop = scrollContainer!.scrollTop
const scrollHeight = scrollContainer!.scrollHeight
const clientHeight = scrollContainer!.clientHeight
const distanceFromTop = currentScrollTop
const distanceFromBottom = scrollHeight - currentScrollTop - clientHeight
const now = Date.now()
const isNearTop = distanceFromTop < 30
// eslint-disable-next-line sonarjs/no-unused-vars
const _distanceFromBottom = distanceFromBottom < 30
if (isNearTop && hasMore && !isLoading && (now - lastLoadTime > throttleDelay)) {
lastLoadTime = now
loadMoreMessages()
}
}
scrollContainer.addEventListener('scroll', handleScroll, { passive: true })
const handleWheel = (e: WheelEvent) => {
if (e.deltaY < 0)
handleScroll()
}
scrollContainer.addEventListener('wheel', handleWheel, { passive: true })
return () => {
scrollContainer!.removeEventListener('scroll', handleScroll)
scrollContainer!.removeEventListener('wheel', handleWheel)
}
}, [hasMore, isLoading, loadMoreMessages])
const isChatMode = appDetail?.mode !== 'completion'
const isAdvanced = appDetail?.mode === 'advanced-chat'
@ -378,6 +605,36 @@ function DetailPanel({ detail, onFeedback }: IDetailPanel) {
return () => cancelAnimationFrame(raf)
}, [])
// Add scroll listener to ensure loading is triggered
useEffect(() => {
if (threadChatItems.length >= MIN_ITEMS_FOR_SCROLL_LOADING && hasMore) {
const scrollableDiv = document.getElementById('scrollableDiv')
if (scrollableDiv) {
let loadingTimeout: NodeJS.Timeout | null = null
const handleScroll = () => {
const { scrollTop } = scrollableDiv
// Trigger loading when scrolling near the top
if (scrollTop < SCROLL_THRESHOLD_PX && !isLoadingRef.current) {
if (loadingTimeout)
clearTimeout(loadingTimeout)
loadingTimeout = setTimeout(fetchData, SCROLL_DEBOUNCE_MS) // 200ms debounce
}
}
scrollableDiv.addEventListener('scroll', handleScroll)
return () => {
scrollableDiv.removeEventListener('scroll', handleScroll)
if (loadingTimeout)
clearTimeout(loadingTimeout)
}
}
}
}, [threadChatItems.length, hasMore, fetchData])
return (
<div ref={ref} className='flex h-full flex-col rounded-xl border-[0.5px] border-components-panel-border'>
{/* Panel Header */}
@ -439,8 +696,8 @@ function DetailPanel({ detail, onFeedback }: IDetailPanel) {
siteInfo={null}
/>
</div>
: threadChatItems.length < 8
? <div className="mb-4 pt-4">
: threadChatItems.length < MIN_ITEMS_FOR_SCROLL_LOADING ? (
<div className="mb-4 pt-4">
<Chat
config={{
appId: appDetail?.id,
@ -466,35 +723,27 @@ function DetailPanel({ detail, onFeedback }: IDetailPanel) {
switchSibling={switchSibling}
/>
</div>
: <div
) : (
<div
className="py-4"
id="scrollableDiv"
style={{
display: 'flex',
flexDirection: 'column-reverse',
height: '100%',
overflow: 'auto',
}}>
{/* Put the scroll bar always on the bottom */}
<InfiniteScroll
scrollableTarget="scrollableDiv"
dataLength={threadChatItems.length}
next={fetchData}
hasMore={hasMore}
loader={<div className='system-xs-regular text-center text-text-tertiary'>{t('appLog.detail.loading')}...</div>}
// endMessage={<div className='text-center'>Nothing more to show</div>}
// below props only if you need pull down functionality
refreshFunction={fetchData}
pullDownToRefresh
pullDownToRefreshThreshold={50}
// pullDownToRefreshContent={
// <div className='text-center'>Pull down to refresh</div>
// }
// releaseToRefreshContent={
// <div className='text-center'>Release to refresh</div>
// }
// To put endMessage and loader to the top.
style={{ display: 'flex', flexDirection: 'column-reverse' }}
inverse={true}
>
<div className="flex w-full flex-col-reverse" style={{ position: 'relative' }}>
{/* Loading state indicator - only shown when loading */}
{hasMore && isLoading && (
<div className="sticky left-0 right-0 top-0 z-10 bg-primary-50/40 py-3 text-center">
<div className='system-xs-regular text-text-tertiary'>
{t('appLog.detail.loading')}...
</div>
</div>
)}
<Chat
config={{
appId: appDetail?.id,
@ -519,8 +768,9 @@ function DetailPanel({ detail, onFeedback }: IDetailPanel) {
chatContainerInnerClassName='px-3'
switchSibling={switchSibling}
/>
</InfiniteScroll>
</div>
</div>
)
}
</div>
{showMessageLogModal && (

View File

@ -407,8 +407,8 @@ const AppCard = ({ app, onRefresh }: AppCardProps) => {
}
btnClassName={open =>
cn(
open ? '!bg-black/5 !shadow-none' : '!bg-transparent',
'h-8 w-8 rounded-md border-none !p-2 hover:!bg-black/5',
open ? '!bg-state-base-hover !shadow-none' : '!bg-transparent',
'h-8 w-8 rounded-md border-none !p-2 hover:!bg-state-base-hover',
)
}
popupClassName={

View File

@ -2,6 +2,7 @@
import type { FC } from 'react'
import React from 'react'
import s from './style.module.css'
import cn from '@/utils/classnames'
export type ILoadingAnimProps = {
type: 'text' | 'avatar'
@ -11,7 +12,7 @@ const LoadingAnim: FC<ILoadingAnimProps> = ({
type,
}) => {
return (
<div className={`${s['dot-flashing']} ${s[type]}`}></div>
<div className={cn(s['dot-flashing'], s[type])} />
)
}
export default React.memo(LoadingAnim)

View File

@ -8,6 +8,7 @@ import Modal from '@/app/components/base/modal'
import Button from '@/app/components/base/button'
import Divider from '@/app/components/base/divider'
import ConfirmAddVar from '@/app/components/app/configuration/config-prompt/confirm-add-var'
import PromptEditor from '@/app/components/base/prompt-editor'
import type { OpeningStatement } from '@/app/components/base/features/types'
import { getInputKeys } from '@/app/components/base/block-input'
import type { PromptVariable } from '@/models/debug'
@ -101,7 +102,7 @@ const OpeningSettingModal = ({
<div>·</div>
<div>{tempSuggestedQuestions.length}/{MAX_QUESTION_NUM}</div>
</div>
<Divider bgStyle='gradient' className='ml-3 h-px w-0 grow'/>
<Divider bgStyle='gradient' className='ml-3 h-px w-0 grow' />
</div>
<ReactSortable
className="space-y-1"
@ -178,19 +179,32 @@ const OpeningSettingModal = ({
>
<div className='mb-6 flex items-center justify-between'>
<div className='title-2xl-semi-bold text-text-primary'>{t('appDebug.feature.conversationOpener.title')}</div>
<div className='cursor-pointer p-1' onClick={onCancel}><RiCloseLine className='h-4 w-4 text-text-tertiary'/></div>
<div className='cursor-pointer p-1' onClick={onCancel}><RiCloseLine className='h-4 w-4 text-text-tertiary' /></div>
</div>
<div className='mb-8 flex gap-2'>
<div className='mt-1.5 h-8 w-8 shrink-0 rounded-lg border-components-panel-border bg-util-colors-orange-dark-orange-dark-500 p-1.5'>
<RiAsterisk className='h-5 w-5 text-text-primary-on-surface' />
</div>
<div className='grow rounded-2xl border-t border-divider-subtle bg-chat-bubble-bg p-3 shadow-xs'>
<textarea
<PromptEditor
value={tempValue}
rows={3}
onChange={e => setTempValue(e.target.value)}
className="system-md-regular w-full border-0 bg-transparent px-0 text-text-secondary focus:outline-none"
onChange={setTempValue}
placeholder={t('appDebug.openingStatement.placeholder') as string}
variableBlock={{
show: true,
variables: [
// Prompt variables
...promptVariables.map(item => ({
name: item.name || item.key,
value: item.key,
})),
// Workflow variables
...workflowVariables.map(item => ({
name: item.variable,
value: item.variable,
})),
],
}}
/>
{renderQuestions()}
</div>

View File

@ -137,7 +137,7 @@ const HitTestingPage: FC<Props> = ({ datasetId }: Props) => {
<>
<div className='grow overflow-y-auto'>
<table className={'w-full border-collapse border-0 text-[13px] leading-4 text-text-secondary '}>
<thead className='sticky top-0 h-7 text-xs font-medium uppercase leading-7 text-text-tertiary'>
<thead className='sticky top-0 h-7 text-xs font-medium uppercase leading-7 text-text-tertiary backdrop-blur-[5px]'>
<tr>
<td className='w-[128px] rounded-l-lg bg-background-section-burn pl-3'>{t('datasetHitTesting.table.header.source')}</td>
<td className='bg-background-section-burn'>{t('datasetHitTesting.table.header.text')}</td>

View File

@ -5,7 +5,7 @@ import type { GithubRepo } from '@/models/common'
import { RiLoader2Line } from '@remixicon/react'
const defaultData = {
stargazers_count: 98570,
stargazers_count: 110918,
}
const getStar = async () => {

View File

@ -79,6 +79,9 @@ const SwrInitializer = ({
<SWRConfig value={{
shouldRetryOnError: false,
revalidateOnFocus: false,
dedupingInterval: 60000,
focusThrottleInterval: 5000,
provider: () => new Map(),
}}>
{children}
</SWRConfig>

View File

@ -0,0 +1,15 @@
import { create } from 'zustand'
import type { Label } from './constant'
type State = {
labelList: Label[]
}
type Action = {
setLabelList: (labelList?: Label[]) => void
}
export const useStore = create<State & Action>(set => ({
labelList: [],
setLabelList: labelList => set(() => ({ labelList })),
}))

View File

@ -246,11 +246,11 @@ export const useWorkflow = () => {
const handleOutVarRenameChange = useCallback((nodeId: string, oldValeSelector: ValueSelector, newVarSelector: ValueSelector) => {
const { getNodes, setNodes } = store.getState()
const afterNodes = getAfterNodesInSameBranch(nodeId)
const effectNodes = findUsedVarNodes(oldValeSelector, afterNodes)
if (effectNodes.length > 0) {
const newNodes = getNodes().map((node) => {
if (effectNodes.find(n => n.id === node.id))
const allNodes = getNodes()
const affectedNodes = findUsedVarNodes(oldValeSelector, allNodes)
if (affectedNodes.length > 0) {
const newNodes = allNodes.map((node) => {
if (affectedNodes.find(n => n.id === node.id))
return updateNodeVars(node, oldValeSelector, newVarSelector)
return node

View File

@ -1101,7 +1101,15 @@ export const getNodeUsedVars = (node: Node): ValueSelector[] => {
res = (data as IfElseNodeType).conditions?.map((c) => {
return c.variable_selector || []
}) || []
res.push(...((data as IfElseNodeType).cases || []).flatMap(c => (c.conditions || [])).map(c => c.variable_selector || []))
res.push(...((data as IfElseNodeType).cases || []).flatMap(c => (c.conditions || [])).flatMap((c) => {
const selectors: ValueSelector[] = []
if (c.variable_selector)
selectors.push(c.variable_selector)
// Handle sub-variable conditions
if (c.sub_variable_condition && c.sub_variable_condition.conditions)
selectors.push(...c.sub_variable_condition.conditions.map(subC => subC.variable_selector || []).filter(sel => sel.length > 0))
return selectors
}))
break
}
case BlockEnum.Code: {
@ -1345,6 +1353,26 @@ export const updateNodeVars = (oldNode: Node, oldVarSelector: ValueSelector, new
return c
})
}
if (payload.cases) {
payload.cases = payload.cases.map((caseItem) => {
if (caseItem.conditions) {
caseItem.conditions = caseItem.conditions.map((c) => {
if (c.variable_selector?.join('.') === oldVarSelector.join('.'))
c.variable_selector = newVarSelector
// Handle sub-variable conditions
if (c.sub_variable_condition && c.sub_variable_condition.conditions) {
c.sub_variable_condition.conditions = c.sub_variable_condition.conditions.map((subC) => {
if (subC.variable_selector?.join('.') === oldVarSelector.join('.'))
subC.variable_selector = newVarSelector
return subC
})
}
return c
})
}
return caseItem
})
}
break
}
case BlockEnum.Code: {

View File

@ -90,7 +90,7 @@ const DebugAndPreview = () => {
<div
ref={containerRef}
className={cn(
'relative flex h-full flex-col rounded-l-2xl border border-r-0 border-components-panel-border bg-components-panel-bg shadow-xl',
'relative flex h-full flex-col rounded-l-2xl border border-r-0 border-components-panel-border bg-chatbot-bg shadow-xl',
)}
style={{ width: `${panelWidth}px` }}
>

View File

@ -75,7 +75,7 @@ export type AppContextProviderProps = {
}
export const AppContextProvider: FC<AppContextProviderProps> = ({ children }) => {
const { data: userProfileResponse, mutate: mutateUserProfile } = useSWR({ url: '/account/profile', params: {} }, fetchUserProfile)
const { data: userProfileResponse, mutate: mutateUserProfile, error: userProfileError } = useSWR({ url: '/account/profile', params: {} }, fetchUserProfile)
const { data: currentWorkspaceResponse, mutate: mutateCurrentWorkspace, isLoading: isLoadingCurrentWorkspace } = useSWR({ url: '/workspaces/current', params: {} }, fetchCurrentWorkspace)
const [userProfile, setUserProfile] = useState<UserProfileResponse>(userProfilePlaceholder)
@ -86,15 +86,26 @@ export const AppContextProvider: FC<AppContextProviderProps> = ({ children }) =>
const isCurrentWorkspaceEditor = useMemo(() => ['owner', 'admin', 'editor'].includes(currentWorkspace.role), [currentWorkspace.role])
const isCurrentWorkspaceDatasetOperator = useMemo(() => currentWorkspace.role === 'dataset_operator', [currentWorkspace.role])
const updateUserProfileAndVersion = useCallback(async () => {
if (userProfileResponse && !userProfileResponse.bodyUsed) {
const result = await userProfileResponse.json()
setUserProfile(result)
const current_version = userProfileResponse.headers.get('x-version')
const current_env = process.env.NODE_ENV === 'development' ? 'DEVELOPMENT' : userProfileResponse.headers.get('x-env')
const versionData = await fetchLangGeniusVersion({ url: '/version', params: { current_version } })
setLangGeniusVersionInfo({ ...versionData, current_version, latest_version: versionData.version, current_env })
if (userProfileResponse) {
try {
const clonedResponse = (userProfileResponse as Response).clone()
const result = await clonedResponse.json()
setUserProfile(result)
const current_version = userProfileResponse.headers.get('x-version')
const current_env = process.env.NODE_ENV === 'development' ? 'DEVELOPMENT' : userProfileResponse.headers.get('x-env')
const versionData = await fetchLangGeniusVersion({ url: '/version', params: { current_version } })
setLangGeniusVersionInfo({ ...versionData, current_version, latest_version: versionData.version, current_env })
}
catch (error) {
console.error('Failed to update user profile:', error)
if (userProfile.id === '')
setUserProfile(userProfilePlaceholder)
}
}
}, [userProfileResponse])
else if (userProfileError && userProfile.id === '') {
setUserProfile(userProfilePlaceholder)
}
}, [userProfileResponse, userProfileError, userProfile.id])
useEffect(() => {
updateUserProfileAndVersion()

View File

@ -23,19 +23,14 @@ const translation = {
contractSales: 'تماس با فروش',
contractOwner: 'تماس با مدیر تیم',
startForFree: 'رایگان شروع کنید',
getStartedWith: 'شروع کنید با ',
contactSales: 'تماس با فروش',
talkToSales: 'صحبت با فروش',
modelProviders: 'ارائه‌دهندگان مدل',
teamMembers: 'اعضای تیم',
annotationQuota: 'سهمیه حاشیه‌نویسی',
buildApps: 'ساخت اپلیکیشن‌ها',
vectorSpace: 'فضای وکتور',
vectorSpaceBillingTooltip: 'هر 1 مگابایت می‌تواند حدود 1.2 میلیون کاراکتر از داده‌های وکتور شده را ذخیره کند (براساس تخمین با استفاده از OpenAI Embeddings، متفاوت بر اساس مدل‌ها).',
vectorSpaceTooltip: 'فضای وکتور سیستم حافظه بلند مدت است که برای درک داده‌های شما توسط LLMها مورد نیاز است.',
documentsUploadQuota: 'سهمیه بارگذاری مستندات',
documentProcessingPriority: 'اولویت پردازش مستندات',
documentProcessingPriorityTip: 'برای اولویت پردازش بالاتر مستندات، لطفاً طرح خود را ارتقاء دهید.',
documentProcessingPriorityUpgrade: 'داده‌های بیشتری را با دقت بالاتر و سرعت بیشتر پردازش کنید.',
priority: {
'standard': 'استاندارد',
@ -103,19 +98,16 @@ const translation = {
sandbox: {
name: 'محیط آزمایشی',
description: '200 بار آزمایش رایگان GPT',
includesTitle: 'شامل:',
for: 'دوره آزمایشی رایگان قابلیت‌های اصلی',
},
professional: {
name: 'حرفه‌ای',
description: 'برای افراد و تیم‌های کوچک برای باز کردن قدرت بیشتر به طور مقرون به صرفه.',
includesTitle: 'همه چیز در طرح رایگان، به علاوه:',
for: 'برای توسعه‌دهندگان مستقل/تیم‌های کوچک',
},
team: {
name: 'تیم',
description: 'همکاری بدون محدودیت و لذت بردن از عملکرد برتر.',
includesTitle: 'همه چیز در طرح حرفه‌ای، به علاوه:',
for: 'برای تیم‌های متوسط',
},
enterprise: {
@ -123,15 +115,15 @@ const translation = {
description: 'دریافت کامل‌ترین قابلیت‌ها و پشتیبانی برای سیستم‌های بزرگ و بحرانی.',
includesTitle: 'همه چیز در طرح تیم، به علاوه:',
features: {
0: 'راهکارهای استقرار مقیاس‌پذیر در سطح سازمانی',
8: 'پشتیبانی فنی حرفه‌ای',
3: 'چندین فضای کاری و مدیریت سازمانی',
5: 'SLA های توافق شده توسط شرکای Dify',
4: 'SSO',
2: 'ویژگی‌های انحصاری سازمانی',
1: 'مجوز صدور مجوز تجاری',
6: 'امنیت و کنترل‌های پیشرفته',
7: 'به‌روزرسانی‌ها و نگهداری توسط دیفی به‌طور رسمی',
4: 'Sso',
1: 'مجوز جواز تجاری',
2: 'ویژگی های انحصاری سازمانی',
8: 'پشتیبانی فنی حرفه ای',
5: 'SLA های مذاکره شده توسط Dify Partners',
6: 'امنیت و کنترل پیشرفته',
3: 'فضاهای کاری چندگانه و مدیریت سازمانی',
7: 'به روز رسانی و نگهداری توسط Dify به طور رسمی',
0: 'راه حل های استقرار مقیاس پذیر در سطح سازمانی',
},
price: 'سفارشی',
btnText: 'تماس با فروش',
@ -140,9 +132,9 @@ const translation = {
},
community: {
features: {
0: 'تمام ویژگی‌های اصلی منتشر شده در مخزن عمومی',
2: 'با رعایت مجوز منبع باز دیفی',
1: 'فضای کاری واحد',
2: 'با مجوز منبع باز Dify مطابقت دارد',
0: 'تمام ویژگی های اصلی در مخزن عمومی منتشر شده است',
},
btnText: 'شروع کنید با جامعه',
price: 'رایگان',
@ -153,10 +145,10 @@ const translation = {
},
premium: {
features: {
1: 'محل کار واحد',
0: 'قابل اطمینان خودمدیریتی توسط ارائه‌دهندگان مختلف ابر',
2: 'شعار و سفارشی‌سازی برند وب‌اپلیکیشن',
3: 'پشتیبانی اولویت ایمیل و چت',
1: 'فضای کاری واحد',
3: 'پشتیبانی از ایمیل و چت اولویت دار',
2: 'لوگوی وب اپلیکیشن و سفارشی سازی برندینگ',
0: 'قابلیت اطمینان خود مدیریت شده توسط ارائه دهندگان مختلف ابر',
},
btnText: 'گرفتن نسخه پریمیوم در',
description: 'برای سازمان‌ها و تیم‌های میان‌رده',
@ -173,8 +165,6 @@ const translation = {
fullSolution: 'طرح خود را ارتقاء دهید تا فضای بیشتری دریافت کنید.',
},
apps: {
fullTipLine1: 'طرح خود را ارتقاء دهید تا',
fullTipLine2: 'اپلیکیشن‌های بیشتری بسازید.',
fullTip2: 'محدودیت طرح به پایان رسید',
contactUs: 'با ما تماس بگیرید',
fullTip1: 'به‌روزرسانی کنید تا برنامه‌های بیشتری ایجاد کنید',

View File

@ -23,18 +23,13 @@ const translation = {
contractSales: 'Contactez les ventes',
contractOwner: 'Contacter le chef d\'équipe',
startForFree: 'Commencez gratuitement',
getStartedWith: 'Commencez avec',
contactSales: 'Contacter les ventes',
talkToSales: 'Parlez aux Ventes',
modelProviders: 'Fournisseurs de Modèles',
teamMembers: 'Membres de l\'équipe',
buildApps: 'Construire des Applications',
vectorSpace: 'Espace Vectoriel',
vectorSpaceBillingTooltip: 'Chaque 1MB peut stocker environ 1,2 million de caractères de données vectorisées (estimé en utilisant les embeddings OpenAI, varie selon les modèles).',
vectorSpaceTooltip: 'L\'espace vectoriel est le système de mémoire à long terme nécessaire pour que les LLMs comprennent vos données.',
documentsUploadQuota: 'Quota de téléchargement de documents',
documentProcessingPriority: 'Priorité de Traitement de Document',
documentProcessingPriorityTip: 'Pour une priorité de traitement de documents plus élevée, veuillez mettre à niveau votre plan.',
documentProcessingPriorityUpgrade: 'Traitez plus de données avec une précision plus élevée à des vitesses plus rapides.',
priority: {
'standard': 'Standard',
@ -103,19 +98,16 @@ const translation = {
sandbox: {
name: 'Bac à sable',
description: '200 essais gratuits de GPT',
includesTitle: 'Inclus :',
for: 'Essai gratuit des fonctionnalités principales',
},
professional: {
name: 'Professionnel',
description: 'Pour les individus et les petites équipes afin de débloquer plus de puissance à un prix abordable.',
includesTitle: 'Tout ce qui est dans le plan gratuit, plus :',
for: 'Pour les développeurs indépendants / petites équipes',
},
team: {
name: 'Équipe',
description: 'Collaborez sans limites et profitez d\'une performance de premier ordre.',
includesTitle: 'Tout ce qui est inclus dans le plan Professionnel, plus :',
for: 'Pour les équipes de taille moyenne',
},
enterprise: {
@ -123,14 +115,14 @@ const translation = {
description: 'Obtenez toutes les capacités et le support pour les systèmes à grande échelle et critiques pour la mission.',
includesTitle: 'Tout ce qui est inclus dans le plan Équipe, plus :',
features: {
5: 'SLA négociés par Dify Partners',
1: 'Autorisation de Licence Commerciale',
2: 'Fonctionnalités exclusives pour les entreprises',
4: 'SSO',
8: 'Support Technique Professionnel',
3: 'Gestion de plusieurs espaces de travail et d\'entreprise',
6: 'Sécurité et contrôles avancés',
7: 'Mises à jour et maintenance par Dify Officiellement',
3: 'Espaces de travail multiples et gestion dentreprise',
4: 'SSO',
1: 'Autorisation de licence commerciale',
2: 'Fonctionnalités exclusives à lentreprise',
5: 'SLA négociés par les partenaires Dify',
8: 'Assistance technique professionnelle',
7: 'Mises à jour et maintenance par Dify officiellement',
0: 'Solutions de déploiement évolutives de niveau entreprise',
},
for: 'Pour les équipes de grande taille',
@ -140,9 +132,9 @@ const translation = {
},
community: {
features: {
2: 'Conforme à la licence open source de Dify',
1: 'Espace de travail unique',
0: 'Toutes les fonctionnalités principales publiées dans le référentiel public',
0: 'Toutes les fonctionnalités de base publiées dans le dépôt public',
2: 'Conforme à la licence Open Source Dify',
},
name: 'Communauté',
btnText: 'Commencez avec la communauté',
@ -153,10 +145,10 @@ const translation = {
},
premium: {
features: {
3: 'Support par e-mail et chat prioritaire',
2: 'Personnalisation du logo et de limage de marque WebApp',
1: 'Espace de travail unique',
0: 'Fiabilité autogérée par divers fournisseurs de cloud',
2: 'Personnalisation du logo et de la marque de l\'application Web',
3: 'Assistance prioritaire par e-mail et chat',
0: 'Fiabilité autogérée par différents fournisseurs de cloud',
},
for: 'Pour les organisations et les équipes de taille moyenne',
includesTitle: 'Tout de la communauté, en plus :',
@ -173,8 +165,6 @@ const translation = {
fullSolution: 'Mettez à niveau votre plan pour obtenir plus d\'espace.',
},
apps: {
fullTipLine1: 'Mettez à jour votre plan pour',
fullTipLine2: 'construire plus d\'applications.',
fullTip2: 'Limite de plan atteinte',
contactUs: 'Contactez-nous',
fullTip1: 'Mettez à niveau pour créer plus d\'applications',

View File

@ -24,22 +24,15 @@ const translation = {
contractSales: 'बिक्री से संपर्क करें',
contractOwner: 'टीम प्रबंधक से संपर्क करें',
startForFree: 'मुफ्त में शुरू करें',
getStartedWith: 'इसके साथ शुरू करें ',
contactSales: 'बिक्री से संपर्क करें',
talkToSales: 'बिक्री से बात करें',
modelProviders: 'मॉडल प्रदाता',
teamMembers: 'टीम के सदस्य',
annotationQuota: 'एनोटेशन कोटा',
buildApps: 'ऐप्स बनाएं',
vectorSpace: 'वेक्टर स्पेस',
vectorSpaceBillingTooltip:
'प्रत्येक 1MB लगभग 1.2 मिलियन वर्णों के वेक्टराइज्ड डेटा को संग्रहीत कर सकता है (OpenAI एम्बेडिंग का उपयोग करके अनुमानित, मॉडल में भिन्नता होती है)।',
vectorSpaceTooltip:
'वेक्टर स्पेस वह दीर्घकालिक स्मृति प्रणाली है जिसकी आवश्यकता LLMs को आपके डेटा को समझने के लिए होती है।',
documentsUploadQuota: 'दस्तावेज़ अपलोड कोटा',
documentProcessingPriority: 'दस्तावेज़ प्रसंस्करण प्राथमिकता',
documentProcessingPriorityTip:
'उच्च दस्तावेज़ प्रसंस्करण प्राथमिकता के लिए, कृपया अपनी योजना अपग्रेड करें।',
documentProcessingPriorityUpgrade:
'तेजी से गति पर उच्च सटीकता के साथ अधिक डेटा संसाधित करें।',
priority: {
@ -113,21 +106,18 @@ const translation = {
sandbox: {
name: 'सैंडबॉक्स',
description: '200 बार GPT मुफ्त ट्रायल',
includesTitle: 'शामिल हैं:',
for: 'कोर क्षमताओं का मुफ्त परीक्षण',
},
professional: {
name: 'प्रोफेशनल',
description:
'व्यक्तियों और छोटे टीमों के लिए अधिक शक्ति सस्ती दर पर खोलें।',
includesTitle: 'मुफ्त योजना में सब कुछ, साथ में:',
for: 'स्वतंत्र डेवलपर्स/छोटे टीमों के लिए',
},
team: {
name: 'टीम',
description:
'बिना सीमा के सहयोग करें और शीर्ष स्तरीय प्रदर्शन का आनंद लें।',
includesTitle: 'प्रोफेशनल योजना में सब कुछ, साथ में:',
for: 'मध्यम आकार की टीमों के लिए',
},
enterprise: {
@ -136,15 +126,15 @@ const translation = {
'बड़े पैमाने पर मिशन-क्रिटिकल सिस्टम के लिए पूर्ण क्षमताएं और समर्थन प्राप्त करें।',
includesTitle: 'टीम योजना में सब कुछ, साथ में:',
features: {
0: 'उद्योग स्तर के बड़े पैमाने पर वितरण समाधान',
3: 'अनेक कार्यक्षेत्र और उद्यम प्रबंधक',
8: 'प्रोफेशनल तकनीकी समर्थन',
6: 'उन्नत सुरक्षा और नियंत्रण',
2: 'विशेष उद्यम सुविधाएँ',
1: 'Commercial License Authorization',
4: 'SSO',
6: 'उन्नत सुरक्षा और नियंत्रण',
2: 'विशेष उद्यम सुविधाएँ',
3: 'अनेक कार्यक्षेत्र और उद्यम प्रबंधक',
5: 'डिफाई पार्टनर्स द्वारा बातचीत किए गए एसएलए',
8: 'प्रोफेशनल तकनीकी समर्थन',
7: 'डीफाई द्वारा आधिकारिक रूप से अपडेट और रखरखाव',
0: 'उद्योग स्तर के बड़े पैमाने पर वितरण समाधान',
},
price: 'कस्टम',
btnText: 'बिक्री से संपर्क करें',
@ -153,9 +143,9 @@ const translation = {
},
community: {
features: {
1: 'एकल कार्यक्षेत्र',
2: 'डिफी ओपन सोर्स लाइसेंस के अनुपालन में',
0: 'सभी मुख्य सुविधाएं सार्वजनिक संग्रह के तहत जारी की गई हैं।',
1: 'एकल कार्यक्षेत्र',
},
description: 'व्यक्तिगत उपयोगकर्ताओं, छोटे टीमों, या गैर-व्यावसायिक परियोजनाओं के लिए',
for: 'व्यक्तिगत उपयोगकर्ताओं, छोटे टीमों, या गैर-व्यावसायिक परियोजनाओं के लिए',
@ -166,9 +156,9 @@ const translation = {
},
premium: {
features: {
1: 'एकल कार्यक्षेत्र',
2: 'वेब ऐप लोगो और ब्रांडिंग कस्टमाइजेशन',
3: 'प्राथमिकता ईमेल और चैट समर्थन',
1: 'एकल कार्यक्षेत्र',
0: 'विभिन्न क्लाउड प्रदाताओं द्वारा आत्म-प्रबंधित विश्वसनीयता',
},
priceTip: 'क्लाउड मार्केटप्लेस के आधार पर',
@ -186,8 +176,6 @@ const translation = {
fullSolution: 'अधिक स्थान प्राप्त करने के लिए अपनी योजना अपग्रेड करें।',
},
apps: {
fullTipLine1: 'अधिक ऐप्स बनाने के लिए',
fullTipLine2: 'अपनी योजना अपग्रेड करें।',
fullTip1: 'अधिक ऐप्स बनाने के लिए अपग्रेड करें',
fullTip2: 'योजना की सीमा पहुँच गई',
contactUs: 'हमसे संपर्क करें',

View File

@ -24,22 +24,15 @@ const translation = {
contractSales: 'Contatta vendite',
contractOwner: 'Contatta il responsabile del team',
startForFree: 'Inizia gratis',
getStartedWith: 'Inizia con ',
contactSales: 'Contatta le vendite',
talkToSales: 'Parla con le vendite',
modelProviders: 'Fornitori di Modelli',
teamMembers: 'Membri del Team',
annotationQuota: 'Quota di Annotazione',
buildApps: 'Crea App',
vectorSpace: 'Spazio Vettoriale',
vectorSpaceBillingTooltip:
'Ogni 1MB può memorizzare circa 1,2 milioni di caratteri di dati vettoriali (stimato utilizzando OpenAI Embeddings, varia tra i modelli).',
vectorSpaceTooltip:
'Lo Spazio Vettoriale è il sistema di memoria a lungo termine necessario per permettere agli LLM di comprendere i tuoi dati.',
documentsUploadQuota: 'Quota di Caricamento Documenti',
documentProcessingPriority: 'Priorità di Elaborazione Documenti',
documentProcessingPriorityTip:
'Per una maggiore priorità di elaborazione dei documenti, aggiorna il tuo piano.',
documentProcessingPriorityUpgrade:
'Elabora più dati con maggiore precisione a velocità più elevate.',
priority: {
@ -113,21 +106,18 @@ const translation = {
sandbox: {
name: 'Sandbox',
description: '200 prove gratuite di GPT',
includesTitle: 'Include:',
for: 'Prova gratuita delle capacità principali',
},
professional: {
name: 'Professional',
description:
'Per individui e piccoli team per sbloccare più potenza a prezzi accessibili.',
includesTitle: 'Tutto nel piano gratuito, più:',
for: 'Per sviluppatori indipendenti / piccoli team',
},
team: {
name: 'Team',
description:
'Collabora senza limiti e goditi prestazioni di alto livello.',
includesTitle: 'Tutto nel piano Professional, più:',
for: 'Per team di medie dimensioni',
},
enterprise: {
@ -136,15 +126,15 @@ const translation = {
'Ottieni tutte le capacità e il supporto per sistemi mission-critical su larga scala.',
includesTitle: 'Tutto nel piano Team, più:',
features: {
6: 'Sicurezza e Controlli Avanzati',
2: 'Funzionalità esclusive per le imprese',
3: 'Spazi di lavoro multipli e gestione aziendale',
2: 'Funzionalità esclusive per le aziende',
1: 'Autorizzazione Licenza Commerciale',
5: 'SLA negoziati dai partner Dify',
4: 'SSO',
8: 'Supporto Tecnico Professionale',
5: 'SLA negoziati da Dify Partners',
0: 'Soluzioni di distribuzione scalabili di livello enterprise',
7: 'Aggiornamenti e manutenzione di Dify ufficialmente',
1: 'Autorizzazione alla Licenza Commerciale',
3: 'Gestione di più spazi di lavoro e imprese',
6: 'Sicurezza e controlli avanzati',
8: 'Supporto tecnico professionale',
7: 'Aggiornamenti e manutenzione da parte di Dify ufficialmente',
0: 'Soluzioni di distribuzione scalabili di livello aziendale',
},
price: 'Personalizzato',
for: 'Per team di grandi dimensioni',
@ -153,9 +143,9 @@ const translation = {
},
community: {
features: {
1: 'Spazio di Lavoro Unico',
2: 'Rispetta la Licenza Open Source di Dify',
0: 'Tutte le funzionalità principali rilasciate sotto il repository pubblico',
1: 'Area di lavoro singola',
2: 'Conforme alla licenza Open Source Dify',
0: 'Tutte le funzionalità principali rilasciate nel repository pubblico',
},
name: 'Comunità',
btnText: 'Inizia con la comunità',
@ -166,10 +156,10 @@ const translation = {
},
premium: {
features: {
0: 'Affidabilità autogestita da vari fornitori di cloud',
3: 'Supporto prioritario via Email e Chat',
2: 'Personalizzazione del logo e del marchio dell\'app web',
1: 'Spazio di Lavoro Unico',
3: 'Supporto prioritario via e-mail e chat',
1: 'Area di lavoro singola',
2: 'Personalizzazione del logo e del marchio WebApp',
0: 'Affidabilità autogestita da vari fornitori di servizi cloud',
},
name: 'Premium',
priceTip: 'Basato su Cloud Marketplace',
@ -186,8 +176,6 @@ const translation = {
fullSolution: 'Aggiorna il tuo piano per ottenere più spazio.',
},
apps: {
fullTipLine1: 'Aggiorna il tuo piano per',
fullTipLine2: 'creare più app.',
fullTip1des: 'Hai raggiunto il limite di costruzione delle app su questo piano.',
fullTip2des: 'Si consiglia di disinstallare le applicazioni inattive per liberare spazio, o contattarci.',
contactUs: 'Contattaci',

View File

@ -17,7 +17,7 @@ const translation = {
bulkImport: '一括インポート',
bulkExport: '一括エクスポート',
clearAll: 'すべて削除',
clearAllConfirm: 'すべての寸法を削除?',
clearAllConfirm: 'すべての注釈を削除しますか?',
},
},
editModal: {

View File

@ -565,7 +565,7 @@ const translation = {
overview: '監視',
promptEng: 'オーケストレート',
apiAccess: 'API アクセス',
logAndAnn: 'ログ&アナウンス',
logAndAnn: 'ログ&注釈',
logs: 'ログ',
},
environment: {

View File

@ -995,6 +995,7 @@ const translation = {
noLastRunFound: '以前の実行が見つかりませんでした。',
copyLastRunError: '最後の実行の入力をコピーできませんでした',
noMatchingInputsFound: '前回の実行から一致する入力が見つかりませんでした。',
lastRunInputsCopied: '前回の実行から{{count}}個の入力をコピーしました',
},
}

View File

@ -24,21 +24,14 @@ const translation = {
contractSales: 'Skontaktuj się z działem sprzedaży',
contractOwner: 'Skontaktuj się z zarządcą zespołu',
startForFree: 'Zacznij za darmo',
getStartedWith: 'Rozpocznij z ',
contactSales: 'Kontakt z działem sprzedaży',
talkToSales: 'Porozmawiaj z działem sprzedaży',
modelProviders: 'Dostawcy modeli',
teamMembers: 'Członkowie zespołu',
buildApps: 'Twórz aplikacje',
vectorSpace: 'Przestrzeń wektorowa',
vectorSpaceBillingTooltip:
'Każdy 1MB może przechowywać około 1,2 miliona znaków z wektoryzowanych danych (szacowane na podstawie OpenAI Embeddings, różni się w zależności od modelu).',
vectorSpaceTooltip:
'Przestrzeń wektorowa jest systemem pamięci długoterminowej wymaganym dla LLM, aby zrozumieć Twoje dane.',
documentsUploadQuota: 'Limit przesyłanych dokumentów',
documentProcessingPriority: 'Priorytet przetwarzania dokumentów',
documentProcessingPriorityTip:
'Dla wyższego priorytetu przetwarzania dokumentów, ulepsz swój plan.',
documentProcessingPriorityUpgrade:
'Przetwarzaj więcej danych z większą dokładnością i w szybszym tempie.',
priority: {
@ -112,21 +105,18 @@ const translation = {
sandbox: {
name: 'Sandbox',
description: '200 razy darmowa próba GPT',
includesTitle: 'Zawiera:',
for: 'Darmowy okres próbny podstawowych funkcji',
},
professional: {
name: 'Profesjonalny',
description:
'Dla osób fizycznych i małych zespołów, aby odblokować więcej mocy w przystępnej cenie.',
includesTitle: 'Wszystko w darmowym planie, plus:',
for: 'Dla niezależnych deweloperów/małych zespołów',
},
team: {
name: 'Zespół',
description:
'Współpracuj bez ograniczeń i ciesz się najwyższą wydajnością.',
includesTitle: 'Wszystko w planie Profesjonalnym, plus:',
for: 'Dla średniej wielkości zespołów',
},
enterprise: {
@ -135,15 +125,15 @@ const translation = {
'Uzyskaj pełne możliwości i wsparcie dla systemów o kluczowym znaczeniu dla misji.',
includesTitle: 'Wszystko w planie Zespołowym, plus:',
features: {
3: 'Wiele przestrzeni roboczych i zarządzanie przedsiębiorstwem',
5: 'Wynegocjowane SLA przez Dify Partners',
0: 'Rozwiązania do wdrożeń na dużą skalę klasy przedsiębiorstw',
8: 'Profesjonalne wsparcie techniczne',
2: 'Ekskluzywne funkcje przedsiębiorstwa',
6: 'Zaawansowane zabezpieczenia i kontrola',
7: 'Aktualizacje i konserwacja przez Dify Oficjalnie',
4: 'SSO',
2: 'Wyjątkowe funkcje dla przedsiębiorstw',
7: 'Aktualizacje i konserwacja przez Dify oficjalnie',
4: 'Usługi rejestracji jednokrotnej',
1: 'Autoryzacja licencji komercyjnej',
0: 'Skalowalne rozwiązania wdrożeniowe klasy korporacyjnej',
5: 'Umowy SLA wynegocjowane przez Dify Partners',
8: 'Profesjonalne wsparcie techniczne',
3: 'Wiele przestrzeni roboczych i zarządzanie przedsiębiorstwem',
6: 'Zaawansowane zabezpieczenia i kontrola',
},
priceTip: 'Tylko roczne fakturowanie',
btnText: 'Skontaktuj się z działem sprzedaży',
@ -152,9 +142,9 @@ const translation = {
},
community: {
features: {
0: 'Wszystkie funkcje podstawowe wydane w publicznym repozytorium',
1: 'Jedno Miejsce Pracy',
2: 'Zgodne z licencją Dify Open Source',
1: 'Pojedyncza przestrzeń robocza',
2: 'Zgodny z licencją Dify Open Source',
0: 'Wszystkie podstawowe funkcje udostępnione w repozytorium publicznym',
},
includesTitle: 'Darmowe funkcje:',
name: 'Społeczność',
@ -165,10 +155,10 @@ const translation = {
},
premium: {
features: {
0: 'Samozarządzana niezawodność różnych dostawców chmury',
1: 'Jedno miejsce pracy',
3: 'Priorytetowe wsparcie przez e-mail i czat',
2: 'Logo aplikacji internetowej i dostosowanie marki',
1: 'Pojedyncza przestrzeń robocza',
2: 'Personalizacja logo i brandingu aplikacji internetowej',
3: 'Priorytetowa pomoc techniczna przez e-mail i czat',
0: 'Niezawodność samodzielnego zarządzania przez różnych dostawców usług w chmurze',
},
description: 'Dla średnich organizacji i zespołów',
for: 'Dla średnich organizacji i zespołów',
@ -185,8 +175,6 @@ const translation = {
fullSolution: 'Ulepsz swój plan, aby uzyskać więcej miejsca.',
},
apps: {
fullTipLine1: 'Ulepsz swój plan, aby',
fullTipLine2: 'tworzyć więcej aplikacji.',
fullTip1des: 'Osiągnąłeś limit tworzenia aplikacji w tym planie.',
fullTip1: 'Zaktualizuj, aby stworzyć więcej aplikacji',
fullTip2: 'Osiągnięto limit planu',

View File

@ -22,17 +22,13 @@ const translation = {
currentPlan: 'Plano Atual',
contractOwner: 'Entre em contato com o gerente da equipe',
startForFree: 'Comece de graça',
getStartedWith: 'Comece com',
contactSales: 'Fale com a equipe de Vendas',
talkToSales: 'Fale com a equipe de Vendas',
modelProviders: 'Fornecedores de Modelos',
teamMembers: 'Membros da Equipe',
buildApps: 'Construir Aplicações',
vectorSpace: 'Espaço Vetorial',
vectorSpaceBillingTooltip: 'Cada 1MB pode armazenar cerca de 1,2 milhão de caracteres de dados vetorizados (estimado usando OpenAI Embeddings, varia entre os modelos).',
vectorSpaceTooltip: 'O Espaço Vetorial é o sistema de memória de longo prazo necessário para que LLMs compreendam seus dados.',
documentProcessingPriority: 'Prioridade no Processamento de Documentos',
documentProcessingPriorityTip: 'Para maior prioridade no processamento de documentos, faça o upgrade do seu plano.',
documentProcessingPriorityUpgrade: 'Processe mais dados com maior precisão e velocidade.',
priority: {
'standard': 'Padrão',
@ -53,7 +49,6 @@ const translation = {
dedicatedAPISupport: 'Suporte dedicado à API',
customIntegration: 'Integração e suporte personalizados',
ragAPIRequest: 'Solicitações API RAG',
agentModel: 'Modelo de Agente',
workflow: 'Fluxo de trabalho',
llmLoadingBalancing: 'Balanceamento de carga LLM',
bulkUpload: 'Upload em massa de documentos',
@ -75,7 +70,6 @@ const translation = {
ragAPIRequestTooltip: 'Refere-se ao número de chamadas de API que invocam apenas as capacidades de processamento da base de conhecimento do Dify.',
receiptInfo: 'Somente proprietários e administradores de equipe podem se inscrever e visualizar informações de cobrança',
customTools: 'Ferramentas personalizadas',
documentsUploadQuota: 'Cota de upload de documentos',
annotationQuota: 'Cota de anotação',
contractSales: 'Entre em contato com a equipe de vendas',
unavailable: 'Indisponível',
@ -104,19 +98,16 @@ const translation = {
sandbox: {
name: 'Sandbox',
description: '200 vezes GPT de teste gratuito',
includesTitle: 'Inclui:',
for: 'Teste gratuito das capacidades principais',
},
professional: {
name: 'Profissional',
description: 'Para indivíduos e pequenas equipes desbloquearem mais poder de forma acessível.',
includesTitle: 'Tudo no plano gratuito, além de:',
for: 'Para Desenvolvedores Independentes/Pequenas Equipes',
},
team: {
name: 'Equipe',
description: 'Colabore sem limites e aproveite o desempenho de primeira linha.',
includesTitle: 'Tudo no plano Profissional, além de:',
for: 'Para Equipes de Médio Porte',
},
enterprise: {
@ -124,15 +115,15 @@ const translation = {
description: 'Obtenha capacidades completas e suporte para sistemas críticos em larga escala.',
includesTitle: 'Tudo no plano Equipe, além de:',
features: {
6: 'Segurança e Controles Avançados',
7: 'Atualizações e Manutenção por Dify Oficialmente',
5: 'Acordos de Nível de Serviço negociados pelos Parceiros Dify',
1: 'Autorização de Licença Comercial',
8: 'Suporte Técnico Profissional',
3: 'Vários espaços de trabalho e gerenciamento corporativo',
2: 'Recursos exclusivos da empresa',
6: 'Segurança e controles avançados',
4: 'SSO',
2: 'Recursos Exclusivos da Empresa',
3: 'Múltiplos Espaços de Trabalho e Gestão Empresarial',
0: 'Soluções de Implantação Escaláveis de Nível Empresarial',
8: 'Suporte Técnico Profissional',
0: 'Soluções de implantação escaláveis de nível empresarial',
7: 'Atualizações e manutenção por Dify oficialmente',
1: 'Autorização de Licença Comercial',
5: 'SLAs negociados pela Dify Partners',
},
btnText: 'Contate Vendas',
priceTip: 'Faturamento Anual Apenas',
@ -141,9 +132,9 @@ const translation = {
},
community: {
features: {
1: 'Espaço de Trabalho Único',
0: 'Todos os recursos principais lançados sob o repositório público',
2: 'Cumpre a Licença de Código Aberto Dify',
0: 'Todos os principais recursos lançados no repositório público',
2: 'Está em conformidade com a licença de código aberto Dify',
1: 'Espaço de trabalho individual',
},
name: 'Comunidade',
description: 'Para Usuários Individuais, Pequenas Equipes ou Projetos Não Comerciais',
@ -154,10 +145,10 @@ const translation = {
},
premium: {
features: {
1: 'Espaço de Trabalho Único',
3: 'Suporte prioritário por e-mail e chat',
2: 'Customização de Logo e Branding do WebApp',
2: 'Personalização do logotipo e da marca do WebApp',
1: 'Espaço de trabalho individual',
0: 'Confiabilidade autogerenciada por vários provedores de nuvem',
3: 'Suporte prioritário por e-mail e bate-papo',
},
includesTitle: 'Tudo da Comunidade, além de:',
for: 'Para organizações e equipes de médio porte',
@ -174,8 +165,6 @@ const translation = {
fullSolution: 'Faça o upgrade do seu plano para obter mais espaço.',
},
apps: {
fullTipLine1: 'Faça o upgrade do seu plano para',
fullTipLine2: 'construir mais aplicativos.',
fullTip1: 'Atualize para criar mais aplicativos',
fullTip2: 'Limite do plano alcançado',
fullTip1des: 'Você atingiu o limite de criar aplicativos neste plano.',

View File

@ -23,18 +23,13 @@ const translation = {
contractSales: 'Contactați vânzările',
contractOwner: 'Contactați managerul echipei',
startForFree: 'Începe gratuit',
getStartedWith: 'Începe cu ',
contactSales: 'Contactați vânzările',
talkToSales: 'Vorbiți cu vânzările',
modelProviders: 'Furnizori de modele',
teamMembers: 'Membri ai echipei',
buildApps: 'Construiește aplicații',
vectorSpace: 'Spațiu vectorial',
vectorSpaceBillingTooltip: 'Fiecare 1MB poate stoca aproximativ 1,2 milioane de caractere de date vectorizate (estimat folosind OpenAI Embeddings, variază în funcție de modele).',
vectorSpaceTooltip: 'Spațiul vectorial este sistemul de memorie pe termen lung necesar pentru ca LLM-urile să înțeleagă datele dvs.',
documentsUploadQuota: 'Cotă de încărcare a documentelor',
documentProcessingPriority: 'Prioritatea procesării documentelor',
documentProcessingPriorityTip: 'Pentru o prioritate mai mare a procesării documentelor, vă rugăm să actualizați planul.',
documentProcessingPriorityUpgrade: 'Procesați mai multe date cu o acuratețe mai mare și la viteze mai rapide.',
priority: {
'standard': 'Standard',
@ -103,19 +98,16 @@ const translation = {
sandbox: {
name: 'Sandbox',
description: '200 de încercări gratuite GPT',
includesTitle: 'Include:',
for: 'Proba gratuită a capacităților de bază',
},
professional: {
name: 'Professional',
description: 'Pentru persoane fizice și echipe mici pentru a debloca mai multă putere la un preț accesibil.',
includesTitle: 'Tot ce este în planul gratuit, plus:',
for: 'Pentru dezvoltatori independenți / echipe mici',
},
team: {
name: 'Echipă',
description: 'Colaborați fără limite și bucurați-vă de performanțe de top.',
includesTitle: 'Tot ce este în planul Professional, plus:',
for: 'Pentru echipe de dimensiuni medii',
},
enterprise: {
@ -123,15 +115,15 @@ const translation = {
description: 'Obțineți capacități și asistență complete pentru sisteme critice la scară largă.',
includesTitle: 'Tot ce este în planul Echipă, plus:',
features: {
3: 'Multiple Spații de lucru și Management Enterprise',
6: 'Securitate avansată și control',
6: 'Securitate și controale avansate',
1: 'Autorizare licență comercială',
2: 'Funcții exclusive pentru întreprinderi',
0: 'Soluții de implementare scalabile la nivel de întreprindere',
5: 'SLA-uri negociate de partenerii Dify',
3: 'Mai multe spații de lucru și managementul întreprinderii',
7: 'Actualizări și întreținere de către Dify oficial',
8: 'Asistență tehnică profesională',
4: 'SSO',
7: 'Actualizări și întreținere de către Dify Oficial',
1: 'Autorizare pentru licență comercială',
5: 'SLA-uri negociate de partenerii Dify',
0: 'Soluții de desfășurare scalabile de nivel enterprise',
},
for: 'Pentru echipe de mari dimensiuni',
price: 'Personalizat',
@ -140,9 +132,9 @@ const translation = {
},
community: {
features: {
2: 'Se conformează Licenței Open Source Dify',
0: 'Toate caracteristicile de bază lansate în depozitul public',
2: 'Respectă licența Dify Open Source',
1: 'Spațiu de lucru unic',
0: 'Toate funcțiile de bază lansate sub depozitul public',
},
description: 'Pentru utilizatori individuali, echipe mici sau proiecte necomerciale',
btnText: 'Începe cu Comunitatea',
@ -153,10 +145,10 @@ const translation = {
},
premium: {
features: {
3: 'Asistență prioritară prin e-mail și chat',
1: 'Spațiu de lucru unic',
0: 'Fiabilitate autogestionată de diferiți furnizori de cloud',
2: 'Personalizarea logo-ului și branding-ului aplicației web',
3: 'Suport prioritar prin email și chat',
0: 'Fiabilitate autogestionată de diverși furnizori de cloud',
2: 'Personalizarea logo-ului și brandingului WebApp',
},
btnText: 'Obține Premium în',
description: 'Pentru organizații și echipe de dimensiuni medii',
@ -173,8 +165,6 @@ const translation = {
fullSolution: 'Actualizați-vă planul pentru a obține mai mult spațiu.',
},
apps: {
fullTipLine1: 'Actualizați-vă planul pentru a',
fullTipLine2: 'construi mai multe aplicații.',
fullTip2des: 'Se recomandă curățarea aplicațiilor inactive pentru a elibera resurse, sau contactați-ne.',
fullTip2: 'Limita planului a fost atinsă',
fullTip1des: 'Ați atins limita de aplicații construite pe acest plan',

View File

@ -23,19 +23,14 @@ const translation = {
contractSales: 'Связаться с отделом продаж',
contractOwner: 'Связаться с руководителем команды',
startForFree: 'Начать бесплатно',
getStartedWith: 'Начать с ',
contactSales: 'Связаться с отделом продаж',
talkToSales: 'Поговорить с отделом продаж',
modelProviders: 'Поставщики моделей',
teamMembers: 'Участники команды',
annotationQuota: 'Квота аннотаций',
buildApps: 'Создать приложения',
vectorSpace: 'Векторное пространство',
vectorSpaceBillingTooltip: 'Каждый 1 МБ может хранить около 1,2 миллиона символов векторизованных данных (оценка с использованием Embeddings OpenAI, варьируется в зависимости от модели).',
vectorSpaceTooltip: 'Векторное пространство - это система долговременной памяти, необходимая LLM для понимания ваших данных.',
documentsUploadQuota: 'Квота загрузки документов',
documentProcessingPriority: 'Приоритет обработки документов',
documentProcessingPriorityTip: 'Для более высокого приоритета обработки документов, пожалуйста, обновите свой тарифный план.',
documentProcessingPriorityUpgrade: 'Обрабатывайте больше данных с большей точностью и на более высоких скоростях.',
priority: {
'standard': 'Стандартный',
@ -103,19 +98,16 @@ const translation = {
sandbox: {
name: 'Песочница',
description: '200 бесплатных пробных использований GPT',
includesTitle: 'Включает:',
for: 'Бесплатная пробная версия основных возможностей',
},
professional: {
name: 'Профессиональный',
description: 'Для частных лиц и небольших команд, чтобы разблокировать больше возможностей по доступной цене.',
includesTitle: 'Все в бесплатном плане, плюс:',
for: 'Для независимых разработчиков/малых команд',
},
team: {
name: 'Команда',
description: 'Сотрудничайте без ограничений и наслаждайтесь высочайшей производительностью.',
includesTitle: 'Все в профессиональном плане, плюс:',
for: 'Для команд среднего размера',
},
enterprise: {
@ -123,15 +115,15 @@ const translation = {
description: 'Получите полный набор возможностей и поддержку для крупномасштабных критически важных систем.',
includesTitle: 'Все в командном плане, плюс:',
features: {
7: 'Обновления и обслуживание от Dify официально',
4: 'ССО',
5: 'Согласованные SLA от Dify Partners',
8: 'Профессиональная техническая поддержка',
6: 'Современная безопасность и контроль',
2: 'Эксклюзивные функции для предприятий',
1: 'Коммерческая лицензия',
3: 'Множественные рабочие области и управление предприятием',
0: 'Решения для масштабируемого развертывания корпоративного уровня',
5: 'Согласованные Соглашения об Уровне Услуг от Dify Partners',
2: 'Эксклюзивные корпоративные функции',
6: 'Расширенная безопасность и контроль',
7: 'Обновления и обслуживание от Dify официально',
3: 'Несколько рабочих пространств и управление предприятием',
0: 'Масштабируемые решения для развертывания корпоративного уровня',
1: 'Разрешение на коммерческую лицензию',
},
price: 'Пользовательский',
priceTip: 'Только годовая подписка',
@ -140,9 +132,9 @@ const translation = {
},
community: {
features: {
0: 'Все основные функции выпущены в публичном репозитории',
1: 'Единое рабочее пространство',
2: 'Соблюдает Лицензию на открытое программное обеспечение Dify',
2: 'Соответствует лицензии Dify с открытым исходным кодом',
0: 'Все основные функции выпущены в общедоступном репозитории',
},
name: 'Сообщество',
btnText: 'Начните с сообщества',
@ -153,10 +145,10 @@ const translation = {
},
premium: {
features: {
3: 'Приоритетная поддержка по электронной почте и чату',
2: 'Настройка логотипа и брендинга WebApp',
1: 'Единое рабочее пространство',
2: 'Настройка логотипа и брендинга веб-приложения',
0: 'Самостоятельное управление надежностью различными облачными провайдерами',
3: 'Приоритетная поддержка по электронной почте и в чате',
0: 'Самостоятельное управление надежностью от различных поставщиков облачных услуг',
},
description: 'Для средних организаций и команд',
includesTitle: 'Всё из Сообщества, плюс:',
@ -173,8 +165,6 @@ const translation = {
fullSolution: 'Обновите свой тарифный план, чтобы получить больше места.',
},
apps: {
fullTipLine1: 'Обновите свой тарифный план, чтобы',
fullTipLine2: 'создавать больше приложений.',
fullTip2des: 'Рекомендуется удалить неактивные приложения, чтобы освободить место, или свяжитесь с нами.',
fullTip2: 'Достигнут лимит плана',
contactUs: 'Свяжитесь с нами',

View File

@ -23,19 +23,14 @@ const translation = {
contractSales: 'Kontaktirajte prodajo',
contractOwner: 'Kontaktirajte upravitelja ekipe',
startForFree: 'Začnite brezplačno',
getStartedWith: 'Začnite z ',
contactSales: 'Kontaktirajte prodajo',
talkToSales: 'Pogovorite se s prodajo',
modelProviders: 'Ponudniki modelov',
teamMembers: 'Člani ekipe',
annotationQuota: 'Kvote za označevanje',
buildApps: 'Gradite aplikacije',
vectorSpace: 'Prostor za vektorje',
vectorSpaceBillingTooltip: 'Vsak 1 MB lahko shrani približno 1,2 milijona znakov vektoriziranih podatkov (ocenjeno z uporabo OpenAI Embeddings, odvisno od modelov).',
vectorSpaceTooltip: 'Prostor za vektorje je dolgoročni pomnilniški sistem, potreben za to, da LLM-ji razumejo vaše podatke.',
documentsUploadQuota: 'Kvote za nalaganje dokumentov',
documentProcessingPriority: 'Prioriteta obdelave dokumentov',
documentProcessingPriorityTip: 'Za višjo prioriteto obdelave dokumentov nadgradite svoj načrt.',
documentProcessingPriorityUpgrade: 'Obdelujte več podatkov z večjo natančnostjo in hitrostjo.',
priority: {
'standard': 'Standard',
@ -103,19 +98,16 @@ const translation = {
sandbox: {
name: 'Peskovnik',
description: '200 brezplačnih poskusov GPT',
includesTitle: 'Vključuje:',
for: 'Brezplačno preizkušanje osnovnih zmogljivosti',
},
professional: {
name: 'Profesionalni',
description: 'Za posameznike in male ekipe, da odklenete več zmogljivosti po ugodni ceni.',
includesTitle: 'Vse v brezplačnem načrtu, plus:',
for: 'Za neodvisne razvijalce/male ekipe',
},
team: {
name: 'Ekipa',
description: 'Sodelujte brez omejitev in uživajte v vrhunski zmogljivosti.',
includesTitle: 'Vse v profesionalnem načrtu, plus:',
for: 'Za srednje velike ekipe',
},
enterprise: {
@ -123,15 +115,15 @@ const translation = {
description: 'Pridobite vse zmogljivosti in podporo za velike sisteme kritične za misijo.',
includesTitle: 'Vse v načrtu Ekipa, plus:',
features: {
5: 'Pogajali smo se o SLAs s partnerji Dify',
4: 'SSO',
0: 'Rešitve za razširljivo uvedbo na ravni podjetij',
1: 'Avtorizacija za komercialno licenco',
0: 'Prilagodljive rešitve za uvajanje na ravni podjetij',
2: 'Ekskluzivne funkcije za podjetja',
7: 'Posodobitve in vzdrževanje s strani Dify uradno',
3: 'Več delovnih prostorov in upravljanje podjetij',
7: 'Posodobitve in vzdrževanje s strani Dify Official',
8: 'Strokovna tehnična podpora',
1: 'Dovoljenje za komercialno licenco',
3: 'Več delovnih prostorov in upravljanje podjetja',
5: 'Dogovorjene pogodbe o ravni storitev s strani Dify Partners',
6: 'Napredna varnost in nadzor',
8: 'Profesionalna tehnična podpora',
4: 'SSO',
},
priceTip: 'Letno zaračunavanje samo',
price: 'Po meri',
@ -140,9 +132,9 @@ const translation = {
},
community: {
features: {
2: 'Upošteva Dify odprtokodno licenco',
0: 'Vse ključne funkcije so bile objavljene v javnem repozitoriju',
1: 'Enotno delovno okolje',
1: 'En delovni prostor',
0: 'Vse osnovne funkcije, izdane v javnem repozitoriju',
2: 'Skladen z odprtokodno licenco Dify',
},
includesTitle: 'Brezplačne funkcije:',
price: 'Brezplačno',
@ -153,10 +145,10 @@ const translation = {
},
premium: {
features: {
2: 'Prilagoditev logotipa in blagovne znamke spletne aplikacije',
1: 'Enotno delovno okolje',
0: 'Samoobvladovana zanesljivost različnih ponudnikov oblačnih storitev',
3: 'Prednostna e-pošta in podpora za klepet',
1: 'En delovni prostor',
3: 'Prednostna podpora po e-pošti in klepetu',
2: 'Prilagajanje logotipa in blagovne znamke WebApp',
0: 'Samostojna zanesljivost različnih ponudnikov storitev v oblaku',
},
name: 'Premium',
priceTip: 'Na podlagi oblaka Marketplace',
@ -173,8 +165,6 @@ const translation = {
fullSolution: 'Nadgradite svoj načrt za več prostora.',
},
apps: {
fullTipLine1: 'Nadgradite svoj načrt, da',
fullTipLine2: 'gradite več aplikacij.',
fullTip1des: 'Dosegli ste omejitev za izdelavo aplikacij v tem načrtu.',
fullTip1: 'Nadgradite za ustvarjanje več aplikacij',
fullTip2: 'Dosežena meja načrta',

View File

@ -23,19 +23,14 @@ const translation = {
contractSales: 'ติดต่อฝ่ายขาย',
contractOwner: 'ติดต่อผู้จัดการทีม',
startForFree: 'เริ่มฟรี',
getStartedWith: 'เริ่มต้นใช้งาน',
contactSales: 'ติดต่อฝ่ายขาย',
talkToSales: 'พูดคุยกับฝ่ายขาย',
modelProviders: 'ผู้ให้บริการโมเดล',
teamMembers: 'สมาชิกในทีม',
annotationQuota: 'โควต้าคําอธิบายประกอบ',
buildApps: 'สร้างแอพ',
vectorSpace: 'พื้นที่เวกเตอร์',
vectorSpaceBillingTooltip: 'แต่ละ 1MB สามารถจัดเก็บข้อมูลแบบเวกเตอร์ได้ประมาณ 1.2 ล้านอักขระ (โดยประมาณโดยใช้ OpenAI Embeddings แตกต่างกันไปตามรุ่น)',
vectorSpaceTooltip: 'Vector Space เป็นระบบหน่วยความจําระยะยาวที่จําเป็นสําหรับ LLM ในการทําความเข้าใจข้อมูลของคุณ',
documentsUploadQuota: 'โควต้าการอัปโหลดเอกสาร',
documentProcessingPriority: 'ลําดับความสําคัญในการประมวลผลเอกสาร',
documentProcessingPriorityTip: 'สําหรับลําดับความสําคัญในการประมวลผลเอกสารที่สูงขึ้น โปรดอัปเกรดแผนของคุณ',
documentProcessingPriorityUpgrade: 'ประมวลผลข้อมูลได้มากขึ้นด้วยความแม่นยําที่สูงขึ้นด้วยความเร็วที่เร็วขึ้น',
priority: {
'standard': 'มาตรฐาน',
@ -103,19 +98,16 @@ const translation = {
sandbox: {
name: 'กระบะทราย',
description: 'ทดลองใช้ GPT ฟรี 200 ครั้ง',
includesTitle: 'มี:',
for: 'ทดลองใช้ฟรีของความสามารถหลัก',
},
professional: {
name: 'มืออาชีพ',
description: 'สําหรับบุคคลและทีมขนาดเล็กเพื่อปลดล็อกพลังงานมากขึ้นในราคาย่อมเยา',
includesTitle: 'ทุกอย่างในแผนฟรี รวมถึง:',
for: 'สำหรับนักพัฒนาที่เป็นอิสระ/ทีมขนาดเล็ก',
},
team: {
name: 'ทีม',
description: 'ทํางานร่วมกันอย่างไร้ขีดจํากัดและเพลิดเพลินไปกับประสิทธิภาพระดับสูงสุด',
includesTitle: 'ทุกอย่างในแผน Professional รวมถึง:',
for: 'สำหรับทีมขนาดกลาง',
},
enterprise: {
@ -123,15 +115,15 @@ const translation = {
description: 'รับความสามารถและการสนับสนุนเต็มรูปแบบสําหรับระบบที่สําคัญต่อภารกิจขนาดใหญ่',
includesTitle: 'ทุกอย่างในแผนทีม รวมถึง:',
features: {
8: 'การสนับสนุนทางเทคนิคระดับมืออาชีพ',
2: 'คุณสมบัติพิเศษขององค์กร',
3: 'หลายพื้นที่ทำงานและการบริหารจัดการองค์กร',
4: 'SSO',
6: 'ความปลอดภัยและการควบคุมขั้นสูง',
5: 'เจรจาข้อตกลงบริการ (SLA) โดย Dify Partners',
7: 'การอัปเดตและการบำรุงรักษาโดย Dify อย่างเป็นทางการ',
1: 'ใบอนุญาตการใช้เชิงพาณิชย์',
0: 'โซลูชันการปรับใช้ที่มีขนาดใหญ่และมีคุณภาพระดับองค์กร',
2: 'คุณสมบัติพิเศษสําหรับองค์กร',
5: 'SLA ที่เจรจาโดย Dify Partners',
1: 'การอนุญาตใบอนุญาตเชิงพาณิชย์',
8: 'การสนับสนุนด้านเทคนิคอย่างมืออาชีพ',
0: 'โซลูชันการปรับใช้ที่ปรับขนาดได้ระดับองค์กร',
7: 'การอัปเดตและบํารุงรักษาโดย Dify อย่างเป็นทางการ',
3: 'พื้นที่ทํางานหลายแห่งและการจัดการองค์กร',
6: 'การรักษาความปลอดภัยและการควบคุมขั้นสูง',
},
btnText: 'ติดต่อฝ่ายขาย',
price: 'ที่กำหนดเอง',
@ -140,9 +132,9 @@ const translation = {
},
community: {
features: {
2: 'ปฏิบัติตามใบอนุญาตโอเพ่นซอร์สของ Dify',
0: 'ฟีเจอร์หลักทั้งหมดถูกปล่อยออกภายใต้ที่เก็บสาธารณะ',
1: 'พื้นที่ทำงานเดียว',
1: 'พื้นที่ทํางานเดียว',
2: 'สอดคล้องกับใบอนุญาตโอเพ่นซอร์ส Dify',
0: 'คุณสมบัติหลักทั้งหมดที่เผยแพร่ภายใต้ที่เก็บสาธารณะ',
},
name: 'ชุมชน',
price: 'ฟรี',
@ -153,10 +145,10 @@ const translation = {
},
premium: {
features: {
3: 'การสนับสนุนทางอีเมลและแชทที่มีความสำคัญ',
1: 'พื้นที่ทำงานเดียว',
2: 'การปรับแต่งโลโก้และแบรนดิ้งของเว็บแอป',
0: 'การจัดการความน่าเชื่อถือด้วยตนเองโดยผู้ให้บริการคลาวด์ต่าง ๆ',
2: 'โลโก้ WebApp และการปรับแต่งแบรนด์',
3: 'การสนับสนุนทางอีเมลและแชทลําดับความสําคัญ',
1: 'พื้นที่ทํางานเดียว',
0: 'ความน่าเชื่อถือที่จัดการด้วยตนเองโดยผู้ให้บริการคลาวด์ต่างๆ',
},
priceTip: 'อิงตามตลาดคลาวด์',
for: 'สำหรับองค์กรและทีมขนาดกลาง',
@ -173,8 +165,6 @@ const translation = {
fullSolution: 'อัปเกรดแผนของคุณเพื่อเพิ่มพื้นที่',
},
apps: {
fullTipLine1: 'อัปเกรดแผนของคุณเป็น',
fullTipLine2: 'สร้างแอปเพิ่มเติม',
contactUs: 'ติดต่อเรา',
fullTip2: 'ถึงขีดจำกัดของแผนแล้ว',
fullTip1: 'อัปเกรดเพื่อสร้างแอปเพิ่มเติม',

View File

@ -23,19 +23,14 @@ const translation = {
contractSales: 'Satışla iletişime geçin',
contractOwner: 'Takım yöneticisine başvurun',
startForFree: 'Ücretsiz Başla',
getStartedWith: 'ile başlayın',
contactSales: 'Satışlarla İletişime Geçin',
talkToSales: 'Satışlarla Konuşun',
modelProviders: 'Model Sağlayıcılar',
teamMembers: 'Takım Üyeleri',
annotationQuota: 'Ek Açıklama Kotası',
buildApps: 'Uygulamalar Oluştur',
vectorSpace: 'Vektör Alanı',
vectorSpaceBillingTooltip: 'Her 1MB yaklaşık 1.2 milyon karakter vektörize veri depolayabilir (OpenAI Embeddings ile tahmin edilmiştir, modellere göre farklılık gösterebilir).',
vectorSpaceTooltip: 'Vektör Alanı, LLM\'lerin verilerinizi anlaması için gerekli uzun süreli hafıza sistemidir.',
documentsUploadQuota: 'Doküman Yükleme Kotası',
documentProcessingPriority: 'Doküman İşleme Önceliği',
documentProcessingPriorityTip: 'Daha yüksek doküman işleme önceliği için planınızı yükseltin.',
documentProcessingPriorityUpgrade: 'Daha fazla veriyi daha yüksek doğrulukla ve daha hızlı işleyin.',
priority: {
'standard': 'Standart',
@ -103,19 +98,16 @@ const translation = {
sandbox: {
name: 'Sandbox',
description: '200 kez GPT ücretsiz deneme',
includesTitle: 'İçerdikleri:',
for: 'Temel Yeteneklerin Ücretsiz Denemesi',
},
professional: {
name: 'Profesyonel',
description: 'Bireyler ve küçük takımlar için daha fazla güç açın.',
includesTitle: 'Ücretsiz plandaki her şey, artı:',
for: 'Bağımsız Geliştiriciler/Küçük Takımlar için',
},
team: {
name: 'Takım',
description: 'Sınırsız işbirliği ve en üst düzey performans.',
includesTitle: 'Profesyonel plandaki her şey, artı:',
for: 'Orta Boyutlu Takımlar İçin',
},
enterprise: {
@ -123,15 +115,15 @@ const translation = {
description: 'Büyük ölçekli kritik sistemler için tam yetenekler ve destek.',
includesTitle: 'Takım plandaki her şey, artı:',
features: {
3: 'Birden Fazla Çalışma Alanı ve Kurumsal Yönetim',
8: 'Profesyonel Teknik Destek',
4: 'SSO',
2: 'Özel Şirket Özellikleri',
1: 'Ticari Lisans Yetkilendirmesi',
7: 'Dify Tarafından Resmi Güncellemeler ve Bakım',
5: 'Dify Ortakları tarafından müzakere edilen SLA\'lar',
6: 'Gelişmiş Güvenlik ve Kontroller',
5: 'Dify Partners tarafından müzakere edilen SLA\'lar',
4: 'SSO',
2: 'Özel Kurumsal Özellikler',
0: 'Kurumsal Düzeyde Ölçeklenebilir Dağıtım Çözümleri',
7: 'Resmi olarak Dify tarafından Güncellemeler ve Bakım',
3: 'Çoklu Çalışma Alanları ve Kurumsal Yönetim',
},
priceTip: 'Yıllık Faturalama Sadece',
for: 'Büyük boyutlu Takımlar için',
@ -140,9 +132,9 @@ const translation = {
},
community: {
features: {
1: 'Tek İş Alanı',
0: 'Tüm Temel Özellikler Kamu Deposu Altında Yayınlandı',
2: 'Dify Açık Kaynak Lisansına uyar',
1: 'Tek Çalışma Alanı',
0: 'Genel depo altında yayınlanan tüm temel özellikler',
2: 'Dify Açık Kaynak Lisansı ile uyumludur',
},
price: 'Ücretsiz',
includesTitle: 'Ücretsiz Özellikler:',
@ -153,10 +145,10 @@ const translation = {
},
premium: {
features: {
1: 'Tek İş Alanı',
0: 'Çeşitli Bulut Sağlayıcıları Tarafından Kendiliğinden Yönetilen Güvenilirlik',
3: 'Öncelikli Email ve Sohbet Desteği',
2: 'Web Uygulaması Logo ve Markalaşma Özelleştirmesi',
1: 'Tek Çalışma Alanı',
0: 'Çeşitli Bulut Sağlayıcıları Tarafından Kendi Kendini Yöneten Güvenilirlik',
2: 'WebApp Logosu ve Marka Özelleştirmesi',
3: 'Öncelikli E-posta ve Sohbet Desteği',
},
name: 'Premium',
includesTitle: 'Topluluktan her şey, artı:',
@ -173,8 +165,6 @@ const translation = {
fullSolution: 'Daha fazla alan için planınızı yükseltin.',
},
apps: {
fullTipLine1: 'Daha fazla uygulama oluşturmak için',
fullTipLine2: 'planınızı yükseltin.',
contactUs: 'Bizimle iletişime geçin',
fullTip2des: 'Kullanımı serbest bırakmak için etkisiz uygulamaların temizlenmesi önerilir veya bizimle iletişime geçin.',
fullTip1des: 'Bu planda uygulama oluşturma limitine ulaştınız.',

View File

@ -23,17 +23,13 @@ const translation = {
contractSales: 'Зв\'язатися з відділом продажів',
contractOwner: 'Зв\'язатися з керівником команди',
startForFree: 'Почніть безкоштовно',
getStartedWith: 'Почніть роботу з ',
contactSales: 'Зв\'язатися з відділом продажів',
talkToSales: 'Поговоріть зі службою продажів',
modelProviders: 'Постачальники моделей',
teamMembers: 'Члени команди',
buildApps: 'Створювати додатки',
vectorSpace: 'Векторний простір',
vectorSpaceBillingTooltip: 'Кожен 1 МБ може зберігати близько 1,2 мільйона символів векторизованих даних (оцінка з використанням OpenAI Embeddings, відрізняється в залежності від моделей).',
vectorSpaceTooltip: 'Векторний простір це система довгострокової пам\'яті, необхідна LLM для розуміння ваших даних.',
documentProcessingPriority: 'Пріоритет обробки документів',
documentProcessingPriorityTip: 'Для вищого пріоритету обробки документів оновіть свій план.',
documentProcessingPriorityUpgrade: 'Обробляйте більше даних із вищою точністю та на більших швидкостях.',
priority: {
'standard': 'Стандартний',
@ -77,7 +73,6 @@ const translation = {
ragAPIRequestTooltip: 'Відноситься до кількості викликів API, що викликають лише можливості обробки бази знань Dify.',
receiptInfo: 'Лише власник команди та адміністратор команди можуть підписуватися та переглядати інформацію про виставлення рахунків',
annotationQuota: 'Квота анотацій',
documentsUploadQuota: 'Квота завантаження документів',
teamMember_one: '{{count,number}} член команди',
teamWorkspace: '{{count,number}} Командний Простір',
apiRateLimit: 'Обмеження швидкості API',
@ -103,19 +98,16 @@ const translation = {
sandbox: {
name: 'Пісочниця',
description: '200 безкоштовних пробних версій GPT',
includesTitle: 'Включає в себе:',
for: 'Безкоштовна пробна версія основних функцій',
},
professional: {
name: 'Професійний',
description: 'Щоб окремі особи та невеликі команди могли отримати більше можливостей за доступною ціною.',
includesTitle: 'Все у безкоштовному плані, плюс:',
for: 'Для незалежних розробників/малих команд',
},
team: {
name: 'Команда',
description: 'Співпрацюйте без обмежень і користуйтеся продуктивністю найвищого рівня.',
includesTitle: 'Все, що входить до плану Professional, плюс:',
for: 'Для середніх команд',
},
enterprise: {
@ -123,15 +115,15 @@ const translation = {
description: 'Отримайте повні можливості та підтримку для масштабних критично важливих систем.',
includesTitle: 'Все, що входить до плану Team, плюс:',
features: {
5: 'Угоди про рівень обслуговування, узгоджені партнерами Dify',
2: 'Ексклюзивні підприємницькі функції',
6: 'Розвинена безпека та контроль',
4: 'Єдиний вхід',
7: 'Оновлення та обслуговування від Dify Official',
1: 'Авторизація комерційної ліцензії',
8: 'Професійна технічна підтримка',
1: 'Комерційна ліцензія на авторизацію',
3: 'Кілька робочих просторів та управління підприємством',
4: 'ССО',
0: 'Рішення для масштабованого розгортання підприємств',
7: 'Оновлення та обслуговування від Dify Офіційно',
2: 'Ексклюзивні функції підприємства',
6: 'Розширені функції безпеки та керування',
3: 'Кілька робочих областей і управління підприємством',
5: 'Угода про рівень обслуговування за домовленістю від Dify Partners',
0: 'Масштабовані рішення для розгортання корпоративного рівня',
},
btnText: 'Зв\'язатися з відділом продажу',
priceTip: 'Тільки річна оплата',
@ -140,9 +132,9 @@ const translation = {
},
community: {
features: {
2: 'Відповідає ліцензії Dify Open Source',
1: 'Єдине робоче місце',
0: 'Усі основні функції випущені під публічним репозиторієм',
2: 'Відповідає ліцензії Dify з відкритим вихідним кодом',
0: 'Усі основні функції випущено в загальнодоступному репозиторії',
},
btnText: 'Розпочніть з громади',
includesTitle: 'Безкоштовні можливості:',
@ -153,10 +145,10 @@ const translation = {
},
premium: {
features: {
2: 'Логотип веб-додатку та налаштування брендингу',
1: 'Єдине робоче місце',
3: 'Пріоритетна email та чат підтримка',
0: 'Самостійно керовані надійність різних хмарних постачальників',
2: 'Налаштування логотипу WebApp та брендингу',
3: 'Пріоритетна підтримка електронною поштою та в чаті',
0: 'Самокерована надійність різними хмарними провайдерами',
},
description: 'Для середніх підприємств та команд',
btnText: 'Отримайте Преміум у',
@ -173,8 +165,6 @@ const translation = {
fullSolution: 'Оновіть свій план, щоб отримати більше місця.',
},
apps: {
fullTipLine1: 'Оновіть свій план, щоб',
fullTipLine2: 'створити більше програм.',
fullTip1des: 'Ви досягли межі створення додатків за цим планом.',
fullTip2: 'Досягнуто ліміту плану',
fullTip1: 'Оновіть, щоб створити більше додатків',

View File

@ -23,18 +23,13 @@ const translation = {
contractSales: 'Liên hệ bộ phận bán hàng',
contractOwner: 'Liên hệ quản lý nhóm',
startForFree: 'Bắt đầu miễn phí',
getStartedWith: 'Bắt đầu với ',
contactSales: 'Liên hệ Bán hàng',
talkToSales: 'Nói chuyện với Bộ phận Bán hàng',
modelProviders: 'Nhà cung cấp Mô hình',
teamMembers: 'Thành viên Nhóm',
buildApps: 'Xây dựng Ứng dụng',
vectorSpace: 'Không gian Vector',
vectorSpaceBillingTooltip: 'Mỗi 1MB có thể lưu trữ khoảng 1.2 triệu ký tự dữ liệu vector hóa (ước tính sử dụng OpenAI Embeddings, thay đổi tùy theo các mô hình).',
vectorSpaceTooltip: 'Không gian Vector là hệ thống bộ nhớ dài hạn cần thiết cho LLMs để hiểu dữ liệu của bạn.',
documentsUploadQuota: 'Hạn mức Tải lên Tài liệu',
documentProcessingPriority: 'Ưu tiên Xử lý Tài liệu',
documentProcessingPriorityTip: 'Để có ưu tiên xử lý tài liệu cao hơn, vui lòng nâng cấp kế hoạch của bạn.',
documentProcessingPriorityUpgrade: 'Xử lý nhiều dữ liệu với độ chính xác cao và tốc độ nhanh hơn.',
priority: {
'standard': 'Tiêu chuẩn',
@ -103,19 +98,16 @@ const translation = {
sandbox: {
name: 'Hộp Cát',
description: 'Thử nghiệm miễn phí 200 lần GPT',
includesTitle: 'Bao gồm:',
for: 'Dùng thử miễn phí các tính năng cốt lõi',
},
professional: {
name: 'Chuyên nghiệp',
description: 'Dành cho cá nhân và các nhóm nhỏ để mở khóa nhiều sức mạnh với giá cả phải chăng.',
includesTitle: 'Tất cả trong kế hoạch miễn phí, cộng thêm:',
for: 'Dành cho các nhà phát triển độc lập/nhóm nhỏ',
},
team: {
name: 'Nhóm',
description: 'Hợp tác mà không giới hạn và tận hưởng hiệu suất hạng nhất.',
includesTitle: 'Tất cả trong kế hoạch Chuyên nghiệp, cộng thêm:',
for: 'Dành cho các đội nhóm vừa',
},
enterprise: {
@ -123,15 +115,15 @@ const translation = {
description: 'Nhận toàn bộ khả năng và hỗ trợ cho các hệ thống quan trọng cho nhiệm vụ quy mô lớn.',
includesTitle: 'Tất cả trong kế hoạch Nhóm, cộng thêm:',
features: {
2: 'Tính năng Doanh nghiệp Độc quyền',
1: 'Giấy phép kinh doanh',
8: 'Hỗ trợ kỹ thuật chuyên nghiệp',
7: 'Cập nhật và Bảo trì bởi Dify Chính thức',
5: 'Thỏa thuận SLA bởi các đối tác Dify',
6: 'An ninh nâng cao và kiểm soát',
3: 'Nhiều không gian làm việc & Quản lý doanh nghiệp',
0: 'Giải pháp triển khai mở rộng quy mô cấp doanh nghiệp',
2: 'Các tính năng dành riêng cho doanh nghiệp',
3: 'Nhiều không gian làm việc & quản lý doanh nghiệp',
7: 'Cập nhật và bảo trì bởi Dify chính thức',
4: 'SSO',
8: 'Hỗ trợ kỹ thuật chuyên nghiệp',
5: 'SLA được đàm phán bởi Dify Partners',
1: 'Ủy quyền giấy phép thương mại',
6: 'Bảo mật & Kiểm soát nâng cao',
0: 'Giải pháp triển khai có thể mở rộng cấp doanh nghiệp',
},
price: 'Tùy chỉnh',
for: 'Dành cho các đội lớn',
@ -140,9 +132,9 @@ const translation = {
},
community: {
features: {
2: 'Tuân thủ Giấy phép Mã nguồn Mở Dify',
0: 'Tất cả các tính năng cốt lõi được phát hành dưới Kho lưu trữ công khai',
1: 'Không gian làm việc đơn',
0: 'Tất cả các tính năng cốt lõi được phát hành trong kho lưu trữ công cộng',
2: 'Tuân thủ Giấy phép nguồn mở Dify',
},
description: 'Dành cho người dùng cá nhân, nhóm nhỏ hoặc các dự án phi thương mại',
name: 'Cộng đồng',
@ -153,10 +145,10 @@ const translation = {
},
premium: {
features: {
3: 'Hỗ trợ qua Email & Chat Ưu tiên',
2: 'Tùy chỉnh Logo & Thương hiệu Ứng dụng Web',
1: 'Không gian làm việc đơn',
0: 'Độ tin cậy tự quản lý bởi các nhà cung cấp đám mây khác nhau',
2: 'Logo WebApp & Tùy chỉnh thương hiệu',
3: 'Hỗ trợ email & trò chuyện ưu tiên',
0: 'Độ tin cậy tự quản lý của các nhà cung cấp đám mây khác nhau',
},
comingSoon: 'Hỗ trợ Microsoft Azure & Google Cloud Sẽ Đến Sớm',
priceTip: 'Dựa trên Thị trường Đám mây',
@ -173,8 +165,6 @@ const translation = {
fullSolution: 'Nâng cấp kế hoạch của bạn để có thêm không gian.',
},
apps: {
fullTipLine1: 'Nâng cấp kế hoạch của bạn để',
fullTipLine2: 'xây dựng thêm ứng dụng.',
contactUs: 'Liên hệ với chúng tôi',
fullTip2: 'Đã đạt giới hạn kế hoạch',
fullTip1des: 'Bạn đã đạt đến giới hạn xây dựng ứng dụng trên kế hoạch này.',

View File

@ -115,6 +115,15 @@ const translation = {
description: '獲得大規模關鍵任務系統的完整功能和支援。',
includesTitle: 'Team 計劃中的一切,加上:',
features: {
8: '專業技術支持',
3: '多個工作區和企業管理',
0: '企業級可擴展部署解決方案',
1: '商業許可證授權',
7: 'Dify 官方更新和維護',
6: '進階安全與控制',
4: '單一登入',
5: 'Dify 合作夥伴協商的 SLA',
2: '獨家企業功能',
},
price: '自訂',
btnText: '聯繫銷售',
@ -123,6 +132,9 @@ const translation = {
},
community: {
features: {
0: '所有核心功能在公共存儲庫下發布',
1: '單一工作區',
2: '符合 Dify 開源許可證',
},
includesTitle: '免費功能:',
btnText: '開始使用社區',
@ -133,6 +145,10 @@ const translation = {
},
premium: {
features: {
3: '優先電子郵件和聊天支持',
2: 'WebApp 標誌和品牌定制',
0: '各種雲端供應商的自我管理可靠性',
1: '單一工作區',
},
for: '適用於中型組織和團隊',
comingSoon: '微軟 Azure 與 Google Cloud 支持即將推出',

View File

@ -2,6 +2,7 @@
"name": "dify-web",
"version": "1.7.2",
"private": true,
"packageManager": "pnpm@10.14.0",
"engines": {
"node": ">=v22.11.0"
},