mirror of https://github.com/langgenius/dify.git
Merge branch 'main' into feat/rag-pipeline
This commit is contained in:
commit
b9f59e3a75
|
|
@ -179,6 +179,7 @@ docker/volumes/pgvecto_rs/data/*
|
|||
docker/volumes/couchbase/*
|
||||
docker/volumes/oceanbase/*
|
||||
docker/volumes/plugin_daemon/*
|
||||
docker/volumes/matrixone/*
|
||||
!docker/volumes/oceanbase/init.d
|
||||
|
||||
docker/nginx/conf.d/default.conf
|
||||
|
|
|
|||
|
|
@ -226,6 +226,11 @@ Deploy Dify to AWS with [CDK](https://aws.amazon.com/cdk/)
|
|||
|
||||
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Using Alibaba Cloud Computing Nest
|
||||
|
||||
Quickly deploy Dify to Alibaba cloud with [Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## Contributing
|
||||
|
||||
For those who'd like to contribute code, see our [Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
|
||||
|
|
|
|||
|
|
@ -209,6 +209,9 @@ docker compose up -d
|
|||
|
||||
- [AWS CDK بواسطة @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### استخدام Alibaba Cloud للنشر
|
||||
[بسرعة نشر Dify إلى سحابة علي بابا مع عش الحوسبة السحابية علي بابا](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
## المساهمة
|
||||
|
||||
لأولئك الذين يرغبون في المساهمة، انظر إلى [دليل المساهمة](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md) لدينا.
|
||||
|
|
|
|||
|
|
@ -225,6 +225,11 @@ GitHub-এ ডিফাইকে স্টার দিয়ে রাখুন
|
|||
|
||||
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Alibaba Cloud ব্যবহার করে ডিপ্লয়
|
||||
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## Contributing
|
||||
|
||||
যারা কোড অবদান রাখতে চান, তাদের জন্য আমাদের [অবদান নির্দেশিকা] দেখুন (https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md)।
|
||||
|
|
|
|||
|
|
@ -221,6 +221,11 @@ docker compose up -d
|
|||
##### AWS
|
||||
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### 使用 阿里云计算巢 部署
|
||||
|
||||
使用 [阿里云计算巢](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88) 将 Dify 一键部署到 阿里云
|
||||
|
||||
|
||||
## Star History
|
||||
|
||||
[](https://star-history.com/#langgenius/dify&Date)
|
||||
|
|
|
|||
|
|
@ -221,6 +221,11 @@ Bereitstellung von Dify auf AWS mit [CDK](https://aws.amazon.com/cdk/)
|
|||
##### AWS
|
||||
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Alibaba Cloud
|
||||
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## Contributing
|
||||
|
||||
Falls Sie Code beitragen möchten, lesen Sie bitte unseren [Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md). Gleichzeitig bitten wir Sie, Dify zu unterstützen, indem Sie es in den sozialen Medien teilen und auf Veranstaltungen und Konferenzen präsentieren.
|
||||
|
|
|
|||
|
|
@ -221,6 +221,10 @@ Despliegue Dify en AWS usando [CDK](https://aws.amazon.com/cdk/)
|
|||
##### AWS
|
||||
- [AWS CDK por @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Alibaba Cloud
|
||||
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
## Contribuir
|
||||
|
||||
Para aquellos que deseen contribuir con código, consulten nuestra [Guía de contribución](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
|
||||
|
|
|
|||
|
|
@ -219,6 +219,11 @@ Déployez Dify sur AWS en utilisant [CDK](https://aws.amazon.com/cdk/)
|
|||
##### AWS
|
||||
- [AWS CDK par @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Alibaba Cloud
|
||||
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## Contribuer
|
||||
|
||||
Pour ceux qui souhaitent contribuer du code, consultez notre [Guide de contribution](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
|
||||
|
|
|
|||
|
|
@ -220,6 +220,10 @@ docker compose up -d
|
|||
##### AWS
|
||||
- [@KevinZhaoによるAWS CDK](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Alibaba Cloud
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## 貢献
|
||||
|
||||
コードに貢献したい方は、[Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md)を参照してください。
|
||||
|
|
|
|||
|
|
@ -219,6 +219,11 @@ wa'logh nIqHom neH ghun deployment toy'wI' [CDK](https://aws.amazon.com/cdk/) lo
|
|||
##### AWS
|
||||
- [AWS CDK qachlot @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Alibaba Cloud
|
||||
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## Contributing
|
||||
|
||||
For those who'd like to contribute code, see our [Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
|
||||
|
|
|
|||
|
|
@ -213,6 +213,11 @@ Dify를 Kubernetes에 배포하고 프리미엄 스케일링 설정을 구성했
|
|||
##### AWS
|
||||
- [KevinZhao의 AWS CDK](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Alibaba Cloud
|
||||
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## 기여
|
||||
|
||||
코드에 기여하고 싶은 분들은 [기여 가이드](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md)를 참조하세요.
|
||||
|
|
|
|||
|
|
@ -218,6 +218,11 @@ Implante o Dify na AWS usando [CDK](https://aws.amazon.com/cdk/)
|
|||
##### AWS
|
||||
- [AWS CDK por @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Alibaba Cloud
|
||||
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## Contribuindo
|
||||
|
||||
Para aqueles que desejam contribuir com código, veja nosso [Guia de Contribuição](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md).
|
||||
|
|
|
|||
|
|
@ -219,6 +219,11 @@ Uvedite Dify v AWS z uporabo [CDK](https://aws.amazon.com/cdk/)
|
|||
##### AWS
|
||||
- [AWS CDK by @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Alibaba Cloud
|
||||
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## Prispevam
|
||||
|
||||
Za tiste, ki bi radi prispevali kodo, si oglejte naš vodnik za prispevke . Hkrati vas prosimo, da podprete Dify tako, da ga delite na družbenih medijih ter na dogodkih in konferencah.
|
||||
|
|
|
|||
|
|
@ -212,6 +212,11 @@ Dify'ı bulut platformuna tek tıklamayla dağıtın [terraform](https://www.ter
|
|||
##### AWS
|
||||
- [AWS CDK tarafından @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### Alibaba Cloud
|
||||
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## Katkıda Bulunma
|
||||
|
||||
Kod katkısında bulunmak isteyenler için [Katkı Kılavuzumuza](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md) bakabilirsiniz.
|
||||
|
|
|
|||
|
|
@ -224,6 +224,11 @@ Dify 的所有功能都提供相應的 API,因此您可以輕鬆地將 Dify
|
|||
|
||||
- [由 @KevinZhao 提供的 AWS CDK](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
#### 使用 阿里云计算巢進行部署
|
||||
|
||||
[阿里云](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## 貢獻
|
||||
|
||||
對於想要貢獻程式碼的開發者,請參閱我們的[貢獻指南](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md)。
|
||||
|
|
|
|||
|
|
@ -214,6 +214,12 @@ Triển khai Dify trên AWS bằng [CDK](https://aws.amazon.com/cdk/)
|
|||
##### AWS
|
||||
- [AWS CDK bởi @KevinZhao](https://github.com/aws-samples/solution-for-deploying-dify-on-aws)
|
||||
|
||||
|
||||
#### Alibaba Cloud
|
||||
|
||||
[Alibaba Cloud Computing Nest](https://computenest.console.aliyun.com/service/instance/create/default?type=user&ServiceName=Dify%E7%A4%BE%E5%8C%BA%E7%89%88)
|
||||
|
||||
|
||||
## Đóng góp
|
||||
|
||||
Đối với những người muốn đóng góp mã, xem [Hướng dẫn Đóng góp](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md) của chúng tôi.
|
||||
|
|
|
|||
|
|
@ -137,7 +137,7 @@ WEB_API_CORS_ALLOW_ORIGINS=http://127.0.0.1:3000,*
|
|||
CONSOLE_CORS_ALLOW_ORIGINS=http://127.0.0.1:3000,*
|
||||
|
||||
# Vector database configuration
|
||||
# support: weaviate, qdrant, milvus, myscale, relyt, pgvecto_rs, pgvector, pgvector, chroma, opensearch, tidb_vector, couchbase, vikingdb, upstash, lindorm, oceanbase, opengauss, tablestore
|
||||
# support: weaviate, qdrant, milvus, myscale, relyt, pgvecto_rs, pgvector, pgvector, chroma, opensearch, tidb_vector, couchbase, vikingdb, upstash, lindorm, oceanbase, opengauss, tablestore, matrixone
|
||||
VECTOR_STORE=weaviate
|
||||
|
||||
# Weaviate configuration
|
||||
|
|
@ -294,6 +294,13 @@ VIKINGDB_SCHEMA=http
|
|||
VIKINGDB_CONNECTION_TIMEOUT=30
|
||||
VIKINGDB_SOCKET_TIMEOUT=30
|
||||
|
||||
# Matrixone configration
|
||||
MATRIXONE_HOST=127.0.0.1
|
||||
MATRIXONE_PORT=6001
|
||||
MATRIXONE_USER=dump
|
||||
MATRIXONE_PASSWORD=111
|
||||
MATRIXONE_DATABASE=dify
|
||||
|
||||
# Lindorm configuration
|
||||
LINDORM_URL=http://ld-*******************-proxy-search-pub.lindorm.aliyuncs.com:30070
|
||||
LINDORM_USERNAME=admin
|
||||
|
|
@ -332,9 +339,11 @@ PROMPT_GENERATION_MAX_TOKENS=512
|
|||
CODE_GENERATION_MAX_TOKENS=1024
|
||||
PLUGIN_BASED_TOKEN_COUNTING_ENABLED=false
|
||||
|
||||
# Mail configuration, support: resend, smtp
|
||||
# Mail configuration, support: resend, smtp, sendgrid
|
||||
MAIL_TYPE=
|
||||
# If using SendGrid, use the 'from' field for authentication if necessary.
|
||||
MAIL_DEFAULT_SEND_FROM=no-reply <no-reply@dify.ai>
|
||||
# resend configuration
|
||||
RESEND_API_KEY=
|
||||
RESEND_API_URL=https://api.resend.com
|
||||
# smtp configuration
|
||||
|
|
@ -344,7 +353,8 @@ SMTP_USERNAME=123
|
|||
SMTP_PASSWORD=abc
|
||||
SMTP_USE_TLS=true
|
||||
SMTP_OPPORTUNISTIC_TLS=false
|
||||
|
||||
# Sendgid configuration
|
||||
SENDGRID_API_KEY=
|
||||
# Sentry configuration
|
||||
SENTRY_DSN=
|
||||
|
||||
|
|
|
|||
|
|
@ -281,6 +281,7 @@ def migrate_knowledge_vector_database():
|
|||
VectorType.ELASTICSEARCH,
|
||||
VectorType.OPENGAUSS,
|
||||
VectorType.TABLESTORE,
|
||||
VectorType.MATRIXONE,
|
||||
}
|
||||
lower_collection_vector_types = {
|
||||
VectorType.ANALYTICDB,
|
||||
|
|
|
|||
|
|
@ -609,7 +609,7 @@ class MailConfig(BaseSettings):
|
|||
"""
|
||||
|
||||
MAIL_TYPE: Optional[str] = Field(
|
||||
description="Email service provider type ('smtp' or 'resend'), default to None.",
|
||||
description="Email service provider type ('smtp' or 'resend' or 'sendGrid), default to None.",
|
||||
default=None,
|
||||
)
|
||||
|
||||
|
|
@ -663,6 +663,11 @@ class MailConfig(BaseSettings):
|
|||
default=50,
|
||||
)
|
||||
|
||||
SENDGRID_API_KEY: Optional[str] = Field(
|
||||
description="API key for SendGrid service",
|
||||
default=None,
|
||||
)
|
||||
|
||||
|
||||
class RagEtlConfig(BaseSettings):
|
||||
"""
|
||||
|
|
|
|||
|
|
@ -24,6 +24,7 @@ from .vdb.couchbase_config import CouchbaseConfig
|
|||
from .vdb.elasticsearch_config import ElasticsearchConfig
|
||||
from .vdb.huawei_cloud_config import HuaweiCloudConfig
|
||||
from .vdb.lindorm_config import LindormConfig
|
||||
from .vdb.matrixone_config import MatrixoneConfig
|
||||
from .vdb.milvus_config import MilvusConfig
|
||||
from .vdb.myscale_config import MyScaleConfig
|
||||
from .vdb.oceanbase_config import OceanBaseVectorConfig
|
||||
|
|
@ -323,5 +324,6 @@ class MiddlewareConfig(
|
|||
OpenGaussConfig,
|
||||
TableStoreConfig,
|
||||
DatasetQueueMonitorConfig,
|
||||
MatrixoneConfig,
|
||||
):
|
||||
pass
|
||||
|
|
|
|||
|
|
@ -0,0 +1,14 @@
|
|||
from pydantic import BaseModel, Field
|
||||
|
||||
|
||||
class MatrixoneConfig(BaseModel):
|
||||
"""Matrixone vector database configuration."""
|
||||
|
||||
MATRIXONE_HOST: str = Field(default="localhost", description="Host address of the Matrixone server")
|
||||
MATRIXONE_PORT: int = Field(default=6001, description="Port number of the Matrixone server")
|
||||
MATRIXONE_USER: str = Field(default="dump", description="Username for authenticating with Matrixone")
|
||||
MATRIXONE_PASSWORD: str = Field(default="111", description="Password for authenticating with Matrixone")
|
||||
MATRIXONE_DATABASE: str = Field(default="dify", description="Name of the Matrixone database to connect to")
|
||||
MATRIXONE_METRIC: str = Field(
|
||||
default="l2", description="Distance metric type for vector similarity search (cosine or l2)"
|
||||
)
|
||||
|
|
@ -56,8 +56,7 @@ class InsertExploreAppListApi(Resource):
|
|||
parser.add_argument("position", type=int, required=True, nullable=False, location="json")
|
||||
args = parser.parse_args()
|
||||
|
||||
with Session(db.engine) as session:
|
||||
app = session.execute(select(App).filter(App.id == args["app_id"])).scalar_one_or_none()
|
||||
app = db.session.execute(select(App).filter(App.id == args["app_id"])).scalar_one_or_none()
|
||||
if not app:
|
||||
raise NotFound(f"App '{args['app_id']}' is not found")
|
||||
|
||||
|
|
@ -78,38 +77,38 @@ class InsertExploreAppListApi(Resource):
|
|||
select(RecommendedApp).filter(RecommendedApp.app_id == args["app_id"])
|
||||
).scalar_one_or_none()
|
||||
|
||||
if not recommended_app:
|
||||
recommended_app = RecommendedApp(
|
||||
app_id=app.id,
|
||||
description=desc,
|
||||
copyright=copy_right,
|
||||
privacy_policy=privacy_policy,
|
||||
custom_disclaimer=custom_disclaimer,
|
||||
language=args["language"],
|
||||
category=args["category"],
|
||||
position=args["position"],
|
||||
)
|
||||
if not recommended_app:
|
||||
recommended_app = RecommendedApp(
|
||||
app_id=app.id,
|
||||
description=desc,
|
||||
copyright=copy_right,
|
||||
privacy_policy=privacy_policy,
|
||||
custom_disclaimer=custom_disclaimer,
|
||||
language=args["language"],
|
||||
category=args["category"],
|
||||
position=args["position"],
|
||||
)
|
||||
|
||||
db.session.add(recommended_app)
|
||||
db.session.add(recommended_app)
|
||||
|
||||
app.is_public = True
|
||||
db.session.commit()
|
||||
app.is_public = True
|
||||
db.session.commit()
|
||||
|
||||
return {"result": "success"}, 201
|
||||
else:
|
||||
recommended_app.description = desc
|
||||
recommended_app.copyright = copy_right
|
||||
recommended_app.privacy_policy = privacy_policy
|
||||
recommended_app.custom_disclaimer = custom_disclaimer
|
||||
recommended_app.language = args["language"]
|
||||
recommended_app.category = args["category"]
|
||||
recommended_app.position = args["position"]
|
||||
return {"result": "success"}, 201
|
||||
else:
|
||||
recommended_app.description = desc
|
||||
recommended_app.copyright = copy_right
|
||||
recommended_app.privacy_policy = privacy_policy
|
||||
recommended_app.custom_disclaimer = custom_disclaimer
|
||||
recommended_app.language = args["language"]
|
||||
recommended_app.category = args["category"]
|
||||
recommended_app.position = args["position"]
|
||||
|
||||
app.is_public = True
|
||||
app.is_public = True
|
||||
|
||||
db.session.commit()
|
||||
db.session.commit()
|
||||
|
||||
return {"result": "success"}, 200
|
||||
return {"result": "success"}, 200
|
||||
|
||||
|
||||
class InsertExploreAppApi(Resource):
|
||||
|
|
|
|||
|
|
@ -17,6 +17,8 @@ from libs.login import login_required
|
|||
from models import Account
|
||||
from models.model import App
|
||||
from services.app_dsl_service import AppDslService, ImportStatus
|
||||
from services.enterprise.enterprise_service import EnterpriseService
|
||||
from services.feature_service import FeatureService
|
||||
|
||||
|
||||
class AppImportApi(Resource):
|
||||
|
|
@ -60,7 +62,9 @@ class AppImportApi(Resource):
|
|||
app_id=args.get("app_id"),
|
||||
)
|
||||
session.commit()
|
||||
|
||||
if result.app_id and FeatureService.get_system_features().webapp_auth.enabled:
|
||||
# update web app setting as private
|
||||
EnterpriseService.WebAppAuth.update_app_access_mode(result.app_id, "private")
|
||||
# Return appropriate status code based on result
|
||||
status = result.status
|
||||
if status == ImportStatus.FAILED.value:
|
||||
|
|
|
|||
|
|
@ -34,6 +34,20 @@ class WorkflowAppLogApi(Resource):
|
|||
parser.add_argument(
|
||||
"created_at__after", type=str, location="args", help="Filter logs created after this timestamp"
|
||||
)
|
||||
parser.add_argument(
|
||||
"created_by_end_user_session_id",
|
||||
type=str,
|
||||
location="args",
|
||||
required=False,
|
||||
default=None,
|
||||
)
|
||||
parser.add_argument(
|
||||
"created_by_account",
|
||||
type=str,
|
||||
location="args",
|
||||
required=False,
|
||||
default=None,
|
||||
)
|
||||
parser.add_argument("page", type=int_range(1, 99999), default=1, location="args")
|
||||
parser.add_argument("limit", type=int_range(1, 100), default=20, location="args")
|
||||
args = parser.parse_args()
|
||||
|
|
@ -57,6 +71,8 @@ class WorkflowAppLogApi(Resource):
|
|||
created_at_after=args.created_at__after,
|
||||
page=args.page,
|
||||
limit=args.limit,
|
||||
created_by_end_user_session_id=args.created_by_end_user_session_id,
|
||||
created_by_account=args.created_by_account,
|
||||
)
|
||||
|
||||
return workflow_app_log_pagination
|
||||
|
|
|
|||
|
|
@ -686,6 +686,7 @@ class DatasetRetrievalSettingApi(Resource):
|
|||
| VectorType.TABLESTORE
|
||||
| VectorType.HUAWEI_CLOUD
|
||||
| VectorType.TENCENT
|
||||
| VectorType.MATRIXONE
|
||||
):
|
||||
return {
|
||||
"retrieval_method": [
|
||||
|
|
@ -733,6 +734,7 @@ class DatasetRetrievalSettingMockApi(Resource):
|
|||
| VectorType.TABLESTORE
|
||||
| VectorType.TENCENT
|
||||
| VectorType.HUAWEI_CLOUD
|
||||
| VectorType.MATRIXONE
|
||||
):
|
||||
return {
|
||||
"retrieval_method": [
|
||||
|
|
|
|||
|
|
@ -43,7 +43,6 @@ from core.model_runtime.errors.invoke import InvokeAuthorizationError
|
|||
from core.plugin.impl.exc import PluginDaemonClientSideError
|
||||
from core.rag.extractor.entity.extract_setting import ExtractSetting
|
||||
from extensions.ext_database import db
|
||||
from extensions.ext_redis import redis_client
|
||||
from fields.document_fields import (
|
||||
dataset_and_document_fields,
|
||||
document_fields,
|
||||
|
|
@ -54,8 +53,6 @@ from libs.login import login_required
|
|||
from models import Dataset, DatasetProcessRule, Document, DocumentSegment, UploadFile
|
||||
from services.dataset_service import DatasetService, DocumentService
|
||||
from services.entities.knowledge_entities.knowledge_entities import KnowledgeConfig
|
||||
from tasks.add_document_to_index_task import add_document_to_index_task
|
||||
from tasks.remove_document_from_index_task import remove_document_from_index_task
|
||||
|
||||
|
||||
class DocumentResource(Resource):
|
||||
|
|
@ -862,77 +859,16 @@ class DocumentStatusApi(DocumentResource):
|
|||
DatasetService.check_dataset_permission(dataset, current_user)
|
||||
|
||||
document_ids = request.args.getlist("document_id")
|
||||
for document_id in document_ids:
|
||||
document = self.get_document(dataset_id, document_id)
|
||||
|
||||
indexing_cache_key = "document_{}_indexing".format(document.id)
|
||||
cache_result = redis_client.get(indexing_cache_key)
|
||||
if cache_result is not None:
|
||||
raise InvalidActionError(f"Document:{document.name} is being indexed, please try again later")
|
||||
try:
|
||||
DocumentService.batch_update_document_status(dataset, document_ids, action, current_user)
|
||||
except services.errors.document.DocumentIndexingError as e:
|
||||
raise InvalidActionError(str(e))
|
||||
except ValueError as e:
|
||||
raise InvalidActionError(str(e))
|
||||
except NotFound as e:
|
||||
raise NotFound(str(e))
|
||||
|
||||
if action == "enable":
|
||||
if document.enabled:
|
||||
continue
|
||||
document.enabled = True
|
||||
document.disabled_at = None
|
||||
document.disabled_by = None
|
||||
document.updated_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
db.session.commit()
|
||||
|
||||
# Set cache to prevent indexing the same document multiple times
|
||||
redis_client.setex(indexing_cache_key, 600, 1)
|
||||
|
||||
add_document_to_index_task.delay(document_id)
|
||||
|
||||
elif action == "disable":
|
||||
if not document.completed_at or document.indexing_status != "completed":
|
||||
raise InvalidActionError(f"Document: {document.name} is not completed.")
|
||||
if not document.enabled:
|
||||
continue
|
||||
|
||||
document.enabled = False
|
||||
document.disabled_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
document.disabled_by = current_user.id
|
||||
document.updated_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
db.session.commit()
|
||||
|
||||
# Set cache to prevent indexing the same document multiple times
|
||||
redis_client.setex(indexing_cache_key, 600, 1)
|
||||
|
||||
remove_document_from_index_task.delay(document_id)
|
||||
|
||||
elif action == "archive":
|
||||
if document.archived:
|
||||
continue
|
||||
|
||||
document.archived = True
|
||||
document.archived_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
document.archived_by = current_user.id
|
||||
document.updated_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
db.session.commit()
|
||||
|
||||
if document.enabled:
|
||||
# Set cache to prevent indexing the same document multiple times
|
||||
redis_client.setex(indexing_cache_key, 600, 1)
|
||||
|
||||
remove_document_from_index_task.delay(document_id)
|
||||
|
||||
elif action == "un_archive":
|
||||
if not document.archived:
|
||||
continue
|
||||
document.archived = False
|
||||
document.archived_at = None
|
||||
document.archived_by = None
|
||||
document.updated_at = datetime.now(UTC).replace(tzinfo=None)
|
||||
db.session.commit()
|
||||
|
||||
# Set cache to prevent indexing the same document multiple times
|
||||
redis_client.setex(indexing_cache_key, 600, 1)
|
||||
|
||||
add_document_to_index_task.delay(document_id)
|
||||
|
||||
else:
|
||||
raise InvalidActionError()
|
||||
return {"result": "success"}, 200
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -15,7 +15,7 @@ class LoadBalancingCredentialsValidateApi(Resource):
|
|||
@login_required
|
||||
@account_initialization_required
|
||||
def post(self, provider: str):
|
||||
if not TenantAccountRole.is_privileged_role(current_user.current_tenant.current_role):
|
||||
if not TenantAccountRole.is_privileged_role(current_user.current_role):
|
||||
raise Forbidden()
|
||||
|
||||
tenant_id = current_user.current_tenant_id
|
||||
|
|
@ -64,7 +64,7 @@ class LoadBalancingConfigCredentialsValidateApi(Resource):
|
|||
@login_required
|
||||
@account_initialization_required
|
||||
def post(self, provider: str, config_id: str):
|
||||
if not TenantAccountRole.is_privileged_role(current_user.current_tenant.current_role):
|
||||
if not TenantAccountRole.is_privileged_role(current_user.current_role):
|
||||
raise Forbidden()
|
||||
|
||||
tenant_id = current_user.current_tenant_id
|
||||
|
|
|
|||
|
|
@ -135,6 +135,20 @@ class WorkflowAppLogApi(Resource):
|
|||
parser.add_argument("status", type=str, choices=["succeeded", "failed", "stopped"], location="args")
|
||||
parser.add_argument("created_at__before", type=str, location="args")
|
||||
parser.add_argument("created_at__after", type=str, location="args")
|
||||
parser.add_argument(
|
||||
"created_by_end_user_session_id",
|
||||
type=str,
|
||||
location="args",
|
||||
required=False,
|
||||
default=None,
|
||||
)
|
||||
parser.add_argument(
|
||||
"created_by_account",
|
||||
type=str,
|
||||
location="args",
|
||||
required=False,
|
||||
default=None,
|
||||
)
|
||||
parser.add_argument("page", type=int_range(1, 99999), default=1, location="args")
|
||||
parser.add_argument("limit", type=int_range(1, 100), default=20, location="args")
|
||||
args = parser.parse_args()
|
||||
|
|
@ -158,6 +172,8 @@ class WorkflowAppLogApi(Resource):
|
|||
created_at_after=args.created_at__after,
|
||||
page=args.page,
|
||||
limit=args.limit,
|
||||
created_by_end_user_session_id=args.created_by_end_user_session_id,
|
||||
created_by_account=args.created_by_account,
|
||||
)
|
||||
|
||||
return workflow_app_log_pagination
|
||||
|
|
|
|||
|
|
@ -4,7 +4,7 @@ from werkzeug.exceptions import Forbidden, NotFound
|
|||
|
||||
import services.dataset_service
|
||||
from controllers.service_api import api
|
||||
from controllers.service_api.dataset.error import DatasetInUseError, DatasetNameDuplicateError
|
||||
from controllers.service_api.dataset.error import DatasetInUseError, DatasetNameDuplicateError, InvalidActionError
|
||||
from controllers.service_api.wraps import (
|
||||
DatasetApiResource,
|
||||
cloud_edition_billing_rate_limit_check,
|
||||
|
|
@ -17,7 +17,7 @@ from fields.dataset_fields import dataset_detail_fields
|
|||
from fields.tag_fields import tag_fields
|
||||
from libs.login import current_user
|
||||
from models.dataset import Dataset, DatasetPermissionEnum
|
||||
from services.dataset_service import DatasetPermissionService, DatasetService
|
||||
from services.dataset_service import DatasetPermissionService, DatasetService, DocumentService
|
||||
from services.entities.knowledge_entities.knowledge_entities import RetrievalModel
|
||||
from services.tag_service import TagService
|
||||
|
||||
|
|
@ -329,6 +329,56 @@ class DatasetApi(DatasetApiResource):
|
|||
raise DatasetInUseError()
|
||||
|
||||
|
||||
class DocumentStatusApi(DatasetApiResource):
|
||||
"""Resource for batch document status operations."""
|
||||
|
||||
def patch(self, tenant_id, dataset_id, action):
|
||||
"""
|
||||
Batch update document status.
|
||||
|
||||
Args:
|
||||
tenant_id: tenant id
|
||||
dataset_id: dataset id
|
||||
action: action to perform (enable, disable, archive, un_archive)
|
||||
|
||||
Returns:
|
||||
dict: A dictionary with a key 'result' and a value 'success'
|
||||
int: HTTP status code 200 indicating that the operation was successful.
|
||||
|
||||
Raises:
|
||||
NotFound: If the dataset with the given ID does not exist.
|
||||
Forbidden: If the user does not have permission.
|
||||
InvalidActionError: If the action is invalid or cannot be performed.
|
||||
"""
|
||||
dataset_id_str = str(dataset_id)
|
||||
dataset = DatasetService.get_dataset(dataset_id_str)
|
||||
|
||||
if dataset is None:
|
||||
raise NotFound("Dataset not found.")
|
||||
|
||||
# Check user's permission
|
||||
try:
|
||||
DatasetService.check_dataset_permission(dataset, current_user)
|
||||
except services.errors.account.NoPermissionError as e:
|
||||
raise Forbidden(str(e))
|
||||
|
||||
# Check dataset model setting
|
||||
DatasetService.check_dataset_model_setting(dataset)
|
||||
|
||||
# Get document IDs from request body
|
||||
data = request.get_json()
|
||||
document_ids = data.get("document_ids", [])
|
||||
|
||||
try:
|
||||
DocumentService.batch_update_document_status(dataset, document_ids, action, current_user)
|
||||
except services.errors.document.DocumentIndexingError as e:
|
||||
raise InvalidActionError(str(e))
|
||||
except ValueError as e:
|
||||
raise InvalidActionError(str(e))
|
||||
|
||||
return {"result": "success"}, 200
|
||||
|
||||
|
||||
class DatasetTagsApi(DatasetApiResource):
|
||||
@validate_dataset_token
|
||||
@marshal_with(tag_fields)
|
||||
|
|
@ -457,6 +507,7 @@ class DatasetTagsBindingStatusApi(DatasetApiResource):
|
|||
|
||||
api.add_resource(DatasetListApi, "/datasets")
|
||||
api.add_resource(DatasetApi, "/datasets/<uuid:dataset_id>")
|
||||
api.add_resource(DocumentStatusApi, "/datasets/<uuid:dataset_id>/documents/status/<string:action>")
|
||||
api.add_resource(DatasetTagsApi, "/datasets/tags")
|
||||
api.add_resource(DatasetTagBindingApi, "/datasets/tags/binding")
|
||||
api.add_resource(DatasetTagUnbindingApi, "/datasets/tags/unbinding")
|
||||
|
|
|
|||
|
|
@ -0,0 +1,233 @@
|
|||
import json
|
||||
import logging
|
||||
import uuid
|
||||
from functools import wraps
|
||||
from typing import Any, Optional
|
||||
|
||||
from mo_vector.client import MoVectorClient # type: ignore
|
||||
from pydantic import BaseModel, model_validator
|
||||
|
||||
from configs import dify_config
|
||||
from core.rag.datasource.vdb.vector_base import BaseVector
|
||||
from core.rag.datasource.vdb.vector_factory import AbstractVectorFactory
|
||||
from core.rag.datasource.vdb.vector_type import VectorType
|
||||
from core.rag.embedding.embedding_base import Embeddings
|
||||
from core.rag.models.document import Document
|
||||
from extensions.ext_redis import redis_client
|
||||
from models.dataset import Dataset
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class MatrixoneConfig(BaseModel):
|
||||
host: str = "localhost"
|
||||
port: int = 6001
|
||||
user: str = "dump"
|
||||
password: str = "111"
|
||||
database: str = "dify"
|
||||
metric: str = "l2"
|
||||
|
||||
@model_validator(mode="before")
|
||||
@classmethod
|
||||
def validate_config(cls, values: dict) -> dict:
|
||||
if not values["host"]:
|
||||
raise ValueError("config host is required")
|
||||
if not values["port"]:
|
||||
raise ValueError("config port is required")
|
||||
if not values["user"]:
|
||||
raise ValueError("config user is required")
|
||||
if not values["password"]:
|
||||
raise ValueError("config password is required")
|
||||
if not values["database"]:
|
||||
raise ValueError("config database is required")
|
||||
return values
|
||||
|
||||
|
||||
def ensure_client(func):
|
||||
@wraps(func)
|
||||
def wrapper(self, *args, **kwargs):
|
||||
if self.client is None:
|
||||
self.client = self._get_client(None, False)
|
||||
return func(self, *args, **kwargs)
|
||||
|
||||
return wrapper
|
||||
|
||||
|
||||
class MatrixoneVector(BaseVector):
|
||||
"""
|
||||
Matrixone vector storage implementation.
|
||||
"""
|
||||
|
||||
def __init__(self, collection_name: str, config: MatrixoneConfig):
|
||||
super().__init__(collection_name)
|
||||
self.config = config
|
||||
self.collection_name = collection_name.lower()
|
||||
self.client = None
|
||||
|
||||
@property
|
||||
def collection_name(self):
|
||||
return self._collection_name
|
||||
|
||||
@collection_name.setter
|
||||
def collection_name(self, value):
|
||||
self._collection_name = value
|
||||
|
||||
def get_type(self) -> str:
|
||||
return VectorType.MATRIXONE
|
||||
|
||||
def create(self, texts: list[Document], embeddings: list[list[float]], **kwargs):
|
||||
if self.client is None:
|
||||
self.client = self._get_client(len(embeddings[0]), True)
|
||||
return self.add_texts(texts, embeddings)
|
||||
|
||||
def _get_client(self, dimension: Optional[int] = None, create_table: bool = False) -> MoVectorClient:
|
||||
"""
|
||||
Create a new client for the collection.
|
||||
|
||||
The collection will be created if it doesn't exist.
|
||||
"""
|
||||
lock_name = f"vector_indexing_lock_{self._collection_name}"
|
||||
with redis_client.lock(lock_name, timeout=20):
|
||||
client = MoVectorClient(
|
||||
connection_string=f"mysql+pymysql://{self.config.user}:{self.config.password}@{self.config.host}:{self.config.port}/{self.config.database}",
|
||||
table_name=self.collection_name,
|
||||
vector_dimension=dimension,
|
||||
create_table=create_table,
|
||||
)
|
||||
collection_exist_cache_key = f"vector_indexing_{self._collection_name}"
|
||||
if redis_client.get(collection_exist_cache_key):
|
||||
return client
|
||||
try:
|
||||
client.create_full_text_index()
|
||||
except Exception as e:
|
||||
logger.exception("Failed to create full text index")
|
||||
redis_client.set(collection_exist_cache_key, 1, ex=3600)
|
||||
return client
|
||||
|
||||
def add_texts(self, documents: list[Document], embeddings: list[list[float]], **kwargs):
|
||||
if self.client is None:
|
||||
self.client = self._get_client(len(embeddings[0]), True)
|
||||
assert self.client is not None
|
||||
ids = []
|
||||
for _, doc in enumerate(documents):
|
||||
if doc.metadata is not None:
|
||||
doc_id = doc.metadata.get("doc_id", str(uuid.uuid4()))
|
||||
ids.append(doc_id)
|
||||
self.client.insert(
|
||||
texts=[doc.page_content for doc in documents],
|
||||
embeddings=embeddings,
|
||||
metadatas=[doc.metadata for doc in documents],
|
||||
ids=ids,
|
||||
)
|
||||
return ids
|
||||
|
||||
@ensure_client
|
||||
def text_exists(self, id: str) -> bool:
|
||||
assert self.client is not None
|
||||
result = self.client.get(ids=[id])
|
||||
return len(result) > 0
|
||||
|
||||
@ensure_client
|
||||
def delete_by_ids(self, ids: list[str]) -> None:
|
||||
assert self.client is not None
|
||||
if not ids:
|
||||
return
|
||||
self.client.delete(ids=ids)
|
||||
|
||||
@ensure_client
|
||||
def get_ids_by_metadata_field(self, key: str, value: str):
|
||||
assert self.client is not None
|
||||
results = self.client.query_by_metadata(filter={key: value})
|
||||
return [result.id for result in results]
|
||||
|
||||
@ensure_client
|
||||
def delete_by_metadata_field(self, key: str, value: str) -> None:
|
||||
assert self.client is not None
|
||||
self.client.delete(filter={key: value})
|
||||
|
||||
@ensure_client
|
||||
def search_by_vector(self, query_vector: list[float], **kwargs: Any) -> list[Document]:
|
||||
assert self.client is not None
|
||||
top_k = kwargs.get("top_k", 5)
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
filter = None
|
||||
if document_ids_filter:
|
||||
filter = {"document_id": {"$in": document_ids_filter}}
|
||||
|
||||
results = self.client.query(
|
||||
query_vector=query_vector,
|
||||
k=top_k,
|
||||
filter=filter,
|
||||
)
|
||||
|
||||
docs = []
|
||||
# TODO: add the score threshold to the query
|
||||
for result in results:
|
||||
metadata = result.metadata
|
||||
docs.append(
|
||||
Document(
|
||||
page_content=result.document,
|
||||
metadata=metadata,
|
||||
)
|
||||
)
|
||||
return docs
|
||||
|
||||
@ensure_client
|
||||
def search_by_full_text(self, query: str, **kwargs: Any) -> list[Document]:
|
||||
assert self.client is not None
|
||||
top_k = kwargs.get("top_k", 5)
|
||||
document_ids_filter = kwargs.get("document_ids_filter")
|
||||
filter = None
|
||||
if document_ids_filter:
|
||||
filter = {"document_id": {"$in": document_ids_filter}}
|
||||
score_threshold = float(kwargs.get("score_threshold", 0.0))
|
||||
|
||||
results = self.client.full_text_query(
|
||||
keywords=[query],
|
||||
k=top_k,
|
||||
filter=filter,
|
||||
)
|
||||
|
||||
docs = []
|
||||
for result in results:
|
||||
metadata = result.metadata
|
||||
if isinstance(metadata, str):
|
||||
import json
|
||||
|
||||
metadata = json.loads(metadata)
|
||||
score = 1 - result.distance
|
||||
if score >= score_threshold:
|
||||
metadata["score"] = score
|
||||
docs.append(
|
||||
Document(
|
||||
page_content=result.document,
|
||||
metadata=metadata,
|
||||
)
|
||||
)
|
||||
return docs
|
||||
|
||||
@ensure_client
|
||||
def delete(self) -> None:
|
||||
assert self.client is not None
|
||||
self.client.delete()
|
||||
|
||||
|
||||
class MatrixoneVectorFactory(AbstractVectorFactory):
|
||||
def init_vector(self, dataset: Dataset, attributes: list, embeddings: Embeddings) -> MatrixoneVector:
|
||||
if dataset.index_struct_dict:
|
||||
class_prefix: str = dataset.index_struct_dict["vector_store"]["class_prefix"]
|
||||
collection_name = class_prefix
|
||||
else:
|
||||
dataset_id = dataset.id
|
||||
collection_name = Dataset.gen_collection_name_by_id(dataset_id)
|
||||
dataset.index_struct = json.dumps(self.gen_index_struct_dict(VectorType.MATRIXONE, collection_name))
|
||||
|
||||
config = MatrixoneConfig(
|
||||
host=dify_config.MATRIXONE_HOST or "localhost",
|
||||
port=dify_config.MATRIXONE_PORT or 6001,
|
||||
user=dify_config.MATRIXONE_USER or "dump",
|
||||
password=dify_config.MATRIXONE_PASSWORD or "111",
|
||||
database=dify_config.MATRIXONE_DATABASE or "dify",
|
||||
metric=dify_config.MATRIXONE_METRIC or "l2",
|
||||
)
|
||||
return MatrixoneVector(collection_name=collection_name, config=config)
|
||||
|
|
@ -164,6 +164,10 @@ class Vector:
|
|||
from core.rag.datasource.vdb.huawei.huawei_cloud_vector import HuaweiCloudVectorFactory
|
||||
|
||||
return HuaweiCloudVectorFactory
|
||||
case VectorType.MATRIXONE:
|
||||
from core.rag.datasource.vdb.matrixone.matrixone_vector import MatrixoneVectorFactory
|
||||
|
||||
return MatrixoneVectorFactory
|
||||
case _:
|
||||
raise ValueError(f"Vector store {vector_type} is not supported.")
|
||||
|
||||
|
|
|
|||
|
|
@ -29,3 +29,4 @@ class VectorType(StrEnum):
|
|||
OPENGAUSS = "opengauss"
|
||||
TABLESTORE = "tablestore"
|
||||
HUAWEI_CLOUD = "huawei_cloud"
|
||||
MATRIXONE = "matrixone"
|
||||
|
|
|
|||
|
|
@ -45,7 +45,8 @@ class WeaviateVector(BaseVector):
|
|||
# by changing the connection timeout to pypi.org from 1 second to 0.001 seconds.
|
||||
# TODO: This can be removed once weaviate-client is updated to 3.26.7 or higher,
|
||||
# which does not contain the deprecation check.
|
||||
weaviate.connect.connection.PYPI_TIMEOUT = 0.001
|
||||
if hasattr(weaviate.connect.connection, "PYPI_TIMEOUT"):
|
||||
weaviate.connect.connection.PYPI_TIMEOUT = 0.001
|
||||
|
||||
try:
|
||||
client = weaviate.Client(
|
||||
|
|
|
|||
|
|
@ -68,22 +68,17 @@ class MarkdownExtractor(BaseExtractor):
|
|||
continue
|
||||
header_match = re.match(r"^#+\s", line)
|
||||
if header_match:
|
||||
if current_header is not None:
|
||||
markdown_tups.append((current_header, current_text))
|
||||
|
||||
markdown_tups.append((current_header, current_text))
|
||||
current_header = line
|
||||
current_text = ""
|
||||
else:
|
||||
current_text += line + "\n"
|
||||
markdown_tups.append((current_header, current_text))
|
||||
|
||||
if current_header is not None:
|
||||
# pass linting, assert keys are defined
|
||||
markdown_tups = [
|
||||
(re.sub(r"#", "", cast(str, key)).strip(), re.sub(r"<.*?>", "", value)) for key, value in markdown_tups
|
||||
]
|
||||
else:
|
||||
markdown_tups = [(key, re.sub("\n", "", value)) for key, value in markdown_tups]
|
||||
markdown_tups = [
|
||||
(re.sub(r"#", "", cast(str, key)).strip() if key else None, re.sub(r"<.*?>", "", value))
|
||||
for key, value in markdown_tups
|
||||
]
|
||||
|
||||
return markdown_tups
|
||||
|
||||
|
|
|
|||
|
|
@ -6,7 +6,7 @@ import json
|
|||
import logging
|
||||
from typing import Optional, Union
|
||||
|
||||
from sqlalchemy import func, select
|
||||
from sqlalchemy import select
|
||||
from sqlalchemy.engine import Engine
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
|
||||
|
|
@ -146,25 +146,7 @@ class SQLAlchemyWorkflowExecutionRepository(WorkflowExecutionRepository):
|
|||
db_model.workflow_id = domain_model.workflow_id
|
||||
db_model.triggered_from = self._triggered_from
|
||||
|
||||
# Check if this is a new record
|
||||
with self._session_factory() as session:
|
||||
existing = session.scalar(select(WorkflowRun).where(WorkflowRun.id == domain_model.id_))
|
||||
if not existing:
|
||||
# For new records, get the next sequence number
|
||||
# in case multiple executions are created concurrently, use for update
|
||||
stmt = (
|
||||
select(func.coalesce(func.max(WorkflowRun.sequence_number), 0) + 1)
|
||||
.where(
|
||||
WorkflowRun.app_id == self._app_id,
|
||||
WorkflowRun.tenant_id == self._tenant_id,
|
||||
)
|
||||
.with_for_update()
|
||||
)
|
||||
next_seq = session.scalar(stmt)
|
||||
db_model.sequence_number = int(next_seq) if next_seq is not None else 1
|
||||
else:
|
||||
# For updates, keep the existing sequence number
|
||||
db_model.sequence_number = existing.sequence_number
|
||||
# No sequence number generation needed anymore
|
||||
|
||||
db_model.type = domain_model.workflow_type
|
||||
db_model.version = domain_model.workflow_version
|
||||
|
|
|
|||
|
|
@ -57,7 +57,6 @@ class StreamProcessor(ABC):
|
|||
|
||||
# The branch_identify parameter is added to ensure that
|
||||
# only nodes in the correct logical branch are included.
|
||||
reachable_node_ids.append(edge.target_node_id)
|
||||
ids = self._fetch_node_ids_in_reachable_branch(edge.target_node_id, run_result.edge_source_handle)
|
||||
reachable_node_ids.extend(ids)
|
||||
else:
|
||||
|
|
@ -74,6 +73,8 @@ class StreamProcessor(ABC):
|
|||
self._remove_node_ids_in_unreachable_branch(node_id, reachable_node_ids)
|
||||
|
||||
def _fetch_node_ids_in_reachable_branch(self, node_id: str, branch_identify: Optional[str] = None) -> list[str]:
|
||||
if node_id not in self.rest_node_ids:
|
||||
self.rest_node_ids.append(node_id)
|
||||
node_ids = []
|
||||
for edge in self.graph.edge_mapping.get(node_id, []):
|
||||
if edge.target_node_id == self.graph.root_node_id:
|
||||
|
|
|
|||
|
|
@ -54,6 +54,15 @@ class Mail:
|
|||
use_tls=dify_config.SMTP_USE_TLS,
|
||||
opportunistic_tls=dify_config.SMTP_OPPORTUNISTIC_TLS,
|
||||
)
|
||||
case "sendgrid":
|
||||
from libs.sendgrid import SendGridClient
|
||||
|
||||
if not dify_config.SENDGRID_API_KEY:
|
||||
raise ValueError("SENDGRID_API_KEY is required for SendGrid mail type")
|
||||
|
||||
self._client = SendGridClient(
|
||||
sendgrid_api_key=dify_config.SENDGRID_API_KEY, _from=dify_config.MAIL_DEFAULT_SEND_FROM or ""
|
||||
)
|
||||
case _:
|
||||
raise ValueError("Unsupported mail type {}".format(mail_type))
|
||||
|
||||
|
|
|
|||
|
|
@ -19,7 +19,6 @@ workflow_run_for_log_fields = {
|
|||
|
||||
workflow_run_for_list_fields = {
|
||||
"id": fields.String,
|
||||
"sequence_number": fields.Integer,
|
||||
"version": fields.String,
|
||||
"status": fields.String,
|
||||
"elapsed_time": fields.Float,
|
||||
|
|
@ -36,7 +35,6 @@ advanced_chat_workflow_run_for_list_fields = {
|
|||
"id": fields.String,
|
||||
"conversation_id": fields.String,
|
||||
"message_id": fields.String,
|
||||
"sequence_number": fields.Integer,
|
||||
"version": fields.String,
|
||||
"status": fields.String,
|
||||
"elapsed_time": fields.Float,
|
||||
|
|
@ -63,7 +61,6 @@ workflow_run_pagination_fields = {
|
|||
|
||||
workflow_run_detail_fields = {
|
||||
"id": fields.String,
|
||||
"sequence_number": fields.Integer,
|
||||
"version": fields.String,
|
||||
"graph": fields.Raw(attribute="graph_dict"),
|
||||
"inputs": fields.Raw(attribute="inputs_dict"),
|
||||
|
|
|
|||
|
|
@ -0,0 +1,45 @@
|
|||
import logging
|
||||
|
||||
import sendgrid # type: ignore
|
||||
from python_http_client.exceptions import ForbiddenError, UnauthorizedError
|
||||
from sendgrid.helpers.mail import Content, Email, Mail, To # type: ignore
|
||||
|
||||
|
||||
class SendGridClient:
|
||||
def __init__(self, sendgrid_api_key: str, _from: str):
|
||||
self.sendgrid_api_key = sendgrid_api_key
|
||||
self._from = _from
|
||||
|
||||
def send(self, mail: dict):
|
||||
logging.debug("Sending email with SendGrid")
|
||||
|
||||
try:
|
||||
_to = mail["to"]
|
||||
|
||||
if not _to:
|
||||
raise ValueError("SendGridClient: Cannot send email: recipient address is missing.")
|
||||
|
||||
sg = sendgrid.SendGridAPIClient(api_key=self.sendgrid_api_key)
|
||||
from_email = Email(self._from)
|
||||
to_email = To(_to)
|
||||
subject = mail["subject"]
|
||||
content = Content("text/html", mail["html"])
|
||||
mail = Mail(from_email, to_email, subject, content)
|
||||
mail_json = mail.get() # type: ignore
|
||||
response = sg.client.mail.send.post(request_body=mail_json)
|
||||
logging.debug(response.status_code)
|
||||
logging.debug(response.body)
|
||||
logging.debug(response.headers)
|
||||
|
||||
except TimeoutError as e:
|
||||
logging.exception("SendGridClient Timeout occurred while sending email")
|
||||
raise
|
||||
except (UnauthorizedError, ForbiddenError) as e:
|
||||
logging.exception(
|
||||
"SendGridClient Authentication failed. "
|
||||
"Verify that your credentials and the 'from' email address are correct"
|
||||
)
|
||||
raise
|
||||
except Exception as e:
|
||||
logging.exception(f"SendGridClient Unexpected error occurred while sending email to {_to}")
|
||||
raise
|
||||
|
|
@ -0,0 +1,66 @@
|
|||
"""remove sequence_number from workflow_runs
|
||||
|
||||
Revision ID: 0ab65e1cc7fa
|
||||
Revises: 4474872b0ee6
|
||||
Create Date: 2025-06-19 16:33:13.377215
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import models as models
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = '0ab65e1cc7fa'
|
||||
down_revision = '4474872b0ee6'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
with op.batch_alter_table('workflow_runs', schema=None) as batch_op:
|
||||
batch_op.drop_index(batch_op.f('workflow_run_tenant_app_sequence_idx'))
|
||||
batch_op.drop_column('sequence_number')
|
||||
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
||||
def downgrade():
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
|
||||
# WARNING: This downgrade CANNOT recover the original sequence_number values!
|
||||
# The original sequence numbers are permanently lost after the upgrade.
|
||||
# This downgrade will regenerate sequence numbers based on created_at order,
|
||||
# which may result in different values than the original sequence numbers.
|
||||
#
|
||||
# If you need to preserve original sequence numbers, use the alternative
|
||||
# migration approach that creates a backup table before removal.
|
||||
|
||||
# Step 1: Add sequence_number column as nullable first
|
||||
with op.batch_alter_table('workflow_runs', schema=None) as batch_op:
|
||||
batch_op.add_column(sa.Column('sequence_number', sa.INTEGER(), autoincrement=False, nullable=True))
|
||||
|
||||
# Step 2: Populate sequence_number values based on created_at order within each app
|
||||
# NOTE: This recreates sequence numbering logic but values will be different
|
||||
# from the original sequence numbers that were removed in the upgrade
|
||||
connection = op.get_bind()
|
||||
connection.execute(sa.text("""
|
||||
UPDATE workflow_runs
|
||||
SET sequence_number = subquery.row_num
|
||||
FROM (
|
||||
SELECT id, ROW_NUMBER() OVER (
|
||||
PARTITION BY tenant_id, app_id
|
||||
ORDER BY created_at, id
|
||||
) as row_num
|
||||
FROM workflow_runs
|
||||
) subquery
|
||||
WHERE workflow_runs.id = subquery.id
|
||||
"""))
|
||||
|
||||
# Step 3: Make the column NOT NULL and add the index
|
||||
with op.batch_alter_table('workflow_runs', schema=None) as batch_op:
|
||||
batch_op.alter_column('sequence_number', nullable=False)
|
||||
batch_op.create_index(batch_op.f('workflow_run_tenant_app_sequence_idx'), ['tenant_id', 'app_id', 'sequence_number'], unique=False)
|
||||
|
||||
# ### end Alembic commands ###
|
||||
|
|
@ -386,7 +386,7 @@ class WorkflowRun(Base):
|
|||
- id (uuid) Run ID
|
||||
- tenant_id (uuid) Workspace ID
|
||||
- app_id (uuid) App ID
|
||||
- sequence_number (int) Auto-increment sequence number, incremented within the App, starting from 1
|
||||
|
||||
- workflow_id (uuid) Workflow ID
|
||||
- type (string) Workflow type
|
||||
- triggered_from (string) Trigger source
|
||||
|
|
@ -419,13 +419,12 @@ class WorkflowRun(Base):
|
|||
__table_args__ = (
|
||||
db.PrimaryKeyConstraint("id", name="workflow_run_pkey"),
|
||||
db.Index("workflow_run_triggerd_from_idx", "tenant_id", "app_id", "triggered_from"),
|
||||
db.Index("workflow_run_tenant_app_sequence_idx", "tenant_id", "app_id", "sequence_number"),
|
||||
)
|
||||
|
||||
id: Mapped[str] = mapped_column(StringUUID, server_default=db.text("uuid_generate_v4()"))
|
||||
tenant_id: Mapped[str] = mapped_column(StringUUID)
|
||||
app_id: Mapped[str] = mapped_column(StringUUID)
|
||||
sequence_number: Mapped[int] = mapped_column()
|
||||
|
||||
workflow_id: Mapped[str] = mapped_column(StringUUID)
|
||||
type: Mapped[str] = mapped_column(db.String(255))
|
||||
triggered_from: Mapped[str] = mapped_column(db.String(255))
|
||||
|
|
@ -485,7 +484,6 @@ class WorkflowRun(Base):
|
|||
"id": self.id,
|
||||
"tenant_id": self.tenant_id,
|
||||
"app_id": self.app_id,
|
||||
"sequence_number": self.sequence_number,
|
||||
"workflow_id": self.workflow_id,
|
||||
"type": self.type,
|
||||
"triggered_from": self.triggered_from,
|
||||
|
|
@ -511,7 +509,6 @@ class WorkflowRun(Base):
|
|||
id=data.get("id"),
|
||||
tenant_id=data.get("tenant_id"),
|
||||
app_id=data.get("app_id"),
|
||||
sequence_number=data.get("sequence_number"),
|
||||
workflow_id=data.get("workflow_id"),
|
||||
type=data.get("type"),
|
||||
triggered_from=data.get("triggered_from"),
|
||||
|
|
|
|||
|
|
@ -18,4 +18,3 @@ ignore_missing_imports=True
|
|||
|
||||
[mypy-flask_restful.inputs]
|
||||
ignore_missing_imports=True
|
||||
|
||||
|
|
|
|||
|
|
@ -81,6 +81,7 @@ dependencies = [
|
|||
"weave~=0.51.0",
|
||||
"yarl~=1.18.3",
|
||||
"webvtt-py~=0.5.1",
|
||||
"sendgrid~=6.12.3",
|
||||
]
|
||||
# Before adding new dependency, consider place it in
|
||||
# alphabet order (a-z) and suitable group.
|
||||
|
|
@ -202,4 +203,5 @@ vdb = [
|
|||
"volcengine-compat~=1.0.0",
|
||||
"weaviate-client~=3.24.0",
|
||||
"xinference-client~=1.2.2",
|
||||
"mo-vector~=0.1.13",
|
||||
]
|
||||
|
|
|
|||
|
|
@ -59,6 +59,7 @@ from services.external_knowledge_service import ExternalDatasetService
|
|||
from services.feature_service import FeatureModel, FeatureService
|
||||
from services.tag_service import TagService
|
||||
from services.vector_service import VectorService
|
||||
from tasks.add_document_to_index_task import add_document_to_index_task
|
||||
from tasks.batch_clean_document_task import batch_clean_document_task
|
||||
from tasks.clean_notion_document_task import clean_notion_document_task
|
||||
from tasks.deal_dataset_vector_index_task import deal_dataset_vector_index_task
|
||||
|
|
@ -70,6 +71,7 @@ from tasks.document_indexing_update_task import document_indexing_update_task
|
|||
from tasks.duplicate_document_indexing_task import duplicate_document_indexing_task
|
||||
from tasks.enable_segments_to_index_task import enable_segments_to_index_task
|
||||
from tasks.recover_document_indexing_task import recover_document_indexing_task
|
||||
from tasks.remove_document_from_index_task import remove_document_from_index_task
|
||||
from tasks.retry_document_indexing_task import retry_document_indexing_task
|
||||
from tasks.sync_website_document_indexing_task import sync_website_document_indexing_task
|
||||
|
||||
|
|
@ -434,7 +436,7 @@ class DatasetService:
|
|||
raise ValueError(ex.description)
|
||||
|
||||
filtered_data["updated_by"] = user.id
|
||||
filtered_data["updated_at"] = datetime.datetime.now()
|
||||
filtered_data["updated_at"] = datetime.datetime.now(datetime.UTC).replace(tzinfo=None)
|
||||
|
||||
# update Retrieval model
|
||||
filtered_data["retrieval_model"] = data["retrieval_model"]
|
||||
|
|
@ -976,12 +978,17 @@ class DocumentService:
|
|||
process_rule = knowledge_config.process_rule
|
||||
if process_rule:
|
||||
if process_rule.mode in ("custom", "hierarchical"):
|
||||
dataset_process_rule = DatasetProcessRule(
|
||||
dataset_id=dataset.id,
|
||||
mode=process_rule.mode,
|
||||
rules=process_rule.rules.model_dump_json() if process_rule.rules else None,
|
||||
created_by=account.id,
|
||||
)
|
||||
if process_rule.rules:
|
||||
dataset_process_rule = DatasetProcessRule(
|
||||
dataset_id=dataset.id,
|
||||
mode=process_rule.mode,
|
||||
rules=process_rule.rules.model_dump_json() if process_rule.rules else None,
|
||||
created_by=account.id,
|
||||
)
|
||||
else:
|
||||
dataset_process_rule = dataset.latest_process_rule
|
||||
if not dataset_process_rule:
|
||||
raise ValueError("No process rule found.")
|
||||
elif process_rule.mode == "automatic":
|
||||
dataset_process_rule = DatasetProcessRule(
|
||||
dataset_id=dataset.id,
|
||||
|
|
@ -1402,16 +1409,16 @@ class DocumentService:
|
|||
knowledge_config.embedding_model, # type: ignore
|
||||
)
|
||||
dataset_collection_binding_id = dataset_collection_binding.id
|
||||
if knowledge_config.retrieval_model:
|
||||
retrieval_model = knowledge_config.retrieval_model
|
||||
else:
|
||||
retrieval_model = RetrievalModel(
|
||||
search_method=RetrievalMethod.SEMANTIC_SEARCH.value,
|
||||
reranking_enable=False,
|
||||
reranking_model=RerankingModel(reranking_provider_name="", reranking_model_name=""),
|
||||
top_k=2,
|
||||
score_threshold_enabled=False,
|
||||
)
|
||||
if knowledge_config.retrieval_model:
|
||||
retrieval_model = knowledge_config.retrieval_model
|
||||
else:
|
||||
retrieval_model = RetrievalModel(
|
||||
search_method=RetrievalMethod.SEMANTIC_SEARCH.value,
|
||||
reranking_enable=False,
|
||||
reranking_model=RerankingModel(reranking_provider_name="", reranking_model_name=""),
|
||||
top_k=2,
|
||||
score_threshold_enabled=False,
|
||||
)
|
||||
# save dataset
|
||||
dataset = Dataset(
|
||||
tenant_id=tenant_id,
|
||||
|
|
@ -1603,6 +1610,191 @@ class DocumentService:
|
|||
if not isinstance(args["process_rule"]["rules"]["segmentation"]["max_tokens"], int):
|
||||
raise ValueError("Process rule segmentation max_tokens is invalid")
|
||||
|
||||
@staticmethod
|
||||
def batch_update_document_status(dataset: Dataset, document_ids: list[str], action: str, user):
|
||||
"""
|
||||
Batch update document status.
|
||||
|
||||
Args:
|
||||
dataset (Dataset): The dataset object
|
||||
document_ids (list[str]): List of document IDs to update
|
||||
action (str): Action to perform (enable, disable, archive, un_archive)
|
||||
user: Current user performing the action
|
||||
|
||||
Raises:
|
||||
DocumentIndexingError: If document is being indexed or not in correct state
|
||||
ValueError: If action is invalid
|
||||
"""
|
||||
if not document_ids:
|
||||
return
|
||||
|
||||
# Early validation of action parameter
|
||||
valid_actions = ["enable", "disable", "archive", "un_archive"]
|
||||
if action not in valid_actions:
|
||||
raise ValueError(f"Invalid action: {action}. Must be one of {valid_actions}")
|
||||
|
||||
documents_to_update = []
|
||||
|
||||
# First pass: validate all documents and prepare updates
|
||||
for document_id in document_ids:
|
||||
document = DocumentService.get_document(dataset.id, document_id)
|
||||
if not document:
|
||||
continue
|
||||
|
||||
# Check if document is being indexed
|
||||
indexing_cache_key = f"document_{document.id}_indexing"
|
||||
cache_result = redis_client.get(indexing_cache_key)
|
||||
if cache_result is not None:
|
||||
raise DocumentIndexingError(f"Document:{document.name} is being indexed, please try again later")
|
||||
|
||||
# Prepare update based on action
|
||||
update_info = DocumentService._prepare_document_status_update(document, action, user)
|
||||
if update_info:
|
||||
documents_to_update.append(update_info)
|
||||
|
||||
# Second pass: apply all updates in a single transaction
|
||||
if documents_to_update:
|
||||
try:
|
||||
for update_info in documents_to_update:
|
||||
document = update_info["document"]
|
||||
updates = update_info["updates"]
|
||||
|
||||
# Apply updates to the document
|
||||
for field, value in updates.items():
|
||||
setattr(document, field, value)
|
||||
|
||||
db.session.add(document)
|
||||
|
||||
# Batch commit all changes
|
||||
db.session.commit()
|
||||
except Exception as e:
|
||||
# Rollback on any error
|
||||
db.session.rollback()
|
||||
raise e
|
||||
# Execute async tasks and set Redis cache after successful commit
|
||||
# propagation_error is used to capture any errors for submitting async task execution
|
||||
propagation_error = None
|
||||
for update_info in documents_to_update:
|
||||
try:
|
||||
# Execute async tasks after successful commit
|
||||
if update_info["async_task"]:
|
||||
task_info = update_info["async_task"]
|
||||
task_func = task_info["function"]
|
||||
task_args = task_info["args"]
|
||||
task_func.delay(*task_args)
|
||||
except Exception as e:
|
||||
# Log the error but do not rollback the transaction
|
||||
logging.exception(f"Error executing async task for document {update_info['document'].id}")
|
||||
# don't raise the error immediately, but capture it for later
|
||||
propagation_error = e
|
||||
try:
|
||||
# Set Redis cache if needed after successful commit
|
||||
if update_info["set_cache"]:
|
||||
document = update_info["document"]
|
||||
indexing_cache_key = f"document_{document.id}_indexing"
|
||||
redis_client.setex(indexing_cache_key, 600, 1)
|
||||
except Exception as e:
|
||||
# Log the error but do not rollback the transaction
|
||||
logging.exception(f"Error setting cache for document {update_info['document'].id}")
|
||||
# Raise any propagation error after all updates
|
||||
if propagation_error:
|
||||
raise propagation_error
|
||||
|
||||
@staticmethod
|
||||
def _prepare_document_status_update(document, action: str, user):
|
||||
"""
|
||||
Prepare document status update information.
|
||||
|
||||
Args:
|
||||
document: Document object to update
|
||||
action: Action to perform
|
||||
user: Current user
|
||||
|
||||
Returns:
|
||||
dict: Update information or None if no update needed
|
||||
"""
|
||||
now = datetime.datetime.now(datetime.UTC).replace(tzinfo=None)
|
||||
|
||||
if action == "enable":
|
||||
return DocumentService._prepare_enable_update(document, now)
|
||||
elif action == "disable":
|
||||
return DocumentService._prepare_disable_update(document, user, now)
|
||||
elif action == "archive":
|
||||
return DocumentService._prepare_archive_update(document, user, now)
|
||||
elif action == "un_archive":
|
||||
return DocumentService._prepare_unarchive_update(document, now)
|
||||
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def _prepare_enable_update(document, now):
|
||||
"""Prepare updates for enabling a document."""
|
||||
if document.enabled:
|
||||
return None
|
||||
|
||||
return {
|
||||
"document": document,
|
||||
"updates": {"enabled": True, "disabled_at": None, "disabled_by": None, "updated_at": now},
|
||||
"async_task": {"function": add_document_to_index_task, "args": [document.id]},
|
||||
"set_cache": True,
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def _prepare_disable_update(document, user, now):
|
||||
"""Prepare updates for disabling a document."""
|
||||
if not document.completed_at or document.indexing_status != "completed":
|
||||
raise DocumentIndexingError(f"Document: {document.name} is not completed.")
|
||||
|
||||
if not document.enabled:
|
||||
return None
|
||||
|
||||
return {
|
||||
"document": document,
|
||||
"updates": {"enabled": False, "disabled_at": now, "disabled_by": user.id, "updated_at": now},
|
||||
"async_task": {"function": remove_document_from_index_task, "args": [document.id]},
|
||||
"set_cache": True,
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def _prepare_archive_update(document, user, now):
|
||||
"""Prepare updates for archiving a document."""
|
||||
if document.archived:
|
||||
return None
|
||||
|
||||
update_info = {
|
||||
"document": document,
|
||||
"updates": {"archived": True, "archived_at": now, "archived_by": user.id, "updated_at": now},
|
||||
"async_task": None,
|
||||
"set_cache": False,
|
||||
}
|
||||
|
||||
# Only set async task and cache if document is currently enabled
|
||||
if document.enabled:
|
||||
update_info["async_task"] = {"function": remove_document_from_index_task, "args": [document.id]}
|
||||
update_info["set_cache"] = True
|
||||
|
||||
return update_info
|
||||
|
||||
@staticmethod
|
||||
def _prepare_unarchive_update(document, now):
|
||||
"""Prepare updates for unarchiving a document."""
|
||||
if not document.archived:
|
||||
return None
|
||||
|
||||
update_info = {
|
||||
"document": document,
|
||||
"updates": {"archived": False, "archived_at": None, "archived_by": None, "updated_at": now},
|
||||
"async_task": None,
|
||||
"set_cache": False,
|
||||
}
|
||||
|
||||
# Only re-index if the document is currently enabled
|
||||
if document.enabled:
|
||||
update_info["async_task"] = {"function": add_document_to_index_task, "args": [document.id]}
|
||||
update_info["set_cache"] = True
|
||||
|
||||
return update_info
|
||||
|
||||
|
||||
class SegmentService:
|
||||
@classmethod
|
||||
|
|
|
|||
|
|
@ -101,7 +101,7 @@ class WeightModel(BaseModel):
|
|||
|
||||
|
||||
class RetrievalModel(BaseModel):
|
||||
search_method: Literal["hybrid_search", "semantic_search", "full_text_search"]
|
||||
search_method: Literal["hybrid_search", "semantic_search", "full_text_search", "keyword_search"]
|
||||
reranking_enable: bool
|
||||
reranking_model: Optional[RerankingModel] = None
|
||||
reranking_mode: Optional[str] = None
|
||||
|
|
|
|||
|
|
@ -3,7 +3,7 @@ import logging
|
|||
|
||||
import click
|
||||
|
||||
from core.entities import DEFAULT_PLUGIN_ID
|
||||
from core.plugin.entities.plugin import GenericProviderID, ModelProviderID, ToolProviderID
|
||||
from models.engine import db
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
|
@ -12,17 +12,17 @@ logger = logging.getLogger(__name__)
|
|||
class PluginDataMigration:
|
||||
@classmethod
|
||||
def migrate(cls) -> None:
|
||||
cls.migrate_db_records("providers", "provider_name") # large table
|
||||
cls.migrate_db_records("provider_models", "provider_name")
|
||||
cls.migrate_db_records("provider_orders", "provider_name")
|
||||
cls.migrate_db_records("tenant_default_models", "provider_name")
|
||||
cls.migrate_db_records("tenant_preferred_model_providers", "provider_name")
|
||||
cls.migrate_db_records("provider_model_settings", "provider_name")
|
||||
cls.migrate_db_records("load_balancing_model_configs", "provider_name")
|
||||
cls.migrate_db_records("providers", "provider_name", ModelProviderID) # large table
|
||||
cls.migrate_db_records("provider_models", "provider_name", ModelProviderID)
|
||||
cls.migrate_db_records("provider_orders", "provider_name", ModelProviderID)
|
||||
cls.migrate_db_records("tenant_default_models", "provider_name", ModelProviderID)
|
||||
cls.migrate_db_records("tenant_preferred_model_providers", "provider_name", ModelProviderID)
|
||||
cls.migrate_db_records("provider_model_settings", "provider_name", ModelProviderID)
|
||||
cls.migrate_db_records("load_balancing_model_configs", "provider_name", ModelProviderID)
|
||||
cls.migrate_datasets()
|
||||
cls.migrate_db_records("embeddings", "provider_name") # large table
|
||||
cls.migrate_db_records("dataset_collection_bindings", "provider_name")
|
||||
cls.migrate_db_records("tool_builtin_providers", "provider")
|
||||
cls.migrate_db_records("embeddings", "provider_name", ModelProviderID) # large table
|
||||
cls.migrate_db_records("dataset_collection_bindings", "provider_name", ModelProviderID)
|
||||
cls.migrate_db_records("tool_builtin_providers", "provider_name", ToolProviderID)
|
||||
|
||||
@classmethod
|
||||
def migrate_datasets(cls) -> None:
|
||||
|
|
@ -66,9 +66,10 @@ limit 1000"""
|
|||
fg="white",
|
||||
)
|
||||
)
|
||||
retrieval_model["reranking_model"]["reranking_provider_name"] = (
|
||||
f"{DEFAULT_PLUGIN_ID}/{retrieval_model['reranking_model']['reranking_provider_name']}/{retrieval_model['reranking_model']['reranking_provider_name']}"
|
||||
)
|
||||
# update google to langgenius/gemini/google etc.
|
||||
retrieval_model["reranking_model"]["reranking_provider_name"] = ModelProviderID(
|
||||
retrieval_model["reranking_model"]["reranking_provider_name"]
|
||||
).to_string()
|
||||
retrieval_model_changed = True
|
||||
|
||||
click.echo(
|
||||
|
|
@ -86,9 +87,11 @@ limit 1000"""
|
|||
update_retrieval_model_sql = ", retrieval_model = :retrieval_model"
|
||||
params["retrieval_model"] = json.dumps(retrieval_model)
|
||||
|
||||
params["provider_name"] = ModelProviderID(provider_name).to_string()
|
||||
|
||||
sql = f"""update {table_name}
|
||||
set {provider_column_name} =
|
||||
concat('{DEFAULT_PLUGIN_ID}/', {provider_column_name}, '/', {provider_column_name})
|
||||
:provider_name
|
||||
{update_retrieval_model_sql}
|
||||
where id = :record_id"""
|
||||
conn.execute(db.text(sql), params)
|
||||
|
|
@ -122,7 +125,9 @@ limit 1000"""
|
|||
)
|
||||
|
||||
@classmethod
|
||||
def migrate_db_records(cls, table_name: str, provider_column_name: str) -> None:
|
||||
def migrate_db_records(
|
||||
cls, table_name: str, provider_column_name: str, provider_cls: type[GenericProviderID]
|
||||
) -> None:
|
||||
click.echo(click.style(f"Migrating [{table_name}] data for plugin", fg="white"))
|
||||
|
||||
processed_count = 0
|
||||
|
|
@ -166,7 +171,8 @@ limit 1000"""
|
|||
)
|
||||
|
||||
try:
|
||||
updated_value = f"{DEFAULT_PLUGIN_ID}/{provider_name}/{provider_name}"
|
||||
# update jina to langgenius/jina_tool/jina etc.
|
||||
updated_value = provider_cls(provider_name).to_string()
|
||||
batch_updates.append((updated_value, record_id))
|
||||
except Exception as e:
|
||||
failed_ids.append(record_id)
|
||||
|
|
|
|||
|
|
@ -5,7 +5,7 @@ from sqlalchemy import and_, func, or_, select
|
|||
from sqlalchemy.orm import Session
|
||||
|
||||
from core.workflow.entities.workflow_execution import WorkflowExecutionStatus
|
||||
from models import App, EndUser, WorkflowAppLog, WorkflowRun
|
||||
from models import Account, App, EndUser, WorkflowAppLog, WorkflowRun
|
||||
from models.enums import CreatorUserRole
|
||||
|
||||
|
||||
|
|
@ -21,6 +21,8 @@ class WorkflowAppService:
|
|||
created_at_after: datetime | None = None,
|
||||
page: int = 1,
|
||||
limit: int = 20,
|
||||
created_by_end_user_session_id: str | None = None,
|
||||
created_by_account: str | None = None,
|
||||
) -> dict:
|
||||
"""
|
||||
Get paginate workflow app logs using SQLAlchemy 2.0 style
|
||||
|
|
@ -32,6 +34,8 @@ class WorkflowAppService:
|
|||
:param created_at_after: filter logs created after this timestamp
|
||||
:param page: page number
|
||||
:param limit: items per page
|
||||
:param created_by_end_user_session_id: filter by end user session id
|
||||
:param created_by_account: filter by account email
|
||||
:return: Pagination object
|
||||
"""
|
||||
# Build base statement using SQLAlchemy 2.0 style
|
||||
|
|
@ -71,6 +75,26 @@ class WorkflowAppService:
|
|||
if created_at_after:
|
||||
stmt = stmt.where(WorkflowAppLog.created_at >= created_at_after)
|
||||
|
||||
# Filter by end user session id or account email
|
||||
if created_by_end_user_session_id:
|
||||
stmt = stmt.join(
|
||||
EndUser,
|
||||
and_(
|
||||
WorkflowAppLog.created_by == EndUser.id,
|
||||
WorkflowAppLog.created_by_role == CreatorUserRole.END_USER,
|
||||
EndUser.session_id == created_by_end_user_session_id,
|
||||
),
|
||||
)
|
||||
if created_by_account:
|
||||
stmt = stmt.join(
|
||||
Account,
|
||||
and_(
|
||||
WorkflowAppLog.created_by == Account.id,
|
||||
WorkflowAppLog.created_by_role == CreatorUserRole.ACCOUNT,
|
||||
Account.email == created_by_account,
|
||||
),
|
||||
)
|
||||
|
||||
stmt = stmt.order_by(WorkflowAppLog.created_at.desc())
|
||||
|
||||
# Get total count using the same filters
|
||||
|
|
|
|||
|
|
@ -0,0 +1,25 @@
|
|||
from core.rag.datasource.vdb.matrixone.matrixone_vector import MatrixoneConfig, MatrixoneVector
|
||||
from tests.integration_tests.vdb.test_vector_store import (
|
||||
AbstractVectorTest,
|
||||
get_example_text,
|
||||
setup_mock_redis,
|
||||
)
|
||||
|
||||
|
||||
class MatrixoneVectorTest(AbstractVectorTest):
|
||||
def __init__(self):
|
||||
super().__init__()
|
||||
self.vector = MatrixoneVector(
|
||||
collection_name=self.collection_name,
|
||||
config=MatrixoneConfig(
|
||||
host="localhost", port=6001, user="dump", password="111", database="dify", metric="l2"
|
||||
),
|
||||
)
|
||||
|
||||
def get_ids_by_metadata_field(self):
|
||||
ids = self.vector.get_ids_by_metadata_field(key="document_id", value=self.example_doc_id)
|
||||
assert len(ids) == 1
|
||||
|
||||
|
||||
def test_matrixone_vector(setup_mock_redis):
|
||||
MatrixoneVectorTest().run_all_tests()
|
||||
|
|
@ -1,4 +1,5 @@
|
|||
import os
|
||||
from unittest.mock import MagicMock, patch
|
||||
|
||||
import pytest
|
||||
from flask import Flask
|
||||
|
|
@ -11,6 +12,24 @@ PROJECT_DIR = os.path.abspath(os.path.join(ABS_PATH, os.pardir, os.pardir))
|
|||
|
||||
CACHED_APP = Flask(__name__)
|
||||
|
||||
# set global mock for Redis client
|
||||
redis_mock = MagicMock()
|
||||
redis_mock.get = MagicMock(return_value=None)
|
||||
redis_mock.setex = MagicMock()
|
||||
redis_mock.setnx = MagicMock()
|
||||
redis_mock.delete = MagicMock()
|
||||
redis_mock.lock = MagicMock()
|
||||
redis_mock.exists = MagicMock(return_value=False)
|
||||
redis_mock.set = MagicMock()
|
||||
redis_mock.expire = MagicMock()
|
||||
redis_mock.hgetall = MagicMock(return_value={})
|
||||
redis_mock.hdel = MagicMock()
|
||||
redis_mock.incr = MagicMock(return_value=1)
|
||||
|
||||
# apply the mock to the Redis client in the Flask app
|
||||
redis_patcher = patch("extensions.ext_redis.redis_client", redis_mock)
|
||||
redis_patcher.start()
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def app() -> Flask:
|
||||
|
|
@ -21,3 +40,19 @@ def app() -> Flask:
|
|||
def _provide_app_context(app: Flask):
|
||||
with app.app_context():
|
||||
yield
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def reset_redis_mock():
|
||||
"""reset the Redis mock before each test"""
|
||||
redis_mock.reset_mock()
|
||||
redis_mock.get.return_value = None
|
||||
redis_mock.setex.return_value = None
|
||||
redis_mock.setnx.return_value = None
|
||||
redis_mock.delete.return_value = None
|
||||
redis_mock.exists.return_value = False
|
||||
redis_mock.set.return_value = None
|
||||
redis_mock.expire.return_value = None
|
||||
redis_mock.hgetall.return_value = {}
|
||||
redis_mock.hdel.return_value = None
|
||||
redis_mock.incr.return_value = 1
|
||||
|
|
|
|||
|
|
@ -0,0 +1,22 @@
|
|||
from core.rag.extractor.markdown_extractor import MarkdownExtractor
|
||||
|
||||
|
||||
def test_markdown_to_tups():
|
||||
markdown = """
|
||||
this is some text without header
|
||||
|
||||
# title 1
|
||||
this is balabala text
|
||||
|
||||
## title 2
|
||||
this is more specific text.
|
||||
"""
|
||||
extractor = MarkdownExtractor(file_path="dummy_path")
|
||||
updated_output = extractor.markdown_to_tups(markdown)
|
||||
assert len(updated_output) == 3
|
||||
key, header_value = updated_output[0]
|
||||
assert key == None
|
||||
assert header_value.strip() == "this is some text without header"
|
||||
title_1, value = updated_output[1]
|
||||
assert title_1.strip() == "title 1"
|
||||
assert value.strip() == "this is balabala text"
|
||||
|
|
@ -163,7 +163,6 @@ def real_workflow_run():
|
|||
workflow_run.tenant_id = "test-tenant-id"
|
||||
workflow_run.app_id = "test-app-id"
|
||||
workflow_run.workflow_id = "test-workflow-id"
|
||||
workflow_run.sequence_number = 1
|
||||
workflow_run.type = "chat"
|
||||
workflow_run.triggered_from = "app-run"
|
||||
workflow_run.version = "1.0"
|
||||
|
|
|
|||
File diff suppressed because it is too large
Load Diff
4458
api/uv.lock
4458
api/uv.lock
File diff suppressed because it is too large
Load Diff
|
|
@ -399,7 +399,7 @@ SUPABASE_URL=your-server-url
|
|||
# ------------------------------
|
||||
|
||||
# The type of vector store to use.
|
||||
# Supported values are `weaviate`, `qdrant`, `milvus`, `myscale`, `relyt`, `pgvector`, `pgvecto-rs`, `chroma`, `opensearch`, `oracle`, `tencent`, `elasticsearch`, `elasticsearch-ja`, `analyticdb`, `couchbase`, `vikingdb`, `oceanbase`, `opengauss`, `tablestore`,`vastbase`,`tidb`,`tidb_on_qdrant`,`baidu`,`lindorm`,`huawei_cloud`,`upstash`.
|
||||
# Supported values are `weaviate`, `qdrant`, `milvus`, `myscale`, `relyt`, `pgvector`, `pgvecto-rs`, `chroma`, `opensearch`, `oracle`, `tencent`, `elasticsearch`, `elasticsearch-ja`, `analyticdb`, `couchbase`, `vikingdb`, `oceanbase`, `opengauss`, `tablestore`,`vastbase`,`tidb`,`tidb_on_qdrant`,`baidu`,`lindorm`,`huawei_cloud`,`upstash`, `matrixone`.
|
||||
VECTOR_STORE=weaviate
|
||||
|
||||
# The Weaviate endpoint URL. Only available when VECTOR_STORE is `weaviate`.
|
||||
|
|
@ -490,6 +490,13 @@ TIDB_VECTOR_USER=
|
|||
TIDB_VECTOR_PASSWORD=
|
||||
TIDB_VECTOR_DATABASE=dify
|
||||
|
||||
# Matrixone vector configurations.
|
||||
MATRIXONE_HOST=matrixone
|
||||
MATRIXONE_PORT=6001
|
||||
MATRIXONE_USER=dump
|
||||
MATRIXONE_PASSWORD=111
|
||||
MATRIXONE_DATABASE=dify
|
||||
|
||||
# Tidb on qdrant configuration, only available when VECTOR_STORE is `tidb_on_qdrant`
|
||||
TIDB_ON_QDRANT_URL=http://127.0.0.1
|
||||
TIDB_ON_QDRANT_API_KEY=dify
|
||||
|
|
@ -719,10 +726,11 @@ NOTION_INTERNAL_SECRET=
|
|||
# Mail related configuration
|
||||
# ------------------------------
|
||||
|
||||
# Mail type, support: resend, smtp
|
||||
# Mail type, support: resend, smtp, sendgrid
|
||||
MAIL_TYPE=resend
|
||||
|
||||
# Default send from email address, if not specified
|
||||
# If using SendGrid, use the 'from' field for authentication if necessary.
|
||||
MAIL_DEFAULT_SEND_FROM=
|
||||
|
||||
# API-Key for the Resend email provider, used when MAIL_TYPE is `resend`.
|
||||
|
|
@ -738,6 +746,9 @@ SMTP_PASSWORD=
|
|||
SMTP_USE_TLS=true
|
||||
SMTP_OPPORTUNISTIC_TLS=false
|
||||
|
||||
# Sendgid configuration
|
||||
SENDGRID_API_KEY=
|
||||
|
||||
# ------------------------------
|
||||
# Others Configuration
|
||||
# ------------------------------
|
||||
|
|
|
|||
|
|
@ -617,6 +617,18 @@ services:
|
|||
ports:
|
||||
- ${MYSCALE_PORT:-8123}:${MYSCALE_PORT:-8123}
|
||||
|
||||
# Matrixone vector store.
|
||||
matrixone:
|
||||
hostname: matrixone
|
||||
image: matrixorigin/matrixone:2.1.1
|
||||
profiles:
|
||||
- matrixone
|
||||
restart: always
|
||||
volumes:
|
||||
- ./volumes/matrixone/data:/mo-data
|
||||
ports:
|
||||
- ${MATRIXONE_PORT:-6001}:${MATRIXONE_PORT:-6001}
|
||||
|
||||
# https://www.elastic.co/guide/en/elasticsearch/reference/current/settings.html
|
||||
# https://www.elastic.co/guide/en/elasticsearch/reference/current/docker.html#docker-prod-prerequisites
|
||||
elasticsearch:
|
||||
|
|
|
|||
|
|
@ -195,6 +195,11 @@ x-shared-env: &shared-api-worker-env
|
|||
TIDB_VECTOR_USER: ${TIDB_VECTOR_USER:-}
|
||||
TIDB_VECTOR_PASSWORD: ${TIDB_VECTOR_PASSWORD:-}
|
||||
TIDB_VECTOR_DATABASE: ${TIDB_VECTOR_DATABASE:-dify}
|
||||
MATRIXONE_HOST: ${MATRIXONE_HOST:-matrixone}
|
||||
MATRIXONE_PORT: ${MATRIXONE_PORT:-6001}
|
||||
MATRIXONE_USER: ${MATRIXONE_USER:-dump}
|
||||
MATRIXONE_PASSWORD: ${MATRIXONE_PASSWORD:-111}
|
||||
MATRIXONE_DATABASE: ${MATRIXONE_DATABASE:-dify}
|
||||
TIDB_ON_QDRANT_URL: ${TIDB_ON_QDRANT_URL:-http://127.0.0.1}
|
||||
TIDB_ON_QDRANT_API_KEY: ${TIDB_ON_QDRANT_API_KEY:-dify}
|
||||
TIDB_ON_QDRANT_CLIENT_TIMEOUT: ${TIDB_ON_QDRANT_CLIENT_TIMEOUT:-20}
|
||||
|
|
@ -322,6 +327,7 @@ x-shared-env: &shared-api-worker-env
|
|||
SMTP_PASSWORD: ${SMTP_PASSWORD:-}
|
||||
SMTP_USE_TLS: ${SMTP_USE_TLS:-true}
|
||||
SMTP_OPPORTUNISTIC_TLS: ${SMTP_OPPORTUNISTIC_TLS:-false}
|
||||
SENDGRID_API_KEY: ${SENDGRID_API_KEY:-}
|
||||
INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH: ${INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH:-4000}
|
||||
INVITE_EXPIRY_HOURS: ${INVITE_EXPIRY_HOURS:-72}
|
||||
RESET_PASSWORD_TOKEN_EXPIRY_MINUTES: ${RESET_PASSWORD_TOKEN_EXPIRY_MINUTES:-5}
|
||||
|
|
@ -1124,6 +1130,18 @@ services:
|
|||
ports:
|
||||
- ${MYSCALE_PORT:-8123}:${MYSCALE_PORT:-8123}
|
||||
|
||||
# Matrixone vector store.
|
||||
matrixone:
|
||||
hostname: matrixone
|
||||
image: matrixorigin/matrixone:2.1.1
|
||||
profiles:
|
||||
- matrixone
|
||||
restart: always
|
||||
volumes:
|
||||
- ./volumes/matrixone/data:/mo-data
|
||||
ports:
|
||||
- ${MATRIXONE_PORT:-6001}:${MATRIXONE_PORT:-6001}
|
||||
|
||||
# https://www.elastic.co/guide/en/elasticsearch/reference/current/settings.html
|
||||
# https://www.elastic.co/guide/en/elasticsearch/reference/current/docker.html#docker-prod-prerequisites
|
||||
elasticsearch:
|
||||
|
|
|
|||
|
|
@ -8,17 +8,17 @@ import { useRouter } from 'next/navigation'
|
|||
import { useEffect } from 'react'
|
||||
|
||||
export default function DatasetsLayout({ children }: { children: React.ReactNode }) {
|
||||
const { isCurrentWorkspaceEditor, isCurrentWorkspaceDatasetOperator } = useAppContext()
|
||||
const { isCurrentWorkspaceEditor, isCurrentWorkspaceDatasetOperator, currentWorkspace, isLoadingCurrentWorkspace } = useAppContext()
|
||||
const router = useRouter()
|
||||
|
||||
useEffect(() => {
|
||||
if (typeof isCurrentWorkspaceEditor !== 'boolean' || typeof isCurrentWorkspaceDatasetOperator !== 'boolean')
|
||||
if (isLoadingCurrentWorkspace || !currentWorkspace.id)
|
||||
return
|
||||
if (!isCurrentWorkspaceEditor && !isCurrentWorkspaceDatasetOperator)
|
||||
if (!(isCurrentWorkspaceEditor || isCurrentWorkspaceDatasetOperator))
|
||||
router.replace('/apps')
|
||||
}, [isCurrentWorkspaceEditor, isCurrentWorkspaceDatasetOperator, router])
|
||||
}, [isCurrentWorkspaceEditor, isCurrentWorkspaceDatasetOperator, isLoadingCurrentWorkspace, currentWorkspace, router])
|
||||
|
||||
if (!isCurrentWorkspaceEditor && !isCurrentWorkspaceDatasetOperator)
|
||||
if (isLoadingCurrentWorkspace || !(isCurrentWorkspaceEditor || isCurrentWorkspaceDatasetOperator))
|
||||
return <Loading type='app' />
|
||||
return (
|
||||
<ExternalKnowledgeApiProvider>
|
||||
|
|
|
|||
|
|
@ -10,7 +10,6 @@ import PromptEditorHeightResizeWrap from './prompt-editor-height-resize-wrap'
|
|||
import cn from '@/utils/classnames'
|
||||
import type { PromptVariable } from '@/models/debug'
|
||||
import Tooltip from '@/app/components/base/tooltip'
|
||||
import type { CompletionParams } from '@/types/app'
|
||||
import { AppType } from '@/types/app'
|
||||
import { getNewVar, getVars } from '@/utils/var'
|
||||
import AutomaticBtn from '@/app/components/app/configuration/config/automatic/automatic-btn'
|
||||
|
|
@ -63,7 +62,6 @@ const Prompt: FC<ISimplePromptInput> = ({
|
|||
const { eventEmitter } = useEventEmitterContextContext()
|
||||
const {
|
||||
modelConfig,
|
||||
completionParams,
|
||||
dataSets,
|
||||
setModelConfig,
|
||||
setPrevPromptConfig,
|
||||
|
|
@ -264,14 +262,6 @@ const Prompt: FC<ISimplePromptInput> = ({
|
|||
{showAutomatic && (
|
||||
<GetAutomaticResModal
|
||||
mode={mode as AppType}
|
||||
model={
|
||||
{
|
||||
provider: modelConfig.provider,
|
||||
name: modelConfig.model_id,
|
||||
mode: modelConfig.mode,
|
||||
completion_params: completionParams as CompletionParams,
|
||||
}
|
||||
}
|
||||
isShow={showAutomatic}
|
||||
onClose={showAutomaticFalse}
|
||||
onFinished={handleAutomaticRes}
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
'use client'
|
||||
import type { FC } from 'react'
|
||||
import React, { useCallback } from 'react'
|
||||
import React, { useCallback, useEffect, useState } from 'react'
|
||||
import { useTranslation } from 'react-i18next'
|
||||
import { useBoolean } from 'ahooks'
|
||||
import {
|
||||
|
|
@ -22,7 +22,7 @@ import Textarea from '@/app/components/base/textarea'
|
|||
import Toast from '@/app/components/base/toast'
|
||||
import { generateRule } from '@/service/debug'
|
||||
import ConfigPrompt from '@/app/components/app/configuration/config-prompt'
|
||||
import type { Model } from '@/types/app'
|
||||
import type { CompletionParams, Model } from '@/types/app'
|
||||
import { AppType } from '@/types/app'
|
||||
import ConfigVar from '@/app/components/app/configuration/config-var'
|
||||
import GroupName from '@/app/components/app/configuration/base/group-name'
|
||||
|
|
@ -33,14 +33,15 @@ import { LoveMessage } from '@/app/components/base/icons/src/vender/features'
|
|||
// type
|
||||
import type { AutomaticRes } from '@/service/debug'
|
||||
import { Generator } from '@/app/components/base/icons/src/vender/other'
|
||||
import ModelIcon from '@/app/components/header/account-setting/model-provider-page/model-icon'
|
||||
import ModelName from '@/app/components/header/account-setting/model-provider-page/model-name'
|
||||
import ModelParameterModal from '@/app/components/header/account-setting/model-provider-page/model-parameter-modal'
|
||||
|
||||
import { ModelTypeEnum } from '@/app/components/header/account-setting/model-provider-page/declarations'
|
||||
import { useModelListAndDefaultModelAndCurrentProviderAndModel } from '@/app/components/header/account-setting/model-provider-page/hooks'
|
||||
import type { ModelModeType } from '@/types/app'
|
||||
import type { FormValue } from '@/app/components/header/account-setting/model-provider-page/declarations'
|
||||
|
||||
export type IGetAutomaticResProps = {
|
||||
mode: AppType
|
||||
model: Model
|
||||
isShow: boolean
|
||||
onClose: () => void
|
||||
onFinished: (res: AutomaticRes) => void
|
||||
|
|
@ -65,16 +66,23 @@ const TryLabel: FC<{
|
|||
|
||||
const GetAutomaticRes: FC<IGetAutomaticResProps> = ({
|
||||
mode,
|
||||
model,
|
||||
isShow,
|
||||
onClose,
|
||||
isInLLMNode,
|
||||
onFinished,
|
||||
}) => {
|
||||
const { t } = useTranslation()
|
||||
const localModel = localStorage.getItem('auto-gen-model')
|
||||
? JSON.parse(localStorage.getItem('auto-gen-model') as string) as Model
|
||||
: null
|
||||
const [model, setModel] = React.useState<Model>(localModel || {
|
||||
name: '',
|
||||
provider: '',
|
||||
mode: mode as unknown as ModelModeType.chat,
|
||||
completion_params: {} as CompletionParams,
|
||||
})
|
||||
const {
|
||||
currentProvider,
|
||||
currentModel,
|
||||
defaultModel,
|
||||
} = useModelListAndDefaultModelAndCurrentProviderAndModel(ModelTypeEnum.textGeneration)
|
||||
const tryList = [
|
||||
{
|
||||
|
|
@ -115,7 +123,7 @@ const GetAutomaticRes: FC<IGetAutomaticResProps> = ({
|
|||
},
|
||||
]
|
||||
|
||||
const [instruction, setInstruction] = React.useState<string>('')
|
||||
const [instruction, setInstruction] = useState<string>('')
|
||||
const handleChooseTemplate = useCallback((key: string) => {
|
||||
return () => {
|
||||
const template = t(`appDebug.generate.template.${key}.instruction`)
|
||||
|
|
@ -135,7 +143,25 @@ const GetAutomaticRes: FC<IGetAutomaticResProps> = ({
|
|||
return true
|
||||
}
|
||||
const [isLoading, { setTrue: setLoadingTrue, setFalse: setLoadingFalse }] = useBoolean(false)
|
||||
const [res, setRes] = React.useState<AutomaticRes | null>(null)
|
||||
const [res, setRes] = useState<AutomaticRes | null>(null)
|
||||
|
||||
useEffect(() => {
|
||||
if (defaultModel) {
|
||||
const localModel = localStorage.getItem('auto-gen-model')
|
||||
? JSON.parse(localStorage.getItem('auto-gen-model') || '')
|
||||
: null
|
||||
if (localModel) {
|
||||
setModel(localModel)
|
||||
}
|
||||
else {
|
||||
setModel(prev => ({
|
||||
...prev,
|
||||
name: defaultModel.model,
|
||||
provider: defaultModel.provider.provider,
|
||||
}))
|
||||
}
|
||||
}
|
||||
}, [defaultModel])
|
||||
|
||||
const renderLoading = (
|
||||
<div className='flex h-full w-0 grow flex-col items-center justify-center space-y-3'>
|
||||
|
|
@ -154,6 +180,26 @@ const GetAutomaticRes: FC<IGetAutomaticResProps> = ({
|
|||
</div>
|
||||
)
|
||||
|
||||
const handleModelChange = useCallback((newValue: { modelId: string; provider: string; mode?: string; features?: string[] }) => {
|
||||
const newModel = {
|
||||
...model,
|
||||
provider: newValue.provider,
|
||||
name: newValue.modelId,
|
||||
mode: newValue.mode as ModelModeType,
|
||||
}
|
||||
setModel(newModel)
|
||||
localStorage.setItem('auto-gen-model', JSON.stringify(newModel))
|
||||
}, [model, setModel])
|
||||
|
||||
const handleCompletionParamsChange = useCallback((newParams: FormValue) => {
|
||||
const newModel = {
|
||||
...model,
|
||||
completion_params: newParams as CompletionParams,
|
||||
}
|
||||
setModel(newModel)
|
||||
localStorage.setItem('auto-gen-model', JSON.stringify(newModel))
|
||||
}, [model, setModel])
|
||||
|
||||
const onGenerate = async () => {
|
||||
if (!isValid())
|
||||
return
|
||||
|
|
@ -198,17 +244,18 @@ const GetAutomaticRes: FC<IGetAutomaticResProps> = ({
|
|||
<div className={`text-lg font-bold leading-[28px] ${s.textGradient}`}>{t('appDebug.generate.title')}</div>
|
||||
<div className='mt-1 text-[13px] font-normal text-text-tertiary'>{t('appDebug.generate.description')}</div>
|
||||
</div>
|
||||
<div className='mb-8 flex items-center'>
|
||||
<ModelIcon
|
||||
className='mr-1.5 shrink-0 '
|
||||
provider={currentProvider}
|
||||
modelName={currentModel?.model}
|
||||
/>
|
||||
<ModelName
|
||||
className='grow'
|
||||
modelItem={currentModel!}
|
||||
showMode
|
||||
showFeatures
|
||||
<div className='mb-8'>
|
||||
<ModelParameterModal
|
||||
popupClassName='!w-[520px]'
|
||||
portalToFollowElemContentClassName='z-[1000]'
|
||||
isAdvancedMode={true}
|
||||
provider={model.provider}
|
||||
mode={model.mode}
|
||||
completionParams={model.completion_params}
|
||||
modelId={model.name}
|
||||
setModel={handleModelChange}
|
||||
onCompletionParamsChange={handleCompletionParamsChange}
|
||||
hideDebugWithMultipleModel
|
||||
/>
|
||||
</div>
|
||||
<div >
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
import type { FC } from 'react'
|
||||
import React from 'react'
|
||||
import React, { useCallback, useEffect } from 'react'
|
||||
import cn from 'classnames'
|
||||
import useBoolean from 'ahooks/lib/useBoolean'
|
||||
import { useTranslation } from 'react-i18next'
|
||||
|
|
@ -7,8 +7,10 @@ import ConfigPrompt from '../../config-prompt'
|
|||
import { languageMap } from '../../../../workflow/nodes/_base/components/editor/code-editor/index'
|
||||
import { generateRuleCode } from '@/service/debug'
|
||||
import type { CodeGenRes } from '@/service/debug'
|
||||
import { type AppType, type Model, ModelModeType } from '@/types/app'
|
||||
import type { ModelModeType } from '@/types/app'
|
||||
import type { AppType, CompletionParams, Model } from '@/types/app'
|
||||
import Modal from '@/app/components/base/modal'
|
||||
import Textarea from '@/app/components/base/textarea'
|
||||
import Button from '@/app/components/base/button'
|
||||
import { Generator } from '@/app/components/base/icons/src/vender/other'
|
||||
import Toast from '@/app/components/base/toast'
|
||||
|
|
@ -17,8 +19,9 @@ import Confirm from '@/app/components/base/confirm'
|
|||
import type { CodeLanguage } from '@/app/components/workflow/nodes/code/types'
|
||||
import { useModelListAndDefaultModelAndCurrentProviderAndModel } from '@/app/components/header/account-setting/model-provider-page/hooks'
|
||||
import { ModelTypeEnum } from '@/app/components/header/account-setting/model-provider-page/declarations'
|
||||
import ModelIcon from '@/app/components/header/account-setting/model-provider-page/model-icon'
|
||||
import ModelName from '@/app/components/header/account-setting/model-provider-page/model-name'
|
||||
import ModelParameterModal from '@/app/components/header/account-setting/model-provider-page/model-parameter-modal'
|
||||
import type { FormValue } from '@/app/components/header/account-setting/model-provider-page/declarations'
|
||||
|
||||
export type IGetCodeGeneratorResProps = {
|
||||
mode: AppType
|
||||
isShow: boolean
|
||||
|
|
@ -36,11 +39,28 @@ export const GetCodeGeneratorResModal: FC<IGetCodeGeneratorResProps> = (
|
|||
onFinished,
|
||||
},
|
||||
) => {
|
||||
const {
|
||||
currentProvider,
|
||||
currentModel,
|
||||
} = useModelListAndDefaultModelAndCurrentProviderAndModel(ModelTypeEnum.textGeneration)
|
||||
const { t } = useTranslation()
|
||||
const defaultCompletionParams = {
|
||||
temperature: 0.7,
|
||||
max_tokens: 0,
|
||||
top_p: 0,
|
||||
echo: false,
|
||||
stop: [],
|
||||
presence_penalty: 0,
|
||||
frequency_penalty: 0,
|
||||
}
|
||||
const localModel = localStorage.getItem('auto-gen-model')
|
||||
? JSON.parse(localStorage.getItem('auto-gen-model') as string) as Model
|
||||
: null
|
||||
const [model, setModel] = React.useState<Model>(localModel || {
|
||||
name: '',
|
||||
provider: '',
|
||||
mode: mode as unknown as ModelModeType.chat,
|
||||
completion_params: defaultCompletionParams,
|
||||
})
|
||||
const {
|
||||
defaultModel,
|
||||
} = useModelListAndDefaultModelAndCurrentProviderAndModel(ModelTypeEnum.textGeneration)
|
||||
const [instruction, setInstruction] = React.useState<string>('')
|
||||
const [isLoading, { setTrue: setLoadingTrue, setFalse: setLoadingFalse }] = useBoolean(false)
|
||||
const [res, setRes] = React.useState<CodeGenRes | null>(null)
|
||||
|
|
@ -56,21 +76,27 @@ export const GetCodeGeneratorResModal: FC<IGetCodeGeneratorResProps> = (
|
|||
}
|
||||
return true
|
||||
}
|
||||
const model: Model = {
|
||||
provider: currentProvider?.provider || '',
|
||||
name: currentModel?.model || '',
|
||||
mode: ModelModeType.chat,
|
||||
// This is a fixed parameter
|
||||
completion_params: {
|
||||
temperature: 0.7,
|
||||
max_tokens: 0,
|
||||
top_p: 0,
|
||||
echo: false,
|
||||
stop: [],
|
||||
presence_penalty: 0,
|
||||
frequency_penalty: 0,
|
||||
},
|
||||
}
|
||||
|
||||
const handleModelChange = useCallback((newValue: { modelId: string; provider: string; mode?: string; features?: string[] }) => {
|
||||
const newModel = {
|
||||
...model,
|
||||
provider: newValue.provider,
|
||||
name: newValue.modelId,
|
||||
mode: newValue.mode as ModelModeType,
|
||||
}
|
||||
setModel(newModel)
|
||||
localStorage.setItem('auto-gen-model', JSON.stringify(newModel))
|
||||
}, [model, setModel])
|
||||
|
||||
const handleCompletionParamsChange = useCallback((newParams: FormValue) => {
|
||||
const newModel = {
|
||||
...model,
|
||||
completion_params: newParams as CompletionParams,
|
||||
}
|
||||
setModel(newModel)
|
||||
localStorage.setItem('auto-gen-model', JSON.stringify(newModel))
|
||||
}, [model, setModel])
|
||||
|
||||
const isInLLMNode = true
|
||||
const onGenerate = async () => {
|
||||
if (!isValid())
|
||||
|
|
@ -99,16 +125,40 @@ export const GetCodeGeneratorResModal: FC<IGetCodeGeneratorResProps> = (
|
|||
}
|
||||
const [showConfirmOverwrite, setShowConfirmOverwrite] = React.useState(false)
|
||||
|
||||
useEffect(() => {
|
||||
if (defaultModel) {
|
||||
const localModel = localStorage.getItem('auto-gen-model')
|
||||
? JSON.parse(localStorage.getItem('auto-gen-model') || '')
|
||||
: null
|
||||
if (localModel) {
|
||||
setModel({
|
||||
...localModel,
|
||||
completion_params: {
|
||||
...defaultCompletionParams,
|
||||
...localModel.completion_params,
|
||||
},
|
||||
})
|
||||
}
|
||||
else {
|
||||
setModel(prev => ({
|
||||
...prev,
|
||||
name: defaultModel.model,
|
||||
provider: defaultModel.provider.provider,
|
||||
}))
|
||||
}
|
||||
}
|
||||
}, [defaultModel])
|
||||
|
||||
const renderLoading = (
|
||||
<div className='flex h-full w-0 grow flex-col items-center justify-center space-y-3'>
|
||||
<Loading />
|
||||
<div className='text-[13px] text-gray-400'>{t('appDebug.codegen.loading')}</div>
|
||||
<div className='text-[13px] text-text-tertiary'>{t('appDebug.codegen.loading')}</div>
|
||||
</div>
|
||||
)
|
||||
const renderNoData = (
|
||||
<div className='flex h-full w-0 grow flex-col items-center justify-center space-y-3 px-8'>
|
||||
<Generator className='h-14 w-14 text-gray-300' />
|
||||
<div className='text-center text-[13px] font-normal leading-5 text-gray-400'>
|
||||
<Generator className='h-14 w-14 text-text-tertiary' />
|
||||
<div className='text-center text-[13px] font-normal leading-5 text-text-tertiary'>
|
||||
<div>{t('appDebug.codegen.noDataLine1')}</div>
|
||||
<div>{t('appDebug.codegen.noDataLine2')}</div>
|
||||
</div>
|
||||
|
|
@ -123,29 +173,30 @@ export const GetCodeGeneratorResModal: FC<IGetCodeGeneratorResProps> = (
|
|||
closable
|
||||
>
|
||||
<div className='relative flex h-[680px] flex-wrap'>
|
||||
<div className='h-full w-[570px] shrink-0 overflow-y-auto border-r border-gray-100 p-8'>
|
||||
<div className='h-full w-[570px] shrink-0 overflow-y-auto border-r border-divider-regular p-8'>
|
||||
<div className='mb-8'>
|
||||
<div className={'text-lg font-bold leading-[28px]'}>{t('appDebug.codegen.title')}</div>
|
||||
<div className='mt-1 text-[13px] font-normal text-gray-500'>{t('appDebug.codegen.description')}</div>
|
||||
<div className={'text-lg font-bold leading-[28px] text-text-primary'}>{t('appDebug.codegen.title')}</div>
|
||||
<div className='mt-1 text-[13px] font-normal text-text-tertiary'>{t('appDebug.codegen.description')}</div>
|
||||
</div>
|
||||
<div className='flex items-center'>
|
||||
<ModelIcon
|
||||
className='mr-1.5 shrink-0'
|
||||
provider={currentProvider}
|
||||
modelName={currentModel?.model}
|
||||
/>
|
||||
<ModelName
|
||||
className='grow'
|
||||
modelItem={currentModel!}
|
||||
showMode
|
||||
showFeatures
|
||||
<div className='mb-8'>
|
||||
<ModelParameterModal
|
||||
popupClassName='!w-[520px]'
|
||||
portalToFollowElemContentClassName='z-[1000]'
|
||||
isAdvancedMode={true}
|
||||
provider={model.provider}
|
||||
mode={model.mode}
|
||||
completionParams={model.completion_params}
|
||||
modelId={model.name}
|
||||
setModel={handleModelChange}
|
||||
onCompletionParamsChange={handleCompletionParamsChange}
|
||||
hideDebugWithMultipleModel
|
||||
/>
|
||||
</div>
|
||||
<div className='mt-6'>
|
||||
<div>
|
||||
<div className='text-[0px]'>
|
||||
<div className='mb-2 text-sm font-medium leading-5 text-gray-900'>{t('appDebug.codegen.instruction')}</div>
|
||||
<textarea
|
||||
className="h-[200px] w-full overflow-y-auto rounded-lg bg-gray-50 px-3 py-2 text-sm"
|
||||
<div className='mb-2 text-sm font-medium leading-5 text-text-primary'>{t('appDebug.codegen.instruction')}</div>
|
||||
<Textarea
|
||||
className="h-[200px] resize-none"
|
||||
placeholder={t('appDebug.codegen.instructionPlaceholder') || ''}
|
||||
value={instruction}
|
||||
onChange={e => setInstruction(e.target.value)}
|
||||
|
|
@ -169,7 +220,7 @@ export const GetCodeGeneratorResModal: FC<IGetCodeGeneratorResProps> = (
|
|||
{!isLoading && !res && renderNoData}
|
||||
{(!isLoading && res) && (
|
||||
<div className='h-full w-0 grow p-6 pb-0'>
|
||||
<div className='mb-3 shrink-0 text-base font-semibold leading-[160%] text-gray-800'>{t('appDebug.codegen.resTitle')}</div>
|
||||
<div className='mb-3 shrink-0 text-base font-semibold leading-[160%] text-text-secondary'>{t('appDebug.codegen.resTitle')}</div>
|
||||
<div className={cn('max-h-[555px] overflow-y-auto', !isInLLMNode && 'pb-2')}>
|
||||
<ConfigPrompt
|
||||
mode={mode}
|
||||
|
|
@ -185,7 +236,7 @@ export const GetCodeGeneratorResModal: FC<IGetCodeGeneratorResProps> = (
|
|||
<>
|
||||
{res?.code && (
|
||||
<div className='mt-4'>
|
||||
<h3 className='mb-2 text-sm font-medium text-gray-900'>{t('appDebug.codegen.generatedCode')}</h3>
|
||||
<h3 className='mb-2 text-sm font-medium text-text-primary'>{t('appDebug.codegen.generatedCode')}</h3>
|
||||
<pre className='overflow-x-auto rounded-lg bg-gray-50 p-4'>
|
||||
<code className={`language-${res.language}`}>
|
||||
{res.code}
|
||||
|
|
@ -202,7 +253,7 @@ export const GetCodeGeneratorResModal: FC<IGetCodeGeneratorResProps> = (
|
|||
)}
|
||||
</div>
|
||||
|
||||
<div className='flex justify-end bg-white py-4'>
|
||||
<div className='flex justify-end bg-background-default py-4'>
|
||||
<Button onClick={onClose}>{t('common.operation.cancel')}</Button>
|
||||
<Button variant='primary' className='ml-2' onClick={() => {
|
||||
setShowConfirmOverwrite(true)
|
||||
|
|
|
|||
|
|
@ -271,9 +271,7 @@ const CodeBlock: any = memo(({ inline, className, children = '', ...props }: any
|
|||
const content = String(children).replace(/\n$/, '')
|
||||
switch (language) {
|
||||
case 'mermaid':
|
||||
if (isSVG)
|
||||
return <Flowchart PrimitiveCode={content} />
|
||||
break
|
||||
return <Flowchart PrimitiveCode={content} theme={theme as 'light' | 'dark'} />
|
||||
case 'echarts': {
|
||||
// Loading state: show loading indicator
|
||||
if (chartState === 'loading') {
|
||||
|
|
@ -428,7 +426,7 @@ const CodeBlock: any = memo(({ inline, className, children = '', ...props }: any
|
|||
<div className='flex h-8 items-center justify-between rounded-t-[10px] border-b border-divider-subtle bg-components-input-bg-normal p-1 pl-3'>
|
||||
<div className='system-xs-semibold-uppercase text-text-secondary'>{languageShowName}</div>
|
||||
<div className='flex items-center gap-1'>
|
||||
{(['mermaid', 'svg']).includes(language!) && <SVGBtn isSVG={isSVG} setIsSVG={setIsSVG} />}
|
||||
{language === 'svg' && <SVGBtn isSVG={isSVG} setIsSVG={setIsSVG} />}
|
||||
<ActionButton>
|
||||
<CopyIcon content={String(children).replace(/\n$/, '')} />
|
||||
</ActionButton>
|
||||
|
|
|
|||
|
|
@ -16,7 +16,7 @@ const Link = ({ node, children, ...props }: any) => {
|
|||
}
|
||||
else {
|
||||
const href = props.href || node.properties?.href
|
||||
if(!isValidUrl(href))
|
||||
if(!href || !isValidUrl(href))
|
||||
return <span>{children}</span>
|
||||
|
||||
return <a href={href} target="_blank" className="cursor-pointer underline !decoration-primary-700 decoration-dashed">{children || 'Download'}</a>
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
import React, { useCallback, useEffect, useMemo, useRef, useState } from 'react'
|
||||
import mermaid from 'mermaid'
|
||||
import mermaid, { type MermaidConfig } from 'mermaid'
|
||||
import { useTranslation } from 'react-i18next'
|
||||
import { ExclamationTriangleIcon } from '@heroicons/react/24/outline'
|
||||
import { MoonIcon, SunIcon } from '@heroicons/react/24/solid'
|
||||
|
|
@ -68,14 +68,13 @@ const THEMES = {
|
|||
const initMermaid = () => {
|
||||
if (typeof window !== 'undefined' && !isMermaidInitialized) {
|
||||
try {
|
||||
mermaid.initialize({
|
||||
const config: MermaidConfig = {
|
||||
startOnLoad: false,
|
||||
fontFamily: 'sans-serif',
|
||||
securityLevel: 'loose',
|
||||
flowchart: {
|
||||
htmlLabels: true,
|
||||
useMaxWidth: true,
|
||||
diagramPadding: 10,
|
||||
curve: 'basis',
|
||||
nodeSpacing: 50,
|
||||
rankSpacing: 70,
|
||||
|
|
@ -94,10 +93,10 @@ const initMermaid = () => {
|
|||
mindmap: {
|
||||
useMaxWidth: true,
|
||||
padding: 10,
|
||||
diagramPadding: 20,
|
||||
},
|
||||
maxTextSize: 50000,
|
||||
})
|
||||
}
|
||||
mermaid.initialize(config)
|
||||
isMermaidInitialized = true
|
||||
}
|
||||
catch (error) {
|
||||
|
|
@ -113,7 +112,7 @@ const Flowchart = React.forwardRef((props: {
|
|||
theme?: 'light' | 'dark'
|
||||
}, ref) => {
|
||||
const { t } = useTranslation()
|
||||
const [svgCode, setSvgCode] = useState<string | null>(null)
|
||||
const [svgString, setSvgString] = useState<string | null>(null)
|
||||
const [look, setLook] = useState<'classic' | 'handDrawn'>('classic')
|
||||
const [isInitialized, setIsInitialized] = useState(false)
|
||||
const [currentTheme, setCurrentTheme] = useState<'light' | 'dark'>(props.theme || 'light')
|
||||
|
|
@ -125,6 +124,7 @@ const Flowchart = React.forwardRef((props: {
|
|||
const [imagePreviewUrl, setImagePreviewUrl] = useState('')
|
||||
const [isCodeComplete, setIsCodeComplete] = useState(false)
|
||||
const codeCompletionCheckRef = useRef<NodeJS.Timeout>()
|
||||
const prevCodeRef = useRef<string>()
|
||||
|
||||
// Create cache key from code, style and theme
|
||||
const cacheKey = useMemo(() => {
|
||||
|
|
@ -169,50 +169,18 @@ const Flowchart = React.forwardRef((props: {
|
|||
*/
|
||||
const handleRenderError = (error: any) => {
|
||||
console.error('Mermaid rendering error:', error)
|
||||
const errorMsg = (error as Error).message
|
||||
|
||||
if (errorMsg.includes('getAttribute')) {
|
||||
diagramCache.clear()
|
||||
mermaid.initialize({
|
||||
startOnLoad: false,
|
||||
securityLevel: 'loose',
|
||||
})
|
||||
// On any render error, assume the mermaid state is corrupted and force a re-initialization.
|
||||
try {
|
||||
diagramCache.clear() // Clear cache to prevent using potentially corrupted SVGs
|
||||
isMermaidInitialized = false // <-- THE FIX: Force re-initialization
|
||||
initMermaid() // Re-initialize with the default safe configuration
|
||||
}
|
||||
else {
|
||||
setErrMsg(`Rendering chart failed, please refresh and try again ${look === 'handDrawn' ? 'Or try using classic mode' : ''}`)
|
||||
}
|
||||
|
||||
if (look === 'handDrawn') {
|
||||
try {
|
||||
// Clear possible cache issues
|
||||
diagramCache.delete(`${props.PrimitiveCode}-handDrawn-${currentTheme}`)
|
||||
|
||||
// Reset mermaid configuration
|
||||
mermaid.initialize({
|
||||
startOnLoad: false,
|
||||
securityLevel: 'loose',
|
||||
theme: 'default',
|
||||
maxTextSize: 50000,
|
||||
})
|
||||
|
||||
// Try rendering with standard mode
|
||||
setLook('classic')
|
||||
setErrMsg('Hand-drawn mode is not supported for this diagram. Switched to classic mode.')
|
||||
|
||||
// Delay error clearing
|
||||
setTimeout(() => {
|
||||
if (containerRef.current) {
|
||||
// Try rendering again with standard mode, but can't call renderFlowchart directly due to circular dependency
|
||||
// Instead set state to trigger re-render
|
||||
setIsCodeComplete(true) // This will trigger useEffect re-render
|
||||
}
|
||||
}, 500)
|
||||
}
|
||||
catch (e) {
|
||||
console.error('Reset after handDrawn error failed:', e)
|
||||
}
|
||||
catch (reinitError) {
|
||||
console.error('Failed to re-initialize Mermaid after error:', reinitError)
|
||||
}
|
||||
|
||||
setErrMsg(`Rendering failed: ${(error as Error).message || 'Unknown error. Please check the console.'}`)
|
||||
setIsLoading(false)
|
||||
}
|
||||
|
||||
|
|
@ -223,51 +191,23 @@ const Flowchart = React.forwardRef((props: {
|
|||
setIsInitialized(true)
|
||||
}, [])
|
||||
|
||||
// Update theme when prop changes
|
||||
// Update theme when prop changes, but allow internal override.
|
||||
const prevThemeRef = useRef<string>()
|
||||
useEffect(() => {
|
||||
if (props.theme)
|
||||
// Only react if the theme prop from the outside has actually changed.
|
||||
if (props.theme && props.theme !== prevThemeRef.current) {
|
||||
// When the global theme prop changes, it should act as the source of truth,
|
||||
// overriding any local theme selection.
|
||||
diagramCache.clear()
|
||||
setSvgString(null)
|
||||
setCurrentTheme(props.theme)
|
||||
// Reset look to classic for a consistent state after a global change.
|
||||
setLook('classic')
|
||||
}
|
||||
// Update the ref to the current prop value for the next render.
|
||||
prevThemeRef.current = props.theme
|
||||
}, [props.theme])
|
||||
|
||||
// Validate mermaid code and check for completeness
|
||||
useEffect(() => {
|
||||
if (codeCompletionCheckRef.current)
|
||||
clearTimeout(codeCompletionCheckRef.current)
|
||||
|
||||
// Reset code complete status when code changes
|
||||
setIsCodeComplete(false)
|
||||
|
||||
// If no code or code is extremely short, don't proceed
|
||||
if (!props.PrimitiveCode || props.PrimitiveCode.length < 10)
|
||||
return
|
||||
|
||||
// Check if code already in cache - if so we know it's valid
|
||||
if (diagramCache.has(cacheKey)) {
|
||||
setIsCodeComplete(true)
|
||||
return
|
||||
}
|
||||
|
||||
// Initial check using the extracted isMermaidCodeComplete function
|
||||
const isComplete = isMermaidCodeComplete(props.PrimitiveCode)
|
||||
if (isComplete) {
|
||||
setIsCodeComplete(true)
|
||||
return
|
||||
}
|
||||
|
||||
// Set a delay to check again in case code is still being generated
|
||||
codeCompletionCheckRef.current = setTimeout(() => {
|
||||
setIsCodeComplete(isMermaidCodeComplete(props.PrimitiveCode))
|
||||
}, 300)
|
||||
|
||||
return () => {
|
||||
if (codeCompletionCheckRef.current)
|
||||
clearTimeout(codeCompletionCheckRef.current)
|
||||
}
|
||||
}, [props.PrimitiveCode, cacheKey])
|
||||
|
||||
/**
|
||||
* Renders flowchart based on provided code
|
||||
*/
|
||||
const renderFlowchart = useCallback(async (primitiveCode: string) => {
|
||||
if (!isInitialized || !containerRef.current) {
|
||||
setIsLoading(false)
|
||||
|
|
@ -275,15 +215,11 @@ const Flowchart = React.forwardRef((props: {
|
|||
return
|
||||
}
|
||||
|
||||
// Don't render if code is not complete yet
|
||||
if (!isCodeComplete) {
|
||||
setIsLoading(true)
|
||||
return
|
||||
}
|
||||
|
||||
// Return cached result if available
|
||||
const cacheKey = `${primitiveCode}-${look}-${currentTheme}`
|
||||
if (diagramCache.has(cacheKey)) {
|
||||
setSvgCode(diagramCache.get(cacheKey) || null)
|
||||
setErrMsg('')
|
||||
setSvgString(diagramCache.get(cacheKey) || null)
|
||||
setIsLoading(false)
|
||||
return
|
||||
}
|
||||
|
|
@ -294,17 +230,45 @@ const Flowchart = React.forwardRef((props: {
|
|||
try {
|
||||
let finalCode: string
|
||||
|
||||
// Check if it's a gantt chart or mindmap
|
||||
const isGanttChart = primitiveCode.trim().startsWith('gantt')
|
||||
const isMindMap = primitiveCode.trim().startsWith('mindmap')
|
||||
const trimmedCode = primitiveCode.trim()
|
||||
const isGantt = trimmedCode.startsWith('gantt')
|
||||
const isMindMap = trimmedCode.startsWith('mindmap')
|
||||
const isSequence = trimmedCode.startsWith('sequenceDiagram')
|
||||
|
||||
if (isGanttChart || isMindMap) {
|
||||
// For gantt charts and mindmaps, ensure each task is on its own line
|
||||
// and preserve exact whitespace/format
|
||||
finalCode = primitiveCode.trim()
|
||||
if (isGantt || isMindMap || isSequence) {
|
||||
if (isGantt) {
|
||||
finalCode = trimmedCode
|
||||
.split('\n')
|
||||
.map((line) => {
|
||||
// Gantt charts have specific syntax needs.
|
||||
const taskMatch = line.match(/^\s*([^:]+?)\s*:\s*(.*)/)
|
||||
if (!taskMatch)
|
||||
return line // Not a task line, return as is.
|
||||
|
||||
const taskName = taskMatch[1].trim()
|
||||
let paramsStr = taskMatch[2].trim()
|
||||
|
||||
// Rule 1: Correct multiple "after" dependencies ONLY if they exist.
|
||||
// This is a common mistake, e.g., "..., after task1, after task2, ..."
|
||||
const afterCount = (paramsStr.match(/after /g) || []).length
|
||||
if (afterCount > 1)
|
||||
paramsStr = paramsStr.replace(/,\s*after\s+/g, ' ')
|
||||
|
||||
// Rule 2: Normalize spacing between parameters for consistency.
|
||||
const finalParams = paramsStr.replace(/\s*,\s*/g, ', ').trim()
|
||||
return `${taskName} :${finalParams}`
|
||||
})
|
||||
.join('\n')
|
||||
}
|
||||
else {
|
||||
// For mindmap and sequence charts, which are sensitive to syntax,
|
||||
// pass the code through directly.
|
||||
finalCode = trimmedCode
|
||||
}
|
||||
}
|
||||
else {
|
||||
// Step 1: Clean and prepare Mermaid code using the extracted prepareMermaidCode function
|
||||
// This function handles flowcharts appropriately.
|
||||
finalCode = prepareMermaidCode(primitiveCode, look)
|
||||
}
|
||||
|
||||
|
|
@ -319,13 +283,12 @@ const Flowchart = React.forwardRef((props: {
|
|||
THEMES,
|
||||
)
|
||||
|
||||
// Step 4: Clean SVG code and convert to base64 using the extracted functions
|
||||
// Step 4: Clean up SVG code
|
||||
const cleanedSvg = cleanUpSvgCode(processedSvg)
|
||||
const base64Svg = await svgToBase64(cleanedSvg)
|
||||
|
||||
if (base64Svg && typeof base64Svg === 'string') {
|
||||
diagramCache.set(cacheKey, base64Svg)
|
||||
setSvgCode(base64Svg)
|
||||
if (cleanedSvg && typeof cleanedSvg === 'string') {
|
||||
diagramCache.set(cacheKey, cleanedSvg)
|
||||
setSvgString(cleanedSvg)
|
||||
}
|
||||
|
||||
setIsLoading(false)
|
||||
|
|
@ -334,12 +297,9 @@ const Flowchart = React.forwardRef((props: {
|
|||
// Error handling
|
||||
handleRenderError(error)
|
||||
}
|
||||
}, [chartId, isInitialized, cacheKey, isCodeComplete, look, currentTheme, t])
|
||||
}, [chartId, isInitialized, look, currentTheme, t])
|
||||
|
||||
/**
|
||||
* Configure mermaid based on selected style and theme
|
||||
*/
|
||||
const configureMermaid = useCallback(() => {
|
||||
const configureMermaid = useCallback((primitiveCode: string) => {
|
||||
if (typeof window !== 'undefined' && isInitialized) {
|
||||
const themeVars = THEMES[currentTheme]
|
||||
const config: any = {
|
||||
|
|
@ -361,23 +321,37 @@ const Flowchart = React.forwardRef((props: {
|
|||
mindmap: {
|
||||
useMaxWidth: true,
|
||||
padding: 10,
|
||||
diagramPadding: 20,
|
||||
},
|
||||
}
|
||||
|
||||
const isFlowchart = primitiveCode.trim().startsWith('graph') || primitiveCode.trim().startsWith('flowchart')
|
||||
|
||||
if (look === 'classic') {
|
||||
config.theme = currentTheme === 'dark' ? 'dark' : 'neutral'
|
||||
config.flowchart = {
|
||||
htmlLabels: true,
|
||||
useMaxWidth: true,
|
||||
diagramPadding: 12,
|
||||
nodeSpacing: 60,
|
||||
rankSpacing: 80,
|
||||
curve: 'linear',
|
||||
ranker: 'tight-tree',
|
||||
|
||||
if (isFlowchart) {
|
||||
config.flowchart = {
|
||||
htmlLabels: true,
|
||||
useMaxWidth: true,
|
||||
nodeSpacing: 60,
|
||||
rankSpacing: 80,
|
||||
curve: 'linear',
|
||||
ranker: 'tight-tree',
|
||||
}
|
||||
}
|
||||
|
||||
if (currentTheme === 'dark') {
|
||||
config.themeVariables = {
|
||||
background: themeVars.background,
|
||||
primaryColor: themeVars.primaryColor,
|
||||
primaryBorderColor: themeVars.primaryBorderColor,
|
||||
primaryTextColor: themeVars.primaryTextColor,
|
||||
secondaryColor: themeVars.secondaryColor,
|
||||
tertiaryColor: themeVars.tertiaryColor,
|
||||
}
|
||||
}
|
||||
}
|
||||
else {
|
||||
else { // look === 'handDrawn'
|
||||
config.theme = 'default'
|
||||
config.themeCSS = `
|
||||
.node rect { fill-opacity: 0.85; }
|
||||
|
|
@ -389,27 +363,17 @@ const Flowchart = React.forwardRef((props: {
|
|||
config.themeVariables = {
|
||||
fontSize: '14px',
|
||||
fontFamily: 'sans-serif',
|
||||
primaryBorderColor: currentTheme === 'dark' ? THEMES.dark.connectionColor : THEMES.light.connectionColor,
|
||||
}
|
||||
config.flowchart = {
|
||||
htmlLabels: true,
|
||||
useMaxWidth: true,
|
||||
diagramPadding: 10,
|
||||
nodeSpacing: 40,
|
||||
rankSpacing: 60,
|
||||
curve: 'basis',
|
||||
}
|
||||
config.themeVariables.primaryBorderColor = currentTheme === 'dark' ? THEMES.dark.connectionColor : THEMES.light.connectionColor
|
||||
}
|
||||
|
||||
if (currentTheme === 'dark' && !config.themeVariables) {
|
||||
config.themeVariables = {
|
||||
background: themeVars.background,
|
||||
primaryColor: themeVars.primaryColor,
|
||||
primaryBorderColor: themeVars.primaryBorderColor,
|
||||
primaryTextColor: themeVars.primaryTextColor,
|
||||
secondaryColor: themeVars.secondaryColor,
|
||||
tertiaryColor: themeVars.tertiaryColor,
|
||||
fontFamily: 'sans-serif',
|
||||
if (isFlowchart) {
|
||||
config.flowchart = {
|
||||
htmlLabels: true,
|
||||
useMaxWidth: true,
|
||||
nodeSpacing: 40,
|
||||
rankSpacing: 60,
|
||||
curve: 'basis',
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -425,44 +389,50 @@ const Flowchart = React.forwardRef((props: {
|
|||
return false
|
||||
}, [currentTheme, isInitialized, look])
|
||||
|
||||
// Effect for theme and style configuration
|
||||
// This is the main rendering effect.
|
||||
// It triggers whenever the code, theme, or style changes.
|
||||
useEffect(() => {
|
||||
if (diagramCache.has(cacheKey)) {
|
||||
setSvgCode(diagramCache.get(cacheKey) || null)
|
||||
setIsLoading(false)
|
||||
return
|
||||
}
|
||||
|
||||
if (configureMermaid() && containerRef.current && isCodeComplete)
|
||||
renderFlowchart(props.PrimitiveCode)
|
||||
}, [look, props.PrimitiveCode, renderFlowchart, isInitialized, cacheKey, currentTheme, isCodeComplete, configureMermaid])
|
||||
|
||||
// Effect for rendering with debounce
|
||||
useEffect(() => {
|
||||
if (diagramCache.has(cacheKey)) {
|
||||
setSvgCode(diagramCache.get(cacheKey) || null)
|
||||
if (!isInitialized)
|
||||
return
|
||||
|
||||
// Don't render if code is too short
|
||||
if (!props.PrimitiveCode || props.PrimitiveCode.length < 10) {
|
||||
setIsLoading(false)
|
||||
setSvgString(null)
|
||||
return
|
||||
}
|
||||
|
||||
// Use a timeout to handle streaming code and debounce rendering
|
||||
if (renderTimeoutRef.current)
|
||||
clearTimeout(renderTimeoutRef.current)
|
||||
|
||||
if (isCodeComplete) {
|
||||
renderTimeoutRef.current = setTimeout(() => {
|
||||
if (isInitialized)
|
||||
renderFlowchart(props.PrimitiveCode)
|
||||
}, 300)
|
||||
}
|
||||
else {
|
||||
setIsLoading(true)
|
||||
}
|
||||
setIsLoading(true)
|
||||
|
||||
renderTimeoutRef.current = setTimeout(() => {
|
||||
// Final validation before rendering
|
||||
if (!isMermaidCodeComplete(props.PrimitiveCode)) {
|
||||
setIsLoading(false)
|
||||
setErrMsg('Diagram code is not complete or invalid.')
|
||||
return
|
||||
}
|
||||
|
||||
const cacheKey = `${props.PrimitiveCode}-${look}-${currentTheme}`
|
||||
if (diagramCache.has(cacheKey)) {
|
||||
setErrMsg('')
|
||||
setSvgString(diagramCache.get(cacheKey) || null)
|
||||
setIsLoading(false)
|
||||
return
|
||||
}
|
||||
|
||||
if (configureMermaid(props.PrimitiveCode))
|
||||
renderFlowchart(props.PrimitiveCode)
|
||||
}, 300) // 300ms debounce
|
||||
|
||||
return () => {
|
||||
if (renderTimeoutRef.current)
|
||||
clearTimeout(renderTimeoutRef.current)
|
||||
}
|
||||
}, [props.PrimitiveCode, renderFlowchart, isInitialized, cacheKey, isCodeComplete])
|
||||
}, [props.PrimitiveCode, look, currentTheme, isInitialized, configureMermaid, renderFlowchart])
|
||||
|
||||
// Cleanup on unmount
|
||||
useEffect(() => {
|
||||
|
|
@ -471,14 +441,22 @@ const Flowchart = React.forwardRef((props: {
|
|||
containerRef.current.innerHTML = ''
|
||||
if (renderTimeoutRef.current)
|
||||
clearTimeout(renderTimeoutRef.current)
|
||||
if (codeCompletionCheckRef.current)
|
||||
clearTimeout(codeCompletionCheckRef.current)
|
||||
}
|
||||
}, [])
|
||||
|
||||
const handlePreviewClick = async () => {
|
||||
if (svgString) {
|
||||
const base64 = await svgToBase64(svgString)
|
||||
setImagePreviewUrl(base64)
|
||||
}
|
||||
}
|
||||
|
||||
const toggleTheme = () => {
|
||||
setCurrentTheme(prevTheme => prevTheme === 'light' ? Theme.dark : Theme.light)
|
||||
const newTheme = currentTheme === 'light' ? 'dark' : 'light'
|
||||
// Ensure a full, clean re-render cycle, consistent with global theme change.
|
||||
diagramCache.clear()
|
||||
setSvgString(null)
|
||||
setCurrentTheme(newTheme)
|
||||
}
|
||||
|
||||
// Style classes for theme-dependent elements
|
||||
|
|
@ -527,14 +505,26 @@ const Flowchart = React.forwardRef((props: {
|
|||
<div
|
||||
key='classic'
|
||||
className={getLookButtonClass('classic')}
|
||||
onClick={() => setLook('classic')}
|
||||
onClick={() => {
|
||||
if (look !== 'classic') {
|
||||
diagramCache.clear()
|
||||
setSvgString(null)
|
||||
setLook('classic')
|
||||
}
|
||||
}}
|
||||
>
|
||||
<div className="msh-segmented-item-label">{t('app.mermaid.classic')}</div>
|
||||
</div>
|
||||
<div
|
||||
key='handDrawn'
|
||||
className={getLookButtonClass('handDrawn')}
|
||||
onClick={() => setLook('handDrawn')}
|
||||
onClick={() => {
|
||||
if (look !== 'handDrawn') {
|
||||
diagramCache.clear()
|
||||
setSvgString(null)
|
||||
setLook('handDrawn')
|
||||
}
|
||||
}}
|
||||
>
|
||||
<div className="msh-segmented-item-label">{t('app.mermaid.handDrawn')}</div>
|
||||
</div>
|
||||
|
|
@ -544,7 +534,7 @@ const Flowchart = React.forwardRef((props: {
|
|||
|
||||
<div ref={containerRef} style={{ position: 'absolute', visibility: 'hidden', height: 0, overflow: 'hidden' }} />
|
||||
|
||||
{isLoading && !svgCode && (
|
||||
{isLoading && !svgString && (
|
||||
<div className='px-[26px] py-4'>
|
||||
<LoadingAnim type='text'/>
|
||||
{!isCodeComplete && (
|
||||
|
|
@ -555,8 +545,8 @@ const Flowchart = React.forwardRef((props: {
|
|||
</div>
|
||||
)}
|
||||
|
||||
{svgCode && (
|
||||
<div className={themeClasses.mermaidDiv} style={{ objectFit: 'cover' }} onClick={() => setImagePreviewUrl(svgCode)}>
|
||||
{svgString && (
|
||||
<div className={themeClasses.mermaidDiv} style={{ objectFit: 'cover' }} onClick={handlePreviewClick}>
|
||||
<div className="absolute bottom-2 left-2 z-[100]">
|
||||
<button
|
||||
onClick={(e) => {
|
||||
|
|
@ -571,11 +561,9 @@ const Flowchart = React.forwardRef((props: {
|
|||
</button>
|
||||
</div>
|
||||
|
||||
<img
|
||||
src={svgCode}
|
||||
alt="mermaid_chart"
|
||||
<div
|
||||
style={{ maxWidth: '100%' }}
|
||||
onError={() => { setErrMsg('Chart rendering failed, please refresh and retry') }}
|
||||
dangerouslySetInnerHTML={{ __html: svgString }}
|
||||
/>
|
||||
</div>
|
||||
)}
|
||||
|
|
|
|||
|
|
@ -3,52 +3,31 @@ export function cleanUpSvgCode(svgCode: string): string {
|
|||
}
|
||||
|
||||
/**
|
||||
* Preprocesses mermaid code to fix common syntax issues
|
||||
* Prepares mermaid code for rendering by sanitizing common syntax issues.
|
||||
* @param {string} mermaidCode - The mermaid code to prepare
|
||||
* @param {'classic' | 'handDrawn'} style - The rendering style
|
||||
* @returns {string} - The prepared mermaid code
|
||||
*/
|
||||
export function preprocessMermaidCode(code: string): string {
|
||||
if (!code || typeof code !== 'string')
|
||||
export const prepareMermaidCode = (mermaidCode: string, style: 'classic' | 'handDrawn'): string => {
|
||||
if (!mermaidCode || typeof mermaidCode !== 'string')
|
||||
return ''
|
||||
|
||||
// First check if this is a gantt chart
|
||||
if (code.trim().startsWith('gantt')) {
|
||||
// For gantt charts, we need to ensure each task is on its own line
|
||||
// Split the code into lines and process each line separately
|
||||
const lines = code.split('\n').map(line => line.trim())
|
||||
return lines.join('\n')
|
||||
}
|
||||
let code = mermaidCode.trim()
|
||||
|
||||
return code
|
||||
// Replace English colons with Chinese colons in section nodes to avoid parsing issues
|
||||
.replace(/section\s+([^:]+):/g, (match, sectionName) => `section ${sectionName}:`)
|
||||
// Fix common syntax issues
|
||||
.replace(/fifopacket/g, 'rect')
|
||||
// Ensure graph has direction
|
||||
.replace(/^graph\s+((?:TB|BT|RL|LR)*)/, (match, direction) => {
|
||||
return direction ? match : 'graph TD'
|
||||
})
|
||||
// Clean up empty lines and extra spaces
|
||||
.trim()
|
||||
}
|
||||
// Security: Sanitize against javascript: protocol in click events (XSS vector)
|
||||
code = code.replace(/(\bclick\s+\w+\s+")javascript:[^"]*(")/g, '$1#$2')
|
||||
|
||||
/**
|
||||
* Prepares mermaid code based on selected style
|
||||
*/
|
||||
export function prepareMermaidCode(code: string, style: 'classic' | 'handDrawn'): string {
|
||||
let finalCode = preprocessMermaidCode(code)
|
||||
// Convenience: Basic BR replacement. This is a common and safe operation.
|
||||
code = code.replace(/<br\s*\/?>/g, '\n')
|
||||
|
||||
// Special handling for gantt charts and mindmaps
|
||||
if (finalCode.trim().startsWith('gantt') || finalCode.trim().startsWith('mindmap')) {
|
||||
// For gantt charts and mindmaps, preserve the structure exactly as is
|
||||
return finalCode
|
||||
}
|
||||
let finalCode = code
|
||||
|
||||
// Hand-drawn style requires some specific clean-up.
|
||||
if (style === 'handDrawn') {
|
||||
finalCode = finalCode
|
||||
// Remove style definitions that interfere with hand-drawn style
|
||||
.replace(/style\s+[^\n]+/g, '')
|
||||
.replace(/linkStyle\s+[^\n]+/g, '')
|
||||
.replace(/^flowchart/, 'graph')
|
||||
// Remove any styles that might interfere with hand-drawn style
|
||||
.replace(/class="[^"]*"/g, '')
|
||||
.replace(/fill="[^"]*"/g, '')
|
||||
.replace(/stroke="[^"]*"/g, '')
|
||||
|
|
@ -82,7 +61,6 @@ export function svgToBase64(svgGraph: string): Promise<string> {
|
|||
})
|
||||
}
|
||||
catch (error) {
|
||||
console.error('Error converting SVG to base64:', error)
|
||||
return Promise.resolve('')
|
||||
}
|
||||
}
|
||||
|
|
@ -115,13 +93,11 @@ export function processSvgForTheme(
|
|||
}
|
||||
else {
|
||||
let i = 0
|
||||
themes.dark.nodeColors.forEach(() => {
|
||||
const regex = /fill="#[a-fA-F0-9]{6}"[^>]*class="node-[^"]*"/g
|
||||
processedSvg = processedSvg.replace(regex, (match: string) => {
|
||||
const colorIndex = i % themes.dark.nodeColors.length
|
||||
i++
|
||||
return match.replace(/fill="#[a-fA-F0-9]{6}"/, `fill="${themes.dark.nodeColors[colorIndex].bg}"`)
|
||||
})
|
||||
const nodeColorRegex = /fill="#[a-fA-F0-9]{6}"[^>]*class="node-[^"]*"/g
|
||||
processedSvg = processedSvg.replace(nodeColorRegex, (match: string) => {
|
||||
const colorIndex = i % themes.dark.nodeColors.length
|
||||
i++
|
||||
return match.replace(/fill="#[a-fA-F0-9]{6}"/, `fill="${themes.dark.nodeColors[colorIndex].bg}"`)
|
||||
})
|
||||
|
||||
processedSvg = processedSvg
|
||||
|
|
@ -139,14 +115,12 @@ export function processSvgForTheme(
|
|||
.replace(/stroke-width="1"/g, 'stroke-width="1.5"')
|
||||
}
|
||||
else {
|
||||
themes.light.nodeColors.forEach(() => {
|
||||
const regex = /fill="#[a-fA-F0-9]{6}"[^>]*class="node-[^"]*"/g
|
||||
let i = 0
|
||||
processedSvg = processedSvg.replace(regex, (match: string) => {
|
||||
const colorIndex = i % themes.light.nodeColors.length
|
||||
i++
|
||||
return match.replace(/fill="#[a-fA-F0-9]{6}"/, `fill="${themes.light.nodeColors[colorIndex].bg}"`)
|
||||
})
|
||||
let i = 0
|
||||
const nodeColorRegex = /fill="#[a-fA-F0-9]{6}"[^>]*class="node-[^"]*"/g
|
||||
processedSvg = processedSvg.replace(nodeColorRegex, (match: string) => {
|
||||
const colorIndex = i % themes.light.nodeColors.length
|
||||
i++
|
||||
return match.replace(/fill="#[a-fA-F0-9]{6}"/, `fill="${themes.light.nodeColors[colorIndex].bg}"`)
|
||||
})
|
||||
|
||||
processedSvg = processedSvg
|
||||
|
|
@ -187,24 +161,10 @@ export function isMermaidCodeComplete(code: string): boolean {
|
|||
// Check for basic syntax structure
|
||||
const hasValidStart = /^(graph|flowchart|sequenceDiagram|classDiagram|classDef|class|stateDiagram|gantt|pie|er|journey|requirementDiagram|mindmap)/.test(trimmedCode)
|
||||
|
||||
// Check for balanced brackets and parentheses
|
||||
const isBalanced = (() => {
|
||||
const stack = []
|
||||
const pairs = { '{': '}', '[': ']', '(': ')' }
|
||||
|
||||
for (const char of trimmedCode) {
|
||||
if (char in pairs) {
|
||||
stack.push(char)
|
||||
}
|
||||
else if (Object.values(pairs).includes(char)) {
|
||||
const last = stack.pop()
|
||||
if (pairs[last as keyof typeof pairs] !== char)
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
return stack.length === 0
|
||||
})()
|
||||
// The balanced bracket check was too strict and produced false negatives for valid
|
||||
// mermaid syntax like the asymmetric shape `A>B]`. Relying on Mermaid's own
|
||||
// parser is more robust.
|
||||
const isBalanced = true
|
||||
|
||||
// Check for common syntax errors
|
||||
const hasNoSyntaxErrors = !trimmedCode.includes('undefined')
|
||||
|
|
@ -215,7 +175,7 @@ export function isMermaidCodeComplete(code: string): boolean {
|
|||
return hasValidStart && isBalanced && hasNoSyntaxErrors
|
||||
}
|
||||
catch (error) {
|
||||
console.debug('Mermaid code validation error:', error)
|
||||
console.error('Mermaid code validation error:', error)
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -162,7 +162,9 @@ const StepTwo = ({
|
|||
|
||||
const isInCreatePage = !datasetId || (datasetId && !currentDataset?.data_source_type)
|
||||
const dataSourceType = isInCreatePage ? inCreatePageDataSourceType : currentDataset?.data_source_type
|
||||
const [segmentationType, setSegmentationType] = useState<ProcessMode>(ProcessMode.general)
|
||||
const [segmentationType, setSegmentationType] = useState<ProcessMode>(
|
||||
currentDataset?.doc_form === ChunkingMode.parentChild ? ProcessMode.parentChild : ProcessMode.general,
|
||||
)
|
||||
const [segmentIdentifier, doSetSegmentIdentifier] = useState(DEFAULT_SEGMENT_IDENTIFIER)
|
||||
const setSegmentIdentifier = useCallback((value: string, canEmpty?: boolean) => {
|
||||
doSetSegmentIdentifier(value ? escape(value) : (canEmpty ? '' : DEFAULT_SEGMENT_IDENTIFIER))
|
||||
|
|
@ -208,7 +210,14 @@ const StepTwo = ({
|
|||
}
|
||||
if (value === ChunkingMode.parentChild && indexType === IndexingType.ECONOMICAL)
|
||||
setIndexType(IndexingType.QUALIFIED)
|
||||
|
||||
setDocForm(value)
|
||||
|
||||
if (value === ChunkingMode.parentChild)
|
||||
setSegmentationType(ProcessMode.parentChild)
|
||||
else
|
||||
setSegmentationType(ProcessMode.general)
|
||||
|
||||
// eslint-disable-next-line ts/no-use-before-define
|
||||
currentEstimateMutation.reset()
|
||||
}
|
||||
|
|
@ -504,6 +513,20 @@ const StepTwo = ({
|
|||
setOverlap(overlap!)
|
||||
setRules(rules.pre_processing_rules)
|
||||
setDefaultConfig(rules)
|
||||
|
||||
if (documentDetail.dataset_process_rule.mode === 'hierarchical') {
|
||||
setParentChildConfig({
|
||||
chunkForContext: rules.parent_mode || 'paragraph',
|
||||
parent: {
|
||||
delimiter: escape(rules.segmentation.separator),
|
||||
maxLength: rules.segmentation.max_tokens,
|
||||
},
|
||||
child: {
|
||||
delimiter: escape(rules.subchunk_segmentation.separator),
|
||||
maxLength: rules.subchunk_segmentation.max_tokens,
|
||||
},
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -966,8 +989,8 @@ const StepTwo = ({
|
|||
<div className='system-md-semibold mb-0.5 text-text-secondary'>{t('datasetSettings.form.retrievalSetting.title')}</div>
|
||||
<div className='body-xs-regular text-text-tertiary'>
|
||||
<a target='_blank' rel='noopener noreferrer'
|
||||
href={docLink('/guides/knowledge-base/create-knowledge-and-upload-documents')}
|
||||
className='text-text-accent'>{t('datasetSettings.form.retrievalSetting.learnMore')}</a>
|
||||
href={docLink('/guides/knowledge-base/create-knowledge-and-upload-documents')}
|
||||
className='text-text-accent'>{t('datasetSettings.form.retrievalSetting.learnMore')}</a>
|
||||
{t('datasetSettings.form.retrievalSetting.longDescription')}
|
||||
</div>
|
||||
</div>
|
||||
|
|
@ -1131,7 +1154,7 @@ const StepTwo = ({
|
|||
const indexForLabel = index + 1
|
||||
return (
|
||||
<PreviewSlice
|
||||
key={child}
|
||||
key={`C-${indexForLabel}-${child}`}
|
||||
label={`C-${indexForLabel}`}
|
||||
text={child}
|
||||
tooltip={`Child-chunk-${indexForLabel} · ${child.length} Characters`}
|
||||
|
|
|
|||
|
|
@ -1124,6 +1124,63 @@ import { Row, Col, Properties, Property, Heading, SubProperty, PropertyInstructi
|
|||
|
||||
<hr className='ml-0 mr-0' />
|
||||
|
||||
<Heading
|
||||
url='/datasets/{dataset_id}/documents/status/{action}'
|
||||
method='PATCH'
|
||||
title='Update Document Status'
|
||||
name='#batch_document_status'
|
||||
/>
|
||||
<Row>
|
||||
<Col>
|
||||
### Path
|
||||
<Properties>
|
||||
<Property name='dataset_id' type='string' key='dataset_id'>
|
||||
Knowledge ID
|
||||
</Property>
|
||||
<Property name='action' type='string' key='action'>
|
||||
- `enable` - Enable document
|
||||
- `disable` - Disable document
|
||||
- `archive` - Archive document
|
||||
- `un_archive` - Unarchive document
|
||||
</Property>
|
||||
</Properties>
|
||||
|
||||
### Request Body
|
||||
<Properties>
|
||||
<Property name='document_ids' type='array[string]' key='document_ids'>
|
||||
List of document IDs
|
||||
</Property>
|
||||
</Properties>
|
||||
</Col>
|
||||
<Col sticky>
|
||||
<CodeGroup
|
||||
title="Request"
|
||||
tag="PATCH"
|
||||
label="/datasets/{dataset_id}/documents/status/{action}"
|
||||
targetCode={`curl --location --request PATCH '${props.apiBaseUrl}/datasets/{dataset_id}/documents/status/{action}' \\\n--header 'Authorization: Bearer {api_key}' \\\n--header 'Content-Type: application/json' \\\n--data-raw '{\n "document_ids": ["doc-id-1", "doc-id-2"]\n}'`}
|
||||
>
|
||||
```bash {{ title: 'cURL' }}
|
||||
curl --location --request PATCH '${props.apiBaseUrl}/datasets/{dataset_id}/documents/status/{action}' \
|
||||
--header 'Authorization: Bearer {api_key}' \
|
||||
--header 'Content-Type: application/json' \
|
||||
--data-raw '{
|
||||
"document_ids": ["doc-id-1", "doc-id-2"]
|
||||
}'
|
||||
```
|
||||
</CodeGroup>
|
||||
|
||||
<CodeGroup title="Response">
|
||||
```json {{ title: 'Response' }}
|
||||
{
|
||||
"result": "success"
|
||||
}
|
||||
```
|
||||
</CodeGroup>
|
||||
</Col>
|
||||
</Row>
|
||||
|
||||
<hr className='ml-0 mr-0' />
|
||||
|
||||
<Heading
|
||||
url='/datasets/{dataset_id}/documents/{document_id}/segments'
|
||||
method='POST'
|
||||
|
|
|
|||
|
|
@ -881,6 +881,63 @@ import { Row, Col, Properties, Property, Heading, SubProperty, PropertyInstructi
|
|||
|
||||
<hr className='ml-0 mr-0' />
|
||||
|
||||
<Heading
|
||||
url='/datasets/{dataset_id}/documents/status/{action}'
|
||||
method='PATCH'
|
||||
title='ドキュメントステータスの更新'
|
||||
name='#batch_document_status'
|
||||
/>
|
||||
<Row>
|
||||
<Col>
|
||||
### パス
|
||||
<Properties>
|
||||
<Property name='dataset_id' type='string' key='dataset_id'>
|
||||
ナレッジ ID
|
||||
</Property>
|
||||
<Property name='action' type='string' key='action'>
|
||||
- `enable` - ドキュメントを有効化
|
||||
- `disable` - ドキュメントを無効化
|
||||
- `archive` - ドキュメントをアーカイブ
|
||||
- `un_archive` - ドキュメントのアーカイブを解除
|
||||
</Property>
|
||||
</Properties>
|
||||
|
||||
### リクエストボディ
|
||||
<Properties>
|
||||
<Property name='document_ids' type='array[string]' key='document_ids'>
|
||||
ドキュメントIDのリスト
|
||||
</Property>
|
||||
</Properties>
|
||||
</Col>
|
||||
<Col sticky>
|
||||
<CodeGroup
|
||||
title="リクエスト"
|
||||
tag="PATCH"
|
||||
label="/datasets/{dataset_id}/documents/status/{action}"
|
||||
targetCode={`curl --location --request PATCH '${props.apiBaseUrl}/datasets/{dataset_id}/documents/status/{action}' \\\n--header 'Authorization: Bearer {api_key}' \\\n--header 'Content-Type: application/json' \\\n--data-raw '{\n "document_ids": ["doc-id-1", "doc-id-2"]\n}'`}
|
||||
>
|
||||
```bash {{ title: 'cURL' }}
|
||||
curl --location --request PATCH '${props.apiBaseUrl}/datasets/{dataset_id}/documents/status/{action}' \
|
||||
--header 'Authorization: Bearer {api_key}' \
|
||||
--header 'Content-Type: application/json' \
|
||||
--data-raw '{
|
||||
"document_ids": ["doc-id-1", "doc-id-2"]
|
||||
}'
|
||||
```
|
||||
</CodeGroup>
|
||||
|
||||
<CodeGroup title="レスポンス">
|
||||
```json {{ title: 'Response' }}
|
||||
{
|
||||
"result": "success"
|
||||
}
|
||||
```
|
||||
</CodeGroup>
|
||||
</Col>
|
||||
</Row>
|
||||
|
||||
<hr className='ml-0 mr-0' />
|
||||
|
||||
<Heading
|
||||
url='/datasets/{dataset_id}/documents/{document_id}/segments'
|
||||
method='POST'
|
||||
|
|
@ -2413,3 +2470,4 @@ import { Row, Col, Properties, Property, Heading, SubProperty, PropertyInstructi
|
|||
</tbody>
|
||||
</table>
|
||||
<div className="pb-4" />
|
||||
|
||||
|
|
|
|||
|
|
@ -1131,6 +1131,63 @@ import { Row, Col, Properties, Property, Heading, SubProperty, PropertyInstructi
|
|||
|
||||
<hr className='ml-0 mr-0' />
|
||||
|
||||
<Heading
|
||||
url='/datasets/{dataset_id}/documents/status/{action}'
|
||||
method='PATCH'
|
||||
title='更新文档状态'
|
||||
name='#batch_document_status'
|
||||
/>
|
||||
<Row>
|
||||
<Col>
|
||||
### Path
|
||||
<Properties>
|
||||
<Property name='dataset_id' type='string' key='dataset_id'>
|
||||
知识库 ID
|
||||
</Property>
|
||||
<Property name='action' type='string' key='action'>
|
||||
- `enable` - 启用文档
|
||||
- `disable` - 禁用文档
|
||||
- `archive` - 归档文档
|
||||
- `un_archive` - 取消归档文档
|
||||
</Property>
|
||||
</Properties>
|
||||
|
||||
### Request Body
|
||||
<Properties>
|
||||
<Property name='document_ids' type='array[string]' key='document_ids'>
|
||||
文档ID列表
|
||||
</Property>
|
||||
</Properties>
|
||||
</Col>
|
||||
<Col sticky>
|
||||
<CodeGroup
|
||||
title="Request"
|
||||
tag="PATCH"
|
||||
label="/datasets/{dataset_id}/documents/status/{action}"
|
||||
targetCode={`curl --location --request PATCH '${props.apiBaseUrl}/datasets/{dataset_id}/documents/status/{action}' \\\n--header 'Authorization: Bearer {api_key}' \\\n--header 'Content-Type: application/json' \\\n--data-raw '{\n "document_ids": ["doc-id-1", "doc-id-2"]\n}'`}
|
||||
>
|
||||
```bash {{ title: 'cURL' }}
|
||||
curl --location --request PATCH '${props.apiBaseUrl}/datasets/{dataset_id}/documents/status/{action}' \
|
||||
--header 'Authorization: Bearer {api_key}' \
|
||||
--header 'Content-Type: application/json' \
|
||||
--data-raw '{
|
||||
"document_ids": ["doc-id-1", "doc-id-2"]
|
||||
}'
|
||||
```
|
||||
</CodeGroup>
|
||||
|
||||
<CodeGroup title="Response">
|
||||
```json {{ title: 'Response' }}
|
||||
{
|
||||
"result": "success"
|
||||
}
|
||||
```
|
||||
</CodeGroup>
|
||||
</Col>
|
||||
</Row>
|
||||
|
||||
<hr className='ml-0 mr-0' />
|
||||
|
||||
<Heading
|
||||
url='/datasets/{dataset_id}/documents/{document_id}/segments'
|
||||
method='POST'
|
||||
|
|
|
|||
|
|
@ -152,7 +152,6 @@ Chat applications support session persistence, allowing previous chat history to
|
|||
- `data` (object) detail
|
||||
- `id` (string) Unique ID of workflow execution
|
||||
- `workflow_id` (string) ID of related workflow
|
||||
- `sequence_number` (int) Self-increasing serial number, self-increasing in the App, starting from 1
|
||||
- `created_at` (timestamp) Creation timestamp, e.g., 1705395332
|
||||
- `event: node_started` node execution started
|
||||
- `task_id` (string) Task ID, used for request tracking and the below Stop Generate API
|
||||
|
|
@ -287,7 +286,7 @@ Chat applications support session persistence, allowing previous chat history to
|
|||
### Streaming Mode
|
||||
<CodeGroup title="Response">
|
||||
```streaming {{ title: 'Response' }}
|
||||
data: {"event": "workflow_started", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "workflow_id": "dfjasklfjdslag", "sequence_number": 1, "created_at": 1679586595}}
|
||||
data: {"event": "workflow_started", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "workflow_id": "dfjasklfjdslag", "created_at": 1679586595}}
|
||||
data: {"event": "node_started", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "node_id": "dfjasklfjdslag", "node_type": "start", "title": "Start", "index": 0, "predecessor_node_id": "fdljewklfklgejlglsd", "inputs": {}, "created_at": 1679586595}}
|
||||
data: {"event": "node_finished", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "node_id": "dfjasklfjdslag", "node_type": "start", "title": "Start", "index": 0, "predecessor_node_id": "fdljewklfklgejlglsd", "inputs": {}, "outputs": {}, "status": "succeeded", "elapsed_time": 0.324, "execution_metadata": {"total_tokens": 63127864, "total_price": 2.378, "currency": "USD"}, "created_at": 1679586595}}
|
||||
data: {"event": "workflow_finished", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "workflow_id": "dfjasklfjdslag", "outputs": {}, "status": "succeeded", "elapsed_time": 0.324, "total_tokens": 63127864, "total_steps": "1", "created_at": 1679586595, "finished_at": 1679976595}}
|
||||
|
|
|
|||
|
|
@ -152,7 +152,6 @@ import { Row, Col, Properties, Property, Heading, SubProperty, Paragraph } from
|
|||
- `data` (object) 詳細
|
||||
- `id` (string) ワークフロー実行の一意ID
|
||||
- `workflow_id` (string) 関連ワークフローのID
|
||||
- `sequence_number` (int) 自己増加シリアル番号、アプリ内で自己増加し、1から始まります
|
||||
- `created_at` (timestamp) 作成タイムスタンプ、例:1705395332
|
||||
- `event: node_started` ノード実行が開始
|
||||
- `task_id` (string) タスクID、リクエスト追跡と以下のStop Generate APIに使用
|
||||
|
|
@ -287,7 +286,7 @@ import { Row, Col, Properties, Property, Heading, SubProperty, Paragraph } from
|
|||
### ストリーミングモード
|
||||
<CodeGroup title="応答">
|
||||
```streaming {{ title: '応答' }}
|
||||
data: {"event": "workflow_started", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "workflow_id": "dfjasklfjdslag", "sequence_number": 1, "created_at": 1679586595}}
|
||||
data: {"event": "workflow_started", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "workflow_id": "dfjasklfjdslag", "created_at": 1679586595}}
|
||||
data: {"event": "node_started", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "node_id": "dfjasklfjdslag", "node_type": "start", "title": "Start", "index": 0, "predecessor_node_id": "fdljewklfklgejlglsd", "inputs": {}, "created_at": 1679586595}}
|
||||
data: {"event": "node_finished", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "node_id": "dfjasklfjdslag", "node_type": "start", "title": "Start", "index": 0, "predecessor_node_id": "fdljewklfklgejlglsd", "inputs": {}, "outputs": {}, "status": "succeeded", "elapsed_time": 0.324, "execution_metadata": {"total_tokens": 63127864, "total_price": 2.378, "currency": "USD"}, "created_at": 1679586595}}
|
||||
data: {"event": "workflow_finished", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "workflow_id": "dfjasklfjdslag", "outputs": {}, "status": "succeeded", "elapsed_time": 0.324, "total_tokens": 63127864, "total_steps": "1", "created_at": 1679586595, "finished_at": 1679976595}}
|
||||
|
|
|
|||
|
|
@ -153,7 +153,6 @@ import { Row, Col, Properties, Property, Heading, SubProperty } from '../md.tsx'
|
|||
- `data` (object) 详细内容
|
||||
- `id` (string) workflow 执行 ID
|
||||
- `workflow_id` (string) 关联 Workflow ID
|
||||
- `sequence_number` (int) 自增序号,App 内自增,从 1 开始
|
||||
- `created_at` (timestamp) 开始时间
|
||||
- `event: node_started` node 开始执行
|
||||
- `task_id` (string) 任务 ID,用于请求跟踪和下方的停止响应接口
|
||||
|
|
@ -297,7 +296,7 @@ import { Row, Col, Properties, Property, Heading, SubProperty } from '../md.tsx'
|
|||
### 流式模式
|
||||
<CodeGroup title="Response">
|
||||
```streaming {{ title: 'Response' }}
|
||||
data: {"event": "workflow_started", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "workflow_id": "dfjasklfjdslag", "sequence_number": 1, "created_at": 1679586595}}
|
||||
data: {"event": "workflow_started", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "workflow_id": "dfjasklfjdslag", "created_at": 1679586595}}
|
||||
data: {"event": "node_started", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "node_id": "dfjasklfjdslag", "node_type": "start", "title": "Start", "index": 0, "predecessor_node_id": "fdljewklfklgejlglsd", "inputs": {}, "created_at": 1679586595}}
|
||||
data: {"event": "node_finished", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "node_id": "dfjasklfjdslag", "node_type": "start", "title": "Start", "index": 0, "predecessor_node_id": "fdljewklfklgejlglsd", "inputs": {}, "outputs": {}, "status": "succeeded", "elapsed_time": 0.324, "execution_metadata": {"total_tokens": 63127864, "total_price": 2.378, "currency": "USD"}, "created_at": 1679586595}}
|
||||
data: {"event": "workflow_finished", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "workflow_id": "dfjasklfjdslag", "outputs": {}, "status": "succeeded", "elapsed_time": 0.324, "total_tokens": 63127864, "total_steps": "1", "created_at": 1679586595, "finished_at": 1679976595}}
|
||||
|
|
|
|||
|
|
@ -82,9 +82,9 @@ Chat applications support session persistence, allowing previous chat history to
|
|||
|
||||
### ChatCompletionResponse
|
||||
Returns the complete App result, `Content-Type` is `application/json`.
|
||||
- `event` (string) 事件类型,固定为 `message`
|
||||
- `task_id` (string) 任务 ID,用于请求跟踪和下方的停止响应接口
|
||||
- `id` (string) 唯一ID
|
||||
- `event` (string) Event type, always `message` in blocking mode.
|
||||
- `task_id` (string) Task ID, used for request tracking and the below Stop Generate API
|
||||
- `id` (string) Unique ID, same as `message_id`
|
||||
- `message_id` (string) Unique message ID
|
||||
- `conversation_id` (string) Conversation ID
|
||||
- `mode` (string) App mode, fixed as `chat`
|
||||
|
|
|
|||
|
|
@ -103,7 +103,6 @@ Workflow applications offers non-session support and is ideal for translation, a
|
|||
- `data` (object) detail
|
||||
- `id` (string) Unique ID of workflow execution
|
||||
- `workflow_id` (string) ID of related workflow
|
||||
- `sequence_number` (int) Self-increasing serial number, self-increasing in the App, starting from 1
|
||||
- `created_at` (timestamp) Creation timestamp, e.g., 1705395332
|
||||
- `event: node_started` node execution started
|
||||
- `task_id` (string) Task ID, used for request tracking and the below Stop Generate API
|
||||
|
|
@ -241,7 +240,7 @@ Workflow applications offers non-session support and is ideal for translation, a
|
|||
### Streaming Mode
|
||||
<CodeGroup title="Response">
|
||||
```streaming {{ title: 'Response' }}
|
||||
data: {"event": "workflow_started", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "workflow_id": "dfjasklfjdslag", "sequence_number": 1, "created_at": 1679586595}}
|
||||
data: {"event": "workflow_started", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "workflow_id": "dfjasklfjdslag", "created_at": 1679586595}}
|
||||
data: {"event": "node_started", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "node_id": "dfjasklfjdslag", "node_type": "start", "title": "Start", "index": 0, "predecessor_node_id": "fdljewklfklgejlglsd", "inputs": {}, "created_at": 1679586595}}
|
||||
data: {"event": "node_finished", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "node_id": "dfjasklfjdslag", "node_type": "start", "title": "Start", "index": 0, "predecessor_node_id": "fdljewklfklgejlglsd", "inputs": {}, "outputs": {}, "status": "succeeded", "elapsed_time": 0.324, "execution_metadata": {"total_tokens": 63127864, "total_price": 2.378, "currency": "USD"}, "created_at": 1679586595}}
|
||||
data: {"event": "workflow_finished", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "workflow_id": "dfjasklfjdslag", "outputs": {}, "status": "succeeded", "elapsed_time": 0.324, "total_tokens": 63127864, "total_steps": "1", "created_at": 1679586595, "finished_at": 1679976595}}
|
||||
|
|
@ -533,6 +532,12 @@ Workflow applications offers non-session support and is ideal for translation, a
|
|||
<Property name='limit' type='int' key='limit'>
|
||||
How many chat history messages to return in one request, default is 20.
|
||||
</Property>
|
||||
<Property name='created_by_end_user_session_id' type='str' key='created_by_end_user_session_id'>
|
||||
Created by which endUser, for example, `abc-123`.
|
||||
</Property>
|
||||
<Property name='created_by_account' type='str' key='created_by_account'>
|
||||
Created by which email account, for example, lizb@test.com.
|
||||
</Property>
|
||||
</Properties>
|
||||
|
||||
### Response
|
||||
|
|
|
|||
|
|
@ -104,7 +104,6 @@ import { Row, Col, Properties, Property, Heading, SubProperty, Paragraph } from
|
|||
- `data` (object) 詳細
|
||||
- `id` (string) ワークフロー実行の一意の ID
|
||||
- `workflow_id` (string) 関連するワークフローの ID
|
||||
- `sequence_number` (int) 自己増加シリアル番号、アプリ内で自己増加し、1 から始まります
|
||||
- `created_at` (timestamp) 作成タイムスタンプ、例:1705395332
|
||||
- `event: node_started` ノード実行開始
|
||||
- `task_id` (string) タスク ID、リクエスト追跡と以下の Stop Generate API に使用
|
||||
|
|
@ -242,7 +241,7 @@ import { Row, Col, Properties, Property, Heading, SubProperty, Paragraph } from
|
|||
### ストリーミングモード
|
||||
<CodeGroup title="応答">
|
||||
```streaming {{ title: '応答' }}
|
||||
data: {"event": "workflow_started", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "workflow_id": "dfjasklfjdslag", "sequence_number": 1, "created_at": 1679586595}}
|
||||
data: {"event": "workflow_started", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "workflow_id": "dfjasklfjdslag", "created_at": 1679586595}}
|
||||
data: {"event": "node_started", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "node_id": "dfjasklfjdslag", "node_type": "start", "title": "Start", "index": 0, "predecessor_node_id": "fdljewklfklgejlglsd", "inputs": {}, "created_at": 1679586595}}
|
||||
data: {"event": "node_finished", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "node_id": "dfjasklfjdslag", "node_type": "start", "title": "Start", "index": 0, "predecessor_node_id": "fdljewklfklgejlglsd", "inputs": {}, "outputs": {}, "status": "succeeded", "elapsed_time": 0.324, "execution_metadata": {"total_tokens": 63127864, "total_price": 2.378, "currency": "USD"}, "created_at": 1679586595}}
|
||||
data: {"event": "workflow_finished", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "workflow_id": "dfjasklfjdslag", "outputs": {}, "status": "succeeded", "elapsed_time": 0.324, "total_tokens": 63127864, "total_steps": "1", "created_at": 1679586595, "finished_at": 1679976595}}
|
||||
|
|
@ -534,6 +533,12 @@ import { Row, Col, Properties, Property, Heading, SubProperty, Paragraph } from
|
|||
<Property name='limit' type='int' key='limit'>
|
||||
1回のリクエストで返すチャット履歴メッセージの数、デフォルトは20。
|
||||
</Property>
|
||||
<Property name='created_by_end_user_session_id' type='str' key='created_by_end_user_session_id'>
|
||||
どのendUserによって作成されたか、例えば、`abc-123`。
|
||||
</Property>
|
||||
<Property name='created_by_account' type='str' key='created_by_account'>
|
||||
どのメールアカウントによって作成されたか、例えば、lizb@test.com。
|
||||
</Property>
|
||||
</Properties>
|
||||
|
||||
### 応答
|
||||
|
|
|
|||
|
|
@ -98,7 +98,6 @@ Workflow 应用无会话支持,适合用于翻译/文章写作/总结 AI 等
|
|||
- `data` (object) 详细内容
|
||||
- `id` (string) workflow 执行 ID
|
||||
- `workflow_id` (string) 关联 Workflow ID
|
||||
- `sequence_number` (int) 自增序号,App 内自增,从 1 开始
|
||||
- `created_at` (timestamp) 开始时间
|
||||
- `event: node_started` node 开始执行
|
||||
- `task_id` (string) 任务 ID,用于请求跟踪和下方的停止响应接口
|
||||
|
|
@ -232,7 +231,7 @@ Workflow 应用无会话支持,适合用于翻译/文章写作/总结 AI 等
|
|||
### Streaming Mode
|
||||
<CodeGroup title="Response">
|
||||
```streaming {{ title: 'Response' }}
|
||||
data: {"event": "workflow_started", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "workflow_id": "dfjasklfjdslag", "sequence_number": 1, "created_at": 1679586595}}
|
||||
data: {"event": "workflow_started", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "workflow_id": "dfjasklfjdslag", "created_at": 1679586595}}
|
||||
data: {"event": "node_started", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "node_id": "dfjasklfjdslag", "node_type": "start", "title": "Start", "index": 0, "predecessor_node_id": "fdljewklfklgejlglsd", "inputs": {}, "created_at": 1679586595}}
|
||||
data: {"event": "node_finished", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "node_id": "dfjasklfjdslag", "node_type": "start", "title": "Start", "index": 0, "predecessor_node_id": "fdljewklfklgejlglsd", "inputs": {}, "outputs": {}, "status": "succeeded", "elapsed_time": 0.324, "execution_metadata": {"total_tokens": 63127864, "total_price": 2.378, "currency": "USD"}, "created_at": 1679586595}}
|
||||
data: {"event": "workflow_finished", "task_id": "5ad4cb98-f0c7-4085-b384-88c403be6290", "workflow_run_id": "5ad498-f0c7-4085-b384-88cbe6290", "data": {"id": "5ad498-f0c7-4085-b384-88cbe6290", "workflow_id": "dfjasklfjdslag", "outputs": {}, "status": "succeeded", "elapsed_time": 0.324, "total_tokens": 63127864, "total_steps": "1", "created_at": 1679586595, "finished_at": 1679976595}}
|
||||
|
|
@ -522,6 +521,12 @@ Workflow 应用无会话支持,适合用于翻译/文章写作/总结 AI 等
|
|||
<Property name='limit' type='int' key='limit'>
|
||||
每页条数, 默认20.
|
||||
</Property>
|
||||
<Property name='created_by_end_user_session_id' type='str' key='created_by_end_user_session_id'>
|
||||
由哪个endUser创建,例如,`abc-123`.
|
||||
</Property>
|
||||
<Property name='created_by_account' type='str' key='created_by_account'>
|
||||
由哪个邮箱账户创建,例如,lizb@test.com.
|
||||
</Property>
|
||||
</Properties>
|
||||
|
||||
### Response
|
||||
|
|
|
|||
|
|
@ -12,6 +12,7 @@ import {
|
|||
PortalToFollowElemTrigger,
|
||||
} from '@/app/components/base/portal-to-follow-elem'
|
||||
import cn from '@/utils/classnames'
|
||||
import { useGlobalPublicStore } from '@/context/global-public-context'
|
||||
|
||||
type Props = {
|
||||
source: PluginSource
|
||||
|
|
@ -40,6 +41,8 @@ const OperationDropdown: FC<Props> = ({
|
|||
setOpen(!openRef.current)
|
||||
}, [setOpen])
|
||||
|
||||
const { enable_marketplace } = useGlobalPublicStore(s => s.systemFeatures)
|
||||
|
||||
return (
|
||||
<PortalToFollowElem
|
||||
open={open}
|
||||
|
|
@ -77,13 +80,13 @@ const OperationDropdown: FC<Props> = ({
|
|||
className='system-md-regular cursor-pointer rounded-lg px-3 py-1.5 text-text-secondary hover:bg-state-base-hover'
|
||||
>{t('plugin.detailPanel.operation.checkUpdate')}</div>
|
||||
)}
|
||||
{(source === PluginSource.marketplace || source === PluginSource.github) && (
|
||||
{(source === PluginSource.marketplace || source === PluginSource.github) && enable_marketplace && (
|
||||
<a href={detailUrl} target='_blank' className='system-md-regular flex cursor-pointer items-center rounded-lg px-3 py-1.5 text-text-secondary hover:bg-state-base-hover'>
|
||||
<span className='grow'>{t('plugin.detailPanel.operation.viewDetail')}</span>
|
||||
<RiArrowRightUpLine className='h-3.5 w-3.5 shrink-0 text-text-tertiary' />
|
||||
</a>
|
||||
)}
|
||||
{(source === PluginSource.marketplace || source === PluginSource.github) && (
|
||||
{(source === PluginSource.marketplace || source === PluginSource.github) && enable_marketplace && (
|
||||
<div className='my-1 h-px bg-divider-subtle'></div>
|
||||
)}
|
||||
<div
|
||||
|
|
|
|||
|
|
@ -29,6 +29,7 @@ import { useAppContext } from '@/context/app-context'
|
|||
import { gte } from 'semver'
|
||||
import Tooltip from '@/app/components/base/tooltip'
|
||||
import { getMarketplaceUrl } from '@/utils/var'
|
||||
import { useGlobalPublicStore } from '@/context/global-public-context'
|
||||
|
||||
type Props = {
|
||||
className?: string
|
||||
|
|
@ -75,6 +76,7 @@ const PluginItem: FC<Props> = ({
|
|||
const getValueFromI18nObject = useRenderI18nObject()
|
||||
const title = getValueFromI18nObject(label)
|
||||
const descriptionText = getValueFromI18nObject(description)
|
||||
const { enable_marketplace } = useGlobalPublicStore(s => s.systemFeatures)
|
||||
|
||||
return (
|
||||
<div
|
||||
|
|
@ -165,7 +167,7 @@ const PluginItem: FC<Props> = ({
|
|||
</a>
|
||||
</>
|
||||
}
|
||||
{source === PluginSource.marketplace
|
||||
{source === PluginSource.marketplace && enable_marketplace
|
||||
&& <>
|
||||
<a href={getMarketplaceUrl(`/plugins/${author}/${name}`, { theme })} target='_blank' className='flex items-center gap-0.5'>
|
||||
<div className='system-2xs-medium-uppercase text-text-tertiary'>{t('plugin.from')} <span className='text-text-secondary'>marketplace</span></div>
|
||||
|
|
|
|||
|
|
@ -15,6 +15,7 @@ import {
|
|||
useWorkflowRun,
|
||||
useWorkflowStartRun,
|
||||
} from '../hooks'
|
||||
import { useWorkflowStore } from '@/app/components/workflow/store'
|
||||
|
||||
type WorkflowMainProps = Pick<WorkflowProps, 'nodes' | 'edges' | 'viewport'>
|
||||
const WorkflowMain = ({
|
||||
|
|
@ -23,14 +24,28 @@ const WorkflowMain = ({
|
|||
viewport,
|
||||
}: WorkflowMainProps) => {
|
||||
const featuresStore = useFeaturesStore()
|
||||
const workflowStore = useWorkflowStore()
|
||||
|
||||
const handleWorkflowDataUpdate = useCallback((payload: any) => {
|
||||
if (payload.features && featuresStore) {
|
||||
const {
|
||||
features,
|
||||
conversation_variables,
|
||||
environment_variables,
|
||||
} = payload
|
||||
if (features && featuresStore) {
|
||||
const { setFeatures } = featuresStore.getState()
|
||||
|
||||
setFeatures(payload.features)
|
||||
setFeatures(features)
|
||||
}
|
||||
}, [featuresStore])
|
||||
if (conversation_variables) {
|
||||
const { setConversationVariables } = workflowStore.getState()
|
||||
setConversationVariables(conversation_variables)
|
||||
}
|
||||
if (environment_variables) {
|
||||
const { setEnvironmentVariables } = workflowStore.getState()
|
||||
setEnvironmentVariables(environment_variables)
|
||||
}
|
||||
}, [featuresStore, workflowStore])
|
||||
|
||||
const {
|
||||
doSyncWorkflowDraft,
|
||||
|
|
|
|||
|
|
@ -2,6 +2,7 @@ import { memo } from 'react'
|
|||
import { useTranslation } from 'react-i18next'
|
||||
import { useIsChatMode } from '../hooks'
|
||||
import { useStore } from '../store'
|
||||
import { formatWorkflowRunIdentifier } from '../utils'
|
||||
import { ClockPlay } from '@/app/components/base/icons/src/vender/line/time'
|
||||
|
||||
const RunningTitle = () => {
|
||||
|
|
@ -12,7 +13,7 @@ const RunningTitle = () => {
|
|||
return (
|
||||
<div className='flex h-[18px] items-center text-xs text-gray-500'>
|
||||
<ClockPlay className='mr-1 h-3 w-3 text-gray-500' />
|
||||
<span>{isChatMode ? `Test Chat#${historyWorkflowData?.sequence_number}` : `Test Run#${historyWorkflowData?.sequence_number}`}</span>
|
||||
<span>{isChatMode ? `Test Chat${formatWorkflowRunIdentifier(historyWorkflowData?.finished_at)}` : `Test Run${formatWorkflowRunIdentifier(historyWorkflowData?.finished_at)}`}</span>
|
||||
<span className='mx-1'>·</span>
|
||||
<span className='ml-1 flex h-[18px] items-center rounded-[5px] border border-indigo-300 bg-white/[0.48] px-1 text-[10px] font-semibold uppercase text-indigo-600'>
|
||||
{t('workflow.common.viewOnly')}
|
||||
|
|
|
|||
|
|
@ -19,6 +19,7 @@ import {
|
|||
useWorkflowRun,
|
||||
} from '../hooks'
|
||||
import { ControlMode, WorkflowRunningStatus } from '../types'
|
||||
import { formatWorkflowRunIdentifier } from '../utils'
|
||||
import cn from '@/utils/classnames'
|
||||
import {
|
||||
PortalToFollowElem,
|
||||
|
|
@ -196,7 +197,7 @@ const ViewHistory = ({
|
|||
item.id === historyWorkflowData?.id && 'text-text-accent',
|
||||
)}
|
||||
>
|
||||
{`Test ${isChatMode ? 'Chat' : 'Run'} #${item.sequence_number}`}
|
||||
{`Test ${isChatMode ? 'Chat' : 'Run'}${formatWorkflowRunIdentifier(item.finished_at)}`}
|
||||
</div>
|
||||
<div className='flex items-center text-xs leading-[18px] text-text-tertiary'>
|
||||
{item.created_by_account?.name} · {formatTimeFromNow((item.finished_at || item.created_at) * 1000)}
|
||||
|
|
|
|||
|
|
@ -11,7 +11,6 @@ import { ModelModeType } from '@/types/app'
|
|||
import { Theme } from '@/types/app'
|
||||
import { SchemaGeneratorDark, SchemaGeneratorLight } from './assets'
|
||||
import cn from '@/utils/classnames'
|
||||
import type { ModelInfo } from './prompt-editor'
|
||||
import PromptEditor from './prompt-editor'
|
||||
import GeneratedResult from './generated-result'
|
||||
import { useGenerateStructuredOutputRules } from '@/service/use-common'
|
||||
|
|
@ -19,7 +18,6 @@ import Toast from '@/app/components/base/toast'
|
|||
import { type FormValue, ModelTypeEnum } from '@/app/components/header/account-setting/model-provider-page/declarations'
|
||||
import { useModelListAndDefaultModelAndCurrentProviderAndModel } from '@/app/components/header/account-setting/model-provider-page/hooks'
|
||||
import { useVisualEditorStore } from '../visual-editor/store'
|
||||
import { useTranslation } from 'react-i18next'
|
||||
import { useMittContext } from '../visual-editor/context'
|
||||
|
||||
type JsonSchemaGeneratorProps = {
|
||||
|
|
@ -36,10 +34,12 @@ export const JsonSchemaGenerator: FC<JsonSchemaGeneratorProps> = ({
|
|||
onApply,
|
||||
crossAxisOffset,
|
||||
}) => {
|
||||
const { t } = useTranslation()
|
||||
const localModel = localStorage.getItem('auto-gen-model')
|
||||
? JSON.parse(localStorage.getItem('auto-gen-model') as string) as Model
|
||||
: null
|
||||
const [open, setOpen] = useState(false)
|
||||
const [view, setView] = useState(GeneratorView.promptEditor)
|
||||
const [model, setModel] = useState<Model>({
|
||||
const [model, setModel] = useState<Model>(localModel || {
|
||||
name: '',
|
||||
provider: '',
|
||||
mode: ModelModeType.completion,
|
||||
|
|
@ -58,11 +58,19 @@ export const JsonSchemaGenerator: FC<JsonSchemaGeneratorProps> = ({
|
|||
|
||||
useEffect(() => {
|
||||
if (defaultModel) {
|
||||
setModel(prev => ({
|
||||
...prev,
|
||||
name: defaultModel.model,
|
||||
provider: defaultModel.provider.provider,
|
||||
}))
|
||||
const localModel = localStorage.getItem('auto-gen-model')
|
||||
? JSON.parse(localStorage.getItem('auto-gen-model') || '')
|
||||
: null
|
||||
if (localModel) {
|
||||
setModel(localModel)
|
||||
}
|
||||
else {
|
||||
setModel(prev => ({
|
||||
...prev,
|
||||
name: defaultModel.model,
|
||||
provider: defaultModel.provider.provider,
|
||||
}))
|
||||
}
|
||||
}
|
||||
}, [defaultModel])
|
||||
|
||||
|
|
@ -77,22 +85,25 @@ export const JsonSchemaGenerator: FC<JsonSchemaGeneratorProps> = ({
|
|||
setOpen(false)
|
||||
}, [])
|
||||
|
||||
const handleModelChange = useCallback((model: ModelInfo) => {
|
||||
setModel(prev => ({
|
||||
...prev,
|
||||
provider: model.provider,
|
||||
name: model.modelId,
|
||||
mode: model.mode as ModelModeType,
|
||||
}))
|
||||
}, [])
|
||||
const handleModelChange = useCallback((newValue: { modelId: string; provider: string; mode?: string; features?: string[] }) => {
|
||||
const newModel = {
|
||||
...model,
|
||||
provider: newValue.provider,
|
||||
name: newValue.modelId,
|
||||
mode: newValue.mode as ModelModeType,
|
||||
}
|
||||
setModel(newModel)
|
||||
localStorage.setItem('auto-gen-model', JSON.stringify(newModel))
|
||||
}, [model, setModel])
|
||||
|
||||
const handleCompletionParamsChange = useCallback((newParams: FormValue) => {
|
||||
setModel(prev => ({
|
||||
...prev,
|
||||
const newModel = {
|
||||
...model,
|
||||
completion_params: newParams as CompletionParams,
|
||||
}),
|
||||
)
|
||||
}, [])
|
||||
}
|
||||
setModel(newModel)
|
||||
localStorage.setItem('auto-gen-model', JSON.stringify(newModel))
|
||||
}, [model, setModel])
|
||||
|
||||
const { mutateAsync: generateStructuredOutputRules, isPending: isGenerating } = useGenerateStructuredOutputRules()
|
||||
|
||||
|
|
|
|||
|
|
@ -9,7 +9,6 @@ import GetAutomaticResModal from '@/app/components/app/configuration/config/auto
|
|||
import { AppType } from '@/types/app'
|
||||
import type { AutomaticRes } from '@/service/debug'
|
||||
import type { ModelConfig } from '@/app/components/workflow/types'
|
||||
import type { Model } from '@/types/app'
|
||||
|
||||
type Props = {
|
||||
className?: string
|
||||
|
|
@ -20,7 +19,6 @@ type Props = {
|
|||
const PromptGeneratorBtn: FC<Props> = ({
|
||||
className,
|
||||
onGenerated,
|
||||
modelConfig,
|
||||
}) => {
|
||||
const [showAutomatic, { setTrue: showAutomaticTrue, setFalse: showAutomaticFalse }] = useBoolean(false)
|
||||
const handleAutomaticRes = useCallback((res: AutomaticRes) => {
|
||||
|
|
@ -40,7 +38,6 @@ const PromptGeneratorBtn: FC<Props> = ({
|
|||
isShow={showAutomatic}
|
||||
onClose={showAutomaticFalse}
|
||||
onFinished={handleAutomaticRes}
|
||||
model={modelConfig as Model}
|
||||
isInLLMNode
|
||||
/>
|
||||
)}
|
||||
|
|
|
|||
|
|
@ -10,6 +10,7 @@ import {
|
|||
useWorkflowStore,
|
||||
} from '../../store'
|
||||
import { useWorkflowRun } from '../../hooks'
|
||||
import { formatWorkflowRunIdentifier } from '../../utils'
|
||||
import UserInput from './user-input'
|
||||
import Chat from '@/app/components/base/chat/chat'
|
||||
import type { ChatItem, ChatItemInTree } from '@/app/components/base/chat/types'
|
||||
|
|
@ -99,7 +100,7 @@ const ChatRecord = () => {
|
|||
{fetched && (
|
||||
<>
|
||||
<div className='flex shrink-0 items-center justify-between p-4 pb-1 text-base font-semibold text-text-primary'>
|
||||
{`TEST CHAT#${historyWorkflowData?.sequence_number}`}
|
||||
{`TEST CHAT${formatWorkflowRunIdentifier(historyWorkflowData?.finished_at)}`}
|
||||
<div
|
||||
className='flex h-6 w-6 cursor-pointer items-center justify-center'
|
||||
onClick={() => {
|
||||
|
|
|
|||
|
|
@ -4,6 +4,7 @@ import Run from '../run'
|
|||
import { useStore } from '../store'
|
||||
import { useWorkflowUpdate } from '../hooks'
|
||||
import { useHooksStore } from '../hooks-store'
|
||||
import { formatWorkflowRunIdentifier } from '../utils'
|
||||
|
||||
const Record = () => {
|
||||
const historyWorkflowData = useStore(s => s.historyWorkflowData)
|
||||
|
|
@ -22,7 +23,7 @@ const Record = () => {
|
|||
return (
|
||||
<div className='flex h-full w-[400px] flex-col rounded-l-2xl border-[0.5px] border-components-panel-border bg-components-panel-bg shadow-xl'>
|
||||
<div className='system-xl-semibold flex items-center justify-between p-4 pb-0 text-text-primary'>
|
||||
{`Test Run#${historyWorkflowData?.sequence_number}`}
|
||||
{`Test Run${formatWorkflowRunIdentifier(historyWorkflowData?.finished_at)}`}
|
||||
</div>
|
||||
<Run
|
||||
runDetailUrl={getWorkflowRunAndTraceUrl(historyWorkflowData?.id).runUrl}
|
||||
|
|
|
|||
|
|
@ -20,6 +20,7 @@ import { useStore } from '../store'
|
|||
import {
|
||||
WorkflowRunningStatus,
|
||||
} from '../types'
|
||||
import { formatWorkflowRunIdentifier } from '../utils'
|
||||
import Toast from '../../base/toast'
|
||||
import InputsPanel from './inputs-panel'
|
||||
import cn from '@/utils/classnames'
|
||||
|
|
@ -88,7 +89,7 @@ const WorkflowPreview = () => {
|
|||
onMouseDown={startResizing}
|
||||
/>
|
||||
<div className='flex items-center justify-between p-4 pb-1 text-base font-semibold text-text-primary'>
|
||||
{`Test Run${!workflowRunningData?.result.sequence_number ? '' : `#${workflowRunningData?.result.sequence_number}`}`}
|
||||
{`Test Run${formatWorkflowRunIdentifier(workflowRunningData?.result.finished_at)}`}
|
||||
<div className='cursor-pointer p-1' onClick={() => handleCancelDebugAndPreviewPanel()}>
|
||||
<RiCloseLine className='h-4 w-4 text-text-tertiary' />
|
||||
</div>
|
||||
|
|
|
|||
|
|
@ -376,7 +376,6 @@ export type WorkflowRunningData = {
|
|||
message_id?: string
|
||||
conversation_id?: string
|
||||
result: {
|
||||
sequence_number?: number
|
||||
workflow_id?: string
|
||||
inputs?: string
|
||||
process_data?: string
|
||||
|
|
@ -399,9 +398,9 @@ export type WorkflowRunningData = {
|
|||
|
||||
export type HistoryWorkflowData = {
|
||||
id: string
|
||||
sequence_number: number
|
||||
status: string
|
||||
conversation_id?: string
|
||||
finished_at?: number
|
||||
}
|
||||
|
||||
export enum ChangeType {
|
||||
|
|
|
|||
|
|
@ -86,6 +86,8 @@ const UpdateDSLModal = ({
|
|||
graph,
|
||||
features,
|
||||
hash,
|
||||
conversation_variables,
|
||||
environment_variables,
|
||||
} = await fetchWorkflowDraft(`/apps/${app_id}/workflows/draft`)
|
||||
|
||||
const { nodes, edges, viewport } = graph
|
||||
|
|
@ -122,6 +124,8 @@ const UpdateDSLModal = ({
|
|||
viewport,
|
||||
features: newFeatures,
|
||||
hash,
|
||||
conversation_variables: conversation_variables || [],
|
||||
environment_variables: environment_variables || [],
|
||||
},
|
||||
} as any)
|
||||
}, [eventEmitter])
|
||||
|
|
|
|||
|
|
@ -33,3 +33,22 @@ export const isEventTargetInputArea = (target: HTMLElement) => {
|
|||
if (target.contentEditable === 'true')
|
||||
return true
|
||||
}
|
||||
|
||||
/**
|
||||
* Format workflow run identifier using finished_at timestamp
|
||||
* @param finishedAt - Unix timestamp in seconds
|
||||
* @param fallbackText - Text to show when finishedAt is not available (default: 'Running')
|
||||
* @returns Formatted string like " (14:30:25)" or " (Running)"
|
||||
*/
|
||||
export const formatWorkflowRunIdentifier = (finishedAt?: number, fallbackText = 'Running'): string => {
|
||||
if (!finishedAt)
|
||||
return ` (${fallbackText})`
|
||||
|
||||
const date = new Date(finishedAt * 1000)
|
||||
const timeStr = date.toLocaleTimeString([], {
|
||||
hour: '2-digit',
|
||||
minute: '2-digit',
|
||||
second: '2-digit',
|
||||
})
|
||||
return ` (${timeStr})`
|
||||
}
|
||||
|
|
|
|||
|
|
@ -19,19 +19,87 @@ import { CUSTOM_ITERATION_START_NODE } from '../nodes/iteration-start/constants'
|
|||
import { CUSTOM_LOOP_START_NODE } from '../nodes/loop-start/constants'
|
||||
|
||||
export const getLayoutByDagre = (originNodes: Node[], originEdges: Edge[]) => {
|
||||
const dagreGraph = new dagre.graphlib.Graph()
|
||||
const dagreGraph = new dagre.graphlib.Graph({ compound: true })
|
||||
dagreGraph.setDefaultEdgeLabel(() => ({}))
|
||||
|
||||
const nodes = cloneDeep(originNodes).filter(node => !node.parentId && node.type === CUSTOM_NODE)
|
||||
const edges = cloneDeep(originEdges).filter(edge => (!edge.data?.isInIteration && !edge.data?.isInLoop))
|
||||
|
||||
// The default dagre layout algorithm often fails to correctly order the branches
|
||||
// of an If/Else node, leading to crossed edges.
|
||||
//
|
||||
// To solve this, we employ a "virtual container" strategy:
|
||||
// 1. A virtual, compound parent node (the "container") is created for each If/Else node's branches.
|
||||
// 2. Each direct child of the If/Else node is preceded by a virtual dummy node. These dummies are placed inside the container.
|
||||
// 3. A rigid, sequential chain of invisible edges is created between these dummy nodes (e.g., dummy_IF -> dummy_ELIF -> dummy_ELSE).
|
||||
//
|
||||
// This forces dagre to treat the ordered branches as an unbreakable, atomic group,
|
||||
// ensuring their layout respects the intended logical sequence.
|
||||
const ifElseNodes = nodes.filter(node => node.data.type === BlockEnum.IfElse)
|
||||
let virtualLogicApplied = false
|
||||
|
||||
ifElseNodes.forEach((ifElseNode) => {
|
||||
const childEdges = edges.filter(e => e.source === ifElseNode.id)
|
||||
if (childEdges.length <= 1)
|
||||
return
|
||||
|
||||
virtualLogicApplied = true
|
||||
const sortedChildEdges = childEdges.sort((edgeA, edgeB) => {
|
||||
const handleA = edgeA.sourceHandle
|
||||
const handleB = edgeB.sourceHandle
|
||||
|
||||
if (handleA && handleB) {
|
||||
const cases = (ifElseNode.data as any).cases || []
|
||||
const isAElse = handleA === 'false'
|
||||
const isBElse = handleB === 'false'
|
||||
|
||||
if (isAElse) return 1
|
||||
if (isBElse) return -1
|
||||
|
||||
const indexA = cases.findIndex((c: any) => c.case_id === handleA)
|
||||
const indexB = cases.findIndex((c: any) => c.case_id === handleB)
|
||||
|
||||
if (indexA !== -1 && indexB !== -1)
|
||||
return indexA - indexB
|
||||
}
|
||||
return 0
|
||||
})
|
||||
|
||||
const parentDummyId = `dummy-parent-${ifElseNode.id}`
|
||||
dagreGraph.setNode(parentDummyId, { width: 1, height: 1 })
|
||||
|
||||
const dummyNodes: string[] = []
|
||||
sortedChildEdges.forEach((edge) => {
|
||||
const dummyNodeId = `dummy-${edge.source}-${edge.target}`
|
||||
dummyNodes.push(dummyNodeId)
|
||||
dagreGraph.setNode(dummyNodeId, { width: 1, height: 1 })
|
||||
dagreGraph.setParent(dummyNodeId, parentDummyId)
|
||||
|
||||
const edgeIndex = edges.findIndex(e => e.id === edge.id)
|
||||
if (edgeIndex > -1)
|
||||
edges.splice(edgeIndex, 1)
|
||||
|
||||
edges.push({ id: `e-${edge.source}-${dummyNodeId}`, source: edge.source, target: dummyNodeId, sourceHandle: edge.sourceHandle } as Edge)
|
||||
edges.push({ id: `e-${dummyNodeId}-${edge.target}`, source: dummyNodeId, target: edge.target, targetHandle: edge.targetHandle } as Edge)
|
||||
})
|
||||
|
||||
for (let i = 0; i < dummyNodes.length - 1; i++) {
|
||||
const sourceDummy = dummyNodes[i]
|
||||
const targetDummy = dummyNodes[i + 1]
|
||||
edges.push({ id: `e-dummy-${sourceDummy}-${targetDummy}`, source: sourceDummy, target: targetDummy } as Edge)
|
||||
}
|
||||
})
|
||||
|
||||
dagreGraph.setGraph({
|
||||
rankdir: 'LR',
|
||||
align: 'UL',
|
||||
nodesep: 40,
|
||||
ranksep: 60,
|
||||
ranksep: virtualLogicApplied ? 30 : 60,
|
||||
ranker: 'tight-tree',
|
||||
marginx: 30,
|
||||
marginy: 200,
|
||||
})
|
||||
|
||||
nodes.forEach((node) => {
|
||||
dagreGraph.setNode(node.id, {
|
||||
width: node.width!,
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
export * from './node'
|
||||
export * from './edge'
|
||||
export * from './workflow-init'
|
||||
export * from './layout'
|
||||
export * from './dagre-layout'
|
||||
export * from './common'
|
||||
export * from './tool'
|
||||
export * from './workflow'
|
||||
|
|
|
|||
|
|
@ -314,14 +314,22 @@ else if (globalThis.document?.body?.getAttribute('data-public-max-iterations-num
|
|||
|
||||
export const MAX_ITERATIONS_NUM = maxIterationsNum
|
||||
|
||||
export const ENABLE_WEBSITE_JINAREADER = process.env.NEXT_PUBLIC_ENABLE_WEBSITE_JINAREADER !== undefined
|
||||
? process.env.NEXT_PUBLIC_ENABLE_WEBSITE_JINAREADER === 'true'
|
||||
: globalThis.document?.body?.getAttribute('data-public-enable-website-jinareader') === 'true' || true
|
||||
let enableWebsiteJinaReader = true
|
||||
let enableWebsiteFireCrawl = true
|
||||
let enableWebsiteWaterCrawl = false
|
||||
|
||||
export const ENABLE_WEBSITE_FIRECRAWL = process.env.NEXT_PUBLIC_ENABLE_WEBSITE_FIRECRAWL !== undefined
|
||||
? process.env.NEXT_PUBLIC_ENABLE_WEBSITE_FIRECRAWL === 'true'
|
||||
: globalThis.document?.body?.getAttribute('data-public-enable-website-firecrawl') === 'true' || true
|
||||
const getBooleanConfig = (envVar: string | undefined, attr: string) => {
|
||||
if (envVar !== undefined && envVar !== '')
|
||||
return envVar === 'true'
|
||||
const attrValue = globalThis.document?.body?.getAttribute(attr)
|
||||
if (attrValue !== undefined && attrValue !== '')
|
||||
return attrValue === 'true'
|
||||
return false
|
||||
}
|
||||
|
||||
export const ENABLE_WEBSITE_WATERCRAWL = process.env.NEXT_PUBLIC_ENABLE_WEBSITE_WATERCRAWL !== undefined
|
||||
? process.env.NEXT_PUBLIC_ENABLE_WEBSITE_WATERCRAWL === 'true'
|
||||
: globalThis.document?.body?.getAttribute('data-public-enable-website-watercrawl') === 'true' || true
|
||||
enableWebsiteJinaReader = getBooleanConfig(process.env.NEXT_PUBLIC_ENABLE_WEBSITE_JINAREADER, 'data-public-enable-website-jinareader')
|
||||
enableWebsiteFireCrawl = getBooleanConfig(process.env.NEXT_PUBLIC_ENABLE_WEBSITE_FIRECRAWL, 'data-public-enable-website-firecrawl')
|
||||
enableWebsiteWaterCrawl = getBooleanConfig(process.env.NEXT_PUBLIC_ENABLE_WEBSITE_WATERCRAWL, 'data-public-enable-website-watercrawl')
|
||||
export const ENABLE_WEBSITE_JINAREADER = enableWebsiteJinaReader
|
||||
export const ENABLE_WEBSITE_FIRECRAWL = enableWebsiteFireCrawl
|
||||
export const ENABLE_WEBSITE_WATERCRAWL = enableWebsiteWaterCrawl
|
||||
|
|
|
|||
|
|
@ -154,10 +154,6 @@ export const ProviderContextProvider = ({
|
|||
setIsFetchedPlan(true)
|
||||
}
|
||||
|
||||
if (data.model_load_balancing_enabled)
|
||||
setModelLoadBalancingEnabled(true)
|
||||
if (data.dataset_operator_enabled)
|
||||
setDatasetOperatorEnabled(true)
|
||||
if (data.model_load_balancing_enabled)
|
||||
setModelLoadBalancingEnabled(true)
|
||||
if (data.dataset_operator_enabled)
|
||||
|
|
|
|||
|
|
@ -137,6 +137,7 @@ const translation = {
|
|||
readyToInstall: 'Über die Installation des folgenden Plugins',
|
||||
dropPluginToInstall: 'Legen Sie das Plugin-Paket hier ab, um es zu installieren',
|
||||
next: 'Nächster',
|
||||
installWarning: 'Dieses Plugin darf nicht installiert werden.',
|
||||
},
|
||||
installFromGitHub: {
|
||||
selectPackagePlaceholder: 'Bitte wählen Sie ein Paket aus',
|
||||
|
|
@ -173,7 +174,7 @@ const translation = {
|
|||
recentlyUpdated: 'Kürzlich aktualisiert',
|
||||
},
|
||||
viewMore: 'Mehr anzeigen',
|
||||
sortBy: 'Schwarze Stadt',
|
||||
sortBy: 'Sortieren nach',
|
||||
discover: 'Entdecken',
|
||||
noPluginFound: 'Kein Plugin gefunden',
|
||||
difyMarketplace: 'Dify Marktplatz',
|
||||
|
|
|
|||
|
|
@ -137,6 +137,7 @@ const translation = {
|
|||
dropPluginToInstall: 'Suelte el paquete del complemento aquí para instalarlo',
|
||||
readyToInstallPackage: 'A punto de instalar el siguiente plugin',
|
||||
installedSuccessfully: 'Instalación exitosa',
|
||||
installWarning: 'Este plugin no está permitido para instalar.',
|
||||
},
|
||||
installFromGitHub: {
|
||||
uploadFailed: 'Error de carga',
|
||||
|
|
@ -175,7 +176,7 @@ const translation = {
|
|||
empower: 'Potencie su desarrollo de IA',
|
||||
moreFrom: 'Más de Marketplace',
|
||||
viewMore: 'Ver más',
|
||||
sortBy: 'Ciudad negra',
|
||||
sortBy: 'Ordenar por',
|
||||
noPluginFound: 'No se ha encontrado ningún plugin',
|
||||
pluginsResult: '{{num}} resultados',
|
||||
discover: 'Descubrir',
|
||||
|
|
|
|||
|
|
@ -137,6 +137,7 @@ const translation = {
|
|||
next: 'Prochain',
|
||||
installPlugin: 'Installer le plugin',
|
||||
installFailedDesc: 'L’installation du plug-in a échoué.',
|
||||
installWarning: 'Ce plugin n’est pas autorisé à être installé.',
|
||||
},
|
||||
installFromGitHub: {
|
||||
installFailed: 'Échec de l’installation',
|
||||
|
|
|
|||
|
|
@ -137,6 +137,7 @@ const translation = {
|
|||
installing: 'Installazione...',
|
||||
install: 'Installare',
|
||||
readyToInstallPackages: 'Sto per installare i seguenti plugin {{num}}',
|
||||
installWarning: 'Questo plugin non è consentito essere installato.',
|
||||
},
|
||||
installFromGitHub: {
|
||||
installedSuccessfully: 'Installazione riuscita',
|
||||
|
|
@ -178,7 +179,7 @@ const translation = {
|
|||
pluginsResult: '{{num}} risultati',
|
||||
noPluginFound: 'Nessun plug-in trovato',
|
||||
empower: 'Potenzia lo sviluppo dell\'intelligenza artificiale',
|
||||
sortBy: 'Città nera',
|
||||
sortBy: 'Ordina per',
|
||||
and: 'e',
|
||||
viewMore: 'Vedi di più',
|
||||
verifiedTip: 'Verificato da Dify',
|
||||
|
|
|
|||
|
|
@ -137,6 +137,7 @@ const translation = {
|
|||
installing: 'Instalar...',
|
||||
uploadingPackage: 'Carregando {{packageName}} ...',
|
||||
dropPluginToInstall: 'Solte o pacote de plug-in aqui para instalar',
|
||||
installWarning: 'Este plugin não é permitido ser instalado.',
|
||||
},
|
||||
installFromGitHub: {
|
||||
selectVersionPlaceholder: 'Selecione uma versão',
|
||||
|
|
@ -172,7 +173,7 @@ const translation = {
|
|||
recentlyUpdated: 'Atualizado recentemente',
|
||||
newlyReleased: 'Recém-lançado',
|
||||
},
|
||||
sortBy: 'Cidade negra',
|
||||
sortBy: 'Ordenar por',
|
||||
viewMore: 'Ver mais',
|
||||
and: 'e',
|
||||
pluginsResult: '{{num}} resultados',
|
||||
|
|
|
|||
|
|
@ -137,6 +137,7 @@ const translation = {
|
|||
pluginLoadErrorDesc: 'Acest plugin nu va fi instalat',
|
||||
installedSuccessfullyDesc: 'Pluginul a fost instalat cu succes.',
|
||||
readyToInstall: 'Despre instalarea următorului plugin',
|
||||
installWarning: 'Acest plugin nu este permis să fie instalat.',
|
||||
},
|
||||
installFromGitHub: {
|
||||
installFailed: 'Instalarea a eșuat',
|
||||
|
|
@ -173,7 +174,7 @@ const translation = {
|
|||
firstReleased: 'Prima lansare',
|
||||
},
|
||||
noPluginFound: 'Nu s-a găsit niciun plugin',
|
||||
sortBy: 'Orașul negru',
|
||||
sortBy: 'Sortează după',
|
||||
discover: 'Descoperi',
|
||||
empower: 'Îmbunătățește-ți dezvoltarea AI',
|
||||
pluginsResult: '{{num}} rezultate',
|
||||
|
|
|
|||
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue