dify/api
zhaobingshuang fa1009b938
fix(dataset): dataset tags service_api error "Dataset not found" (#30028)
Co-authored-by: zbs <zbs@cailian.onaliyun.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-12-26 10:55:42 +08:00
..
.idea fix nltk averaged_perceptron_tagger download and fix score limit is none (#7582) 2024-08-26 15:14:05 +08:00
.vscode feat: introduce trigger functionality (#27644) 2025-11-12 17:59:37 +08:00
agent_skills feat: introduce trigger functionality (#27644) 2025-11-12 17:59:37 +08:00
configs feat: Add OSS-specific parameters for HW and ALI private deployment (#29705) 2025-12-22 21:59:32 +08:00
constants feat(i18n): add Tunisian Arabic (ar-TN) translation (#29306) 2025-12-13 10:55:04 +08:00
contexts feat: introduce trigger functionality (#27644) 2025-11-12 17:59:37 +08:00
controllers fix(dataset): dataset tags service_api error "Dataset not found" (#30028) 2025-12-26 10:55:42 +08:00
core fix: validate first then save to db (#30107) 2025-12-25 19:36:52 +09:00
docker feat: sandbox retention basic settings (#29842) 2025-12-18 14:16:23 +08:00
enums feat: trigger billing (#28335) 2025-11-20 10:15:23 +08:00
events fix: delete knowledge pipeline but pipeline and workflow don't delete (#29591) 2025-12-15 12:00:03 +08:00
extensions feat: Add OSS-specific parameters for HW and ALI private deployment (#29705) 2025-12-22 21:59:32 +08:00
factories fix: webhook node output file as file variable (#29621) 2025-12-15 19:55:59 +08:00
fields Feat/support multimodal embedding (#29115) 2025-12-09 14:41:46 +08:00
libs refactor: split changes for api/libs/helper.py (#29875) 2025-12-19 12:00:34 +08:00
migrations feat: Add "type" field to PipelineRecommendedPlugin model; Add query param "type" to recommended-plugins api. (#29736) 2025-12-17 11:26:08 +08:00
models feat: Add "type" field to PipelineRecommendedPlugin model; Add query param "type" to recommended-plugins api. (#29736) 2025-12-17 11:26:08 +08:00
repositories Enhanced GraphEngine Pause Handling (#28196) 2025-11-26 19:59:34 +08:00
schedule feat: trigger billing (#28335) 2025-11-20 10:15:23 +08:00
services feat: add mcp tool display directly (#30019) 2025-12-26 10:41:10 +08:00
tasks fix: update Notion credential retrieval in document indexing sync task (#29933) 2025-12-21 16:51:24 +08:00
templates Update email templates to improve clarity and consistency in messagin… (#26970) 2025-10-16 01:42:22 -07:00
tests feat: add mcp tool display directly (#30019) 2025-12-26 10:41:10 +08:00
.dockerignore Enhance Code Consistency Across Repository with `.editorconfig` (#19023) 2025-04-29 18:04:33 +08:00
.env.example feat: Add OSS-specific parameters for HW and ALI private deployment (#29705) 2025-12-22 21:59:32 +08:00
.importlinter Enhanced GraphEngine Pause Handling (#28196) 2025-11-26 19:59:34 +08:00
.ruff.toml feat(api): Implement EventManager error logging and add coverage (#29204) 2025-12-08 09:40:40 +08:00
AGENTS.md feat: introduce trigger functionality (#27644) 2025-11-12 17:59:37 +08:00
Dockerfile docker: use `COPY --chown` in api Dockerfile to avoid adding layers by explicit `chown` calls (#28756) 2025-11-28 11:33:06 +08:00
README.md feat: sandbox retention basic settings (#29842) 2025-12-18 14:16:23 +08:00
app.py fix: determine cpu cores determination in baseedpyright-check script on macos (#28058) 2025-11-12 19:27:27 +08:00
app_factory.py feat: Add Aliyun SLS (Simple Log Service) integration for workflow execution logging (#28986) 2025-12-17 13:43:54 +08:00
celery_entrypoint.py chore(api): adjust monkey patching in gunicorn.conf.py (#26056) 2025-09-22 18:23:01 +08:00
cnt_base.sh add cnt script and one more example (#28272) 2025-11-18 16:44:14 +09:00
commands.py fix: make remove-orphaned-files-on-storage management command work and safer (#29247) 2025-12-08 09:48:05 +08:00
dify_app.py refactor: assembling the app features in modular way (#9129) 2024-11-30 23:05:22 +08:00
gunicorn.conf.py docs(api): update docs about gevent setup in app.py (#27611) 2025-10-30 15:43:08 +08:00
pyproject.toml chore: bump version to 1.11.2 (#30088) 2025-12-25 16:16:24 +08:00
pyrefly.toml refactor: port reqparse to Pydantic model (#28949) 2025-12-05 13:05:53 +09:00
pyrightconfig.json rm type ignore (#25715) 2025-10-21 11:26:58 +08:00
pytest.ini test: Consolidate API CI test runner (#29440) 2025-12-15 13:20:31 +08:00
ty.toml chore: apply ty checks on api code with script and ci action (#24653) 2025-09-02 16:05:13 +08:00
uv.lock chore: bump version to 1.11.2 (#30088) 2025-12-25 16:16:24 +08:00

README.md

Dify Backend API

Usage

[!IMPORTANT]

In the v1.3.0 release, poetry has been replaced with uv as the package manager for Dify API backend service.

  1. Start the docker-compose stack

    The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using docker-compose.

    cd ../docker
    cp middleware.env.example middleware.env
    # change the profile to mysql if you are not using postgres,change the profile to other vector database if you are not using weaviate
    docker compose -f docker-compose.middleware.yaml --profile postgresql --profile weaviate -p dify up -d
    cd ../api
    
  2. Copy .env.example to .env

    cp .env.example .env
    

[!IMPORTANT]

When the frontend and backend run on different subdomains, set COOKIE_DOMAIN to the sites top-level domain (e.g., example.com). The frontend and backend must be under the same top-level domain in order to share authentication cookies.

  1. Generate a SECRET_KEY in the .env file.

    bash for Linux

    sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env
    

    bash for Mac

    secret_key=$(openssl rand -base64 42)
    sed -i '' "/^SECRET_KEY=/c\\
    SECRET_KEY=${secret_key}" .env
    
  2. Create environment.

    Dify API service uses UV to manage dependencies. First, you need to add the uv package manager, if you don't have it already.

    pip install uv
    # Or on macOS
    brew install uv
    
  3. Install dependencies

    uv sync --dev
    
  4. Run migrate

    Before the first launch, migrate the database to the latest version.

    uv run flask db upgrade
    
  5. Start backend

    uv run flask run --host 0.0.0.0 --port=5001 --debug
    
  6. Start Dify web service.

  7. Setup your application by visiting http://localhost:3000.

  8. If you need to handle and debug the async tasks (e.g. dataset importing and documents indexing), please start the worker service.

uv run celery -A app.celery worker -P threads -c 2 --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor,retention

Additionally, if you want to debug the celery scheduled tasks, you can run the following command in another terminal to start the beat service:

uv run celery -A app.celery beat

Testing

  1. Install dependencies for both the backend and the test environment

    uv sync --dev
    
  2. Run the tests locally with mocked system environment variables in tool.pytest_env section in pyproject.toml, more can check Claude.md

    uv run pytest                           # Run all tests
    uv run pytest tests/unit_tests/         # Unit tests only
    uv run pytest tests/integration_tests/  # Integration tests
    
    # Code quality
    ../dev/reformat               # Run all formatters and linters
    uv run ruff check --fix ./    # Fix linting issues
    uv run ruff format ./         # Format code
    uv run basedpyright .         # Type checking