**Problem:**
The telemetry system had unnecessary abstraction layers and bad practices
from the last 3 commits introducing the gateway implementation:
- TelemetryFacade class wrapper around emit() function
- String literals instead of SignalType enum
- Dictionary mapping enum → string instead of enum → enum
- Unnecessary ENTERPRISE_TELEMETRY_GATEWAY_ENABLED feature flag
- Duplicate guard checks scattered across files
- Non-thread-safe TelemetryGateway singleton pattern
- Missing guard in ops_trace_task.py causing RuntimeError spam
**Solution:**
1. Deleted TelemetryFacade - replaced with thin emit() function in core/telemetry/__init__.py
2. Added SignalType enum ('trace' | 'metric_log') to enterprise/telemetry/contracts.py
3. Replaced CASE_TO_TRACE_TASK_NAME dict with CASE_TO_TRACE_TASK: dict[TelemetryCase, TraceTaskName]
4. Deleted is_gateway_enabled() and _emit_legacy() - using existing ENTERPRISE_ENABLED + ENTERPRISE_TELEMETRY_ENABLED instead
5. Extracted _should_drop_ee_only_event() helper to eliminate duplicate checks
6. Moved TelemetryGateway singleton to ext_enterprise_telemetry.py:
- Init once in init_app() for thread-safety
- Access via get_gateway() function
7. Re-added guard to ops_trace_task.py to prevent RuntimeError when EE=OFF but CE tracing enabled
8. Updated 11 caller files to import 'emit as telemetry_emit' instead of 'TelemetryFacade'
**Result:**
- 322 net lines deleted (533 removed, 211 added)
- All 91 tests pass
- Thread-safe singleton pattern
- Cleaner API surface: from TelemetryFacade.emit() to telemetry_emit()
- Proper enum usage throughout
- No RuntimeError spam in EE=OFF + CE=ON scenario
|
||
|---|---|---|
| .. | ||
| .idea | ||
| .sisyphus | ||
| .vscode | ||
| configs | ||
| constants | ||
| context | ||
| contexts | ||
| controllers | ||
| core | ||
| docker | ||
| enterprise | ||
| enums | ||
| events | ||
| extensions | ||
| factories | ||
| fields | ||
| libs | ||
| migrations | ||
| models | ||
| repositories | ||
| schedule | ||
| services | ||
| tasks | ||
| templates | ||
| tests | ||
| .dockerignore | ||
| .env.example | ||
| .importlinter | ||
| .ruff.toml | ||
| AGENTS.md | ||
| Dockerfile | ||
| README.md | ||
| app.py | ||
| app_factory.py | ||
| celery_entrypoint.py | ||
| cnt_base.sh | ||
| commands.py | ||
| dify_app.py | ||
| gunicorn.conf.py | ||
| pyproject.toml | ||
| pyrefly.toml | ||
| pyrightconfig.json | ||
| pytest.ini | ||
| ty.toml | ||
| uv.lock | ||
README.md
Dify Backend API
Setup and Run
[!IMPORTANT]
In the v1.3.0 release,
poetryhas been replaced withuvas the package manager for Dify API backend service.
uv and pnpm are required to run the setup and development commands below.
Using scripts (recommended)
The scripts resolve paths relative to their location, so you can run them from anywhere.
-
Run setup (copies env files and installs dependencies).
./dev/setup -
Review
api/.env,web/.env.local, anddocker/middleware.envvalues (see theSECRET_KEYnote below). -
Start middleware (PostgreSQL/Redis/Weaviate).
./dev/start-docker-compose -
Start backend (runs migrations first).
./dev/start-api -
Start Dify web service.
./dev/start-web -
Set up your application by visiting
http://localhost:3000. -
Optional: start the worker service (async tasks, runs from
api)../dev/start-worker -
Optional: start Celery Beat (scheduled tasks).
./dev/start-beat
Manual commands
Show manual setup and run steps
These commands assume you start from the repository root.
-
Start the docker-compose stack.
The backend requires middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using
docker-compose.cp docker/middleware.env.example docker/middleware.env # Use mysql or another vector database profile if you are not using postgres/weaviate. docker compose -f docker/docker-compose.middleware.yaml --profile postgresql --profile weaviate -p dify up -d -
Copy env files.
cp api/.env.example api/.env cp web/.env.example web/.env.local -
Install UV if needed.
pip install uv # Or on macOS brew install uv -
Install API dependencies.
cd api uv sync --group dev -
Install web dependencies.
cd web pnpm install cd .. -
Start backend (runs migrations first, in a new terminal).
cd api uv run flask db upgrade uv run flask run --host 0.0.0.0 --port=5001 --debug -
Start Dify web service (in a new terminal).
cd web pnpm dev:inspect -
Set up your application by visiting
http://localhost:3000. -
Optional: start the worker service (async tasks, in a new terminal).
cd api uv run celery -A app.celery worker -P threads -c 2 --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor,retention,enterprise_telemetry -
Optional: start Celery Beat (scheduled tasks, in a new terminal).
cd api uv run celery -A app.celery beat
Environment notes
[!IMPORTANT]
When the frontend and backend run on different subdomains, set COOKIE_DOMAIN to the site’s top-level domain (e.g.,
example.com). The frontend and backend must be under the same top-level domain in order to share authentication cookies.
-
Generate a
SECRET_KEYin the.envfile.bash for Linux
sed -i "/^SECRET_KEY=/c\\SECRET_KEY=$(openssl rand -base64 42)" .envbash for Mac
secret_key=$(openssl rand -base64 42) sed -i '' "/^SECRET_KEY=/c\\ SECRET_KEY=${secret_key}" .env
Testing
-
Install dependencies for both the backend and the test environment
cd api uv sync --group dev -
Run the tests locally with mocked system environment variables in
tool.pytest_envsection inpyproject.toml, more can check Claude.mdcd api uv run pytest # Run all tests uv run pytest tests/unit_tests/ # Unit tests only uv run pytest tests/integration_tests/ # Integration tests # Code quality ./dev/reformat # Run all formatters and linters uv run ruff check --fix ./ # Fix linting issues uv run ruff format ./ # Format code uv run basedpyright . # Type checking