dify/api
开坦克的贝塔 96008f1f3d feat: 白名单中去除 maas 平台 2024-06-27 17:49:43 +08:00
..
.vscode build: initial support for poetry build tool (#4513) 2024-06-11 13:11:28 +08:00
constants feat: Added hindi translation i18n (#5240) 2024-06-15 21:01:03 +08:00
controllers feat: permission and security fixes (#5266) 2024-06-17 16:06:32 +08:00
core feat: 白名单中去除 maas 平台 2024-06-27 17:49:43 +08:00
docker feat: add `flask upgrade-db` command for running db upgrade with redis lock (#5333) 2024-06-18 13:26:01 +08:00
events fix: add event handler to delete the site when the related app deleted (#5282) 2024-06-17 08:47:26 +08:00
extensions feat: support tencent cos storage (#5297) 2024-06-17 19:18:52 +08:00
fields fix: workspace member's last_active should be last_active_time, but not last_login_time (#4906) 2024-06-14 20:49:19 +08:00
libs fix: allow special characters in email (#5327) 2024-06-17 21:32:59 +08:00
migrations feat: add `flask upgrade-db` command for running db upgrade with redis lock (#5333) 2024-06-18 13:26:01 +08:00
models Feat/firecrawl data source (#5232) 2024-06-15 02:46:02 +08:00
schedule Feat/dify rag (#2528) 2024-02-22 23:31:57 +08:00
services add the filename length limit (#5326) 2024-06-17 20:36:54 +08:00
tasks Feat/firecrawl data source (#5232) 2024-06-15 02:46:02 +08:00
templates fix: email template style (#1914) 2024-01-04 16:53:11 +08:00
tests fix: allow special characters in email (#5327) 2024-06-17 21:32:59 +08:00
.dockerignore build: fix .dockerignore file (#800) 2023-08-11 18:19:44 +08:00
.env.example feat: support tencent cos storage (#5297) 2024-06-17 19:18:52 +08:00
Dockerfile improvement: speed up dependency installation in docker image rebuilds by mounting cache layer (#3218) 2024-04-10 22:49:04 +08:00
README.md docs(readme): Optimize the content in the readme file (#5364) 2024-06-18 18:33:22 +08:00
app.py refactor: config file (#3852) 2024-04-25 22:26:45 +08:00
commands.py feat: add `flask upgrade-db` command for running db upgrade with redis lock (#5333) 2024-06-18 13:26:01 +08:00
config.py feat: support tencent cos storage (#5297) 2024-06-17 19:18:52 +08:00
poetry.lock chore: add `novita_client` to `pyproject.toml` (#5349) 2024-06-18 14:52:20 +08:00
poetry.toml build: initial support for poetry build tool (#4513) 2024-06-11 13:11:28 +08:00
pyproject.toml chore: add `novita_client` to `pyproject.toml` (#5349) 2024-06-18 14:52:20 +08:00
requirements-dev.txt chore: skip explicit installing jinja2 as testing dependency (#4845) 2024-06-02 09:49:20 +08:00
requirements.txt dep: bump chromadb from 0.5.0 to 0.5.1 (#5345) 2024-06-18 14:05:14 +08:00

README.md

Dify Backend API

Usage

  1. Start the docker-compose stack

    The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using docker-compose.

    cd ../docker
    docker-compose -f docker-compose.middleware.yaml -p dify up -d
    cd ../api
    
  2. Copy .env.example to .env

  3. Generate a SECRET_KEY in the .env file.

    sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env
    
    secret_key=$(openssl rand -base64 42)
    sed -i '' "/^SECRET_KEY=/c\\
    SECRET_KEY=${secret_key}" .env
    
  4. Create environment.

    Dify API service uses Poetry to manage dependencies. You can execute poetry shell to activate the environment.

    Using pip can be found below.

  5. Install dependencies

=======

poetry env use 3.10
poetry install

In case of contributors missing to update dependencies for pyproject.toml, you can perform the following shell instead.

poetry shell                                               # activate current environment
poetry add $(cat requirements.txt)           # install dependencies of production and update pyproject.toml
poetry add $(cat requirements-dev.txt) --group dev    # install dependencies of development and update pyproject.toml
  1. Run migrate

    Before the first launch, migrate the database to the latest version.

    poetry run python -m flask db upgrade
    
  2. Start backend

    poetry run python -m flask run --host 0.0.0.0 --port=5001 --debug
    
  3. Start Dify web service.

  4. Setup your application by visiting http://localhost:3000...

  5. If you need to debug local async processing, please start the worker service.

poetry run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail

The started celery app handles the async tasks, e.g. dataset importing and documents indexing.

Testing

  1. Install dependencies for both the backend and the test environment

    poetry install --with dev
    
  2. Run the tests locally with mocked system environment variables in tool.pytest_env section in pyproject.toml

    cd ../
    poetry run -C api bash dev/pytest/pytest_all_tests.sh
    

Usage with pip

[!NOTE]
In the next version, we will deprecate pip as the primary package management tool for dify api service, currently Poetry and pip coexist.

  1. Start the docker-compose stack

    The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using docker-compose.

    cd ../docker
    docker-compose -f docker-compose.middleware.yaml -p dify up -d
    cd ../api
    
  2. Copy .env.example to .env

  3. Generate a SECRET_KEY in the .env file.

    sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env
    
  4. Create environment.

    If you use Anaconda, create a new environment and activate it

    conda create --name dify python=3.10
    conda activate dify
    
  5. Install dependencies

    pip install -r requirements.txt
    
  6. Run migrate

    Before the first launch, migrate the database to the latest version.

    flask db upgrade
    
  7. Start backend:

    flask run --host 0.0.0.0 --port=5001 --debug
    
  8. Setup your application by visiting http://localhost:5001/console/api/setup or other apis...

  9. If you need to debug local async processing, please start the worker service.

    celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail
    

    The started celery app handles the async tasks, e.g. dataset importing and documents indexing.