Stay Hungry,Stay Foolish!

Deploy Machine Learning Models with Keras, FastAPI, Redis and Docker --- the alternative solution --- fastapi + redis + celery + flower 处理后台耗费时间的后台任务

Deploy Machine Learning Models with Keras, FastAPI, Redis and Docker

https://www.cnblogs.com/lightsong/p/18731396

本例子中作者实现了通过redis做任务队列的功能, 异步消息协同是由作者手写,这部分代码实际上可以由 celery 代替。

 

 

Docker FastAPI Celery Redis

https://github.com/fanqingsong/docker-fastapi-celery-redis/tree/master

A basic Docker Compose template for orchestrating a FastAPI application & a Celery queue with Redis

Installation

git clone https://github.com/mattkohl/docker-fastapi-celery-redis
 

Build & Launch 

docker-compose up -d --build
 

This will expose the FastAPI's endpoints on port 5001 as well as a Flower server for monitoring workers on port 5555

To add more workers: 

docker-compose up -d --scale worker=5 --no-recreate
 

To shut down: 

docker-compose down
 

To change the endpoints, update the code in api/app.py

Task changes should happen in celery-queue/tasks.py

 

import celery.states as states
from fastapi import FastAPI
from fastapi.responses import HTMLResponse
from worker import celery

app = FastAPI()


@app.get("/add/{param1}/{param2}", response_class=HTMLResponse)
async def add(param1: int, param2: int) -> str:
    task = celery.send_task('tasks.add', args=[param1, param2], kwargs={})
    response = f"<a href='{app.url_path_for('check_task', task_id=task.id)}'>check status of {task.id} </a>"
    return response


@app.get("/check/{task_id}", response_class=HTMLResponse)
async def check_task(task_id: str) -> str:
    res = celery.AsyncResult(task_id)
    if res.state == states.PENDING:
        return res.state
    else:
        return str(res.result)


@app.get("/health_check")
async def health_check():
    return {"Status": "Ok"}

 

import os
import time
from celery import Celery
from celery.schedules import crontab
from celery.utils.log import get_task_logger
import redis



logger = get_task_logger(__name__)


CELERY_BROKER_URL = os.environ.get('CELERY_BROKER_URL', 'redis://localhost:6379'),
CELERY_RESULT_BACKEND = os.environ.get('CELERY_RESULT_BACKEND', 'redis://localhost:6379')

celery = Celery('tasks', broker=CELERY_BROKER_URL, backend=CELERY_RESULT_BACKEND)



# # Connect Redis db
# redis_db = redis.Redis(
#     host="localhost", port="6379", db=1, charset="utf-8", decode_responses=True
# )

# # Initialize timer in redis
# redis_db.mset({"minute": 0, "second": 0})



# Add periodic tasks
celery_beat_schedule = {
    "time_scheduler": {
        "task": "tasks.timer",
        # Run every second
        "schedule": crontab(minute='*/1'),
    }
}


celery.conf.update(
    result_backend=CELERY_RESULT_BACKEND,
    broker_url=CELERY_BROKER_URL,
    timezone="Asia/Shanghai",
    task_serializer="json",
    accept_content=["json"],
    result_serializer="json",
    beat_schedule=celery_beat_schedule,
)


@celery.task(name='tasks.timer')
def timer():
    # second_counter = int(redis_db.get("second")) + 1
    # if second_counter >= 59:
    #     # Reset the counter
    #     redis_db.set("second", 0)
    #     # Increment the minute
    #     redis_db.set("minute", int(redis_db.get("minute")) + 1)
    # else:
    #     # Increment the second
    #     redis_db.set("second", second_counter)

    logger.critical("second")
    logger.critical("222222222222")


@celery.task(name='tasks.add')
def add(x: int, y: int) -> int:
    time.sleep(5)
    return x + y

 

 

应用场景

对于比较耗时的后台逻辑:

(1)模型推理

(2)模型训练

(3)批处理作业

 

posted @ 2025-03-07 22:13  lightsong  阅读(87)  评论(0)    收藏  举报
千山鸟飞绝,万径人踪灭