detect faces and stream results simple way !
Run app and go to /docs OR:
- POST /image -> create face job
- GET /jobs/{job_id} -> get date about job
- GET /jobs/{job_id}/processed-image -> get processed image
- WS /ws -> connect to live stream of processed image urls
$ cp example.envs .envs
$ docker compose build
app -> localhost:8282/
$ docker compose up
$ docker compose run face_backend pytest --asyncio-mode=auto
- post request with image to process
- backend create "face job", persist origin file, do all logic/validation with "face_job"
- backend create "face job" event as request for detection via producer.
- consumer/consumers consume event - request for face detection.
- Consumer use face detector, and handle "face job" status/metadata,
- In case of face detection, consumer push data about detection to job ws stream
- Current open websockets iterate over job ws stream do get latest data about detection
1 2 3
+------------+ +------------+ +----------------------------+
/image | | create job | | push event | |
--------------> | face API | --------------> | producer | --------------> | JOB STREAM |
POST | | | | | |
+------------+ +------------+ +----------------------------+
+------------+ ^ ^ ^
| websocket | | | |
+------+-----+ | | |
| 4 | consume | consume | consume
| | | |
| | | |
|
| +------------+ +------------+ +------------+
| | | | | | |
| | producer | | producer | | producer |
| | | | | | |
| +------------+ +------------+ +------------+
| +------------+ +------------+ +------------+
| | detector | | detector | | detector |
| +------------+ +------------+ +------------+
|
| | | |
| | | |
| 5 | push | push | push
| | | |
| | | |
| 6 v v v
| +----------------------------+
| iter read by push messages | |
+----------------------------------------------------->| JOB WS STREAM |
| |
+----------------------------+
- postgres for job data
- dir storage "bucket like" for storing files and serve files
- fastAPI as web server
- async python for event handling (producer/consumer/handler)
- nginx to serve files as "static" + simple reverse proxy
- redis for backend for stream handler + pub/sub architecture
- docker compose cuz it's simple and fast
- create package for utils
- make more unit tests
- create pyproject.toml + all stuff with pre-commit etc etc
- make real separate webserver/consumer Dockerfiles
- clear dockerfiles
- move to poetry + clear requirements
- clear TODO comments