- Project Overview
- Features
- Technologies Used
- Prerequisites
- Installation
- Configuration
- Deployment
- Local Deployment
- Production Deployment
- Usage
- Testing
- Contributing
- License
- Contact
Coffee Queue is a project that aims to manage and balance requests efficiently with built-in DDoS protection. The project is built using FastAPI and is containerized using Docker for easy deployment. It includes multiple APIs that handle different tasks and is configured with an Nginx web server as a reverse proxy.
- DDoS Protection: Protect your application from DDoS attacks.
- Multiple APIs: Separate endpoints for different functionalities (outer_api, inner_api, queue_api).
- Dockerized: Easily deployable using Docker and Docker Compose.
- Nginx Reverse Proxy: Efficient request handling and load balancing using Nginx.
- Programming Language: Python 3.9
- Framework: FastAPI
- Web Server: Nginx
- Containerization: Docker, Docker Compose
- HTTP Client: AIOHTTP
Before you begin, ensure you have met the following requirements:
- You have installed Docker and Docker Compose.
- You have a machine running Windows, macOS, or Linux.
- You have read the FastAPI documentation for any API-specific information you might need.
- Clone the repository:
git clone https://github.com/bimba-joy/coffee-queue.git
- Navigate to the project directory:
cd coffee-queue
- Dockerfile:
FROM tiangolo/uvicorn-gunicorn-fastapi:python3.9
COPY ./app /app
COPY ./requirements.txt /app/requirements.txt
WORKDIR /app
RUN pip install --no-cache-dir -r requirements.txt
CMD ["python", "main.py"]
- Docker Compose Configuration (docker-compose.yml):
version: '3.8'
services:
app:
build: .
ports:
- "8000:8000"
- "8001:8001"
- "9999:9999"
networks:
- app-net
nginx:
image: nginx:latest
ports:
- "80:80"
- "81:81"
volumes:
- ./nginx/nginx.conf:/etc/nginx/nginx.conf
depends_on:
- app
networks:
- app-net
networks:
app-net:
- Build and run the Docker containers:
docker-compose up --build
- The application will be accessible at http://localhost:80, http://localhost:81, and http://localhost:99.
The application is deployed at http://51.250.93.255. To deploy the application to another server, follow these steps:
- Ensure Docker and Docker Compose are installed on your server.
- Transfer the repository files to your server.
- Build and run the Docker containers:
docker-compose up --build -d
To use Coffee Queue, follow these steps:
- Access the outer API at http://51.250.93.255:80.
- Access the inner API at http://51.250.93.255:81.
For more detailed instructions on how to use the APIs, refer to the FastAPI documentation.
The Coffee Queue service provides several API endpoints to manage and process orders. Below is a summary of the available endpoints and their usage.
- Create Order
- Method: POST
- URL: http://51.250.93.255:80/order/
- Description: Creates a new order and returns an order_id.
- Response:
{
"order_id": "<generated_order_id>"
}
- Check Order Status
- Method: GET
- URL: http://51.250.93.255:80/order/{order_id}
- Description: Checks the status of an order using the order_id.
- Response:
{
"order_id": "<order_id>",
"status": "<order_status>"
}
-
Start Order
- Method: GET
- URL: http://51.250.93.255:81/start/
- Description: For workers to start working on an order. This endpoint can be used to fetch the next pending order.
-
Finish Order
- Method: POST
- URL: http://51.250.93.255:81/finish/
- Description: For workers to mark an order as finished.
- Request Body:
{
"order_id": "<your_order_id>"
}
Here is an example workflow using the provided API endpoints:
- Create an Order:
curl -X POST http://51.250.93.255:80/order/ -H "Content-Type: application/json"
- Check Order Status:
curl -X GET http://51.250.93.255:80/order/{order_id} -H "Content-Type: application/json"
- Worker Starts an Order:
curl -X GET http://51.250.93.255:81/start/ -H "Content-Type: application/json"
- Worker Finishes an Order:
curl -X POST http://51.250.93.255:81/finish/ -H "Content-Type: application/json" -d '{"order_id": "<your_order_id>"}'
These endpoints allow for efficient order management and processing within the Coffee Queue service.
To ensure that Coffee Queue can handle the expected load, we use Locust for performing load tests. The load test scripts are located in the tests directory.
Ensure you have Locust installed. You can install it using pip:
pip install locust
The load test scripts are located in the tests directory:
- inner_api_test.py: Load test for the inner API.
- outer_api_test.py: Load test for the outer API.
- report_1718575177.6180627.html: An example report generated from a previous load test run.
To run the load tests using Locust, follow these steps:
- Navigate to the tests directory:
cd tests
- Run Locust with the desired test script. For example, to run the test for the outer API:
locust -f outer_api_test.py
-
Open your browser and navigate to http://localhost:8089 to access the Locust web interface.
Here, you can configure the number of users to simulate and the hatch rate (users spawned per second).
Contributions are always welcome! Please follow the contributing guidelines.
- Fork the repository.
- Create a new branch (git checkout -b feature-branch).
- Make your changes.
- Commit your changes (git commit -m 'Add some feature').
- Push to the branch (git push origin feature-branch).
- Open a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
Maintainer: Apashynskyi Dmytro Email: [email protected]
For issues, please open an issue on the issue tracker.
Thank you for using Coffee Queue! If you encounter any issues, feel free to contact us through the Contact section. Happy coding!