System for interior design style transfer - repository for implementation of Masters Thesis project. Created by Mateusz Ogrodowczyk, Joanna Kurczalska, Jakub Eichner
We need a dockerized version of demo app for both tests and presentation. In this case we want all the models to run locally on the device on which the docker container is running – no requests and external API communication
We need a regular structure of a project consisting of the back- and front-end. Both elements could be easily ran inside the container (dockerized), yet we would have separate modules, that could just share their functionalities and make everything more elastic.
Things to consider:
Should we have a separate server for each of the inference 'types'? One for segmentation, another for generation or should we go on with the one big environment allowing to run both model kinds?
Client – stay with Streamlit or go with other framework (React, I guess)?
TODO:
Decide on the approach regarding the backend part
Create backend (probably in FastAPI) performing inference using available models.
Create client communicating with the backend
Acceptance criteria:
Backend allowing to perform inference is and client used to interact with system are functional,
Services are dockerized and can be spawned with single script / by docker compose