Comments (3)
Hi! Sorry for the late reply!
How much did the larger training data (4.5M images from the Places-Challenge dataset) contribute to the improved quality of Big LaMa-Fourier?
Larger data helps, but significantly less than larger model size and other training tricks (SegmPL, large masks)
Do you think that similar results could have also been achieved for Big LaMa-Fourier with less data?
Less data = less quality, but not dramatically - reducing model size or removing SegmPL or using smaller training masks would hurt more
How much did the training and the inference benefit from data augmentation using segmentation masks from Detectron2?
We do not use segmentation masks from Detectron for training. We tried it in the very beginning of the project, but faced technical issues (slow, gpu memory consumption, cuda-reinitialization limitation) - so we could not use segmentation-based mask generation effectively during training. It is there just because we forgot to remove it when preparing a public code release. Please note that segm_proba: 0
in all the data configs.
from lama.
In our experience, mask widths matter much more than mask shapes
from lama.
Hi Roman,
awesome!! Thank you for sharing your experience! This really helps a lot and confirms my preliminary findings :)
from lama.
Related Issues (20)
- Cannot fine tune CelebaHq model checkpoint HOT 6
- Can't download the dataset HOT 7
- Running model on CPU HOT 1
- Link to pretrained model and test dataset seems to be not working HOT 7
- Cant download the model weight. HOT 2
- predict, ValueError:not enough values to unpack (expected 2, got 1) HOT 1
- lama-with-refiner work? HOT 2
- Model not generating output image after prediction HOT 2
- Could not initialize NNPACK! Reason: Unsupported hardware. HOT 1
- Process speed via CPU HOT 1
- a minimal inference script would be really nice HOT 1
- question about the removal of the chromatic aberration in inpainting HOT 1
- Question about model size HOT 1
- Share app using lama
- Using JPG files instead of PNG for inpainting results in an error.
- training dataset example HOT 1
- lama-iOS
- training issue on multiple GPUs
- Suggestions requested for training a LAMA model on higher resolution data HOT 1
- More Time for less VRAM method
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from lama.