Giter VIP home page Giter VIP logo

Comments (6)

MoonRide303 avatar MoonRide303 commented on July 20, 2024

@valdecircarvalho Can you access (error) logs from that server where MRE is running, by any chance?

from fooocus-mre.

valdecircarvalho avatar valdecircarvalho commented on July 20, 2024

Hello @MoonRide303
Yes, I have full access to the server. Could you please point me to where the Fooocus-MRE logs are stored?

from fooocus-mre.

valdecircarvalho avatar valdecircarvalho commented on July 20, 2024

Following are the output of the program start and a successfully generated image accessing it from the ip:port.
When I try to access it from foooocus.domain.com no error is displayed on the console, only on the web browser.
Here is the link if you want to try it https://fooocus.localhostcloud.com/

(fooocus-mre) ubuntu@sd-linux:~/ai/Fooocus-MRE$ python launch.py --listen 0.0.0.0 --port 7866
Python 3.10.12 (main, Jul 5 2023, 18:54:27) [GCC 11.2.0]
Fooocus version: 1.0.41 MRE
Inference Engine exists.
Inference Engine checkout finished.
Total VRAM 16151 MB, total RAM 181123 MB
xformers version: 0.0.21
Set vram state to: NORMAL_VRAM
Device: cuda:0 Tesla V100-SXM2-16GB : cudaMallocAsync
Using xformers cross attention
Running on local URL: http://0.0.0.0:7866

To create a public link, set share=True in launch().
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is None and using 10 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is 2048 and using 10 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is None and using 10 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is 2048 and using 10 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is None and using 10 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is 2048 and using 10 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is None and using 10 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is 2048 and using 10 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 2048 and using 20 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is None and using 10 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is 2048 and using 10 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is None and using 10 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is 2048 and using 10 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is None and using 10 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is 2048 and using 10 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is None and using 10 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is 2048 and using 10 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is None and using 10 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is 2048 and using 10 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is None and using 10 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is 2048 and using 10 heads.
model_type EPS
adm 2816
making attention of type 'vanilla-xformers' with 512 in_channels
building MemoryEfficientAttnBlock with 512 in_channels...
Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
making attention of type 'vanilla-xformers' with 512 in_channels
building MemoryEfficientAttnBlock with 512 in_channels...
missing {'cond_stage_model.clip_l.text_projection', 'cond_stage_model.clip_g.transformer.text_model.embeddings.position_ids', 'cond_stage_model.clip_l.logit_scale'}
Base model loaded: sd_xl_base_1.0_0.9vae.safetensors
App started successful. Use the app with http://localhost:7866/ or 0.0.0.0:7866
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is None and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is 1280 and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is None and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is 1280 and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is None and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is 1280 and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is None and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is 1280 and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is None and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is 1280 and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is None and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is 1280 and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is None and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is 1280 and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is None and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is 1280 and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is None and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1536, context_dim is 1280 and using 24 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is None and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is 1280 and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is None and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is 1280 and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is None and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is 1280 and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is None and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is 1280 and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is None and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is 1280 and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is None and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is 1280 and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is None and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is 1280 and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is None and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is 1280 and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is None and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is 1280 and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is None and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is 1280 and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is None and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is 1280 and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is None and using 12 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 768, context_dim is 1280 and using 12 heads.
model_type EPS
adm 2560
making attention of type 'vanilla-xformers' with 512 in_channels
building MemoryEfficientAttnBlock with 512 in_channels...
Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
making attention of type 'vanilla-xformers' with 512 in_channels
building MemoryEfficientAttnBlock with 512 in_channels...
missing {'cond_stage_model.clip_g.transformer.text_model.embeddings.position_ids'}
Refiner model loaded: sd_xl_refiner_1.0_0.9vae.safetensors
LoRAs loaded: [('sd_xl_offset_example-lora_1.0.safetensors', 0.5), ('None', 0.5), ('None', 0.5), ('None', 0.5), ('None', 0.5)]
loading new
loading new
/home/ubuntu/anaconda3/envs/fooocus-mre/lib/python3.10/site-packages/torch/_utils.py:776: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
return self.fget.get(instance, owner)()
loading new
0%| | 0/30 [00:00<?, ?it/s]/home/ubuntu/anaconda3/envs/fooocus-mre/lib/python3.10/site-packages/torchsde/_brownian/brownian_interval.py:594: UserWarning: Should have tb<=t1 but got tb=14.614643096923828 and t1=14.614643.
warnings.warn(f"Should have {tb_name}<=t1 but got {tb_name}={tb} and t1={self._end}.")
67%|████████████████████████████████████████████████████████████████████████████▋ | 20/30 [00:07<00:02, 3.55it/s]loading new
Refiner swapped.
93%|███████████████████████████████████████████████████████████████████████████████████████████████████████████▎ | 28/30 [00:11<00:00, 2.85it/s]/home/ubuntu/anaconda3/envs/fooocus-mre/lib/python3.10/site-packages/torchsde/_brownian/brownian_interval.py:585: UserWarning: Should have ta>=t0 but got ta=0.02916753850877285 and t0=0.029168.
warnings.warn(f"Should have ta>=t0 but got ta={ta} and t0={self._start}.")
100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 30/30 [00:11<00:00, 2.54it/s]Image generated with private log at: /home/ubuntu/ai/Fooocus-MRE/outputs/2023-08-30/log.html
loading new
67%|████████████████████████████████████████████████████████████████████████████▋ | 20/30 [00:05<00:02, 3.53it/s]loading new
Refiner swapped.
100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 30/30 [00:10<00:00, 2.88it/s]Image generated with private log at: /home/ubuntu/ai/Fooocus-MRE/outputs/2023-08-30/log.html
/home/ubuntu/ai/Fooocus-MRE/outputs/2023-08-30/

from fooocus-mre.

MoonRide303 avatar MoonRide303 commented on July 20, 2024

@valdecircarvalho I observed similar issue when trying to use refiner model in Colab - please try setting refiner model to None, or using updated Colab file from my repository (updated version starts with disabled refiner).

from fooocus-mre.

valdecircarvalho avatar valdecircarvalho commented on July 20, 2024

@MoonRide303
Thanks for the prompt response.
I'm running it on a baremetal server in a cloud provider. I can't/don't know how to run it on Colab (I will do some research).
Could you please explain to me how to set the refiner model to None? Accessing it from the domain:port if I click on the Advanced checkbox, the error pops up.
You can check it here: https://fooocus.localhostcloud.com/

image

from fooocus-mre.

MoonRide303 avatar MoonRide303 commented on July 20, 2024

@valdecircarvalho You can make refiner disabled by default by copying settings-no-refiner.json to settings.json.

Using Colab is pretty simple - you can use this link (also available in readme) to open Fooocus-MRE notebook file in Colab, then press run (it can take few minutes to setup and download models), and when it finishes you should see gradio live link where you'll be able to access it.

from fooocus-mre.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.