Giter VIP home page Giter VIP logo

Comments (8)

MasterJH5574 avatar MasterJH5574 commented on May 19, 2024 1

The use of tune_relax can be something like

ms.relax_integration.tune_relax(
    mod=mod_deploy,
    target=tvm.target.Target("apple/m1-gpu-restricted"),  # for WebGPU 256-thread limitation
    params={},
    builder=ms.builder.LocalBuilder(
        max_workers=os.cpu_count(),
    ),
    runner=ms.runner.LocalRunner(timeout_sec=60),
    work_dir="log_db",
    max_trials_global=50000,
    max_trials_per_task=2000,
)

from web-stable-diffusion.

Civitasv avatar Civitasv commented on May 19, 2024

After finishing tuning, I use:

with args.target, db, tvm.transform.PassContext(opt_level=3):
        mod_deploy = relax.transform.MetaScheduleApplyDatabase(enable_warning=True)(mod)

It will show many warnings like:

[17:27:57] /home/wyc/husen/sandbox/tvm/src/relax/transform/meta_schedule.cc:162: Warning: Tuning record is not found for primfunc: matmul23
[17:27:57] /home/wyc/husen/sandbox/tvm/src/relax/transform/meta_schedule.cc:162: Warning: Tuning record is not found for primfunc: fused_conv2d14_add24_add25
[17:27:57] /home/wyc/husen/sandbox/tvm/src/relax/transform/meta_schedule.cc:162: Warning: Tuning record is not found for primfunc: take
[17:27:57] /home/wyc/husen/sandbox/tvm/src/relax/transform/meta_schedule.cc:162: Warning: Tuning record is not found for primfunc: fused_conv2d37_add34_add35_divide7
[17:27:57] /home/wyc/husen/sandbox/tvm/src/relax/transform/meta_schedule.cc:162: Warning: Tuning record is not found for primfunc: fused_conv2d24_add10
[17:27:57] /home/wyc/husen/sandbox/tvm/src/relax/transform/meta_schedule.cc:162: Warning: Tuning record is not found for primfunc: fused_conv2d7_add10
[17:27:57] /home/wyc/husen/sandbox/tvm/src/relax/transform/meta_schedule.cc:162: Warning: Tuning record is not found for primfunc: fused_matmul28_add27_add28
[17:27:57] /home/wyc/husen/sandbox/tvm/src/relax/transform/meta_schedule.cc:162: Warning: Tuning record is not found for primfunc: fused_matmul11_add11_strided_slice4
[17:27:57] /home/wyc/husen/sandbox/tvm/src/relax/transform/meta_schedule.cc:162: Warning: Tuning record is not found for primfunc: fused_conv2d4_add10_add12
[17:27:57] /home/wyc/husen/sandbox/tvm/src/relax/transform/meta_schedule.cc:162: Warning: Tuning record is not found for primfunc: fused_matmul9_add8_gelu

Then I use relax.build, it will show the typical Did you forget to bind? error.

I don't know why this happen.

cc @tqchen @MasterJH5574

from web-stable-diffusion.

nineis7 avatar nineis7 commented on May 19, 2024

You can use diffusers==0.15.0 version and problems may all be solved. ^^

from web-stable-diffusion.

Civitasv avatar Civitasv commented on May 19, 2024

You can use diffusers==0.15.0 version and problems may all be solved. ^^

Thanks for your help! But I've tried this, still not work.

image

image

from web-stable-diffusion.

MasterJH5574 avatar MasterJH5574 commented on May 19, 2024

Hi @Civitasv, thanks for the question! We used meta_schedule.relax_integration.tune_relax to tune the IRModule mod_deploy.

I guess the mismatch you observed is because both the TIR extraction of tune_relax and MetaScheduleApplyDatabase will “normalize” each TIR function, while tune_tir does not. You can try tune_relax and see if it works for your case.

from web-stable-diffusion.

Civitasv avatar Civitasv commented on May 19, 2024

Thanks for your reply.
I've tried this, but sadly, it still cannot work. Following your advice, I've changed my configuration as follows:

def do_all_tune(mod, target):
    tunning_dir = "gpu3090_workdir"
    tunning_record = "gpu3090/database_tuning_record.json"
    tunning_workload = "gpu3090/database_workload.json"
    cooldown_interval = 0
    trial_cnt = 100
    trial_per = 2

    local_runner = ms.runner.LocalRunner(cooldown_sec=cooldown_interval, timeout_sec=60)
    database = ms.relax_integration.tune_relax(
        mod=mod,
        target=target,
        work_dir=tunning_dir,
        max_trials_global=trial_cnt,
        max_trials_per_task=trial_per,
        runner=local_runner,
        params={},
    )
    if os.path.exists(tunning_record):
        os.remove(tunning_record)
    if os.path.exists(tunning_workload):
        os.remove(tunning_workload)
    database.dump_pruned(
        ms.database.JSONDatabase(
            path_workload=tunning_workload,
            path_tuning_record=tunning_record,
        )
    )

Still saying #43 (comment).

I wonder if it is relavent to the max_trials_global and max_trials_per_task option.

from web-stable-diffusion.

Civitasv avatar Civitasv commented on May 19, 2024

I wonder if it is relevent to the max_trials_global and max_trials_per_task option.

Yes, it is relevant. For 10000 and 2000 for trial_cnt and trial_per, only the take operator is wrong.

from web-stable-diffusion.

MasterJH5574 avatar MasterJH5574 commented on May 19, 2024

Thanks @Civitasv! Glad that it works :-)

from web-stable-diffusion.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.