Giter VIP home page Giter VIP logo

runpod-worker-instantid's Introduction

InstantID : Zero-shot Identity-Preserving Generation in Seconds | RunPod Serverless Worker

This is the source code for a RunPod Serverless worker for InstantID: Zero-shot Identity-Preserving Generation in Seconds.

Docker Pulls Worker Version

Model

Model
YamerMIX_v8

Testing

  1. Local Testing
  2. RunPod Testing

Building the Docker image that will be used by the Serverless Worker

There are two options:

  1. Network Volume
  2. Standalone (without Network Volume)

RunPod API Endpoint

You can send requests to your RunPod API Endpoint using the /run or /runsync endpoints.

Requests sent to the /run endpoint will be handled asynchronously, and are non-blocking operations. Your first response status will always be IN_QUEUE. You need to send subsequent requests to the /status endpoint to get further status updates, and eventually the COMPLETED status will be returned if your request is successful.

Requests sent to the /runsync endpoint will be handled synchronously and are blocking operations. If they are processed by a worker within 90 seconds, the result will be returned in the response, but if the processing time exceeds 90 seconds, you will need to handle the response and request status updates from the /status endpoint until you receive the COMPLETED status which indicates that your request was successful.

RunPod API Examples

Endpoint Status Codes

Status Description
IN_QUEUE Request is in the queue waiting to be picked up by a worker. You can call the /status endpoint to check for status updates.
IN_PROGRESS Request is currently being processed by a worker. You can call the /status endpoint to check for status updates.
FAILED The request failed, most likely due to encountering an error.
CANCELLED The request was cancelled. This usually happens when you call the /cancel endpoint to cancel the request.
TIMED_OUT The request timed out. This usually happens when your handler throws some kind of exception that does return a valid response.
COMPLETED The request completed successfully and the output is available in the output field of the response.

Serverless Handler

The serverless handler (rp_handler.py) is a Python script that handles the API requests to your Endpoint using the runpod Python library. It defines a function handler(event) that takes an API request (event), runs the inference using InstantID with the input, and returns the output in the JSON response.

Acknowledgements

Additional Resources

Community and Contributing

Pull requests and issues on GitHub are welcome. Bug fixes and new features are encouraged.

Appreciate my work?

Buy Me A Coffee

runpod-worker-instantid's People

Contributors

ashleykleynhans avatar dependabot[bot] avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

runpod-worker-instantid's Issues

Error when loading custom .safetensors file

After downloading a model from civit.ai using wget inside of a pod to a network volume. After entering the path in the generate.py file I get the following error:

RunPod request sync-0ccb3bbb-38b5-4f76-85e5-d94d2c525a46-e1 failed { "delayTime": 101811, "error": "StableDiffusionXLPipeline.__init__() got an unexpected keyword argument 'safety_checker'", "executionTime": 21511, "id": "sync-0ccb3bbb-38b5-4f76-85e5-d94d2c525a46-e1", "output": { "output": "Traceback (most recent call last):\n File \"/runpod-volume/runpod-worker-instantid/src/rp_handler.py\", line 313, in handler\n images = generate_image(\n File \"/runpod-volume/runpod-worker-instantid/src/rp_handler.py\", line 282, in generate_image\n PIPELINE = get_instantid_pipeline(model)\n File \"/runpod-volume/runpod-worker-instantid/src/rp_handler.py\", line 100, in get_instantid_pipeline\n (tokenizers, text_encoders, unet, _, vae) = load_models_xl(\n File \"/runpod-volume/runpod-worker-instantid/src/model_util.py\", line 357, in load_models_xl\n (tokenizers, text_encoders, unet, vae) = load_checkpoint_model_xl(\n File \"/runpod-volume/runpod-worker-instantid/src/model_util.py\", line 325, in load_checkpoint_model_xl\n pipe = StableDiffusionXLPipeline.from_single_file(\n File \"/workspace/venv/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py\", line 118, in _inner_fn\n return fn(*args, **kwargs)\n File \"/workspace/venv/lib/python3.10/site-packages/diffusers/loaders/single_file.py\", line 263, in from_single_file\n pipe = download_from_original_stable_diffusion_ckpt(\n File \"/workspace/venv/lib/python3.10/site-packages/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py\", line 1687, in download_from_original_stable_diffusion_ckpt\n pipe = pipeline_class(\nTypeError: StableDiffusionXLPipeline.__init__() got an unexpected keyword argument 'safety_checker'\n" }, "status": "FAILED" }

The model was added to generate.py like this:

MODEL_PATH = '/workspace/runpod-worker-instantid/src/yes.safetensors'

Watercolor is always the style

No matter what I add to the style_name ("style_name": "cyberpunk" for example), watercolor is always sent out as the style. I use your original version of the repo.

Error(s) in loading state_dict for Resampler:\n\tsize mismatch for proj_out.weight

When I replace the model with Justin-Choo/XXMix_9realisticSDXL( https://huggingface.co/Justin-Choo/XXMix_9realisticSDXL), I get an error. The error is:
Error(s) in loading state_dict for Resampler:\n\tsize mismatch for proj_out.weight: copying a param with shape torch.Size([2048, 1280]) from checkpoint, the shape in current model is torch.Size([768, 1280]).\n\tsize mismatch for proj_out.bias: copying a param with shape torch.Size([2048]) from checkpoint, the shape in current model is torch.Size([768]).\n\tsize mismatch for norm_out.weight: copying a param with shape torch.Size([2048]) from checkpoint, the shape in current model is torch.Size([768]).\n\tsize mismatch for norm_out.bias: copying a param with shape torch.Size([2048]) from checkpoint, the shape in current model is torch.Size([768]).

What should I pay attention to when replacing the model?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.