Repository of resources to learn how to use Portkey for LLMOps
Resources focus on how to use those features and why they matter rather than what they are and how they work. Are you interested in the latter? Visit portkey documentation.
In the integrations folder, create a file that expects crowd-sourced inputs to list provider and their supported models that developers can directly use in their code.
It can be a simple table to begin with.
It does not get into provider-specific details, it just hyperlinks to the right documentation.
All developers, especially working with different models, have to do a lot of context-switching. This is just a matter of convenience.
In the 101 Gateway Configs cookbook, add one more section that teaches how to use OpenAI SDK to make API call to any LLM and highlight, how virtual keys will be used while apiKey argument becomes redudant.
Hi, I just wanted to check if Portkey is compatible if I have my LLM privately hosted within a VM. I can't use Mistral API from Mistral Cloud or anywhere else?
This repository is set to show releases and the package on the home page, assuming it is a software product. It is certain this repository will contain the learning material, hence request to the maintainer to remove tags such as "releases" and "versions" or "packages".
Many threads on the Discord community make up the need for a 101 to create prompts on portkey, especially when using or playing with variables. A guide on this one might be very helpful.