Comments (4)
It's unclear where I supposed to get it? Create empty? Create with some defaults?
Running lwe config files
will also show you the location of the config file, should you wish to edit it manually.
To run lwe (or even lwe config!!) I still need OPENAI_API_KEY, but why? What if i have not one?
OpenAI is used by default to auto-generate titles for conversations. This can be overridden in the config by setting backend_options.title_generation.provider
if needed.
The chat_openai
provider is currently always loaded by LWE core, which is probably why you'd get an error about a missing OpenAI API key. I think this approach is sensible for most users, and I'm not interested in doing the work to change it. I would consider a well-formed PR to allow the chat_openai
provider to be completely optional.
And when I have googleai plugin installed, I need also GOOGLE_API_KEY env var to be always set_ otherwise lwe does not even start.
I fail to see why this is an issue -- the API key is required to use the plugin. If you need to start LWE before you have a Google API key, then disable the plugin until you have properly exported the key to your environment.
So: I suppose there is no sense to demand api keys before they actually be used in request.
The keys are required by the underlying libraries that LWE leverages, and these libraries must be loaded via the plugin system on bootstrap.
Though I have managed to achieve that via OPENAI_BASE_URL env var, that is not enoug, as i would like to have several different profiles, each with own base_url and api_key.
The way to do this is to use presets: https://llm-workflow-engine.readthedocs.io/en/latest/presets.html
Presets will allow you to set up and save different configurations, and easily switch between them, or use them in templates/workflows.
There was already support for adding openai_api_key
and openai_organization
to presets, and I just pushed the feature to be able to add openai_api_base
to a preset for OpenAI providers as well.
Keep in mind this new feature is completely untested, but as the underlying library supports it, I do think it should work.
from llm-workflow-engine.
Running lwe config files will also show you the location of the config file, should you wish to edit it manually.
Right, and there is complete directory structure, but empty. Only directories, without files.
I fail to see why this is an issue -- the API key is required to use the plugin. If you need to start LWE before you have a Google API key, then disable the plugin until you have properly exported the key to your environment.
Well, this is just not user-friendly. Sure some persistent user (like me) can overcome all such minor obstacles. But some other user just will not spend time for this (and even more so to report it).
For my case: I want to have all plugins installed and available to use.
But my environment is not persistent, I set api keys only when I use tools that require them.
I will try to use presets instead, but.. I still see that as issue. May be minor.
from llm-workflow-engine.
Right, and there is complete directory structure, but empty. Only directories, without files.
LWE doesn't need a config file to run, it will run w/ defaults if one does not exist IIRC. Once you create a config file either via /config edit
, or manually, then LWE will use that to override the defaults.
Well, this is just not user-friendly. Sure some persistent user (like me) can overcome all such minor obstacles. But some other user just will not spend time for this (and even more so to report it).
You are the first user to ever report this issue. All the installation instructions instruct you to set the API key first.
But my environment is not persistent, I set api keys only when I use tools that require them.
This is clearly not a typical use, but a use more in line with a power user. And a power user can easily adjust their workflow to automate an issue like this, such as setting dummy API keys by default, then overriding them when needed.
And again, to clarify, it is NOT LWE that is enforcing the requirement of API keys -- it's the underlying libraries. Sure it's theoretically possible to re-architect things to probably overcome that reality, but it would most likely make the code much more complicated, bug-prone, and harder to reason about. IMO the juice isn't worth the squeeze, especially given that I provide this code for free, in my spare time ;)
from llm-workflow-engine.
I think I've addressed everything in this issue. As I said, I'll consider a well-formed PR to create better abstractions for loading components if one is submitted.
from llm-workflow-engine.
Related Issues (20)
- Option to disable the metadata after chat response? HOT 1
- Discard history before a certain point? HOT 4
- Add support for max_submission_tokens to presets HOT 3
- Failed to Read Response HOT 5
- iteratively increasing openai api usage? HOT 4
- How to load third party models? HOT 2
- How to change default model? HOT 3
- OpenAI SDK update to v1.x breaks LWE HOT 2
- /examples not found in path when I try to install examples (on windows) HOT 3
- lwe config leads to error HOT 7
- bard support? HOT 2
- No module named 'lwe.core.constants' HOT 1
- ERROR: Invalid pattern is specified in "path:pattern" HOT 11
- No Windows console found. Are you running cmd.exe? HOT 9
- `__call__` was deprecated in LangChain 0.1.7
- Improve lwe startup time HOT 2
- How do i change the model? HOT 2
- Adding Ollama provider Python API HOT 5
- Preset migration of functions to tools
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from llm-workflow-engine.