llama-lab's People
Forkers
seanward kustomzone betars sunygod ai-awe navezjt rsercano commerceless showjiangnan hxbs-zyy fsfhlds89 marco-emmanuel-noto joedevon jaze-developement dino1729 cyd3nt touristshaun monkingxue jjmachan gladiopeace pjq yabarji59 aiorganisation aiorganisation sorokinvld rahulchhabra07 fastrocket alberto-codes rvabrilot drewdas-jasper developmentforpeople knowlimit codebyteme quartzing xiedongmingming virtuoseit apollohuang1 princeofdev jiangyain yashasdevasurmutt crazydoc93 webrulon mz0in cybernetics standardgalactic velanati fushun1990 kremeni cvga stjordanis developers81828182 assetoverflow 271285808 wsawf mohamedsobhi777 wdshin stevegyutyan realbigdave912 codehornets logp trannhan stevew00ds krish240574 leonelberio presteddy56 nguyentruongan deepandy drasaadmoosa ahmethan96 datatecyl josephrp ormwish gitrjaa bmedi claudiakitova infrastacks flowgeniusmz cicimmmmm mcx cove9988 vineetp6 shaon2221 adolfousier brunoscaglione easternbarpad koggard02 zajvladislav2 tirgoun chernenko3457 koloirgg tereshinkiria spokalini smiral1547 burdenkoyol xiamogucci jexudhz wncoz ilonapilipenko773 armorbikes trebolklassllama-lab's Issues
Expose as APIs on local/cloud using langchain-serve?
Integrate with langchain-serve
- Exposes APIs from function definitions locally as well as on the cloud.
- Very few lines of code changes, ease of development remains the same as local.
- Supports both REST & Websocket endpoints
- Serverless/autoscaling endpoints with automatic tls certs on cloud
- Real-time streaming, human-in-the-loop support
- Local to cloud-ready in just one command -
lc-serve deploy local/jcloud <app>
Disclaimer: I'm the primary author of langchain-serve.
Auto/BabyAGI Programatic access
Any plans in the future for enabling "calling" autoGPT/babyAGI through an endpoint or a function so that we can receive results in tidbits that can be used programatically, like doing a .query() call, kinda?
JSONDecodeError: Extra data: line 1 column 4 (char 3) on import statement
I don't understand why this app requires OpenAI API key.
As far as I understand, this project is Auto-GPT copycat using the open-source model, Llama.
Then why do we need OpenAI API key? Then why should we use this one instead of Auto-GPT?
I'm confused.
llama
Error converting string to JSON object when using Spanish options (?)
Error converting string to JSON object when using Spanish options
Problem description
I'm using the following Spanish options for the script:
- -it: "Crea los pasos para ingresar"
- -o: "Subsidio DS49"
To debug the issue, I added these print lines before initial_task_list = json.loads(initial_task_list_str)
:
print("initial_task_list_str:", initial_task_list_str)
print("initial_task_list_str tipo:", type(initial_task_list_str))
print("initial_task_list_str longitud:", len(initial_task_list_str))
The output was:
[1. Revisar los requisitos para postular al subsidio del DS49,
2. Obtener los documentos necesarios para postular,
3. Completar la solicitud de subsidio del DS49,
4. Enviar la solicitud de subsidio del DS49,
5. Esperar la respuesta de la solicitud de subsidio del DS49.]
initial_task_list_str tipo: <class 'str'>
initial_task_list_str longitud: 275
It appears there is an issue with the JSON formatting in initial_task_list_str. The simple_execution_agent.execute_task function produces an output with incorrect formatting that causes the following error when attempting to convert it into a JSON object:
json.decoder.JSONDecodeError: Expecting ',' delimiter: line 2 column 3 (char 3)```
__init__() takes exactly 1 positional argument (2 given) in llama_agi/examples/auto_runner_example.py
Dear Community,
Can someone please suggest what's the error and how to solve it?
Would be awesome to connect it with LLama_Hub context loaders.
Traceback (most recent call last):
File "/Users/mac-pro/llama-lab/llama_agi/examples/auto_runner_example.py", line 45, in
task_manager = LlamaTaskManager(
^^^^^^^^^^^^^^^^^
File "/Users/mac-pro/llama-lab/llama_agi/llama_agi/task_manager/LlamaTaskManager.py", line 40, in init
super().__init__(
File "/Users/mac-pro/llama-lab/llama_agi/llama_agi/task_manager/base.py", line 41, in init
self.current_tasks = [Document(x) for x in tasks]
^^^^^^^^^^^
File "pydantic/main.py", line 332, in pydantic.main.BaseModel.init
def __init__(__pydantic_self__, **data: Any) -> None:
Fields missing
I exported my keys but am getting this:
Reasoning: I think that using a web scraper to extract the relevant information from the documentation and storing it in a structured format would be a good approach because it would allow you to easily search and query the documentation using your custom openAI application. Additionally, using an API client library would also be a good approach because it would provide a programmatic interface to the SP-API documentation, which would allow you to easily integrate the documentation into your custom openAI application.
Plan: My current plan of action is to provide you with more information on how to use a web scraper to extract the relevant information from the SP-API documentation and how to use an API client library to provide a programmatic interface to the documentation.
Command:
{
"action": "search",
"args": {
"search_terms": "ways to make custom openAI application aware of SP-API documentation"
}
}. Got: 4 validation errors for Response
remember
field required (type=value_error.missing)
thoughts
field required (type=value_error.missing)
reasoning
field required (type=value_error.missing)
command
field required (type=value_error.missing)
(base) maximiliandoelle@Maximilians-MBP auto_llama %
Exceptions
I want you to write a summary how can I store the SP-API documenation using
LlamaIndexThinking...
Traceback (most recent call last):
File "/Users/maximiliandoelle/miniconda3/lib/python3.10/site-packages/langchain/output_parsers/pydantic.py", line 25, in parse
json_object = json.loads(json_str)
File "/Users/maximiliandoelle/miniconda3/lib/python3.10/json/init.py", line 346, in loads
return _default_decoder.decode(s)
File "/Users/maximiliandoelle/miniconda3/lib/python3.10/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/Users/maximiliandoelle/miniconda3/lib/python3.10/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/maximiliandoelle/miniconda3/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/Users/maximiliandoelle/miniconda3/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/Users/maximiliandoelle/Projects/llama-lab/auto_llama/auto_llama/main.py", line 52, in
main()
File "/Users/maximiliandoelle/Projects/llama-lab/auto_llama/auto_llama/main.py", line 31, in main
response = agent.get_response()
File "/Users/maximiliandoelle/Projects/llama-lab/auto_llama/auto_llama/agent.py", line 54, in get_response
response_obj = parser.parse(output.content)
File "/Users/maximiliandoelle/miniconda3/lib/python3.10/site-packages/langchain/output_parsers/pydantic.py", line 31, in parse
raise OutputParserException(msg)
langchain.schema.OutputParserException: Failed to parse Response from completion To store the SP-API documentation, you can use the following steps:
-
Search for the SP-API documentation on the web using the "search" command with the search terms "SP-API documentation".
-
Download the contents of the web page using the "download" command with the URL of the documentation page and a name for the downloaded document.
-
Query the downloaded document to extract the relevant information using the "query" command with the name of the downloaded document and a query string that specifies the information you want to extract.
-
Write the extracted information to a file using the "write" command with a file name and the extracted data.
By following these steps, you can store the SP-API documentation in a file for future refe
option to use local models vs. openai key?
I want to be able to pass in "local-mode" env or similar that uses a local model you put in a particular folder, etc.
Python Error auto llama
I followed the step to install requirements for auto-llama. When tried to run it:
python -m auto_llama
I am always getting the following python error:
_Traceback (most recent call last):
File "/Users/david/miniforge3/envs/llmstack/lib/python3.9/runpy.py", line 197, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/Users/david/miniforge3/envs/llmstack/lib/python3.9/runpy.py", line 87, in run_code
exec(code, run_globals)
File "/Users/david/Project/llama-lab/auto_llama/auto_llama/main.py", line 52, in
main()
File "/Users/david/Project/llama-lab/auto_llama/auto_llama/main.py", line 42, in main
action_results = run_command(user_query, action, args, openaichat)
File "/Users/david/Project/llama-lab/auto_llama/auto_llama/actions.py", line 26, in run_command
response = analyze_search_results(
File "/Users/david/Project/llama-lab/auto_llama/auto_llama/actions.py", line 96, in analyze_search_results
doc = Document(json.dumps(results))
File "pydantic/main.py", line 332, in pydantic.main.BaseModel.init
TypeError: init() takes exactly 1 positional argument (2 given)
Looked up stack overflow , someone saw the problem of using Pydantic:
Pydantic error
Tried to ask different questions and the same error was given. Not sure what was wrong.
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. ๐๐๐
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google โค๏ธ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.