Giter VIP home page Giter VIP logo

streamlit-agent's People

Contributors

amjadraza avatar baskaryan avatar bruno-uy avatar lukasmasuch avatar muhtasham avatar sfc-gh-jcarroll avatar shashankdeshpande avatar tconkling avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

streamlit-agent's Issues

Hello, I modified pyproject.toml according to the method you provided and reconfigured the corresponding version of the package, but the error may be a version compatibility issue.

[tool.poetry]
name = "streamlit-agent"
version = "0.1.0"
description = ""
authors = ["Tim Conkling [email protected]", "Joshua Carroll [email protected]"]
license = "Apache 2.0"
readme = "README.md"
packages = [{include = "streamlit_agent"}]

[tool.poetry.dependencies]
python = ">=3.10,<4.0"
langchain = {version = ">=0.0.252", extras = ["docarray"]}
openai = "^0.27.8"
duckduckgo-search = "^3.8.3"
pypdf = "^3.12.2"
sentence-transformers = "^2.2.2"
torch = ">=2.0.0, !=2.0.1"
tabulate = "^0.9.0"
streamlit-feedback = "^0.0.9"
langchain-experimental = "^0.0.10"
streamlit= ">=1.26"

[tool.poetry.group.dev.dependencies]
black = "^23.3.0"
mypy = "^1.4.1"
pre-commit = "^3.3.3"

[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"

PS C:\Users\39627\Downloads\streamlit-agent-main> & c:/Users/39627/Downloads/streamlit-agent-main/langchain_env/Scripts/python.exe c:/Users/39627/Downloads/streamlit-agent-main/streamlit_agent/chat_with_documents.py
Traceback (most recent call last):
File "c:\Users\39627\Downloads\streamlit-agent-main\streamlit_agent\chat_with_documents.py", line 4, in
from langchain.chat_models import ChatOpenAI
File "C:\Users\39627\Downloads\streamlit-agent-main\langchain_env\Lib\site-packages\langchain_init_.py", line 6, in
from langchain.agents import MRKLChain, ReActChain, SelfAskWithSearchChain
File "C:\Users\39627\Downloads\streamlit-agent-main\langchain_env\Lib\site-packages\langchain\agents_init_.py", line 31, in
from langchain.agents.agent import (
File "C:\Users\39627\Downloads\streamlit-agent-main\langchain_env\Lib\site-packages\langchain\agents\agent.py", line 14, in
from langchain.agents.agent_iterator import AgentExecutorIterator
File "C:\Users\39627\Downloads\streamlit-agent-main\langchain_env\Lib\site-packages\langchain\agents\agent_iterator.py", line 21, in
from langchain.callbacks.manager import (
File "C:\Users\39627\Downloads\streamlit-agent-main\langchain_env\Lib\site-packages\langchain\callbacks_init_.py", line 10, in
from langchain.callbacks.aim_callback import AimCallbackHandler
File "C:\Users\39627\Downloads\streamlit-agent-main\langchain_env\Lib\site-packages\langchain\callbacks\aim_callback.py", line 5, in
from langchain.schema import AgentAction, AgentFinish, LLMResult
File "C:\Users\39627\Downloads\streamlit-agent-main\langchain_env\Lib\site-packages\langchain\schema_init_.py", line 3, in
from langchain.schema.cache import BaseCache
File "C:\Users\39627\Downloads\streamlit-agent-main\langchain_env\Lib\site-packages\langchain\schema\cache.py", line 6, in
from langchain.schema.output import Generation
File "C:\Users\39627\Downloads\streamlit-agent-main\langchain_env\Lib\site-packages\langchain\schema\output.py", line 7, in
from langchain.load.serializable import Serializable
File "C:\Users\39627\Downloads\streamlit-agent-main\langchain_env\Lib\site-packages\langchain\load\serializable.py", line 4, in
from langchain.pydantic_v1 import BaseModel, PrivateAttr
ImportError: cannot import name 'PrivateAttr' from 'langchain.pydantic_v1' (C:\Users\39627\Downloads\streamlit-agent-main\langchain_env\Lib\site-packages\langchain\pydantic_v1_init_.py)

Not sure how to initialize `st.session_state['messages']` in my chat code

image

Hitting this error:

Traceback (most recent call last):
  File "/Users/josoroma/Library/Caches/pypoetry/virtualenvs/etl-Fd0mW_QZ-py3.11/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 552, in _run_script
    exec(code, module.__dict__)
  File "/Users/josoroma/sites/etl-pipeline-for-langchain-docs/chat.py", line 111, in <module>
    ChatApp().main()
  File "/Users/josoroma/sites/etl-pipeline-for-langchain-docs/chat.py", line 83, in main
    for msg in st.session_state['messages']:
               ~~~~~~~~~~~~~~~~^^^^^^^^^^^^
  File "/Users/josoroma/Library/Caches/pypoetry/virtualenvs/etl-Fd0mW_QZ-py3.11/lib/python3.11/site-packages/streamlit/runtime/state/session_state_proxy.py", line 90, in __getitem__
    return get_session_state()[key]
           ~~~~~~~~~~~~~~~~~~~^^^^^
  File "/Users/josoroma/Library/Caches/pypoetry/virtualenvs/etl-Fd0mW_QZ-py3.11/lib/python3.11/site-packages/streamlit/runtime/state/safe_session_state.py", line 111, in __getitem__
    raise KeyError(key)
KeyError: 'messages'

My code:

import os
import weaviate
import openai
from pydantic import BaseModel
from langchain.callbacks.base import BaseCallbackHandler
# from langchain.chat_models import ChatOpenAI
from langchain.schema import ChatMessage
import streamlit as st

class Document(BaseModel):
    content: str

class QueryResult(BaseModel):
    document: Document

class StreamHandler(BaseCallbackHandler):
    def __init__(self, container, initial_text=""):
        self.container = container
        self.text = initial_text

    def on_llm_new_token(self, token: str, **kwargs) -> None:
        self.text += token
        self.container.markdown(self.text)

class ChatApp:
    def __init__(self):
        self.client = None
        self.get_env_variables()
        self.client = self.get_client()

    def get_env_variable(self, var_name):
        var_value = os.getenv(var_name)
        return var_value

    def get_env_variables(self):
        with st.sidebar:
            self.OPENAI_API_KEY = self.get_env_variable("OPENAI_API_KEY") or st.text_input("OpenAI API Key", type="password")
            self.WEAVIATE_HOST = self.get_env_variable("WEAVIATE_HOST") or st.text_input("Weaviate Host")
            self.WEAVIATE_AUTH_API_KEY = self.get_env_variable("WEAVIATE_AUTH_API_KEY") or st.text_input("Bearer Token", type="password")

        if not self.OPENAI_API_KEY or not self.WEAVIATE_HOST or not self.WEAVIATE_AUTH_API_KEY:
            st.info("Please add your OpenAI API Key, Weaviate Host, and Bearer Token to continue.")
            st.stop()

        openai.api_key = self.OPENAI_API_KEY

    def get_client(self):
        try:
            client = weaviate.Client(
                url=self.WEAVIATE_HOST,
                auth_client_secret=weaviate.AuthApiKey(self.WEAVIATE_AUTH_API_KEY),
                additional_headers={"X-OpenAI-Api-Key": self.OPENAI_API_KEY},
            )
        except Exception as e:
            st.error(f"Error occurred while creating the Weaviate client: {str(e)}")
            st.stop()

        return client

    def client_query(self, question: str):
        generatePrompt = "Respond to the human as helpfully and accurately as possible: {text}"
        nearText = {"concepts": [f"{question}"]}

        try:
            response = (
                self.client.query
                .get("Document", ["content"])
                .with_generate(single_prompt=generatePrompt)
                .with_near_text(nearText)
                .with_limit(1)
                .do()
            )
        except Exception as e:
            st.error(f"Error occurred while querying the Weaviate client: {str(e)}")
            st.stop()

        return response

    def main(self):
        if 'messages' not in st.session_state:
            st.session_state['messages'] = [ChatMessage(role="assistant", content="How can I help you?")]

        for msg in st.session_state['messages']:
            st.chat_message(msg.role).write(msg.content)

        if prompt := st.text_input("Your input:"):
            st.session_state['messages'].append(ChatMessage(role="user", content=prompt))
            st.chat_message("user").write(prompt)

            response = self.client_query(prompt)
            if response:
                try:
                    for document in response['data']['Get']['Document']:
                        try:
                            generativeOpenAI = document['_additional']['generate']["singleResult"]
                            content = document['content']
                        except KeyError as ke:
                            st.markdown(f"Error: Expected keys not found in the document. {ke}")
                            continue

                        if generativeOpenAI:
                            st.session_state['messages'].append(ChatMessage(role="assistant", content=generativeOpenAI))
                            st.chat_message("assistant").write(generativeOpenAI)
                        if content:
                            st.session_state['messages'].append(ChatMessage(role="assistant", content=content))
                            st.chat_message("assistant").write(content)
                except KeyError as ke:
                    st.markdown(f"Error: Expected keys not found in the response. {ke}")

if __name__ == "__main__":
    ChatApp().main()

Error message in console when running chat_with_documents.py

When running chat_with_documents.py , I get the following error message

Error in PrintRetrievalHandler.on_retriever_start callback: PrintRetrievalHandler.on_retriever_start() takes 2 positional arguments but 3 were given

Apparently, it still works fine, and PrintRetrievalHandler.on_retriever_start() can be commented out

support rate limit

openai account has rate limit. The application will output error.
anyway to fix this issue?

Is there a way to prompt the user with the "human" tool?

I love this repo, and is very useful. Although im wondering if there's an implementation to prompt the user for the "human" tool. Cause as of now, everything happens in the background, and when the human is "prompted" nothing happens.

How to hold all the status containers generated during retrieving?

Thanks for your amazing work.

`class PrintRetrievalHandler(BaseCallbackHandler):
def init(self, container):
self.status = container.status("Context Retrieval")

def on_retriever_start(self, serialized: dict, query: str, **kwargs):
    self.status.write(f"**Question:** {query}")
    self.status.update(label=f"**Context Retrieval:** {query}")

def on_retriever_end(self, documents, **kwargs):
    for idx, doc in enumerate(documents):
        source = os.path.basename(doc.metadata["source"])
        self.status.write(f"**Document {idx} from {source}**")
        self.status.markdown(doc.page_content)
    self.status.update(state="complete")`

Here a status container is initialized. But the last one will disappear when starting a new retrieval. I know in streamlit, the session_state is used to store the objects, but I am new in streamlit, and I failed to implement.

Thanks for your attention. I appreciate it a lot.

trouble working with simple requests

I was trying to use the app just for demo purpose and to see how it goes. Used my own openai api key and in the text box said 'describe the first table'. This is what I get. Not sure what is the issue.
image

Use of Extra Libraries

Do we really need it, I don't see explicit use of these rather adding extra space and time to create an environment.

sentence-transformers = "^2.2.2"
torch = ">=2.0.0, !=2.0.1"

mrkl and search-and-chat examples not working

just tried the above mentioned two examples with my own freshly created openai api key and got error messages. one was:

duckduckgo_search.exceptions.DuckDuckGoSearchException: This app has encountered an error. The original error message is redacted to prevent data leaks. Full error details have been recorded in the logs (if you're on Streamlit Cloud, click on 'Manage app' in the lower right of your app). [...]

Is there a way to prevent thoughts from disappearing?

I.e. the intermediate thoughts appear in the most recent answer from the agent. However, when more messages are added, the intermediate thoughts disappear from the older answers (leaving just the final answer). I would like to be able leave the intermediate thoughts in place, to allow going back and inspecting them.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.