Giter VIP home page Giter VIP logo

cookbook's Introduction

Welcome to Chainlit by Literal AI πŸ‘‹

Twitter PyPI - Downloads GitHub Contributors CI

Build production-ready Conversational AI applications in minutes, not weeks ⚑️

Chainlit is an open-source async Python framework which allows developers to build scalable Conversational AI or agentic applications.

  • βœ… ChatGPT-like application
  • βœ… Embedded Chatbot & Software Copilot
  • βœ… Slack & Discord
  • βœ… Custom frontend (build your own agentic experience)
  • βœ… API Endpoint

Full documentation is available here. You can ask Chainlit related questions to Chainlit Help, an app built using Chainlit!

Note

Contact us here for Enterprise Support. Check out Literal AI, our product to monitor and evaluate LLM applications! It works with any Python or TypeScript applications and seamlessly with Chainlit by adding a LITERAL_API_KEY in your project.

Installation

Open a terminal and run:

$ pip install chainlit
$ chainlit hello

If this opens the hello app in your browser, you're all set!

πŸš€ Quickstart

🐍 Pure Python

Create a new file demo.py with the following code:

import chainlit as cl


@cl.step(type="tool")
async def tool():
    # Fake tool
    await cl.sleep(2)
    return "Response from the tool!"


@cl.on_message  # this function will be called every time a user inputs a message in the UI
async def main(message: cl.Message):
    """
    This function is called every time a user inputs a message in the UI.
    It sends back an intermediate response from the tool, followed by the final answer.

    Args:
        message: The user's message.

    Returns:
        None.
    """

    final_answer = await cl.Message(content="").send()

    # Call the tool
    final_answer.content = await tool()

    await final_answer.update()

Now run it!

$ chainlit run demo.py -w

Quick Start

πŸŽ‰ Key Features and Integrations

Full documentation is available here. Key features:

Chainlit is compatible with all Python programs and libraries. That being said, it comes with integrations for:

πŸ“š More Examples - Cookbook

You can find various examples of Chainlit apps here that leverage tools and services such as OpenAI, Anthropiс, LangChain, LlamaIndex, ChromaDB, Pinecone and more.

Tell us what you would like to see added in Chainlit using the Github issues or on Discord.

πŸ’ Contributing

As an open-source initiative in a rapidly evolving domain, we welcome contributions, be it through the addition of new features or the improvement of documentation.

For detailed information on how to contribute, see here.

πŸ“ƒ License

Chainlit is open-source and licensed under the Apache 2.0 license.

cookbook's People

Contributors

antoineross avatar bbmilan avatar constantinidan avatar felipearosr avatar fenglui avatar kiibo382 avatar mathiasspanhove avatar nobilix avatar owe1n avatar philipkiely-baseten avatar rachittshah avatar ronfromhp avatar skt7 avatar tpatel avatar willydouhard avatar xke avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cookbook's Issues

chrome-qa-chat - The model `gpt-4` does not exist or you do not have access to it.

2024-02-26 20:59:20 - HTTP Request: POST https://api.openai.com/v1/chat/completions "HTTP/1.1 404 Not Found"
2024-02-26 20:59:20 - Error code: 404 - {'error': {'message': 'The model `gpt-4` does not exist or you do not have access to it. Learn more: https://help.openai.com/en/articles/7102672-how-can-i-access-gpt-4.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}
Traceback (most recent call last):
  File "/home/user/.local/lib/python3.11/site-packages/chainlit/utils.py", line 39, in wrapper
    return await user_function(**params_values)

Anyone facing the similar issue for the demo - chrome-qa-chat

I have the access for the GPT 4 and I am unable to trackdown the issue

Copilot doesnΒ΄t work 'TypeError: window.mountChainlitWidget'

I am trying to build a chainlit app with copilot, but I can't get the copilot.


I only installed chainlit and run the file following the tutorial.

Uncaught TypeError: Failed to construct 'URL': Invalid URL
at index.js:275:3885
at index.js:4208:91586
index.html:12
Uncaught TypeError: window.mountChainlitWidget is not a function
at index.html:12:12


Code:

import chainlit as cl


@cl.on_message
async def on_message(msg: cl.Message):
    if cl.context.session.client_type == "copilot":
        fn = cl.CopilotFunction(name="test", args={"msg": msg.content})
        res = await fn.acall()
        await cl.Message(content=res).send()
<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <title>Document</title>
</head>
<body>
    </script>
<script src="http://localhost:8000/copilot/index.js"></script>
  <script>
    window.mountChainlitWidget({
      chainlitServer: "http://localhost:8000",
    });
  </script>
</body>
</html>

image

How to get the "source_documents" with chainlit 0.7??

  I get the source_documents with chainlit 0.6:

code :
@cl.on_message
async def main(message: cl.Message):
chain = cl.user_session.get("chain") # type: ConversationalRetrievalChain
cb = cl.AsyncLangchainCallbackHandler()

res = await chain.acall(message.content, callbacks=[cb])
answer = res["answer"]
source_documents = res["source_documents"]  # type: List[Document]
....

  I use chainlit 0.7 now , I use this code to get the answer:

code:
async for chunk in runnable.astream(
message.content,
config=RunnableConfig(callbacks=[cl.LangchainCallbackHandler()]),
):
await res.stream_token(chunk)

 the result "res" is β€œMessage”, how to get the β€œsource_documentsβ€οΌŸ

Fix for custom front end , playground.tsx

In the backend example provided in the cookbook is used, the chat UI works well , however as soon as other libraries that use their own call backs are inserted into the backend , every time the LLM responds , there is a blank empty text box created . E.g.

image-bug

this is fixed by adding a small condition in the render message function within playground.tsx :

image

Chainlit Autogen example with opensource LLM Qwen Openai.PermissionDenied Error

I have started Qwen API locally using openai_api.py. https://github.com/QwenLM/Qwen/blob/main/openai_api.py

Tried both examples in:
https://github.com/Chainlit/cookbook/tree/main/pyautogen

Following works fine using OpenAIWrapper but initiate_chat with agents as in examples is not working.
config_list = [
{
"model": "Qwen",
"base_url": "http://localhost:8787/v1",
"api_key": "NULL",
}
]
client = OpenAIWrapper(config_list=config_list)
response = client.create(messages=[{"role": "user", "content": "2+9="}])
print(client.extract_text_or_completion_object(response))

Error:
openai.PermissionDeniedError:
< !DOCTYPE html>
< html xmlns="http://www.w3.org/1999/xhtml">
< body>
< /body>
< /html>

2024-01-21 10:35:13 - Loaded .env file
2024-01-21 10:35:15 - Your app is available at http://localhost:8081
user_proxy (to assistant):

hello.


2024-01-21 10:35:18 -

Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/chainlit/utils.py", line 39, in wrapper return await user_function(**params_values) File "/workspace/chainlit-autogen/app5.py", line 108, in on_chat_start await cl.make_async(user_proxy.initiate_chat)( File "/usr/local/lib/python3.10/dist-packages/asyncer/_main.py", line 358, in wrapper return await anyio.to_thread.run_sync( File "/usr/local/lib/python3.10/dist-packages/anyio/to_thread.py", line 33, in run_sync return await get_asynclib().run_sync_in_worker_thread( File "/usr/local/lib/python3.10/dist-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread return await future File "/usr/lib/python3.10/asyncio/futures.py", line 285, in __await__ yield self # This tells Task to wait for completion. File "/usr/lib/python3.10/asyncio/tasks.py", line 304, in __wakeup future.result() File "/usr/lib/python3.10/asyncio/futures.py", line 201, in result raise self._exception.with_traceback(self._exception_tb) File "/usr/local/lib/python3.10/dist-packages/anyio/_backends/_asyncio.py", line 807, in run result = context.run(func, *args) File "/usr/local/lib/python3.10/dist-packages/autogen/agentchat/conversable_agent.py", line 672, in initiate_chat self.send(self.generate_init_message(**context), recipient, silent=silent) File "/workspace/chainlit-autogen/app5.py", line 86, in send super(ChainlitUserProxyAgent, self).send( File "/usr/local/lib/python3.10/dist-packages/autogen/agentchat/conversable_agent.py", line 420, in send recipient.receive(message, self, request_reply, silent) File "/usr/local/lib/python3.10/dist-packages/autogen/agentchat/conversable_agent.py", line 578, in receive reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender) File "/usr/local/lib/python3.10/dist-packages/autogen/agentchat/conversable_agent.py", line 1241, in generate_reply final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"]) File "/usr/local/lib/python3.10/dist-packages/autogen/agentchat/conversable_agent.py", line 761, in generate_oai_reply response = client.create( File "/usr/local/lib/python3.10/dist-packages/autogen/oai/client.py", line 266, in create response = self._completions_create(client, params) File "/usr/local/lib/python3.10/dist-packages/autogen/oai/client.py", line 531, in _completions_create response = completions.create(**params) File "/usr/local/lib/python3.10/dist-packages/openai/_utils/_utils.py", line 271, in wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.10/dist-packages/openai/resources/chat/completions.py", line 648, in create return self._post( File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 1167, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 856, in request return self._request( File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 947, in _request raise self._make_status_error_from_response(err.response) from None openai.PermissionDeniedError:

QA pinecone issue

Hey, I tried to make it working from an existing pinecone database but cant make it work, any suggestions?


import os
from langchain.document_loaders import PyPDFLoader, TextLoader
from langchain.text_splitter import RecursiveCharacterTextSplitter
from langchain.embeddings.openai import OpenAIEmbeddings
from langchain.vectorstores import Pinecone
from langchain.chains import RetrievalQAWithSourcesChain
from langchain.chat_models import ChatOpenAI
import pinecone
import chainlit as cl
from chainlit.types import AskFileResponse

pinecone.init(
    api_key=os.environ.get("PINECONE_API_KEY"),
    environment=os.environ.get("PINECONE_ENV"),
)

index_name = "test"
text_splitter = RecursiveCharacterTextSplitter(chunk_size=1000, chunk_overlap=100)
embeddings = OpenAIEmbeddings()

namespaces = "research"

welcome_message = """Welcome to the Chainlit PDF QA demo! To get started:
1. Upload a PDF or text file
2. Ask a question about the file
"""


@cl.langchain_factory
def langchain_factory():
    # Set a fixed namespace "research"
    namespace = "research"
    

    
    docsearch = Pinecone.from_existing_index(
        index_name=index_name, embedding=embeddings, namespace=namespace
    )

    chain = RetrievalQAWithSourcesChain.from_chain_type(
        ChatOpenAI(temperature=0, streaming=True),
        chain_type="stuff",
        retriever=docsearch.as_retriever(max_tokens_limit=4097),
    )

  
    cl.send_message("The system is ready, you can now ask questions!")

    return chain


@cl.langchain_postprocess
def process_response(res):
    answer = res["answer"]
    sources = res.get("sources", "").strip()  # Use the get method with a default value
    source_elements = []

   
    docs = cl.user_session.get("docs")

    if docs:
        metadatas = [doc.metadata for doc in docs]
        all_sources = [m["source"] for m in metadatas]

        if sources:
            found_sources = []

            
            for source in sources.split(","):
                source_name = source.strip().replace(".", "")
               
                try:
                    index = all_sources.index(source_name)
                except ValueError:
                    continue
                text = docs[index].page_content
                found_sources.append(source_name)
               
                source_elements.append(cl.Text(text=text, name=source_name))

            if found_sources:
                answer += f"\nSources: {', '.join(found_sources)}"
            else:
                answer += "\nNo sources found"
    else:
        answer += "\nNo documents found in the user session"

    cl.send_message(answer, elements=source_elements)

autogen

Hi,how to pass autogen create groupchat with manager

OpenAI Assistant Example

image

Hello,

I am starting the app.py file as shown in the given example. Everything is working, but the final generated image is not being displayed. I have shared the screenshot of the result.

/image-gen doesn't work with Python 3.11

When running the image-gen demo on python 3.11 it throws an error:
ModuleNotFoundError: No module named 'stability_sdk'

Workaround is to run everything on python3.10.

When the openAI assistant is running, there is nothing in the UI to let the user know it's working

I am testing the cool OpenAI assistant example:

https://github.com/Chainlit/cookbook/tree/main/openai-assistant

One thing I've noticed is that on multi-step tasks where the assistant first responds with text, then goes on to do more steps, there is nothing in the chainlit UI to indicate the assistant is working. No spinning wheels, no "Running", even though in fact the assistant is working and will output something. This has confused users. I have looked through the documentation for something to display, but can't seem to find anything.

Is there any way to put something in the UI to indicate the assistant is working?

Working with open source models

Is it possible to work with Open source models like LLama2 using Ollama? I have tried but somehow it did not work and constantly asks for OpenAI key.

Error running openinterpreter example

I have the following error when running the openinterpreter example, using OpenAI (any model gives the same result). Not fluent with chainlit (nor openinterpreter), so maybe I am doing something wrong?

'Interpreter' object has no attribute 'active_block'
Traceback (most recent call last):
  File "/opt/miniconda3/envs/ai/lib/python3.10/site-packages/chainlit/utils.py", line 41, in wrapper
    return user_function(**params_values)
  File "/Users/flavioc/Codes/GitHub/chainlit-cookbook/openinterpreter/app.py", line 109, in main
    interpreter.chat(message)
  File "/opt/miniconda3/envs/ai/lib/python3.10/site-packages/interpreter/core/core.py", line 65, in chat
    for _ in self._streaming_chat(message=message, display=display):
  File "/opt/miniconda3/envs/ai/lib/python3.10/site-packages/interpreter/core/core.py", line 75, in _streaming_chat
    validate_llm_settings(self)
  File "/opt/miniconda3/envs/ai/lib/python3.10/site-packages/interpreter/terminal_interface/validate_llm_settings.py", line 97, in validate_llm_settings
    display_markdown_message(f"> Model set to `{interpreter.model.upper()}`")
  File "/opt/miniconda3/envs/ai/lib/python3.10/site-packages/interpreter/utils/display_markdown_message.py", line 18, in display_markdown_message
    rich_print(Markdown(line))
  File "/opt/miniconda3/envs/ai/lib/python3.10/site-packages/rich/__init__.py", line 74, in print
    return write_console.print(*objects, sep=sep, end=end)
  File "/opt/miniconda3/envs/ai/lib/python3.10/site-packages/rich/console.py", line 1672, in print
    with self:
  File "/opt/miniconda3/envs/ai/lib/python3.10/site-packages/rich/console.py", line 864, in __exit__
    self._exit_buffer()
  File "/opt/miniconda3/envs/ai/lib/python3.10/site-packages/rich/console.py", line 822, in _exit_buffer
    self._check_buffer()
  File "/opt/miniconda3/envs/ai/lib/python3.10/site-packages/rich/console.py", line 2059, in _check_buffer
    self.file.write(text)
  File "/Users/flavioc/Codes/GitHub/chainlit-cookbook/openinterpreter/app.py", line 33, in write
    if interpreter.active_block and type(interpreter.active_block).__name__ == "CodeBlock":
AttributeError: 'Interpreter' object has no attribute 'active_block'

Unable to parse output in `langchain-ask-human` example

Tried running the langchain-ask-human example provided at https://github.com/Chainlit/cookbook/blob/b0e0a0c6f78c5ab69c10d32d6763bea9871ee69e/langchain-ask-human/app.py

However, when I ask the question I need to order pizza for a hackathon. How many should I order? just like in the video ask-human.mp4, I get the error

An output parsing error occurred. In order to pass this error back to the agent and have it try again, pass handle_parsing_errors=True to the AgentExecutor. This is the error: Could not parse LLM output: Thought: I should ask a human for guidance.
Action:
{
"action": "human",
"action_input": "How many people are attending the hackathon?"
}

Anyone knows the solution to this? Thanks


My packages:

chainlit                                 0.7.700
langchain                                0.0.352
langchain-community                      0.0.6
langchain-core                           0.1.3
openai                                   0.27.4

assistants api import error

I keep getting this error when trying to run the assistants api example. I have pip install openai done. Any tips?

ImportError: cannot import name 'MessageContentImageFile' from 'openai.types.beta.threads'

Copilot example with react

Is it possible to convert this example to run with my React app (NextJS)? I am pretty new to both React and Python. This line of code for example: <script src="http://localhost:8000/copilot/index.js"></script>. There is no index.js file and i don't understand how it works without it?

The values obtained within async def on_action(action): cannot be returned to async def main(message: str)

After invoking the action class, the values obtained within async def on_action(action): cannot be returned to async def main(message: str); this is my code.

@cl.action_callback("selector")
async def on_action(action):
    await cl.Message(content=f"you are picking {action.label}").send()
    await cl.Message(content=f"looking into {action.label},wait.....").send()
    user_choice = int(action.value)
    cl.user_session.set("user_choice", user_choice)

@cl.on_message
async def main(message: str):
    # Set Avatar
    await cl.Avatar(
        name="EMM",
        path="./Avatar/avatar.png",
    ).send()
    pick_template = []
    for i, document in enumerate(law_doc_titles):
        pick_template.append(cl.Action(name="selector", value=str(i), label=document.page_content, description=document.page_content))
    
    await cl.Message(content="pick an option:", actions=pick_template).send()
    pick = cl.user_session.get("user_choice")
    chosen_document = law_doc_titles[pick]

I have tried global variables, and the async def main won't wait until the correct value of user_choice is passed through.
if i put
while pick == none cl.sleep(1)
after "
await cl.Message(content="pick an option:", actions=pick_template).send() It would stop the button from clicking as well.
Also tried asyncio.Event(), same problem, as the while method, whatever I want to use to pause until the right value is processed, it freezes the button from click.

`chain_type_kwargs` not used in chroma-qa-chat

Hello,

In the file chroma-qa-chat/app.py, all the block with the system_template/messages/prompt/chain_type_kwargs are not used in RetrievalQAWithSourcesChain.from_chain_type

    # Create a chain that uses the Chroma vector store
    chain = RetrievalQAWithSourcesChain.from_chain_type(
        ChatOpenAI(temperature=0, streaming=True),
        chain_type="stuff",
        retriever=docsearch.as_retriever(),
    )

Needs to change to :

    # Create a chain that uses the Chroma vector store
    chain = RetrievalQAWithSourcesChain.from_chain_type(
        ChatOpenAI(temperature=0, streaming=True),
        chain_type="stuff",
        retriever=docsearch.as_retriever(),
       chain_type_kwargs=chain_type_kwargs,
    )

Else it is taking the default prompts with no history messages.

Need an example for querying from an existing vector DB with the RetrievalQAWithSourcesChain

We have this example for querying on a single document by constructing a vector DB on the fly. I have multiple PDFs on which I want to query, and I have already created a Weaviate vectorstore (here). So, how do I modify this to be able to query on my existing vector DB?

Here is my attempt, I'm getting stuck at extracting the sources. Perhaps the structure of the sources output are different in chroma DB and Weaviate, not sure. Any help is appreciated. This is the error I get:

File "C:\Users\username\Documets\app.py", line 122, in main
    index = all_sources.index(source_name)
            ^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'index'

Here is the output I get:

image

OpenAI Assistants - can't access the assistant_id

I run the OpenAI Assistance cookbook with:

  1. python create_assistant.py > correctly generates assistants.json with {"math_tutor_and_weather_bot": "asst_uhU...Z3"}
  2. chainlit run app.py

The WebUI generates the following error:
1

It works after adding a snippet that directly reads the assistant_id from json at the top of app.py:

with open('assistants.json') as f:
    assistant_id = json.load(f)["math_tutor_and_weather_bot"]
print("assistant_id = ", assistant_id)

My question: How should I set it up to make it possible to deploy the app as the docker on Google Cloud Run?
I mean: how to set up the sequence: 1. create_assistant.py > 2. chainlit run app.py?

Open AI Assitant functions

error:
Traceback (most recent call last):
File "D:\Documendz\code-python\LLM chat app\aienv\lib\site-packages\chainlit\utils.py", line 39, in wrapper
return await user_function(**params_values)
File "D:\Documendz\code-python\LLM chat app\app2.py", line 120, in run_conversation
if tool_call.type == "code_interpreter":
AttributeError: 'dict' object has no attribute 'type'

code:

    # Periodically check for updates
    while True:
        run = await client.beta.threads.runs.retrieve(
            thread_id=thread.id, run_id=run.id
        )

        
        # Fetch the run steps
        run_steps = await client.beta.threads.runs.steps.list(
            thread_id=thread.id, run_id=run.id, order="asc"
        )

        for step in run_steps.data:
            # Fetch step details
            run_step = await client.beta.threads.runs.steps.retrieve(
                thread_id=thread.id, run_id=run.id, step_id=step.id
            )
            step_details = run_step.step_details
            # Update step content in the Chainlit UI
            if step_details.type == "message_creation":
                thread_message = await client.beta.threads.messages.retrieve(
                    message_id=step_details.message_creation.message_id,
                    thread_id=thread.id,
                )
            
                await process_thread_message(message_references, thread_message)

            if step_details.type == "tool_calls":
                for tool_call in step_details.tool_calls:
                    if tool_call.type == "code_interpreter":
                        if not tool_call.id in message_references:
                            message_references[tool_call.id] = cl.Message(
                                author=tool_call.type,
                                content=tool_call.code_interpreter.input
                                or "# Generating code...",
                                language="python",
                                parent_id=context.session.root_message.id,
                            )
                            await message_references[tool_call.id].send()
                        else:
                            message_references[tool_call.id].content = (
                                tool_call.code_interpreter.input
                                or "# Generating code..."
                            )
                            await message_references[tool_call.id].update()

                        tool_output_id = tool_call.id + "output"

                        if not tool_output_id in message_references:
                            message_references[tool_output_id] = cl.Message(
                                author=f"{tool_call.type}_result",
                                content=str(tool_call.code_interpreter.outputs) or "",
                                language="json",
                                parent_id=context.session.root_message.id,
                            )
                            await message_references[tool_output_id].send()
                        else:
                            message_references[tool_output_id].content = (
                                str(tool_call.code_interpreter.outputs) or ""
                            )
                            await message_references[tool_output_id].update()
                            
                    elif tool_call.type == "retrieval":
                        if not tool_call.id in message_references:
                            message_references[tool_call.id] = cl.Message(
                                author=tool_call.type,
                                content="Retrieving information",
                                parent_id=context.session.root_message.id,
                            )
                            await message_references[tool_call.id].send()
                    
                    # below part doesnt work yet, crashes gpt for some reason
                    elif tool_call.type == "function":
                        function_name = tool_call.function.name
                        function_to_call = get_taxi_booking_information
                        function_args = json.loads(tool_call.function.arguments)
                        function_response = function_to_call(**function_args)
                        print(function_response)
                        if not tool_call.id in message_references:
                            message_references[tool_call.id] = cl.Message(
                                author=tool_call.type,
                                content=function_response,
                                parent_id=context.session.root_message.id,
                            )
                            await message_references[tool_call.id].send()
              

        await cl.sleep(1)  # Refresh every second

        if run.status in ["cancelled", "failed", "completed", "expired"]:
            break

all i tried to do was add a function type tool call in the same format as the other tool calls.
To reproduce this error/bug just add a function to the tools while initialising an assistant and then use this code

Custom react frontend with chainlit now working

I was able to use chainlit as chatbot, but when I'm trying to integrate react with it, the react frontend is not able to fetch the message of chainlit backend, while I'm trying to run custom frontend example repo by chainlit.

Link to repo : https://github.com/Chainlit/cookbook/tree/main/custom-frontend

Exception in ASGI application
Traceback (most recent call last):
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/applications.py", line 116, in call
await self.middleware_stack(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in call
raise exc
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in call
await self.app(scope, receive, _send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/cors.py", line 91, in call
await self.simple_response(scope, receive, send, request_headers=headers)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/cors.py", line 146, in simple_response
await self.app(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 55, in wrapped_app
raise exc
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 44, in wrapped_app
await app(scope, receive, sender)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 746, in call
await route.handle(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 288, in handle
await self.app(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 75, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 55, in wrapped_app
raise exc
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 44, in wrapped_app
await app(scope, receive, sender)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 70, in app
response = await func(request)
^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/routing.py", line 299, in app
raise e
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/routing.py", line 294, in app
raw_response = await run_endpoint_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
return await dependant.call(**values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/Desktop/cookbook-main/custom-frontend/chainlit-backend/app.py", line 25, in custom_auth
token = create_jwt(cl.User(identifier="Test User"))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/chainlit/auth.py", line 58, in create_jwt
encoded_jwt = jwt.encode(to_encode, get_jwt_secret(), algorithm="HS256")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/api_jwt.py", line 73, in encode
return api_jws.encode(
^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/api_jws.py", line 160, in encode
key = alg_obj.prepare_key(key)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/algorithms.py", line 265, in prepare_key
key_bytes = force_bytes(key)
^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/utils.py", line 22, in force_bytes
raise TypeError("Expected a string value")
TypeError: Expected a string value
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/applications.py", line 116, in call
await self.middleware_stack(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in call
raise exc
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in call
await self.app(scope, receive, _send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/cors.py", line 91, in call
await self.simple_response(scope, receive, send, request_headers=headers)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/cors.py", line 146, in simple_response
await self.app(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 55, in wrapped_app
raise exc
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 44, in wrapped_app
await app(scope, receive, sender)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 746, in call
await route.handle(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 288, in handle
await self.app(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 75, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 55, in wrapped_app
raise exc
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 44, in wrapped_app
await app(scope, receive, sender)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 70, in app
response = await func(request)
^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/routing.py", line 299, in app
raise e
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/routing.py", line 294, in app
raw_response = await run_endpoint_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
return await dependant.call(**values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/Desktop/cookbook-main/custom-frontend/chainlit-backend/app.py", line 25, in custom_auth
token = create_jwt(cl.User(identifier="Test User"))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/chainlit/auth.py", line 58, in create_jwt
encoded_jwt = jwt.encode(to_encode, get_jwt_secret(), algorithm="HS256")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/api_jwt.py", line 73, in encode
return api_jws.encode(
^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/api_jws.py", line 160, in encode
key = alg_obj.prepare_key(key)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/algorithms.py", line 265, in prepare_key
key_bytes = force_bytes(key)
^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/utils.py", line 22, in force_bytes
raise TypeError("Expected a string value")
TypeError: Expected a string value
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/applications.py", line 116, in call
await self.middleware_stack(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in call
raise exc
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in call
await self.app(scope, receive, _send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/cors.py", line 91, in call
await self.simple_response(scope, receive, send, request_headers=headers)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/cors.py", line 146, in simple_response
await self.app(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 55, in wrapped_app
raise exc
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 44, in wrapped_app
await app(scope, receive, sender)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 746, in call
await route.handle(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 288, in handle
await self.app(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 75, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 55, in wrapped_app
raise exc
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 44, in wrapped_app
await app(scope, receive, sender)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 70, in app
response = await func(request)
^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/routing.py", line 299, in app
raise e
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/routing.py", line 294, in app
raw_response = await run_endpoint_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
return await dependant.call(**values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/Desktop/cookbook-main/custom-frontend/chainlit-backend/app.py", line 25, in custom_auth
token = create_jwt(cl.User(identifier="Test User"))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/chainlit/auth.py", line 58, in create_jwt
encoded_jwt = jwt.encode(to_encode, get_jwt_secret(), algorithm="HS256")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/api_jwt.py", line 73, in encode
return api_jws.encode(
^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/api_jws.py", line 160, in encode
key = alg_obj.prepare_key(key)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/algorithms.py", line 265, in prepare_key
key_bytes = force_bytes(key)
^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/utils.py", line 22, in force_bytes
raise TypeError("Expected a string value")
TypeError: Expected a string value
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/applications.py", line 116, in call
await self.middleware_stack(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in call
raise exc
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in call
await self.app(scope, receive, _send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/cors.py", line 91, in call
await self.simple_response(scope, receive, send, request_headers=headers)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/cors.py", line 146, in simple_response
await self.app(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 55, in wrapped_app
raise exc
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 44, in wrapped_app
await app(scope, receive, sender)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 746, in call
await route.handle(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 288, in handle
await self.app(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 75, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 55, in wrapped_app
raise exc
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 44, in wrapped_app
await app(scope, receive, sender)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 70, in app
response = await func(request)
^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/routing.py", line 299, in app
raise e
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/routing.py", line 294, in app
raw_response = await run_endpoint_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
return await dependant.call(**values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/Desktop/cookbook-main/custom-frontend/chainlit-backend/app.py", line 25, in custom_auth
token = create_jwt(cl.User(identifier="Test User"))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/chainlit/auth.py", line 58, in create_jwt
encoded_jwt = jwt.encode(to_encode, get_jwt_secret(), algorithm="HS256")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/api_jwt.py", line 73, in encode
return api_jws.encode(
^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/api_jws.py", line 160, in encode
key = alg_obj.prepare_key(key)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/algorithms.py", line 265, in prepare_key
key_bytes = force_bytes(key)
^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/utils.py", line 22, in force_bytes
raise TypeError("Expected a string value")
TypeError: Expected a string value
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/applications.py", line 116, in call
await self.middleware_stack(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in call
raise exc
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in call
await self.app(scope, receive, _send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/cors.py", line 91, in call
await self.simple_response(scope, receive, send, request_headers=headers)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/cors.py", line 146, in simple_response
await self.app(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 55, in wrapped_app
raise exc
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 44, in wrapped_app
await app(scope, receive, sender)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 746, in call
await route.handle(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 288, in handle
await self.app(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 75, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 55, in wrapped_app
raise exc
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 44, in wrapped_app
await app(scope, receive, sender)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 70, in app
response = await func(request)
^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/routing.py", line 299, in app
raise e
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/routing.py", line 294, in app
raw_response = await run_endpoint_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
return await dependant.call(**values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/Desktop/cookbook-main/custom-frontend/chainlit-backend/app.py", line 25, in custom_auth
token = create_jwt(cl.User(identifier="Test User"))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/chainlit/auth.py", line 58, in create_jwt
encoded_jwt = jwt.encode(to_encode, get_jwt_secret(), algorithm="HS256")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/api_jwt.py", line 73, in encode
return api_jws.encode(
^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/api_jws.py", line 160, in encode
key = alg_obj.prepare_key(key)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/algorithms.py", line 265, in prepare_key
key_bytes = force_bytes(key)
^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/utils.py", line 22, in force_bytes
raise TypeError("Expected a string value")
TypeError: Expected a string value
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/applications.py", line 116, in call
await self.middleware_stack(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in call
raise exc
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in call
await self.app(scope, receive, _send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/cors.py", line 91, in call
await self.simple_response(scope, receive, send, request_headers=headers)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/cors.py", line 146, in simple_response
await self.app(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 55, in wrapped_app
raise exc
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 44, in wrapped_app
await app(scope, receive, sender)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 746, in call
await route.handle(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 288, in handle
await self.app(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 75, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 55, in wrapped_app
raise exc
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/_exception_handler.py", line 44, in wrapped_app
await app(scope, receive, sender)
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/starlette/routing.py", line 70, in app
response = await func(request)
^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/routing.py", line 299, in app
raise e
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/routing.py", line 294, in app
raw_response = await run_endpoint_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
return await dependant.call(**values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/Desktop/cookbook-main/custom-frontend/chainlit-backend/app.py", line 25, in custom_auth
token = create_jwt(cl.User(identifier="Test User"))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/chainlit/auth.py", line 58, in create_jwt
encoded_jwt = jwt.encode(to_encode, get_jwt_secret(), algorithm="HS256")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/api_jwt.py", line 73, in encode
return api_jws.encode(
^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/api_jws.py", line 160, in encode
key = alg_obj.prepare_key(key)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/algorithms.py", line 265, in prepare_key
key_bytes = force_bytes(key)
^^^^^^^^^^^^^^^^
File "/Users/srichandankota/anaconda3/lib/python3.11/site-packages/jwt/utils.py", line 22, in force_bytes
raise TypeError("Expected a string value")
TypeError: Expected a string value

duplicate calls

I have added an assistant but constantly getting the duplicate calls error even though i have not made any change... how can i allow duplicate calls?

Request to Add Google Gemini Pro and Gemini Pro Vision Examples

I would like to kindly request the inclusion of examples for the Google Gemi Pro and Gemini Pro Vision in the Chainlit Cookbook repository.
Specifically I would like to add

  • text generation and chatbot example with gemini-pro
  • mutimodal image and text question answering with gemini-vision-pro
    Both of these projects are noteworthy in their respective domains and can greatly benefit the community by showcasing their capabilities and integration potential within the Chainlit framework.

how to write the @cl.on_chat_resume with retrieval?

I use the code with the example for @cl.on_chat_resume ,   the chat is OK 

but I want to use the retrieval, can you give an example?

I use the code with langchain example ,but fail!

template = """Answer the question based only on the following context:
{context}

Question: {question}
"""
prompt = ChatPromptTemplate.from_template(template)
model = ChatOpenAI()

retrieval_chain = (
{"context": retriever, "question": RunnablePassthrough()}
| prompt
| model
| StrOutputParser()
)

Copilot example not working

The copilot example is not working for me. The problem seems to be the URL http://localhost:8000/copilot/index.js always gets redirected back to the root URL http://localhost:8000 and we get a HTML response instead of JavaScript.

OpenAI assistant example

  1. images not working:
    'MessageContentImageFile' object has no attribute 'text'

  2. Show all steps please with code interpreter

  3. Enable streaming.

how to write the @cl.on_chat_resume with retrieval?

I use the code with the example for @cl.on_chat_resume ,   the chat is OK 

but I want to use the retrieval, can you give an example?

I use the code with langchain example ,but fail!

template = """Answer the question based only on the following context:
{context}

Question: {question}
"""
prompt = ChatPromptTemplate.from_template(template)
model = ChatOpenAI()

retrieval_chain = (
{"context": retriever, "question": RunnablePassthrough()}
| prompt
| model
| StrOutputParser()
)

azure-openai-pinecone-pdf-qa - Found document with no 'text' key. Skipping

Hey, I'm currently running the azure-openai-pinecone-pdf-qa locally on my machine, i've set up all the things correctly, the indexation of the pdfs are working as expected:
image
but once the app is running and i'm trying to make request about the document i get this message:
image
so the RAG isn't working and the response it gives me is not according to the document.
I've tried with multiple pdf's stored in my ./pdfs directory even the one from the repository but nothing changes it's always the same.

If anyone could help me I'll be really grateful, Thanks !

llama-index not working

I am running on google colab with ngrok. I got the following error for running the llama-index cookbook:

Traceback (most recent call last):
File "/content/app.py", line 24, in
storage_context = StorageContext.from_defaults(persist_dir="./storage")
File "/usr/local/lib/python3.10/dist-packages/llama_index/core/storage/storage_context.py", line 107, in from_defaults
docstore = docstore or SimpleDocumentStore.from_persist_dir(
File "/usr/local/lib/python3.10/dist-packages/llama_index/core/storage/docstore/simple_docstore.py", line 57, in from_persist_dir
return cls.from_persist_path(persist_path, namespace=namespace, fs=fs)
File "/usr/local/lib/python3.10/dist-packages/llama_index/core/storage/docstore/simple_docstore.py", line 74, in from_persist_path
simple_kvstore = SimpleKVStore.from_persist_path(persist_path, fs=fs)
File "/usr/local/lib/python3.10/dist-packages/llama_index/core/storage/kvstore/simple_kvstore.py", line 97, in from_persist_path
with fs.open(persist_path, "rb") as f:
File "/usr/local/lib/python3.10/dist-packages/fsspec/spec.py", line 1241, in open
f = self._open(
File "/usr/local/lib/python3.10/dist-packages/fsspec/implementations/local.py", line 184, in _open
return LocalFileOpener(path, mode, fs=self, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/fsspec/implementations/local.py", line 315, in init
self._open()
File "/usr/local/lib/python3.10/dist-packages/fsspec/implementations/local.py", line 320, in _open
self.f = open(self.path, mode=self.mode)
FileNotFoundError: [Errno 2] No such file or directory: '/content/storage/docstore.json'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/usr/local/bin/chainlit", line 8, in
sys.exit(cli())
File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1157, in call
return self.main(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1078, in main
rv = self.invoke(ctx)
File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1688, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 783, in invoke
return __callback(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/chainlit/cli/init.py", line 154, in chainlit_run
run_chainlit(target)
File "/usr/local/lib/python3.10/dist-packages/chainlit/cli/init.py", line 56, in run_chainlit
load_module(config.run.module_name)
File "/usr/local/lib/python3.10/dist-packages/chainlit/config.py", line 379, in load_module
spec.loader.exec_module(module)
File "", line 883, in exec_module
File "", line 241, in _call_with_frames_removed
File "/content/app.py", line 28, in
documents = SimpleDirectoryReader("./data").load_data(show_progress=True)
File "/usr/local/lib/python3.10/dist-packages/llama_index/core/readers/file/base.py", line 220, in init
raise ValueError(f"Directory {input_dir} does not exist.")
ValueError: Directory ./data does not exist.

PGVECTOR support

This is an awesome library - so thanks!
Any chance you could add support for PGVECTOR or point to a resource that would help make the switch from Pincone or Chroma easy?

Thanks
AISEE

Update Anthropic Client

Anthropic changed their python sdk - making this code line outdated.

c = anthropic.Client(os.environ["ANTHROPIC_API_KEY"])

Would love to know if this might help - https://github.com/BerriAI/litellm

~100 lines of code, that standardizes all the llm api calls to the OpenAI call

from litellm import completion

## set ENV variables
# ENV variables can be set in .env file, too. Example in .env.example
os.environ["OPENAI_API_KEY"] = "openai key"
os.environ["ANTHROPIC_API_KEY"] = "anthropic key"

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# anthropic call
response = completion("claude-v-2", messages)

Example [llava] file.content missing in uploaded files

Greetings,

I've been testing Chainlit and the Llava cookbook.

file.content is empty using the llava example from the cookbook:

    image = next(
        (
            Image.open(io.BytesIO(file.content))
            for file in message.elements or []
            if "image" in file.mime
        ),
        None,
    )

I had to change it to read the full file path instead:

async def main(message: cl.Message):
    image = next(
        (
            Image.open(file.path)
            for file in message.elements or []
            if "image" in file.mime and file.path is not None
        ),
        None,
    )

Is the file.content bytes supposed to be stored in the file object as well? If so there is an issue saving it:

element.py

@dataclass
class Element:
    # The byte content of the element.
    content: Optional[Union[bytes, str]] = None

emitter.py
https://github.com/Chainlit/chainlit/blob/5f7f104bb66d7b94dfc692cd8f27bfad17bac179/backend/chainlit/emitter.py#L213

file_elements Image content is empty:

[
    Image(
        name="back.jpg",
        id="1b800a32-8bfd-4f46-ab41-aa4450d3f6d9",
        chainlit_key="1b800a32-8bfd-4f46-ab41-aa4450d3f6d9",
        url=None,
        object_key=None,
        path="cookbook/custom-frontend/.files/86e16506-67c8-4aa4-bdb4-4be5c26e4691/1b800a32-8bfd-4f46-ab41-aa4450d3f6d9.jpg",
        content=None,
        display="inline",
        size="medium",
        for_id=None,
        language=None,
        mime="image/jpeg",
    )
]

Thanks!

Use cl.Action on custom frontend

Hey I'm trying to use the chainlit actions inside the custom frontend like this:
image
But the 2 actions are not displaying in the frontend when I run it, I guess I have to apply changes directly to the frontend to make it work, but I wonder if the cl.Action method is supported on the @chainlit/react-client module and if so what should I do to make it work.

Thanks !

How can I access user feedback in the main python script?

I'm just getting to know it but I already think Chainlit is awesome!

However, I would like to access the user feedback in the main python script for further analysis. I cannot find it in the documentation or issues. Data persistence is enabled and I can find the feedback in the Literal AI dashboard.

How can I access the feedback in the main python script?

Thanks!

Add LangGraph example

from dotenv import load_dotenv
load_dotenv()
import chainlit as cl

from langchain_community.tools.tavily_search import TavilySearchResults

tools = [TavilySearchResults(max_results=1)]

from langgraph.prebuilt import ToolExecutor

tool_executor = ToolExecutor(tools)

from langchain_openai import ChatOpenAI

# We will set streaming=True so that we can stream tokens
# See the streaming section for more information on this.
model = ChatOpenAI(temperature=0, streaming=True)

from langchain.tools.render import format_tool_to_openai_function

functions = [format_tool_to_openai_function(t) for t in tools]
model = model.bind_functions(functions)

from typing import TypedDict, Annotated, Sequence
import operator
from langchain_core.messages import BaseMessage


class AgentState(TypedDict):
    messages: Annotated[Sequence[BaseMessage], operator.add]


from langgraph.prebuilt import ToolInvocation
import json
from langchain_core.messages import FunctionMessage

# Define the function that determines whether to continue or not
def should_continue(state):
    messages = state['messages']
    last_message = messages[-1]
    # If there is no function call, then we finish
    if "function_call" not in last_message.additional_kwargs:
        return "end"
    # Otherwise if there is, we continue
    else:
        return "continue"

# Define the function that calls the model
def call_model(state):
    messages = state['messages']
    response = model.invoke(messages)
    # We return a list, because this will get added to the existing list
    return {"messages": [response]}

# Define the function to execute tools
def call_tool(state):
    messages = state['messages']
    # Based on the continue condition
    # we know the last message involves a function call
    last_message = messages[-1]
    # We construct an ToolInvocation from the function_call
    action = ToolInvocation(
        tool=last_message.additional_kwargs["function_call"]["name"],
        tool_input=json.loads(last_message.additional_kwargs["function_call"]["arguments"]),
    )
    # We call the tool_executor and get back a response
    response = tool_executor.invoke(action)
    # We use the response to create a FunctionMessage
    function_message = FunctionMessage(content=str(response), name=action.tool)
    # We return a list, because this will get added to the existing list
    return {"messages": [function_message]}


from langgraph.graph import StateGraph, END
# Define a new graph
workflow = StateGraph(AgentState)

# Define the two nodes we will cycle between
workflow.add_node("agent", call_model)
workflow.add_node("action", call_tool)

# Set the entrypoint as `agent`
# This means that this node is the first one called
workflow.set_entry_point("agent")

# We now add a conditional edge
workflow.add_conditional_edges(
    # First, we define the start node. We use `agent`.
    # This means these are the edges taken after the `agent` node is called.
    "agent",
    # Next, we pass in the function that will determine which node is called next.
    should_continue,
    # Finally we pass in a mapping.
    # The keys are strings, and the values are other nodes.
    # END is a special node marking that the graph should finish.
    # What will happen is we will call `should_continue`, and then the output of that
    # will be matched against the keys in this mapping.
    # Based on which one it matches, that node will then be called.
    {
        # If `tools`, then we call the tool node.
        "continue": "action",
        # Otherwise we finish.
        "end": END
    }
)

# We now add a normal edge from `tools` to `agent`.
# This means that after `tools` is called, `agent` node is called next.
workflow.add_edge('action', 'agent')

# Finally, we compile it!
# This compiles it into a LangChain Runnable,
# meaning you can use it as you would any other runnable
app = workflow.compile()


from langchain_core.messages import HumanMessage
from langchain_core.runnables import RunnableConfig

@cl.on_message
async def run_convo(message: cl.Message):
    #"what is the weather in sf"
    inputs = {"messages": [HumanMessage(content=message.content)]}
    
    res = app.invoke(inputs, config=RunnableConfig(callbacks=[
        cl.LangchainCallbackHandler(
            to_ignore=["ChannelRead", "RunnableLambda", "ChannelWrite", "__start__", "_execute"]
            # can add more into the to_ignore: "agent:edges", "call_model"
            # to_keep=

        )]))

    await cl.Message(content=res["messages"][-1].content).send()

human feedback not working

Exception: [{'message': 'Unknown type "FeedbackPayloadInput". Did you mean "ThreadPayloadInput", "GenerationPayloadInput", or "ScorePayloadInput"?', 'locations': [{'line': 14, 'column': 22}]}, {'message': 'Unknown argument "feedback" on field "Mutation.ingestStep".', 'locations': [{'line': 31, 'column': 9}]}]
2024-04-29 12:46:12 - HTTP Request: POST https://cloud.getliteral.ai/api/graphql "HTTP/1.1 200 OK"
2024-04-29 12:46:12 - Failed to create feedback: [{'message': 'Unknown type "FeedbackStrategy".', 'locations': [{'line': 5, 'column': 24}]}, {'message': 'Cannot query field
"createFeedback" on type "Mutation".', 'locations': [{'line': 8, 'column': 13}]}]

On autogen example

Somehow getting an error
"[Errno 2] No such file or directory: 'workspace/tmp_code_99073d0c5dccadd93da41584f422b19a.py", leads to the chatbot hanging on the front-end, but working in the backend.
Frontend hanging:
Screenshot 2023-10-31 at 2 27 09β€―AM

Backend completion:
Screenshot 2023-10-31 at 2 28 37β€―AM

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.