Giter VIP home page Giter VIP logo

Comments (8)

n3d1117 avatar n3d1117 commented on May 22, 2024

Hi @k3it, good catch. Yeah, I think it's showing the sum of the prompt tokens used within a conversation.

I would say that's the expected behavior, since I'm not doing any calculations myself, just printing the usage object that is returned by the API. Anyway I will keep this issue open in case anyone has more knowledge on this!

from chatgpt-telegram-bot.

k3it avatar k3it commented on May 22, 2024

I did some more checking and it looks like "Token used" is an individual counter for each prompt/response transaction. it is not a cumulative for the current conversation. So each new question within the same conversation becomes more expensive to ask.

After a long session without a /reset it may be cheaper to buy and weight the banana yourself instead of asking the bot about it :)

edit: i see now that message query and answer history is added to each completion. that explains the growth I think

self.__add_to_history(chat_id, role="user", content=query)
response = openai.ChatCompletion.create(
model=self.config['model'],
messages=self.sessions[chat_id],
temperature=self.config['temperature'],

self.__add_to_history(chat_id, role="assistant", content=answer)

edit2: here is what the bot thinks about this (might not be accurate?)
The GPT completion API can remember the context of the conversation by itself using its internal memory, without needing to send the full message history back each time a new message is sent.

When you create a new completion request using the GPT API, you can include some context that the endpoint can use to better understand the request. This context can come from the last few messages in the conversation, as well as any additional information you provide. The GPT model can then use this context to generate a more accurate and relevant response.

from chatgpt-telegram-bot.

n3d1117 avatar n3d1117 commented on May 22, 2024

Indeed it looks like sending history also cosumes tokens.

Here's what the docs say:

Including the conversation history helps when user instructions refer to prior messages. [...] Because the models have no memory of past requests, all relevant information must be supplied via the conversation. If a conversation cannot fit within the model’s token limit, it will need to be shortened in some way.

So I agree that history needs to be truncated or summarized somehow. I commented on your PR for possible solutions

from chatgpt-telegram-bot.

n3d1117 avatar n3d1117 commented on May 22, 2024

@k3it Should be fixed in 946f6a4. Feel free to reopen if issue persists

from chatgpt-telegram-bot.

em108 avatar em108 commented on May 22, 2024

Hello and thanks a lot for the updates and fixes
I've seen this application of embeddings in another bot (link below) to solve the token issue. Would it be a good idea to implement to further reduce on the token consumption?

https://github.com/LagPixelLOL/ChatGPTCLIBot

from chatgpt-telegram-bot.

n3d1117 avatar n3d1117 commented on May 22, 2024

Hi @em108 I'm not familiar with embeddings or how they can be implemented in python. What are the advantages?

from chatgpt-telegram-bot.

em108 avatar em108 commented on May 22, 2024

From what I've gathered from multiple sources including the article below, embeddings can aid in long term memory / a way to store conversation data. Based on the cost and application, they can cost up to 5 - ~9 times less than sending chat history.

Article:
https://towardsdatascience.com/generative-question-answering-with-long-term-memory-c280e237b144

Also an example of a notebook utilizing embeddings:
https://github.com/openai/openai-cookbook/blob/main/examples/Question_answering_using_embeddings.ipynb

from chatgpt-telegram-bot.

n3d1117 avatar n3d1117 commented on May 22, 2024

Very interesting, thanks @em108. Would be great, although I currently don't have the time to implement this

from chatgpt-telegram-bot.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.