Giter VIP home page Giter VIP logo

Comments (3)

TechnoRahmon avatar TechnoRahmon commented on July 24, 2024 1

@yogeshhk Thank you for replying.

Actually, it has been solved by feeding the prompt as PromptTemplate type to the LLMChain. my issue was I passed the prompt as a string to the LLMChain, hence I changed it to the PromptTemplate type, It worked

    # prepare the prompt
    prompt = PromptTemplate(
        input_variables=give_assistance_input_variables,
        template=give_assistance_prompt
    )
    # connect to the LLM
    llm_chain = LLMChain(prompt=prompt, llm=llm_davinci)

from notion-qa.

TechnoRahmon avatar TechnoRahmon commented on July 24, 2024

I have similar situation, here where I stored my llm object in separate file :

# Create an instance of OpenAI LLM with desired configuration
llm_davinci = OpenAI(
    model_name=models_names["completions-davinci"],
    temperature=0,
    max_tokens=256,
    top_p=1.0,

    frequency_penalty=0.0,
    presence_penalty=0.0,
    n=1,
    best_of=1,
    request_timeout=None
)

then I am using the llm_davinci instance in other function like this :

def ask_llm(query: str, filename: str):

    # prepare the prompt
    prompt = code_assistance.format(context="this is a test", command=query)
    tokens = tiktoken_len(prompt)
    print(f"prompt  : {prompt}")
    print(f"prompt tokens : {tokens}")

    # connect to the LLM
    llm_chain = LLMChain(prompt=prompt, llm=llm_davinci)

    # run the LLM
    with get_openai_callback() as cb:
        response = llm_chain.run()

    return jsonify({'query': query,
                    'response': str(response),
                    'usage': cb})

the issue is with line :

    # connect to the LLM
    llm_chain = LLMChain(prompt=prompt, llm=llm_davinci)

error :
llm_chain = LLMChain(prompt=prompt, llm=llm_davinci)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "pydantic\main.py", line 341, in pydantic.main.BaseModel.init
pydantic.error_wrappers.ValidationError: 1 validation error for LLMChain
prompt
value is not a valid dict (type=type_error.dict)

any idea to solve this ?

from notion-qa.

yogeshhk avatar yogeshhk commented on July 24, 2024

@TechnoRahmon in my case it was confusing with "prompt" variable... try changing "prompt" inside ask_llm() to something else like "llm_prompt"

from notion-qa.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.