Giter VIP home page Giter VIP logo

Comments (7)

toshiakit avatar toshiakit commented on May 26, 2024

Hello, MatGPT runs on LLMs with MATLAB library, and unless the library supports custom API endpoints, MatGPT cannot. Please open an issue on that repo.

from matgpt.

toshiakit avatar toshiakit commented on May 26, 2024

Hi @Mingzefei , can you provide more details about your use case?

from matgpt.

jonasendc avatar jonasendc commented on May 26, 2024

Hi @toshiakit ,
maybe something like:
client = openai.AzureOpenAI(
api_version="2024-03-01-preview",
azure_endpoint="",
api_key=api_key,
)

Where azure_endpoint is some other url than openAIs.
So somewhere in this app the URL is hardcoded. This should be dynamic.

from matgpt.

toshiakit avatar toshiakit commented on May 26, 2024

@jonasendc

I passed your comment to the maintainer of LLMs with MATLAB.

matlab-deep-learning/llms-with-matlab#14

from matgpt.

Mingzefei avatar Mingzefei commented on May 26, 2024

Hi @Mingzefei , can you provide more details about your use case?

Hi, sorry for the late reply. @jonasendc has provided a case for AzureOpenAI, and I'd like to add another case about locally deployed LLMs.

For example, projects like olloma allow for easy local deployment and utilization of many open-source LLMs. After a successful deployment, olloma by default starts an API service on local port 11434. This API service is quite similar to calling the OpenAI API, as shown below:

from openai import OpenAI

client = OpenAI(
    base_url = 'http://localhost:11434/v1',
    api_key='ollama', # required, but unused
)

response = client.chat.completions.create(
  model="llama3",
  messages=[
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Who won the world series in 2020?"},
    {"role": "assistant", "content": "The LA Dodgers won in 2020."},
    {"role": "user", "content": "Where was it played?"}
  ]
)
print(response.choices[0].message.content)

As you can see, generally only the base_url needs to be modified.

Additionally, being able to call ollama or other locally deployed LLMs might have the following advantages:

  1. The ability to use LLMs fine-tuned for specific tasks like MATLAB, which might yield better performance and lower costs than ChatGPT.
  2. Providing a solution for countries or regions where ChatGPT is not available.

I hope this adds to the discussion and look forward to your thoughts.

from matgpt.

toshiakit avatar toshiakit commented on May 26, 2024

@Mingzefei and @jonasendc

As I said earlier, MatGPT depends on the library called "LLMs with MATLAB", and the endpoint is defined in that library. There is nothing I can do unless the maintainer of the LLMs with MATLAB makes the requested change.

Even if that library supports the custom endpoints, MatGPT may not support it because it goes beyond the intended scope of the project.

The endpoint is hard coded in callOpenAIChatAPI.m

MatGPT
├── helpers
│       ├── llms-with-matlab
│       │       ├── openAIChat.m
|       │       ├── +llms
|       │       │     ├── +internal
|       │       │     │        ├── callOpenAIChatAPI.m

I suggest that you fork "LLMs with MATLAB" and customize the code there. I don't think you need MatGPT for that.

from matgpt.

toshiakit avatar toshiakit commented on May 26, 2024

@jonasendc You expressed frustration in the other thread about the pace of the progress. I am just a contributor to the other thread, but my concern about how the custom API endpoint support was implemented is that it is so open-ended and therefore untestable. I would be much more comfortable if it is implemented in a testable way Hence, I am advocating more use-case-specific approach.

from matgpt.

Related Issues (18)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.