Comments (7)
Hello, MatGPT runs on LLMs with MATLAB library, and unless the library supports custom API endpoints, MatGPT cannot. Please open an issue on that repo.
from matgpt.
Hi @Mingzefei , can you provide more details about your use case?
from matgpt.
Hi @toshiakit ,
maybe something like:
client = openai.AzureOpenAI(
api_version="2024-03-01-preview",
azure_endpoint="",
api_key=api_key,
)
Where azure_endpoint is some other url than openAIs.
So somewhere in this app the URL is hardcoded. This should be dynamic.
from matgpt.
I passed your comment to the maintainer of LLMs with MATLAB.
matlab-deep-learning/llms-with-matlab#14
from matgpt.
Hi @Mingzefei , can you provide more details about your use case?
Hi, sorry for the late reply. @jonasendc has provided a case for AzureOpenAI, and I'd like to add another case about locally deployed LLMs.
For example, projects like olloma allow for easy local deployment and utilization of many open-source LLMs. After a successful deployment, olloma by default starts an API service on local port 11434. This API service is quite similar to calling the OpenAI API, as shown below:
from openai import OpenAI
client = OpenAI(
base_url = 'http://localhost:11434/v1',
api_key='ollama', # required, but unused
)
response = client.chat.completions.create(
model="llama3",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Who won the world series in 2020?"},
{"role": "assistant", "content": "The LA Dodgers won in 2020."},
{"role": "user", "content": "Where was it played?"}
]
)
print(response.choices[0].message.content)
As you can see, generally only the base_url needs to be modified.
Additionally, being able to call ollama or other locally deployed LLMs might have the following advantages:
- The ability to use LLMs fine-tuned for specific tasks like MATLAB, which might yield better performance and lower costs than ChatGPT.
- Providing a solution for countries or regions where ChatGPT is not available.
I hope this adds to the discussion and look forward to your thoughts.
from matgpt.
@Mingzefei and @jonasendc
As I said earlier, MatGPT depends on the library called "LLMs with MATLAB", and the endpoint is defined in that library. There is nothing I can do unless the maintainer of the LLMs with MATLAB makes the requested change.
Even if that library supports the custom endpoints, MatGPT may not support it because it goes beyond the intended scope of the project.
The endpoint is hard coded in callOpenAIChatAPI.m
MatGPT
├── helpers
│ ├── llms-with-matlab
│ │ ├── openAIChat.m
| │ ├── +llms
| │ │ ├── +internal
| │ │ │ ├── callOpenAIChatAPI.m
I suggest that you fork "LLMs with MATLAB" and customize the code there. I don't think you need MatGPT for that.
from matgpt.
@jonasendc You expressed frustration in the other thread about the pace of the progress. I am just a contributor to the other thread, but my concern about how the custom API endpoint support was implemented is that it is so open-ended and therefore untestable. I would be much more comfortable if it is implemented in a testable way Hence, I am advocating more use-case-specific approach.
from matgpt.
Related Issues (18)
- Renaming the second chat produces renaming of the first chat
- Error with MatGPT HOT 4
- AuthInfo is an encoded token HOT 2
- Error in TextHelper.replaceTableMarkdown (line 133) HOT 2
- Connect Timeout HOT 1
- Getting 429 error Accessing ChatGPT API via MATLAB HOT 4
- Suggestions as buttons HOT 2
- Settings HOT 2
- Code Interpreter HOT 1
- Function to get the list of model HOT 1
- How to streamingly receive replies from a large model using Matlab? HOT 3
- Can the new version of MatGPT streamingly receive the responses from LLM? If not, how can we realize it? HOT 5
- Azure OpenAI API HOT 2
- Insufficient_quota error HOT 2
- JSON syntax error HOT 2
- What should I do if I don't want to use api.openai.com and instead want to use a another API URL HOT 1
- Typo in app
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from matgpt.