Giter VIP home page Giter VIP logo

chatbot-ollama's People

Contributors

coolaj86 avatar davlgd avatar ivanfioravanti avatar mskec avatar ryansereno avatar willyw0nka avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

chatbot-ollama's Issues

can't choose custom model when new conversation

I change the .env.local file
.env.local

DEFAULT_MODEL="Mistral-7B:latest"
NEXT_PUBLIC_DEFAULT_SYSTEM_PROMPT="You are a helpful, respectful and honest assistant.Help humman as much as you can."

and run the script

OLLAMA_HOST="http://0.0.0.0:11434" DEFAULT_MODEL='Mistral-7B:latest'  HOST=0.0.0.0 PORT=80 npm run dev

when creating new conversation.there is no option to choose custom model,it gave me error
[OllamaError: model 'mistral:latest' not found, try pulling it first] {
name: 'OllamaError'
}

how to solve it?

image

timeout exception Error: failed to pipe response

when I try to response from ollama a few second,
the error log:

⨯ Error: failed to pipe response 2024-03-03 21:28:21 at pipeToNodeResponse (/app/node_modules/next/dist/server/pipe-readable.js:111:15) 2024-03-03 21:28:21 at process.processTicksAndRejections (node:internal/process/task_queues:95:5) 2024-03-03 21:28:21 at async NextNodeServer.runEdgeFunction (/app/node_modules/next/dist/server/next-server.js:1225:13) 2024-03-03 21:28:21 at async NextNodeServer.handleCatchallRenderRequest (/app/node_modules/next/dist/server/next-server.js:247:37) 2024-03-03 21:28:21 at async NextNodeServer.handleRequestImpl (/app/node_modules/next/dist/server/base-server.js:807:17) 2024-03-03 21:28:21 at async invokeRender (/app/node_modules/next/dist/server/lib/router-server.js:163:21) 2024-03-03 21:28:21 at async handleRequest (/app/node_modules/next/dist/server/lib/router-server.js:342:24) 2024-03-03 21:28:21 at async requestHandlerImpl (/app/node_modules/next/dist/server/lib/router-server.js:366:13) 2024-03-03 21:28:21 at async Server.requestListener (/app/node_modules/next/dist/server/lib/start-server.js:140:13) { 2024-03-03 21:28:21 [cause]: SyntaxError: Expected ',' or ']' after array element in JSON at position 1454 2024-03-03 21:28:21 at JSON.parse (<anonymous>) 2024-03-03 21:28:21 at Object.start (/app/.next/server/pages/api/chat.js:1:822) 2024-03-03 21:28:21 at process.processTicksAndRejections (node:internal/process/task_queues:95:5) 2024-03-03 21:28:21 } 2024-03-03 21:28:21 Error: aborted 2024-03-03 21:28:21 at connResetException (node:internal/errors:787:14) 2024-03-03 21:28:21 at abortIncoming (node:_http_server:793:17) 2024-03-03 21:28:21 at socketOnClose (node:_http_server:787:3) 2024-03-03 21:28:21 at Socket.emit (node:events:530:35) 2024-03-03 21:28:21 at TCP.<anonymous> (node:net:337:12) { 2024-03-03 21:28:21 code: 'ECONNRESET' 2024-03-03 21:28:21 }

in ollama console every thing is all right.

unable to change model

I was using ollama pull to download new model, I can see the model in the dropdown menu, but they won't generate. (mistral works fine btw)

Can't connect to the ollama server at ECONNREFUSED

I set the OLLAMA_HOST like so:

docker run -d -p 3000:3000 -e OLLAMA_HOST="http://127.0.0.1:11434/" --name chatbot-ollama ghcr.io/ivanfioravanti/chatbot-ollama:main

and can't connect the app to the server. docker logs chatbot-ollama reads:

> [email protected] start
> next start

  ▲ Next.js 13.5.4
  - Local:        http://localhost:3000

 ✓ Ready in 236ms
 [TypeError: fetch failed] {
  cause:  [Error: connect ECONNREFUSED 127.0.0.1:11434] {
  errno: -111,
  code: 'ECONNREFUSED',
  syscall: 'connect',
  address: '127.0.0.1',
  port: 11434
}
}

Prompt variable documentation?

When creating a new prompt, it shows a little bit of guidance on how to use variables using double curly braces.

Is there any documentation anywhere (even in the code) that expands upon this?

Running the ollama local model on a mac book failed

[TypeError: fetch failed] {
2024-03-09 14:04:12 cause: [Error: connect ECONNREFUSED 127.0.0.1:11434] {
2024-03-09 14:04:12 errno: -111,
2024-03-09 14:04:12 code: 'ECONNREFUSED',
2024-03-09 14:04:12 syscall: 'connect',
2024-03-09 14:04:12 address: '127.0.0.1',
2024-03-09 14:04:12 port: 11434
2024-03-09 14:04:12 }

CPU pegged to 100%

Running the container is completely fine, but as soon as I open localhost:3000 in any graphical browser the tab pegs one CPU core to 100% and refuses to let go. I cannot scroll up with the scroll wheel, the site will immediately scroll back to the bottom (using the scrollbar at the side by hand works)
It feels like you don't sync to vblank or whatever so your site just tries to render as many frames as the CPU allows? Either way no sleep of any kind seems to happen, making the project almost unusable.

Unfortunately I'm not a web-dev, so even if I wish I could, I can't really help.

image
maybe this profiler callstack helps some. probably not given the obfuscation.

sometimes Unexpected non-whitespace character ruines all the fun

SyntaxError: Unexpected non-whitespace character after JSON at position 104 (line 2 column 1)
at JSON.parse ()
at Object.start (webpack-internal:///(middleware)/./utils/server/index.ts:46:45)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)

this error I think is similar to one GPT API has too...

Deprecated deps

During a npm install I get 3 deprecated notices:

npm WARN deprecated [email protected]: Use your platform's native atob() and btoa() methods instead
npm WARN deprecated @vitest/[email protected]: v8 coverage is moved to @vitest/coverage-v8 package
npm WARN deprecated [email protected]: Use your platform's native DOMException instead

no matching manifest for windows/amd64 10.0.22631

Hi guys, when I use the docker command: docker run -p 3000:3000 ghcr.io/ivanfioravanti/chatbot-ollama:main to install the image, it showed the error: no matching manifest for windows/amd64 10.0.22631 in the manifest list entries. What should I do? Thank you!

issue

Streaming text?

Hi,

I noticed that there is no option to enable streaming. Is it planned to add the feature?

It would be much appreciated.

support for "custom"/alternative endpoints

The following only partially works:
podman run -it --rm -e OLLAMA_HOST=http://somewhere.com -p 3000:3000 -e DEFAULT_MODEL=mistral:7b-instruct ghcr.io/ivanfioravanti/chatbot-ollama:main

The UI seems to act like it's willing to follow the ENV, but then forces itself to default to mistral:latest and blows up with errors:

 [OllamaError: model 'mistral:latest' not found, try pulling it first] {
  name: 'OllamaError'
}

image

New Feature Request: Support multiple file types for importing data

Currently:

I noticed for importing data, it accepts only json files at the moment, as seen in Import.tsx.

My Thoughts For Scaling:

For scaling this feature, we could go further and support multiple type of files to import, such as:

  • PDFs
  • Images (png, svg, jpeg, etc)
  • Language files (.html, .js, .ts, .py, cpp, etc)

We can update this list as needed, but these were what I had in mind atm

Dcker run instruction is not working

Docker run instruction documented to run the pre built container is not working.

Steps to reproduce

$ docker run -e -p 3000:3000 ghcr.io/ivanfioravanti/chatbot-ollama:main

Unable to find image '3000:3000' locally

Workaround

$ docker run -p 3000:3000 ghcr.io/ivanfioravanti/chatbot-ollama:main

Unable to connect to model when ollama is hosted on a different server

I am hosting ollama remotely in a server and trying to deploy the chatbot-ollama on a different server. Accordingly, I added the OLLAMA_HOST env variable pointing to the address of my ollama server.

When I start chatbot-ollama, when I input something in the GUI, in the logs it says
[OllamaError: stat /root/.ollama/models/manifests/registry.ollama.ai/library/llama2/latest: no such file or directory] { name: 'OllamaError'}

Also, weirdly, I set the DEFAULT_MODEL env variable to llama-2-70b-chat-nous-hermes in my dockerfile but it is trying to search for llama2 model? The llama-2-70b-chat-nous-hermes model is up and running on the ollama server btw (created from a Modelfile).

How do I fix this?

pdf file support

right now only file type i can import seems to be json, wanted something that could help me chat with my pdfs using ollama with a beautiful UI. Hoping you can add that feature.

`punycode` module is deprecated

This is the Log after using the installing with docker (command from readme).

[email protected] start
�
> next start
  ▲ Next.js 13.5.4
  - Local:        http://localhost:3000
 ✓ Ready in 455ms
(node:24) [DEP0040] DeprecationWarning: The `punycode` module is deprecated. Please use a userland alternative instead.
(Use `node --trace-deprecation ...` to show where the warning was created)
 [TypeError: fetch failed] {
  cause:  [Error: getaddrinfo ENOTFOUND host.docker.internal] {
  errno: -3008,
  code: 'ENOTFOUND',
  syscall: 'getaddrinfo',
  hostname: 'host.docker.internal'

Error: connect ECONNREFUSED 127.0.0.1:11434]

Hello everyone!
My ollama in My docker
docker Start ollama command is docker run -e OLLAMA_HOST=0.0.0.0:11434 -d -v ollama serve -p 11434:11434 --name ollama ollama/ollama
Then I in vscode open chatbot-ollama And then input npm run dev And then Report an error

↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓ Here is the error log ↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓

PS G:\AI\chatbot-ollama> npm run dev

[email protected] dev
next dev

▲ Next.js 13.5.6

✓ Ready in 2.9s
○ Compiling / ...
✓ Compiled / in 3.3s (1652 modules)
⚠ Fast Refresh had to perform a full reload. Read more: https://nextjs.org/docs/messages/fast-refresh-reload
✓ Compiled in 1699ms (1652 modules)
✓ Compiled in 519ms (1652 modules)
✓ Compiled /api/models in 245ms (68 modules)
[TypeError: fetch failed] {
cause: [Error: connect ECONNREFUSED 127.0.0.1:11434] {
errno: -4078,
code: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 11434
}
}
✓ Compiled in 620ms (1720 modules)
[TypeError: fetch failed] {
cause: [Error: connect ECONNREFUSED 127.0.0.1:11434] {
errno: -4078,
code: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 11434
}
}
[TypeError: fetch failed] {
cause: [Error: connect ECONNREFUSED 127.0.0.1:11434] {
errno: -4078,
code: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 11434
}
}
[TypeError: fetch failed] {
cause: [Error: connect ECONNREFUSED 127.0.0.1:11434] {
errno: -4078,
code: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 11434
}
}

build failed

i tried to build the chat

216:6  Warning: React Hook useEffect has a missing dependency: 'dispatch'. Either include it or remove the dependency array.  react-hooks/exhaustive-deps
221:6  Warning: React Hook useEffect has a missing dependency: 'dispatch'. Either include it or remove the dependency array.  react-hooks/exhaustive-deps
231:6  Warning: React Hook useEffect has a missing dependency: 'defaultModelId'. Either include it or remove the dependency array.  react-hooks/exhaustive-deps
305:6  Warning: React Hook useEffect has missing dependencies: 'conversations' and 't'. Either include them or remove the dependency array.  react-hooks/exhaustive-deps

./components/Chat/Chat.tsx
232:5  Warning: React Hook useCallback has a missing dependency: 'homeDispatch'. Either include it or remove the dependency array.  react-hooks/exhaustive-deps

./components/Chat/ChatInput.tsx
222:6  Warning: React Hook useEffect has a missing dependency: 'textareaRef'. Either include it or remove the dependency array.  react-hooks/exhaustive-deps

./components/Chatbar/Chatbar.tsx
158:6  Warning: React Hook useEffect has a missing dependency: 'chatDispatch'. Either include it or remove the dependency array.  react-hooks/exhaustive-deps

./components/Promptbar/Promptbar.tsx
117:6  Warning: React Hook useEffect has a missing dependency: 'promptDispatch'. Either include it or remove the dependency array.  react-hooks/exhaustive-deps

info  - Need to disable some ESLint rules? Learn more here: https://nextjs.org/docs/basic-features/eslint#disabling-rules
info  - Linting and checking validity of types ...Failed to compile.

./pages/_app.tsx:26:10
Type error: 'Component' cannot be used as a JSX component.
  Its element type 'Component<any, any, any> | ReactElement<any, any> | null' is not a valid JSX element.
    Type 'Component<any, any, any>' is not assignable to type 'Element | ElementClass | null'.
      Type 'Component<any, any, any>' is not assignable to type 'ElementClass'.
        The types returned by 'render()' are incompatible between these types.
          Type 'React.ReactNode' is not assignable to type 'import("/var/www/chat.moving-bytes.at/node_modules/@types/react-dom/node_modules/@types/react/ts5.0/index").ReactNode'.
            Type 'ReactElement<any, string | JSXElementConstructor<any>>' is not assignable to type 'ReactNode'.
              Property 'children' is missing in type 'ReactElement<any, string | JSXElementConstructor<any>>' but required in type 'ReactPortal'.

  24 |       />
  25 |       <QueryClientProvider client={queryClient}>
> 26 |         <Component {...pageProps} />
     |          ^
  27 |       </QueryClientProvider>
  28 |     </div>
  29 |   );
error Command failed with exit code 1.
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.

UI flickering and scroll is broker

Issues:

  1. Word streaming is not working anymore, and all the response is shown at once at the end
  2. UI keeps flickering till the response is streamed
  3. Post response is received on the screen and rendered, the scroll is not smooth and jumps from one position to another.

Recording:

2024-04-03.10-17-39.mp4

System:
image

Running via docker, node 20

Multi modal support

LLaVA was recently released, and progress is being made upstream to support it.
Ollama has yet to begin discussions about this (at least that I can find), and I'm going to explore what it will take on that end too.
But I think it would be advantageous to at least start thinking about how you might want to implement this; I'd be happy to help.

Ofc, makes sense to wait on updates from ollama first, but just wanted to put it on the radar.
I think we could really get local LMM's on the map if we got this implemented:)

Doesn't work if internet is turned off.

When internet is turned off, it throws below error

⨯ Error: failed to pipe response at pipeToNodeResponse (/app/node_modules/next/dist/server/pipe-readable.js:111:15) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async NextNodeServer.runEdgeFunction (/app/node_modules/next/dist/server/next-server.js:1225:13) at async NextNodeServer.handleCatchallRenderRequest (/app/node_modules/next/dist/server/next-server.js:247:37) at async NextNodeServer.handleRequestImpl (/app/node_modules/next/dist/server/base-server.js:807:17) at async invokeRender (/app/node_modules/next/dist/server/lib/router-server.js:163:21) at async handleRequest (/app/node_modules/next/dist/server/lib/router-server.js:342:24) at async requestHandlerImpl (/app/node_modules/next/dist/server/lib/router-server.js:366:13) at async Server.requestListener (/app/node_modules/next/dist/server/lib/start-server.js:140:13) { [cause]: SyntaxError: Unexpected non-whitespace character after JSON at position 101 at JSON.parse (<anonymous>) at Object.start (/app/.next/server/pages/api/chat.js:1:822) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) } Error: aborted at connResetException (node:internal/errors:787:14) at abortIncoming (node:_http_server:793:17) at socketOnClose (node:_http_server:787:3) at Socket.emit (node:events:530:35) at TCP.<anonymous> (node:net:337:12) { code: 'ECONNRESET' } ⨯ Error: failed to pipe response at pipeToNodeResponse (/app/node_modules/next/dist/server/pipe-readable.js:111:15) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async NextNodeServer.runEdgeFunction (/app/node_modules/next/dist/server/next-server.js:1225:13) at async NextNodeServer.handleCatchallRenderRequest (/app/node_modules/next/dist/server/next-server.js:247:37) at async NextNodeServer.handleRequestImpl (/app/node_modules/next/dist/server/base-server.js:807:17) at async invokeRender (/app/node_modules/next/dist/server/lib/router-server.js:163:21) at async handleRequest (/app/node_modules/next/dist/server/lib/router-server.js:342:24) at async requestHandlerImpl (/app/node_modules/next/dist/server/lib/router-server.js:366:13) at async Server.requestListener (/app/node_modules/next/dist/server/lib/start-server.js:140:13) { [cause]: SyntaxError: Unexpected non-whitespace character after JSON at position 99 at JSON.parse (<anonymous>) at Object.start (/app/.next/server/pages/api/chat.js:1:822) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) } Error: aborted at connResetException (node:internal/errors:787:14) at abortIncoming (node:_http_server:793:17) at socketOnClose (node:_http_server:787:3) at Socket.emit (node:events:530:35) at TCP.<anonymous> (node:net:337:12) { code: 'ECONNRESET' } o ⨯ Error: failed to pipe response at pipeToNodeResponse (/app/node_modules/next/dist/server/pipe-readable.js:111:15) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async NextNodeServer.runEdgeFunction (/app/node_modules/next/dist/server/next-server.js:1225:13) at async NextNodeServer.handleCatchallRenderRequest (/app/node_modules/next/dist/server/next-server.js:247:37) at async NextNodeServer.handleRequestImpl (/app/node_modules/next/dist/server/base-server.js:807:17) at async invokeRender (/app/node_modules/next/dist/server/lib/router-server.js:163:21) at async handleRequest (/app/node_modules/next/dist/server/lib/router-server.js:342:24) at async requestHandlerImpl (/app/node_modules/next/dist/server/lib/router-server.js:366:13) at async Server.requestListener (/app/node_modules/next/dist/server/lib/start-server.js:140:13) { [cause]: SyntaxError: Unexpected non-whitespace character after JSON at position 99 at JSON.parse (<anonymous>) at Object.start (/app/.next/server/pages/api/chat.js:1:822) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) } Error: aborted at connResetException (node:internal/errors:787:14) at abortIncoming (node:_http_server:793:17) at socketOnClose (node:_http_server:787:3) at Socket.emit (node:events:530:35) at TCP.<anonymous> (node:net:337:12) { code: 'ECONNRESET' }

Feature Request : Compile LaTeX code

Hi, it looks like chatbot-ui has LaTeX support but chatbot-ollama seems to have it disabled/deleted.
Is it intentional ? It's my main motivation for quitting ollama CLI 😄

how to set the port

I use below script to run chatbot-ollama,the default port is 3000,how change it to 80?

OLLAMA_HOST="http://127.0.0.1:5050"  npm run dev

Ollama API returned an error 404: 404 page not found

ollama is working, but chatbot-ollama reports 404 error

docker run --rm -it -e OLLAMA_HOST="http://127.0.0.1:11434/" --net=host -e DEFAULT_MODEL=llama2 --name chatbot-ollama ghcr.io/ivanfioravanti/chatbot-ollama:main

image
image

There was a error happened when entered my input after "ollama serve" and "npm run dev"

the error information is :
dell@dell:/mnt/data/chatbot-ollama$ npm run dev

[email protected] dev
next dev

▲ Next.js 14.1.0

✓ Ready in 5.1s
✓ Compiled /api/models in 202ms (77 modules)
○ Compiling /_error ...
✓ Compiled /_error in 3.9s (304 modules)
⚠ Fast Refresh had to perform a full reload. Read more: https://nextjs.org/docs/messages/fast-refresh-reload
○ Compiling / ...
✓ Compiled / in 980ms (1659 modules)
⚠ Fast Refresh had to perform a full reload. Read more: https://nextjs.org/docs/messages/fast-refresh-reload
(node:1207575) [DEP0040] DeprecationWarning: The punycode module is deprecated. Please use a userland alternative instead.
(Use node --trace-deprecation ... to show where the warning was created)
✓ Compiled /api/chat in 103ms (80 modules)
Error: aborted
at abortIncoming (node:_http_server:806:17)
at socketOnClose (node:_http_server:800:3)
at Socket.emit (node:events:532:35)
at TCP. (node:net:339:12) {
code: 'ECONNRESET'
}
⨯ uncaughtException: Error: aborted
at abortIncoming (node:_http_server:806:17)
at socketOnClose (node:_http_server:800:3)
at Socket.emit (node:events:532:35)
at TCP. (node:net:339:12) {
code: 'ECONNRESET'
}
⨯ uncaughtException: Error: aborted
at abortIncoming (node:_http_server:806:17)
at socketOnClose (node:_http_server:800:3)
at Socket.emit (node:events:532:35)
at TCP. (node:net:339:12) {
code: 'ECONNRESET'
}
⨯ Error: failed to pipe response
at pipeToNodeResponse (/mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/server/pipe-readable.js:111:15)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async DevServer.runEdgeFunction (/mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/server/next-server.js:1225:13)
at async NextNodeServer.handleCatchallRenderRequest (/mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/server/next-server.js:247:37)
at async DevServer.handleRequestImpl (/mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/server/base-server.js:807:17)
at async /mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/server/dev/next-dev-server.js:331:20
at async Span.traceAsyncFn (/mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/trace/trace.js:151:20)
at async DevServer.handleRequest (/mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/server/dev/next-dev-server.js:328:24)
at async invokeRender (/mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/server/lib/router-server.js:163:21)
at async handleRequest (/mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/server/lib/router-server.js:342:24)
at async requestHandlerImpl (/mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/server/lib/router-server.js:366:13)
at async Server.requestListener (/mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/server/lib/start-server.js:140:13) {
[cause]: SyntaxError: Unexpected non-whitespace character after JSON at position 97 (line 2 column 1)
at JSON.parse ()
at Object.start (webpack-internal:///(middleware)/./utils/server/index.ts:46:45)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
}
the chatbot UI:
image

Can you help me solve this problem? I will really appreciate that.

New Feature Request: decription of each model

I have handful of models I use and test all the time. However it is difficult to keep track of which model is for which purpose. Adding even a one line description for the model and showing it when selecting new chat and model would help a ton.

For example:

Mistral 7B: Small, yet very powerful for a variety of use cases. English and code, 8k context window.
Mixtral 8x7B: A 7B sparse Mixture-of-Experts (SMoE), Fluent in English, French, Italian, German, Spanish, and strong in code, 32k context window

how can I package it into static files?

Hello, I am a beginner who is not familiar with front-end development. I have an nginx service. Could you please tell me how to deploy the chatbot-ollama project on a server that is completely offline? Thank you very much for your reply!

Chatbar.tsx:153 Warning: Maximum update depth exceeded. This can happen when a component calls setState inside useEffect, but useEffect either doesn't have a dependency array, or one of the dependencies changes on every render.

Chatbar.tsx:153 Warning: Maximum update depth exceeded. This can happen when a component calls setState inside useEffect, but useEffect either doesn't have a dependency array, or one of the dependencies changes on every render.
at Chatbar (webpack-internal:///./components/Chatbar/Chatbar.tsx:44:79)
at div
at main
at Home (webpack-internal:///./pages/api/home/home.tsx:54:11)
at QueryClientProvider (webpack-internal:///./node_modules/react-query/es/react/QueryClientProvider.js:39:21)
at div
at App (webpack-internal:///./pages/_app.tsx:18:11)
at I18nextProvider (webpack-internal:///./node_modules/react-i18next/dist/es/I18nextProvider.js:10:19)
at AppWithTranslation (webpack-internal:///./node_modules/next-i18next/dist/esm/appWithTranslation.js:34:22)
at PathnameContextProviderAdapter (webpack-internal:///./node_modules/next/dist/shared/lib/router/adapters.js:80:11)
at ErrorBoundary (webpack-internal:///./node_modules/next/dist/compiled/@next/react-dev-overlay/dist/client.js:26:6864)
at ReactDevOverlay (webpack-internal:///./node_modules/next/dist/compiled/@next/react-dev-overlay/dist/client.js:26:9247)
at Container (webpack-internal:///./node_modules/next/dist/client/index.js:80:1)
at AppContainer (webpack-internal:///./node_modules/next/dist/client/index.js:213:11)
at Root (webpack-internal:///./node_modules/next/dist/client/index.js:437:11)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.