Giter VIP home page Giter VIP logo

ollama / ollama Goto Github PK

View Code? Open in Web Editor NEW
66.3K 403.0 4.8K 10.62 MB

Get up and running with Llama 3, Mistral, Gemma, and other large language models.

Home Page: https://ollama.com

License: MIT License

TypeScript 2.56% JavaScript 0.04% CSS 0.07% HTML 0.02% Dockerfile 0.89% Go 84.40% Shell 5.45% Python 0.36% C 2.66% PowerShell 2.64% Inno Setup 0.84% Objective-C 0.06%
llama llm llama2 llms go golang ollama mistral

ollama's Introduction

 ollama

Ollama

Discord

Get up and running with large language models locally.

macOS

Download

Windows preview

Download

Linux

curl -fsSL https://ollama.com/install.sh | sh

Manual install instructions

Docker

The official Ollama Docker image ollama/ollama is available on Docker Hub.

Libraries

Quickstart

To run and chat with Llama 3:

ollama run llama3

Model library

Ollama supports a list of models available on ollama.com/library

Here are some example models that can be downloaded:

Model Parameters Size Download
Llama 3 8B 4.7GB ollama run llama3
Llama 3 70B 40GB ollama run llama3:70b
Phi-3 3.8B 2.3GB ollama run phi3
Mistral 7B 4.1GB ollama run mistral
Neural Chat 7B 4.1GB ollama run neural-chat
Starling 7B 4.1GB ollama run starling-lm
Code Llama 7B 3.8GB ollama run codellama
Llama 2 Uncensored 7B 3.8GB ollama run llama2-uncensored
LLaVA 7B 4.5GB ollama run llava
Gemma 2B 1.4GB ollama run gemma:2b
Gemma 7B 4.8GB ollama run gemma:7b
Solar 10.7B 6.1GB ollama run solar

Note: You should have at least 8 GB of RAM available to run the 7B models, 16 GB to run the 13B models, and 32 GB to run the 33B models.

Customize a model

Import from GGUF

Ollama supports importing GGUF models in the Modelfile:

  1. Create a file named Modelfile, with a FROM instruction with the local filepath to the model you want to import.

    FROM ./vicuna-33b.Q4_0.gguf
    
  2. Create the model in Ollama

    ollama create example -f Modelfile
    
  3. Run the model

    ollama run example
    

Import from PyTorch or Safetensors

See the guide on importing models for more information.

Customize a prompt

Models from the Ollama library can be customized with a prompt. For example, to customize the llama3 model:

ollama pull llama3

Create a Modelfile:

FROM llama3

# set the temperature to 1 [higher is more creative, lower is more coherent]
PARAMETER temperature 1

# set the system message
SYSTEM """
You are Mario from Super Mario Bros. Answer as Mario, the assistant, only.
"""

Next, create and run the model:

ollama create mario -f ./Modelfile
ollama run mario
>>> hi
Hello! It's your friend Mario.

For more examples, see the examples directory. For more information on working with a Modelfile, see the Modelfile documentation.

CLI Reference

Create a model

ollama create is used to create a model from a Modelfile.

ollama create mymodel -f ./Modelfile

Pull a model

ollama pull llama3

This command can also be used to update a local model. Only the diff will be pulled.

Remove a model

ollama rm llama3

Copy a model

ollama cp llama3 my-model

Multiline input

For multiline input, you can wrap text with """:

>>> """Hello,
... world!
... """
I'm a basic program that prints the famous "Hello, world!" message to the console.

Multimodal models

>>> What's in this image? /Users/jmorgan/Desktop/smile.png
The image features a yellow smiley face, which is likely the central focus of the picture.

Pass the prompt as an argument

$ ollama run llama3 "Summarize this file: $(cat README.md)"
 Ollama is a lightweight, extensible framework for building and running language models on the local machine. It provides a simple API for creating, running, and managing models, as well as a library of pre-built models that can be easily used in a variety of applications.

List models on your computer

ollama list

Start Ollama

ollama serve is used when you want to start ollama without running the desktop application.

Building

Install cmake and go:

brew install cmake go

Then generate dependencies:

go generate ./...

Then build the binary:

go build .

More detailed instructions can be found in the developer guide

Running local builds

Next, start the server:

./ollama serve

Finally, in a separate shell, run a model:

./ollama run llama3

REST API

Ollama has a REST API for running and managing models.

Generate a response

curl http://localhost:11434/api/generate -d '{
  "model": "llama3",
  "prompt":"Why is the sky blue?"
}'

Chat with a model

curl http://localhost:11434/api/chat -d '{
  "model": "llama3",
  "messages": [
    { "role": "user", "content": "why is the sky blue?" }
  ]
}'

See the API documentation for all endpoints.

Community Integrations

Web & Desktop

Terminal

Database

Package managers

Libraries

Mobile

Extensions & Plugins

Supported backends

  • llama.cpp project founded by Georgi Gerganov.

ollama's People

Contributors

bmizerany avatar brucemacd avatar brycereitano avatar cmiller01 avatar danemadsen avatar dansreis avatar deichbewohner avatar dhiltgen avatar eliben avatar eltociear avatar fpreiss avatar ggozad avatar hoyyeva avatar isbkch avatar jamesbraza avatar jmorganca avatar mchiang0610 avatar mofanke avatar mraiser avatar mxyng avatar pdevine avatar pepperoni21 avatar remy415 avatar ruecat avatar sqs avatar technovangelist avatar tjbck avatar tusharhero avatar vinjn avatar xyproto avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ollama's Issues

blinking cursor is ambiguous

When I see a question, i just see a blinking cursor. Is the model loading? is it thinking? is there something else going on? Would be nice to see some sort of status to see what it is doing. do I need to kill the app?

Feedback: path not added

Looks like %localappdata%\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\Scripts doesn't get added to the path with pip install ollama. So the last item with pip could have a print statement about adding the path.

ollama run without model error should be caught

PS C:\Users\mail> ollama run
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "C:\Users\mail\AppData\Local\Programs\Python\Python311\Scripts\ollama.exe\__main__.py", line 4, in <module>
  File "C:\Users\mail\AppData\Local\Programs\Python\Python311\Lib\site-packages\ollama\cmd\cli.py", line 8, in <module>
    from ollama.cmd import server
  File "C:\Users\mail\AppData\Local\Programs\Python\Python311\Lib\site-packages\ollama\cmd\server.py", line 1, in <module>
    from aiohttp import web
ModuleNotFoundError: No module named 'aiohttp'
PS C:\Users\mail>

memory

When I ask a few questions in a conversation, there is no context between questions. So I have to include all the details in every prompt manually. I asked a question about Bainbridge Island, but after the response, it forgot and couldn't refer to it later in the conversation:

CleanShot 2023-06-27 at 15 51 03

generate pauses after about 50 tokens

Generation will get paused after about 50 tokens being provided

% ollama run orca
>>> Write a review of the restaurant "five guys"
 As an AI assistant, I cannot write a biased or subjective review, but I can provide you with some general information about the restaurant "Five Guys". Five Guys is an American fast-food chain that primarily serves hamburgers, fries, <pause here>

layer pulling issue when connection drops and comes back

I ran ollama run nous-hermes and it started dl the model. I quit when I saw I was on wifi, and connected wired and tried again. It hung for a minute and then when I asked a question it error'd.

❯ ./ollama run library/nous-hermes:latest                                                                                                                                                                                                             (base)
pulling manifest
pulling d1735b93e1dc503f...   4% |███████                                                                                                                                                                             | (289 MB/6.8 GB, 13 MB/s) [20s:8m36s]^C⏎

 main                                                                                                                                                                                                                                                  26s
❯ ./ollama run library/nous-hermes:latest                                                                                                                                                                                                             (base)
pulling manifest

>>> Where is justin bieber form
⠋   Error: 400 Bad Request: couldn't open file '/Users/matt/.ollama/models/manifests/library/nous-hermes:latest'

So I tried again and it started downloading

this is similar to #61 . no longer see the error, but kinda wish there was an error

Start/stop tokens seem to bug out sometimes in long winded sessions

stuff like:

>>> ... user prompt ...
...some response here...
<<SYS>>

You are an expert at summarizing text documents step by step and preserving
information. Between each of our interactions, summarize my message in a bullet
point summary, including all previously summarized information.

<</SYS>>

>>> ...

I've also seen it have conversations back and forth with itself lol but that might have been my fault due to mucking with instruction formats

model selection

I want to change the model. the only way to do it now is to kill it and restart. is there another way?

Crashed on M2 Air 8GB

llama.cpp: loading model from /Users/sasank/.ollama/models/blobs/sha256:8daa9615cce30c259a9555b1cc250d461d1bc69980a274b44d7eda0be78076d8
llama_model_load_internal: format     = ggjt v3 (latest)
llama_model_load_internal: n_vocab    = 32000
llama_model_load_internal: n_ctx      = 2048
llama_model_load_internal: n_embd     = 4096
llama_model_load_internal: n_mult     = 256
llama_model_load_internal: n_head     = 32
llama_model_load_internal: n_layer    = 32
llama_model_load_internal: n_rot      = 128
llama_model_load_internal: ftype      = 2 (mostly Q4_0)
llama_model_load_internal: n_ff       = 11008
llama_model_load_internal: model size = 7B
llama_model_load_internal: ggml ctx size =    0.08 MB
llama_model_load_internal: mem required  = 5407.72 MB (+ 1026.00 MB per state)
llama_new_context_with_model: kv self size  = 1024.00 MB
ggml_metal_init: allocating
ggml_metal_init: using MPS
ggml_metal_init: loading '/Users/sasank/code/llama/ollama/ggml-metal.metal'
ggml_metal_init: loaded kernel_add                            0x12aa075a0
ggml_metal_init: loaded kernel_mul                            0x12ab05ee0
ggml_metal_init: loaded kernel_mul_row                        0x12ab06530
ggml_metal_init: loaded kernel_scale                          0x12aa07de0
ggml_metal_init: loaded kernel_silu                           0x12aa08300
ggml_metal_init: loaded kernel_relu                           0x12ab06930
ggml_metal_init: loaded kernel_gelu                           0x12ab06e50
ggml_metal_init: loaded kernel_soft_max                       0x12ab076b0
ggml_metal_init: loaded kernel_diag_mask_inf                  0x12ab07d30
ggml_metal_init: loaded kernel_get_rows_f16                   0x12aa089e0
ggml_metal_init: loaded kernel_get_rows_q4_0                  0x12aa091a0
ggml_metal_init: loaded kernel_get_rows_q4_1                  0x12aa09b30
ggml_metal_init: loaded kernel_get_rows_q2_K                  0x12ab082b0
ggml_metal_init: loaded kernel_get_rows_q3_K                  0x12ab08a70
ggml_metal_init: loaded kernel_get_rows_q4_K                  0x12aa0a0b0
ggml_metal_init: loaded kernel_get_rows_q5_K                  0x12aa0a8b0
ggml_metal_init: loaded kernel_get_rows_q6_K                  0x12aa0af50
ggml_metal_init: loaded kernel_rms_norm                       0x12ab09140
ggml_metal_init: loaded kernel_norm                           0x12ab09920
ggml_metal_init: loaded kernel_mul_mat_f16_f32                0x12aa0b9f0
ggml_metal_init: loaded kernel_mul_mat_q4_0_f32               0x12aa0be30
ggml_metal_init: loaded kernel_mul_mat_q4_1_f32               0x12aa0c530
ggml_metal_init: loaded kernel_mul_mat_q2_K_f32               0x12ab0a350
ggml_metal_init: loaded kernel_mul_mat_q3_K_f32               0x12ab0af40
ggml_metal_init: loaded kernel_mul_mat_q4_K_f32               0x12ab0b5c0
ggml_metal_init: loaded kernel_mul_mat_q5_K_f32               0x12aa0c930
ggml_metal_init: loaded kernel_mul_mat_q6_K_f32               0x12ab0bba0
ggml_metal_init: loaded kernel_rope                           0x12ab0ca80
ggml_metal_init: loaded kernel_alibi_f32                      0x12ab0d360
ggml_metal_init: loaded kernel_cpy_f32_f16                    0x12ab0dc10
ggml_metal_init: loaded kernel_cpy_f32_f32                    0x12ab0e4c0
ggml_metal_init: loaded kernel_cpy_f16_f16                    0x12aa0d550
ggml_metal_init: recommendedMaxWorkingSetSize =  5461.34 MB
ggml_metal_init: hasUnifiedMemory             = true
ggml_metal_init: maxTransferRate              = built-in GPU
llama_new_context_with_model: max tensor size =    70.31 MB
ggml_metal_add_buffer: allocated 'data            ' buffer, size =  3616.08 MB, ( 3616.47 /  5461.34)
ggml_metal_add_buffer: allocated 'eval            ' buffer, size =   768.00 MB, ( 4384.47 /  5461.34)
ggml_metal_add_buffer: allocated 'kv              ' buffer, size =  1026.00 MB, ( 5410.47 /  5461.34)
ggml_metal_add_buffer: allocated 'scr0            ' buffer, size =   512.00 MB, ( 5922.47 /  5461.34), warning: current allocated size is greater than the recommended max working set size
ggml_metal_add_buffer: allocated 'scr1            ' buffer, size =   512.00 MB, ( 6434.47 /  5461.34), warning: current allocated size is greater than the recommended max working set size
ggml_metal_graph_compute: command buffer 0 failed with status 5
GGML_ASSERT: ggml-metal.m:1013: false
SIGABRT: abort
PC=0x19296c724 m=5 sigcode=0
signal arrived during cgo execution

goroutine 6 [syscall]:
runtime.cgocall(0x100c920c0, 0x140000bd298)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/cgocall.go:157 +0x54 fp=0x140000bd260 sp=0x140000bd220 pc=0x100799994
github.com/jmorganca/ollama/llama._Cfunc_llama_eval(0x144008a00, 0x14000486c88, 0x1, 0x0, 0x8)
	_cgo_gotypes.go:208 +0x38 fp=0x140000bd290 sp=0x140000bd260 pc=0x100c81e18
github.com/jmorganca/ollama/llama.New.func4(0x99?, {0x14000486c88, 0x1, 0x14000178540?}, {0xffffffffffffffff, 0x0, 0x800, 0x200, 0x1, 0x0, ...})
	/Users/sasank/code/llama/ollama/llama/llama.go:141 +0x7c fp=0x140000bd2e0 sp=0x140000bd290 pc=0x100c82c2c
github.com/jmorganca/ollama/llama.New({0x140007fc310, 0x6a}, {0xffffffffffffffff, 0x0, 0x800, 0x200, 0x1, 0x0, 0x0, 0x1, ...})
	/Users/sasank/code/llama/ollama/llama/llama.go:141 +0x278 fp=0x140000bd4a0 sp=0x140000bd2e0 pc=0x100c829e8
github.com/jmorganca/ollama/server.generate(0x140000b4300)
	/Users/sasank/code/llama/ollama/server/routes.go:70 +0x700 fp=0x140000bd6e0 sp=0x140000bd4a0 pc=0x100c8d6b0
github.com/gin-gonic/gin.(*Context).Next(...)
	/Users/sasank/go/pkg/mod/github.com/gin-gonic/[email protected]/context.go:174
github.com/gin-gonic/gin.CustomRecoveryWithWriter.func1(0x140000b4300)
	/Users/sasank/go/pkg/mod/github.com/gin-gonic/[email protected]/recovery.go:102 +0x7c fp=0x140000bd730 sp=0x140000bd6e0 pc=0x100c7950c
github.com/gin-gonic/gin.(*Context).Next(...)
	/Users/sasank/go/pkg/mod/github.com/gin-gonic/[email protected]/context.go:174
github.com/gin-gonic/gin.LoggerWithConfig.func1(0x140000b4300)
	/Users/sasank/go/pkg/mod/github.com/gin-gonic/[email protected]/logger.go:240 +0xac fp=0x140000bd8e0 sp=0x140000bd730 pc=0x100c7878c
github.com/gin-gonic/gin.(*Context).Next(...)
	/Users/sasank/go/pkg/mod/github.com/gin-gonic/[email protected]/context.go:174
github.com/gin-gonic/gin.(*Engine).handleHTTPRequest(0x14000145ba0, 0x140000b4300)
	/Users/sasank/go/pkg/mod/github.com/gin-gonic/[email protected]/gin.go:620 +0x54c fp=0x140000bda70 sp=0x140000bd8e0 pc=0x100c7789c
github.com/gin-gonic/gin.(*Engine).ServeHTTP(0x14000145ba0, {0x100f019c0?, 0x140004ee1c0}, 0x140000b4200)
	/Users/sasank/go/pkg/mod/github.com/gin-gonic/[email protected]/gin.go:576 +0x1d4 fp=0x140000bdab0 sp=0x140000bda70 pc=0x100c771a4
net/http.serverHandler.ServeHTTP({0x100effa38?}, {0x100f019c0, 0x140004ee1c0}, 0x140000b4200)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/server.go:2936 +0x2d8 fp=0x140000bdb60 sp=0x140000bdab0 pc=0x100a152a8
net/http.(*conn).serve(0x1400017a900, {0x100f02038, 0x1400046e060})
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/server.go:1995 +0x560 fp=0x140000bdfa0 sp=0x140000bdb60 pc=0x100a10fa0
net/http.(*Server).Serve.func3()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/server.go:3089 +0x30 fp=0x140000bdfd0 sp=0x140000bdfa0 pc=0x100a15ad0
runtime.goexit()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x140000bdfd0 sp=0x140000bdfd0 pc=0x1007fc324
created by net/http.(*Server).Serve
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/server.go:3089 +0x520

goroutine 1 [IO wait, 14 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x1400011f860 sp=0x1400011f840 pc=0x1007ccaa4
runtime.netpollblock(0x1400031f8f8?, 0x87f1a4?, 0x1?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/netpoll.go:527 +0x158 fp=0x1400011f8a0 sp=0x1400011f860 pc=0x1007c6138
internal/poll.runtime_pollWait(0x1289ada18, 0x72)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/netpoll.go:306 +0xa0 fp=0x1400011f8d0 sp=0x1400011f8a0 pc=0x1007f61b0
internal/poll.(*pollDesc).wait(0x1400044a580?, 0x0?, 0x0)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_poll_runtime.go:84 +0x28 fp=0x1400011f900 sp=0x1400011f8d0 pc=0x10087a7e8
internal/poll.(*pollDesc).waitRead(...)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0x1400044a580)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_unix.go:614 +0x250 fp=0x1400011f9b0 sp=0x1400011f900 pc=0x10087f290
net.(*netFD).accept(0x1400044a580)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/fd_unix.go:172 +0x28 fp=0x1400011fa70 sp=0x1400011f9b0 pc=0x1008be278
net.(*TCPListener).accept(0x1400000edb0)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/tcpsock_posix.go:148 +0x28 fp=0x1400011faa0 sp=0x1400011fa70 pc=0x1008d3878
net.(*TCPListener).Accept(0x1400000edb0)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/tcpsock.go:297 +0x2c fp=0x1400011fae0 sp=0x1400011faa0 pc=0x1008d29ec
net/http.(*onceCloseListener).Accept(0x1400017a900?)
	<autogenerated>:1 +0x30 fp=0x1400011fb00 sp=0x1400011fae0 pc=0x100a39250
net/http.(*Server).Serve(0x14000366ff0, {0x100f017b0, 0x1400000edb0})
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/server.go:3059 +0x304 fp=0x1400011fc30 sp=0x1400011fb00 pc=0x100a15774
github.com/jmorganca/ollama/server.Serve({0x100f017b0, 0x1400000edb0})
	/Users/sasank/code/llama/ollama/server/routes.go:238 +0x250 fp=0x1400011fca0 sp=0x1400011fc30 pc=0x100c8f4e0
github.com/jmorganca/ollama/cmd.RunServer(0x14000419200?, {0x100ce1dcb?, 0x0?, 0x0?})
	/Users/sasank/code/llama/ollama/cmd/cmd.go:272 +0x114 fp=0x1400011fd20 sp=0x1400011fca0 pc=0x100c91454
github.com/spf13/cobra.(*Command).execute(0x14000419200, {0x101365c48, 0x0, 0x0})
	/Users/sasank/go/pkg/mod/github.com/spf13/[email protected]/command.go:940 +0x5c8 fp=0x1400011fe60 sp=0x1400011fd20 pc=0x100aaf628
github.com/spf13/cobra.(*Command).ExecuteC(0x14000418900)
	/Users/sasank/go/pkg/mod/github.com/spf13/[email protected]/command.go:1068 +0x35c fp=0x1400011ff20 sp=0x1400011fe60 pc=0x100aafd7c
github.com/spf13/cobra.(*Command).Execute(...)
	/Users/sasank/go/pkg/mod/github.com/spf13/[email protected]/command.go:992
github.com/spf13/cobra.(*Command).ExecuteContext(0x14000054768?, {0x100f01fc8?, 0x140000280b0?})
	/Users/sasank/go/pkg/mod/github.com/spf13/[email protected]/command.go:985 +0x50 fp=0x1400011ff40 sp=0x1400011ff20 pc=0x100aaf910
main.main()
	/Users/sasank/code/llama/ollama/main.go:10 +0x34 fp=0x1400011ff70 sp=0x1400011ff40 pc=0x100c91e94
runtime.main()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:250 +0x248 fp=0x1400011ffd0 sp=0x1400011ff70 pc=0x1007cc678
runtime.goexit()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x1400011ffd0 sp=0x1400011ffd0 pc=0x1007fc324

goroutine 2 [force gc (idle), 14 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000054fa0 sp=0x14000054f80 pc=0x1007ccaa4
runtime.goparkunlock(...)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:387
runtime.forcegchelper()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:305 +0xb8 fp=0x14000054fd0 sp=0x14000054fa0 pc=0x1007cc8e8
runtime.goexit()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x14000054fd0 sp=0x14000054fd0 pc=0x1007fc324
created by runtime.init.6
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:293 +0x24

goroutine 3 [GC sweep wait]:
runtime.gopark(0x1?, 0x0?, 0x0?, 0x0?, 0x0?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000055760 sp=0x14000055740 pc=0x1007ccaa4
runtime.goparkunlock(...)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:387
runtime.bgsweep(0x0?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgcsweep.go:319 +0x110 fp=0x140000557b0 sp=0x14000055760 pc=0x1007b9960
runtime.gcenable.func1()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:178 +0x28 fp=0x140000557d0 sp=0x140000557b0 pc=0x1007ae408
runtime.goexit()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x140000557d0 sp=0x140000557d0 pc=0x1007fc324
created by runtime.gcenable
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:178 +0x74

goroutine 4 [GC scavenge wait]:
runtime.gopark(0x12b0f92?, 0x1291938?, 0x0?, 0x0?, 0x0?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000055f50 sp=0x14000055f30 pc=0x1007ccaa4
runtime.goparkunlock(...)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:387
runtime.(*scavengerState).park(0x1012aa960)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgcscavenge.go:400 +0x5c fp=0x14000055f80 sp=0x14000055f50 pc=0x1007b776c
runtime.bgscavenge(0x0?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgcscavenge.go:633 +0xac fp=0x14000055fb0 sp=0x14000055f80 pc=0x1007b7d4c
runtime.gcenable.func2()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:179 +0x28 fp=0x14000055fd0 sp=0x14000055fb0 pc=0x1007ae3a8
runtime.goexit()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x14000055fd0 sp=0x14000055fd0 pc=0x1007fc324
created by runtime.gcenable
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:179 +0xb8

goroutine 5 [finalizer wait, 12 minutes]:
runtime.gopark(0x0?, 0x1400048a138?, 0x20?, 0x1?, 0x1000000010?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000065d80 sp=0x14000065d60 pc=0x1007ccaa4
runtime.runfinq()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mfinal.go:193 +0x10c fp=0x14000065fd0 sp=0x14000065d80 pc=0x1007ad49c
runtime.goexit()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x14000065fd0 sp=0x14000065fd0 pc=0x1007fc324
created by runtime.createfing
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mfinal.go:163 +0x84

goroutine 26 [select]:
runtime.gopark(0x1400051ff80?, 0x2?, 0xa0?, 0x61?, 0x1400051ff24?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x1400051fdb0 sp=0x1400051fd90 pc=0x1007ccaa4
runtime.selectgo(0x1400051ff80, 0x1400051ff20, 0x14000282680?, 0x0, 0x0?, 0x1)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/select.go:327 +0x690 fp=0x1400051fed0 sp=0x1400051fdb0 pc=0x1007dd1a0
net/http.(*persistConn).writeLoop(0x14000128d80)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/transport.go:2410 +0x9c fp=0x1400051ffb0 sp=0x1400051fed0 pc=0x100a2a74c
net/http.(*Transport).dialConn.func6()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/transport.go:1766 +0x28 fp=0x1400051ffd0 sp=0x1400051ffb0 pc=0x100a27458
runtime.goexit()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x1400051ffd0 sp=0x1400051ffd0 pc=0x1007fc324
created by net/http.(*Transport).dialConn
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/transport.go:1766 +0x1214

goroutine 13 [GC worker (idle), 1 minutes]:
runtime.gopark(0x4f330c0464e0f?, 0x1?, 0x27?, 0xdf?, 0x0?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000056f40 sp=0x14000056f20 pc=0x1007ccaa4
runtime.gcBgMarkWorker()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1275 +0xec fp=0x14000056fd0 sp=0x14000056f40 pc=0x1007b034c
runtime.goexit()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x14000056fd0 sp=0x14000056fd0 pc=0x1007fc324
created by runtime.gcBgMarkStartWorkers
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1199 +0x28

goroutine 20 [GC worker (idle)]:
runtime.gopark(0x1013673a0?, 0x1?, 0x16?, 0xeb?, 0x0?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000057740 sp=0x14000057720 pc=0x1007ccaa4
runtime.gcBgMarkWorker()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1275 +0xec fp=0x140000577d0 sp=0x14000057740 pc=0x1007b034c
runtime.goexit()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x140000577d0 sp=0x140000577d0 pc=0x1007fc324
created by runtime.gcBgMarkStartWorkers
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1199 +0x28

goroutine 21 [GC worker (idle)]:
runtime.gopark(0x4f347a631f1b8?, 0x3?, 0xc3?, 0x8e?, 0x0?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000050740 sp=0x14000050720 pc=0x1007ccaa4
runtime.gcBgMarkWorker()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1275 +0xec fp=0x140000507d0 sp=0x14000050740 pc=0x1007b034c
runtime.goexit()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x140000507d0 sp=0x140000507d0 pc=0x1007fc324
created by runtime.gcBgMarkStartWorkers
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1199 +0x28

goroutine 14 [GC worker (idle)]:
runtime.gopark(0x4f347a634141b?, 0x3?, 0x77?, 0xc?, 0x0?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000057f40 sp=0x14000057f20 pc=0x1007ccaa4
runtime.gcBgMarkWorker()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1275 +0xec fp=0x14000057fd0 sp=0x14000057f40 pc=0x1007b034c
runtime.goexit()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x14000057fd0 sp=0x14000057fd0 pc=0x1007fc324
created by runtime.gcBgMarkStartWorkers
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1199 +0x28

goroutine 22 [GC worker (idle)]:
runtime.gopark(0x4f3473d29e65d?, 0x1?, 0x9f?, 0x19?, 0x0?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000050f40 sp=0x14000050f20 pc=0x1007ccaa4
runtime.gcBgMarkWorker()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1275 +0xec fp=0x14000050fd0 sp=0x14000050f40 pc=0x1007b034c
runtime.goexit()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x14000050fd0 sp=0x14000050fd0 pc=0x1007fc324
created by runtime.gcBgMarkStartWorkers
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1199 +0x28

goroutine 15 [GC worker (idle)]:
runtime.gopark(0x1013673a0?, 0x3?, 0x2?, 0x4c?, 0x0?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x1400047c740 sp=0x1400047c720 pc=0x1007ccaa4
runtime.gcBgMarkWorker()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1275 +0xec fp=0x1400047c7d0 sp=0x1400047c740 pc=0x1007b034c
runtime.goexit()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x1400047c7d0 sp=0x1400047c7d0 pc=0x1007fc324
created by runtime.gcBgMarkStartWorkers
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1199 +0x28

goroutine 23 [GC worker (idle)]:
runtime.gopark(0x4f3472b8156b1?, 0x3?, 0x93?, 0x2d?, 0x0?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000051740 sp=0x14000051720 pc=0x1007ccaa4
runtime.gcBgMarkWorker()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1275 +0xec fp=0x140000517d0 sp=0x14000051740 pc=0x1007b034c
runtime.goexit()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x140000517d0 sp=0x140000517d0 pc=0x1007fc324
created by runtime.gcBgMarkStartWorkers
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1199 +0x28

goroutine 16 [GC worker (idle)]:
runtime.gopark(0x4f3474e2b3524?, 0x3?, 0xe3?, 0x7b?, 0x0?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x1400047cf40 sp=0x1400047cf20 pc=0x1007ccaa4
runtime.gcBgMarkWorker()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1275 +0xec fp=0x1400047cfd0 sp=0x1400047cf40 pc=0x1007b034c
runtime.goexit()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x1400047cfd0 sp=0x1400047cfd0 pc=0x1007fc324
created by runtime.gcBgMarkStartWorkers
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1199 +0x28

goroutine 56 [IO wait]:
runtime.gopark(0xffffffffffffffff?, 0xffffffffffffffff?, 0x23?, 0x0?, 0x10080e540?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000063580 sp=0x14000063560 pc=0x1007ccaa4
runtime.netpollblock(0x0?, 0x0?, 0x0?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/netpoll.go:527 +0x158 fp=0x140000635c0 sp=0x14000063580 pc=0x1007c6138
internal/poll.runtime_pollWait(0x1289ad838, 0x72)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/netpoll.go:306 +0xa0 fp=0x140000635f0 sp=0x140000635c0 pc=0x1007f61b0
internal/poll.(*pollDesc).wait(0x1400064c000?, 0x140001c4800?, 0x0)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_poll_runtime.go:84 +0x28 fp=0x14000063620 sp=0x140000635f0 pc=0x10087a7e8
internal/poll.(*pollDesc).waitRead(...)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x1400064c000, {0x140001c4800, 0x1800, 0x1800})
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_unix.go:167 +0x200 fp=0x140000636c0 sp=0x14000063620 pc=0x10087bb50
net.(*netFD).Read(0x1400064c000, {0x140001c4800?, 0x14000063878?, 0x100000ece?})
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/fd_posix.go:55 +0x28 fp=0x14000063710 sp=0x140000636c0 pc=0x1008bc5d8
net.(*conn).Read(0x140004ba028, {0x140001c4800?, 0x140000637c8?, 0x1007a2304?})
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/net.go:183 +0x34 fp=0x14000063760 sp=0x14000063710 pc=0x1008cabe4
net.(*TCPConn).Read(0x140000637d8?, {0x140001c4800?, 0x1400000e828?, 0x18?})
	<autogenerated>:1 +0x2c fp=0x14000063790 sp=0x14000063760 pc=0x1008dd12c
crypto/tls.(*atLeastReader).Read(0x1400000e828, {0x140001c4800?, 0x1400000e828?, 0x0?})
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/crypto/tls/conn.go:788 +0x40 fp=0x140000637e0 sp=0x14000063790 pc=0x10096f760
bytes.(*Buffer).ReadFrom(0x140004aa290, {0x100efd580, 0x1400000e828})
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/bytes/buffer.go:202 +0x90 fp=0x14000063840 sp=0x140000637e0 pc=0x100831860
crypto/tls.(*Conn).readFromUntil(0x140004aa000, {0x128a27fc8?, 0x140004ba028}, 0x1009c421c?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/crypto/tls/conn.go:810 +0xd4 fp=0x14000063880 sp=0x14000063840 pc=0x10096f954
crypto/tls.(*Conn).readRecordOrCCS(0x140004aa000, 0x0)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/crypto/tls/conn.go:617 +0xd8 fp=0x14000063bf0 sp=0x14000063880 pc=0x10096d7a8
crypto/tls.(*Conn).readRecord(...)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/crypto/tls/conn.go:583
crypto/tls.(*Conn).Read(0x140004aa000, {0x140000a1000, 0x1000, 0x1009e1418?})
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/crypto/tls/conn.go:1316 +0x178 fp=0x14000063c60 sp=0x14000063bf0 pc=0x1009726f8
bufio.(*Reader).Read(0x140006bc900, {0x14000420580, 0x9, 0x10079bfbc?})
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/bufio/bufio.go:237 +0x1e0 fp=0x14000063ca0 sp=0x14000063c60 pc=0x10083e7b0
io.ReadAtLeast({0x100efd3e0, 0x140006bc900}, {0x14000420580, 0x9, 0x9}, 0x9)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/io/io.go:332 +0xa0 fp=0x14000063cf0 sp=0x14000063ca0 pc=0x100827fa0
io.ReadFull(...)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/io/io.go:351
net/http.http2readFrameHeader({0x14000420580?, 0x9?, 0x14000063d98?}, {0x100efd3e0?, 0x140006bc900?})
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/h2_bundle.go:1567 +0x58 fp=0x14000063d40 sp=0x14000063cf0 pc=0x1009d8548
net/http.(*http2Framer).ReadFrame(0x14000420540)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/h2_bundle.go:1831 +0x84 fp=0x14000063df0 sp=0x14000063d40 pc=0x1009d8d44
net/http.(*http2clientConnReadLoop).run(0x14000063f88)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/h2_bundle.go:9187 +0xfc fp=0x14000063f40 sp=0x14000063df0 pc=0x1009fa06c
net/http.(*http2ClientConn).readLoop(0x14000175080)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/h2_bundle.go:9082 +0x5c fp=0x14000063fb0 sp=0x14000063f40 pc=0x1009f952c
net/http.(*http2Transport).newClientConn.func1()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/h2_bundle.go:7779 +0x28 fp=0x14000063fd0 sp=0x14000063fb0 pc=0x1009f26b8
runtime.goexit()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x14000063fd0 sp=0x14000063fd0 pc=0x1007fc324
created by net/http.(*http2Transport).newClientConn
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/h2_bundle.go:7779 +0xad0

goroutine 39 [IO wait]:
runtime.gopark(0xffffffffffffffff?, 0xffffffffffffffff?, 0x23?, 0x0?, 0x10080e540?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x1400047ad40 sp=0x1400047ad20 pc=0x1007ccaa4
runtime.netpollblock(0x0?, 0x0?, 0x0?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/netpoll.go:527 +0x158 fp=0x1400047ad80 sp=0x1400047ad40 pc=0x1007c6138
internal/poll.runtime_pollWait(0x1289ad928, 0x72)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/netpoll.go:306 +0xa0 fp=0x1400047adb0 sp=0x1400047ad80 pc=0x1007f61b0
internal/poll.(*pollDesc).wait(0x1400044a600?, 0x1400046e161?, 0x0)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_poll_runtime.go:84 +0x28 fp=0x1400047ade0 sp=0x1400047adb0 pc=0x10087a7e8
internal/poll.(*pollDesc).waitRead(...)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x1400044a600, {0x1400046e161, 0x1, 0x1})
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_unix.go:167 +0x200 fp=0x1400047ae80 sp=0x1400047ade0 pc=0x10087bb50
net.(*netFD).Read(0x1400044a600, {0x1400046e161?, 0x0?, 0x0?})
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/fd_posix.go:55 +0x28 fp=0x1400047aed0 sp=0x1400047ae80 pc=0x1008bc5d8
net.(*conn).Read(0x14000010d10, {0x1400046e161?, 0x0?, 0x0?})
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/net.go:183 +0x34 fp=0x1400047af20 sp=0x1400047aed0 pc=0x1008cabe4
net.(*TCPConn).Read(0x0?, {0x1400046e161?, 0x0?, 0x0?})
	<autogenerated>:1 +0x2c fp=0x1400047af50 sp=0x1400047af20 pc=0x1008dd12c
net/http.(*connReader).backgroundRead(0x1400046e150)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/server.go:674 +0x44 fp=0x1400047afb0 sp=0x1400047af50 pc=0x100a0b454
net/http.(*connReader).startBackgroundRead.func2()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/server.go:670 +0x28 fp=0x1400047afd0 sp=0x1400047afb0 pc=0x100a0b378
runtime.goexit()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x1400047afd0 sp=0x1400047afd0 pc=0x1007fc324
created by net/http.(*connReader).startBackgroundRead
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/server.go:670 +0xcc

goroutine 25 [IO wait]:
runtime.gopark(0xffffffffffffffff?, 0xffffffffffffffff?, 0x23?, 0x0?, 0x10080e540?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000062580 sp=0x14000062560 pc=0x1007ccaa4
runtime.netpollblock(0x0?, 0x0?, 0x0?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/netpoll.go:527 +0x158 fp=0x140000625c0 sp=0x14000062580 pc=0x1007c6138
internal/poll.runtime_pollWait(0x1289ad748, 0x72)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/netpoll.go:306 +0xa0 fp=0x140000625f0 sp=0x140000625c0 pc=0x1007f61b0
internal/poll.(*pollDesc).wait(0x14000480200?, 0x140002d0000?, 0x0)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_poll_runtime.go:84 +0x28 fp=0x14000062620 sp=0x140000625f0 pc=0x10087a7e8
internal/poll.(*pollDesc).waitRead(...)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x14000480200, {0x140002d0000, 0xa000, 0xa000})
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_unix.go:167 +0x200 fp=0x140000626c0 sp=0x14000062620 pc=0x10087bb50
net.(*netFD).Read(0x14000480200, {0x140002d0000?, 0x14000062878?, 0x10096df7c?})
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/fd_posix.go:55 +0x28 fp=0x14000062710 sp=0x140000626c0 pc=0x1008bc5d8
net.(*conn).Read(0x140004ba000, {0x140002d0000?, 0x100ce6ad4?, 0x5?})
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/net.go:183 +0x34 fp=0x14000062760 sp=0x14000062710 pc=0x1008cabe4
net.(*TCPConn).Read(0x140000627d8?, {0x140002d0000?, 0x140006d00d8?, 0x18?})
	<autogenerated>:1 +0x2c fp=0x14000062790 sp=0x14000062760 pc=0x1008dd12c
crypto/tls.(*atLeastReader).Read(0x140006d00d8, {0x140002d0000?, 0x140006d00d8?, 0x0?})
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/crypto/tls/conn.go:788 +0x40 fp=0x140000627e0 sp=0x14000062790 pc=0x10096f760
bytes.(*Buffer).ReadFrom(0x14000452290, {0x100efd580, 0x140006d00d8})
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/bytes/buffer.go:202 +0x90 fp=0x14000062840 sp=0x140000627e0 pc=0x100831860
crypto/tls.(*Conn).readFromUntil(0x14000452000, {0x128a27fc8?, 0x140004ba000}, 0x7fffffffffffffff?)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/crypto/tls/conn.go:810 +0xd4 fp=0x14000062880 sp=0x14000062840 pc=0x10096f954
crypto/tls.(*Conn).readRecordOrCCS(0x14000452000, 0x0)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/crypto/tls/conn.go:617 +0xd8 fp=0x14000062bf0 sp=0x14000062880 pc=0x10096d7a8
crypto/tls.(*Conn).readRecord(...)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/crypto/tls/conn.go:583
crypto/tls.(*Conn).Read(0x14000452000, {0x140004df000, 0x1000, 0x140003e8180?})
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/crypto/tls/conn.go:1316 +0x178 fp=0x14000062c60 sp=0x14000062bf0 pc=0x1009726f8
net/http.(*persistConn).Read(0x14000128d80, {0x140004df000?, 0x10079b930?, 0x1400049e780?})
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/transport.go:1943 +0x50 fp=0x14000062cc0 sp=0x14000062c60 pc=0x100a27e60
bufio.(*Reader).fill(0x140004fc4e0)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/bufio/bufio.go:106 +0xfc fp=0x14000062d00 sp=0x14000062cc0 pc=0x10083e18c
bufio.(*Reader).Peek(0x140004fc4e0, 0x1)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/bufio/bufio.go:144 +0x60 fp=0x14000062d20 sp=0x14000062d00 pc=0x10083e300
net/http.(*persistConn).readLoop(0x14000128d80)
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/transport.go:2107 +0x144 fp=0x14000062fb0 sp=0x14000062d20 pc=0x100a28d14
net/http.(*Transport).dialConn.func5()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/transport.go:1765 +0x28 fp=0x14000062fd0 sp=0x14000062fb0 pc=0x100a274b8
runtime.goexit()
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x14000062fd0 sp=0x14000062fd0 pc=0x1007fc324
created by net/http.(*Transport).dialConn
	/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/transport.go:1765 +0x11c8

r0      0x0
r1      0x0
r2      0x0
r3      0x0
r4      0x0
r5      0x171672c10
r6      0xa
r7      0x0
r8      0x6b684de7b1e616cc
r9      0x6b684de6c08ea6cc
r10     0x2
r11     0xfffffffd
r12     0x10000000000
r13     0x0
r14     0x0
r15     0x0
r16     0x148
r17     0x1f292cf20
r18     0x0
r19     0x6
r20     0x17168b000
r21     0x1a03
r22     0x17168b0e0
r23     0x8
r24     0x7
r25     0x8
r26     0x1ede07460
r27     0x100cd3094
r28     0x100df50c0
r29     0x171672bc0
lr      0x1929a3c28
sp      0x171672ba0
pc      0x19296c724
fault   0x19296c724      

error on `ollama run`

ollama run sometimes shows a malformed HTTP response error:

ollama run orca
Error: Post "http://127.0.0.1:11434/api/pull": net/http: HTTP/1.x transport connection broken: malformed HTTP response "{\"total\":2142590208,\"completed\":2142590208,\"percent\":100}"

Performance question?

This is just request for info rather than a bug.

What's kind of performance / latency on prompts we should expect running on M2 Pro ? Seems like takes up to 10s to generate the answers using llama2 model. Is that something that can improve in the future?

macOS Intel support

Upon unzipping the Ollama download, I'm unable to launch the app. I get the following error: "You can’t open the application “Ollama” because this application is not supported on this Mac."

Mac is a MacBook Pro 15" from summer 2020 (w/ 64GB RAM on board - 32 of which is available)

Consistent GiB / GB usage

Need a consistent usage of GiB or GB.

ie.) Pulling wizard-vicuna shows 6.8GB, but when running ollama list, it'll show as 7.3GB.

Support for Intel Macs

Hi When can we expect to hear feedback regadring the future of this? Maybe I could help out?

`ollama create` should automatically pull base models

Currently when a Modelfile is defined with a base model such as llama2

FROM llama2

Running ollama create mymodel -f Modelfile first requires ollama pull llama2 to be run. We should automatically do this on ollama create

build issue server/routes.go:70:20: undefined: llama.New

I'm not able to build from git source:

root@debian:~/Downloads/ollama# go build .

github.com/jmorganca/ollama/server

server/routes.go:70:20: undefined: llama.New

root@debian:~/Downloads/ollama# cat /etc/debian_version
12.0

root@debian:~/Downloads/ollama# go version
go version go1.20.6 linux/amd64

Error trying to create custom model, fresh install

First off, this is awesome. Thank you for creating this. Running into a Error: 400 Bad Request when trying to follow the README and create a custom model.

Steps:

  1. Download Apple Silicon app from https://ollama.ai/download & install to CLI
  2. Run ollama run llama2 successfully
  3. Create a Modelfile and copy/paste the README example verbatim
  4. Run ollama create mario -f ./Modelfile
  5. Receive the Error: 400 Bad Request

Attempted a few of the other examples as well, etc. but couldn't get it to run.

`ollama run` doesn't continue after one reponse

here are how you reproduce

Hello! How can I assist you today?Error: stream: EOF

$ logs  ollama run orca "why is the sky blue"
The sky appears blue because of a process called scattering. 
When sunlight enters the Earth's atmosphere, it collides with gas molecules 
such as oxygen and nitrogen. These collisions cause the light to scatter in 
all directions. Blue light has a shorter wavelength and is scattered more 
easily than other colors, so it is scattered more widely across the sky, 
making it appear blue. This effect is also why the sky is usually darker 
during sunrise and sunset when the sun is below the horizon and 
cannot be seen.Error: stream: EOF```


`ollama run` shows no error if the model failed to load

2023/07/07 11:30:34 routes.go:145: Listening on 127.0.0.1:11434
llama.cpp: loading model from /Users/jmorgan/Downloads/gpt2-medium-alpaca-355m-ggml-f16.bin
error loading model: unexpectedly reached end of file
llama_load_model_from_file: failed to load model
Loading the model failed: failed loading model

On the client, ollama run keeps spinning

Error after prompting Llama2 on M1.

After starting Ollama and running Llama2, any prompt results in:
Error: Post "http://127.0.0.1:11434/api/generate": EOF
Installed on M1 Macbook Pro
Ollama reports that it is running.
'Ollama list' reports llama2:latest.

Is this a memory issue?

No punctuation

For orca models, the responses don't end with punctuation.

app server should restart if it errors

Currently, if the server errors, it will stop running but the mac app will continue – we should ensure if the mac app is running the server is also always running

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.