Comments (14)
This issue is because the default lancedb
vector db does not have support (or can determine) which binary it needs to run lancedb.
[email protected]:
version "0.4.11"
resolved "https://registry.yarnpkg.com/vectordb/-/vectordb-0.4.11.tgz#e5b209fc15fb100d8495c132d0e4cc817dc9065a"
integrity sha512-hbN1qO08xdhEpoGecWR6UuvfnNFYDgZLvuEDvgq5nFVbA4er5qXTcEyxjnlTzKsVqFGLlPF+91I+JIigneX24w==
dependencies:
"@apache-arrow/ts" "^14.0.2"
"@neon-rs/load" "^0.0.74"
apache-arrow "^14.0.2"
axios "^1.4.0"
optionalDependencies:
"@lancedb/vectordb-darwin-arm64" "0.4.11"
"@lancedb/vectordb-darwin-x64" "0.4.11"
"@lancedb/vectordb-linux-arm64-gnu" "0.4.11"
"@lancedb/vectordb-linux-x64-gnu" "0.4.11"
"@lancedb/vectordb-win32-x64-msvc" "0.4.11"
i would imagine linux x64 would be right?
from anything-llm.
Hi Tim,
I am also experiencing this issue in a VM but using Hyper-V. This showed up when I pulled the :latest docker image two days ago. I don't build the project just use the image. I tried removing the /app/server/storage volume from the docker-compose file to be sure its not my folder giving permission issues and the issue persist with that being freshly populated by the image.
I've turned on debug logging for node, here are the last lines before the error
anything-llm | MODULE 102: looking for ["/app/server/node_modules/apache-arrow"]
anything-llm | MODULE 102: Module._load REQUEST ./util/buffer.js parent: /app/server/node_modules/apache-arrow/Arrow.js
anything-llm | MODULE 102: RELATIVE: requested: ./util/buffer.js from parent.id /app/server/node_modules/apache-arrow/Arrow.js
anything-llm | MODULE 102: looking for ["/app/server/node_modules/apache-arrow"]
anything-llm | MODULE 102: Module._load REQUEST ./util/vector.js parent: /app/server/node_modules/apache-arrow/Arrow.js
anything-llm | MODULE 102: Module._load REQUEST ./util/pretty.js parent: /app/server/node_modules/apache-arrow/Arrow.js
anything-llm | MODULE 102: RELATIVE: requested: ./util/pretty.js from parent.id /app/server/node_modules/apache-arrow/Arrow.js
anything-llm | MODULE 102: looking for ["/app/server/node_modules/apache-arrow"]
anything-llm | MODULE 102: Module._load REQUEST ./visitor/typecomparator.js parent: /app/server/node_modules/apache-arrow/Arrow.js
anything-llm | MODULE 102: Module._load REQUEST ./Arrow.js parent: /app/server/node_modules/apache-arrow/Arrow.dom.js
anything-llm | MODULE 102: Module._load REQUEST ./arrow parent: /app/server/node_modules/vectordb/dist/index.js
anything-llm | MODULE 102: RELATIVE: requested: ./arrow from parent.id /app/server/node_modules/vectordb/dist/index.js
anything-llm | MODULE 102: looking for ["/app/server/node_modules/vectordb/dist"]
anything-llm | MODULE 102: load "/app/server/node_modules/vectordb/dist/arrow.js" for module "/app/server/node_modules/vectordb/dist/arrow.js"
anything-llm | MODULE 102: Module._load REQUEST apache-arrow parent: /app/server/node_modules/vectordb/dist/arrow.js
anything-llm | MODULE 102: Module._load REQUEST ./remote parent: /app/server/node_modules/vectordb/dist/index.js
anything-llm | MODULE 102: RELATIVE: requested: ./remote from parent.id /app/server/node_modules/vectordb/dist/index.js
anything-llm | MODULE 102: looking for ["/app/server/node_modules/vectordb/dist"]
anything-llm | MODULE 102: load "/app/server/node_modules/vectordb/dist/remote/index.js" for module "/app/server/node_modules/vectordb/dist/remote/index.js"
anything-llm | MODULE 102: Module._load REQUEST ../index parent: /app/server/node_modules/vectordb/dist/remote/index.js
anything-llm | MODULE 102: RELATIVE: requested: ../index from parent.id /app/server/node_modules/vectordb/dist/remote/index.js
anything-llm | MODULE 102: looking for ["/app/server/node_modules/vectordb/dist/remote"]
anything-llm | MODULE 102: Module._load REQUEST ../query parent: /app/server/node_modules/vectordb/dist/remote/index.js
anything-llm | MODULE 102: RELATIVE: requested: ../query from parent.id /app/server/node_modules/vectordb/dist/remote/index.js
anything-llm | MODULE 102: looking for ["/app/server/node_modules/vectordb/dist/remote"]
anything-llm | MODULE 102: load "/app/server/node_modules/vectordb/dist/query.js" for module "/app/server/node_modules/vectordb/dist/query.js"
anything-llm | MODULE 102: Module._load REQUEST apache-arrow parent: /app/server/node_modules/vectordb/dist/query.js
anything-llm | MODULE 102: Module._load REQUEST ../native.js parent: /app/server/node_modules/vectordb/dist/query.js
anything-llm | MODULE 102: RELATIVE: requested: ../native.js from parent.id /app/server/node_modules/vectordb/dist/query.js
anything-llm | MODULE 102: looking for ["/app/server/node_modules/vectordb/dist"]
anything-llm | MODULE 102: load "/app/server/node_modules/vectordb/native.js" for module "/app/server/node_modules/vectordb/native.js"
anything-llm | MODULE 102: Module._load REQUEST @neon-rs/load parent: /app/server/node_modules/vectordb/native.js
anything-llm | MODULE 102: looking for "@neon-rs/load" in ["/app/server/node_modules/vectordb/node_modules","/app/server/node_modules","/app/node_modules","/node_modules","/app/.node_modules","/app/.node_libraries","/usr/lib/node"]
anything-llm | MODULE 102: load "/app/server/node_modules/@neon-rs/load/dist/index.js" for module "/app/server/node_modules/@neon-rs/load/dist/index.js"
anything-llm | MODULE 102: Module._load REQUEST ./index.node parent: /app/server/node_modules/vectordb/native.js
anything-llm | MODULE 102: RELATIVE: requested: ./index.node from parent.id /app/server/node_modules/vectordb/native.js
anything-llm | MODULE 102: looking for ["/app/server/node_modules/vectordb"]
anything-llm | MODULE 102: Module._load REQUEST @lancedb/vectordb-linux-x64-gnu parent: /app/server/node_modules/vectordb/native.js
anything-llm | MODULE 102: looking for "@lancedb/vectordb-linux-x64-gnu" in ["/app/server/node_modules/vectordb/node_modules","/app/server/node_modules","/app/node_modules","/node_modules","/app/.node_modules","/app/.node_libraries","/usr/lib/node"]
anything-llm | MODULE 102: load "/app/server/node_modules/@lancedb/vectordb-linux-x64-gnu/index.node" for module "/app/server/node_modules/@lancedb/vectordb-linux-x64-gnu/index.node"
anything-llm | /usr/local/bin/docker-entrypoint.sh: line 7: 102 Illegal instruction (core dumped) node /app/server/index.js
anything-llm exited with code 132
Let me know if there's anything you'd like me to run or test to help figure out the issue
Thanks!
from anything-llm.
This seems like now more of a support issue for https://github.com/lancedb/lancedb
That being said, we are on package 0.4.11
so it is possible could have been fixed since in the SDK. Obviously such a critical piece like the vectordb we have to bump very carefully.
from anything-llm.
Hi @timothycarambat,
I can confirm that I am on Linux x64 as well and the project is running in a podman container.
Could I suggest tagging the docker images with release versions like the desktop app? Simply so we could roll back in case of issues like this arising. Right now our instance is broken; even with no documents uploaded it crashes because of the default integrated LanceDB. The best option to "fix" this seems to be setting up an alternative vectordb (which is the route I am currently taking, two issues throwing the same error are currently open on the LanceDB repo since oct-nov of 2023 (lancedb/lancedb#592 and lancedb/lancedb#631) so I'm not sure Anything-LLM will start working for us with lancedb anytime soon).
Let me know if there is anything I can do to help. I'd like to echo iFloris and thank you and the community for the amazing open source project that is Anything-LLM!
from anything-llm.
What does your docker-compose look like? This seems like some directory or something is being mounted that should not be. Likely some library that should exist but isn't?
from anything-llm.
This has been reported before and this was the case see this SO comment
from anything-llm.
Hi, thanks for getting back to me @timothycarambat you are right, it was docker compose related. I had a very simple custom docker compose setup, which apparently no longer works:
version: '3.8'
services:
anythingllm:
image: mintplexlabs/anythingllm:master
ports:
- "3001:3001"
volumes:
- ./storage:/app/server/storage
- "./storage/.env:/app/server/.env"
cap_add:
- SYS_ADMIN
restart: always
So I used the 'official' docker compose with build steps. I have copied over my sqlite db and assets dir to the server/storage folder. After fiddling with permissions, that now works too.
However, with the new docker compose configuration, I still get this:
[CommunicationKey] RSA key pair generated for signed payloads within AnythingLLM services.
Primary server in HTTP mode listening on port 3001
**/usr/local/bin/docker-entrypoint.sh: line 7: 163 Illegal instruction (core dumped) node /app/server/index.js**
Collector hot directory and tmp storage wiped!
Document processor app listening on port 8888
Environment variables loaded from .env
Prisma schema loaded from prisma/schema.prisma
my docker compose now looks like this, which allows me to succesfully complete the build steps and start the application:
➜ docker git:(master) ✗ cat docker-compose.yml
version: "3.9"
name: anythingllm
networks:
anything-llm:
driver: bridge
services:
anything-llm:
restart: always
container_name: anything-llm
image: anything-llm:latest
build:
context: ../.
dockerfile: ./docker/Dockerfile
args:
ARG_UID: ${UID:-1000}
ARG_GID: ${GID:-1000}
cap_add:
- SYS_ADMIN
volumes:
- "./.env:/app/server/.env"
- "../server/storage:/app/server/storage"
- "../collector/hotdir/:/app/collector/hotdir"
- "../collector/outputs/:/app/collector/outputs"
user: "${UID:-1000}:${GID:-1000}"
ports:
- "3001:3001"
env_file:
- .env
networks:
- anything-llm
extra_hosts:
- "host.docker.internal:host-gateway"
I could well be wrong, but it does not seem to me I'm mounting directories incorrectly, as the build process completes succesfully for x86/x64?
➜ docker git:(master) ✗ docker compose build
WARN[0000] docker-compose.yml: `version` is obsolete
[+] Building 0.9s (38/38) FINISHED docker:default
=> [anything-llm internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 7.21kB 0.0s
=> [anything-llm internal] load metadata for docker.io/library/ubuntu:jammy-20230916 0.4s
=> [anything-llm internal] load .dockerignore 0.0s
=> => transferring context: 418B 0.0s
=> [anything-llm internal] load build context 0.1s
=> => transferring context: 56.59kB 0.1s
=> [anything-llm base 1/1] FROM docker.io/library/ubuntu:jammy-20230916@sha256:9b8dec3bf938bc80fbe758d856e96fdfab5f56c39d44b0cff351e847bb1b01ea 0.0s
=> CACHED [anything-llm build-amd64 1/7] RUN echo "Preparing build of AnythingLLM image for non-ARM architecture" 0.0s
=> CACHED [anything-llm build-amd64 2/7] RUN DEBIAN_FRONTEND=noninteractive apt-get update && DEBIAN_FRONTEND=noninteractive apt-get install -yq --no-install-recommen 0.0s
=> CACHED [anything-llm build-amd64 3/7] RUN groupadd -g "1000" anythingllm && useradd -l -u "1000" -m -d /app -s /bin/bash -g anythingllm anythingllm && mkdir -p 0.0s
=> CACHED [anything-llm build-amd64 4/7] COPY ./docker/docker-entrypoint.sh /usr/local/bin/ 0.0s
=> CACHED [anything-llm build-amd64 5/7] COPY ./docker/docker-healthcheck.sh /usr/local/bin/ 0.0s
=> CACHED [anything-llm build-amd64 6/7] COPY --chown=anythingllm:anythingllm ./docker/.env.example /app/server/.env 0.0s
=> CACHED [anything-llm build-amd64 7/7] RUN chmod +x /usr/local/bin/docker-entrypoint.sh && chmod +x /usr/local/bin/docker-healthcheck.sh 0.0s
=> CACHED [anything-llm build 1/2] RUN echo "Running common build flow of AnythingLLM image for all architectures" 0.0s
=> CACHED [anything-llm build 2/2] WORKDIR /app 0.0s
=> CACHED [anything-llm server-deps 1/7] COPY ./server/package.json ./server/yarn.lock ./server/ 0.0s
=> CACHED [anything-llm server-deps 2/7] WORKDIR /app/server 0.0s
=> CACHED [anything-llm server-deps 3/7] RUN yarn install --production --network-timeout 100000 && yarn cache clean 0.0s
=> CACHED [anything-llm server-deps 4/7] WORKDIR /app 0.0s
=> CACHED [anything-llm server-deps 5/7] WORKDIR /app/server 0.0s
=> CACHED [anything-llm server-deps 6/7] RUN npx --no node-llama-cpp download 0.0s
=> CACHED [anything-llm server-deps 7/7] WORKDIR /app 0.0s
=> CACHED [anything-llm production-stage 1/8] COPY --chown=anythingllm:anythingllm ./server/ ./server/ 0.0s
=> CACHED [anything-llm frontend-deps 1/4] COPY ./frontend/package.json ./frontend/yarn.lock ./frontend/ 0.0s
=> CACHED [anything-llm frontend-deps 2/4] WORKDIR /app/frontend 0.0s
=> CACHED [anything-llm frontend-deps 3/4] RUN yarn install --network-timeout 100000 && yarn cache clean 0.0s
=> CACHED [anything-llm frontend-deps 4/4] WORKDIR /app 0.0s
=> CACHED [anything-llm build-stage 1/4] COPY ./frontend/ ./frontend/ 0.0s
=> CACHED [anything-llm build-stage 2/4] WORKDIR /app/frontend 0.0s
=> CACHED [anything-llm build-stage 3/4] RUN yarn build && yarn cache clean 0.0s
=> CACHED [anything-llm build-stage 4/4] WORKDIR /app 0.0s
=> CACHED [anything-llm production-stage 2/8] COPY --chown=anythingllm:anythingllm --from=build-stage /app/frontend/dist ./server/public 0.0s
=> CACHED [anything-llm production-stage 3/8] COPY --chown=anythingllm:anythingllm ./collector/ ./collector/ 0.0s
=> CACHED [anything-llm production-stage 4/8] WORKDIR /app/collector 0.0s
=> CACHED [anything-llm production-stage 5/8] RUN yarn install --production --network-timeout 100000 && yarn cache clean 0.0s
=> CACHED [anything-llm production-stage 6/8] WORKDIR /app/server 0.0s
=> CACHED [anything-llm production-stage 7/8] RUN npx prisma generate --schema=./prisma/schema.prisma && npx prisma migrate deploy --schema=./prisma/schema.prisma 0.0s
=> CACHED [anything-llm production-stage 8/8] WORKDIR /app 0.0s
=> [anything-llm] exporting to image 0.0s
=> => exporting layers 0.0s
=> => writing image sha256:28085e5fb9233031a138385e109aa178205db9df8ff94afb7a9f756f022c615b 0.0s
=> => naming to docker.io/library/anything-llm:latest
from anything-llm.
No that seems correct to me. Are you building on and ARM device? I think by default, unless specified, docker compose will build in compatibility mode, but since our dockerfile supports both arch's it should enable autodetection.
What chip arch are you on?
from anything-llm.
Hello @timothycarambat this is on a Xeon server.
from anything-llm.
Then that makes sense, Xeon is x86, not arm so it makes sense to only build for amd64
from anything-llm.
So this is interesting, I copied my (original, simple) docker compose file to another server and AnythingLLM works there.
The difference:
This currently works:
OS: Debian GNU/Linux 10 (buster) x86_64
Host: KVM RHEL-8.6.0 PC (Q35 + ICH9, 2009)
CPU: Intel Xeon Silver 4216 (24) @ 2.099GHz
Memory: 10240MiB / 257696 MiB
Docker Compose version v2.27.0
Docker version 26.1.1, build 4cf5afa
This does not:
OS: Proxmox VE 8.2.2 x86_64
Host: BANFF12 [baremetal]
CPU: Intel Xeon E5-2670 0 (64) @ 2.600GHz
Memory: 92153MiB / 257572 MiB
Docker Compose version v2.27.0
Docker version 26.1.1, build 4cf5afa
So I updated my dns to point to this server instead, but it's still a pretty weird problem.
from anything-llm.
While i cant nail down exactly why this occurs i can say that we have had issues reported with Proxmox before, but i dont have that system available to us so i cannot replicate to debug remotely, but i recall is was either permissions or port related
from anything-llm.
This issue is because the default
lancedb
vector db does not have support (or can determine) which binary it needs to run lancedb.
i would imagine linux x64 would be right?
That's interesting, so the problem originated in the vectordb. That would explain why the web ui crashed when opening workspace settings.
Yes, I believe that linux x64 is right.
The machine that doesn't work (bare metal):
Linux 6.8.4-2-pve #1 SMP PREEMPT_DYNAMIC PMX 6.8.4-2 (2024-04-10T17:36Z) x86_64 GNU/Linux
and the KVM that does work:
Linux 4.19.0-26-amd64 #1 SMP Debian 4.19.304-1 (2024-01-09) x86_64 GNU/Linux
from anything-llm.
Yes, I think you are right @timothycarambat. Thanks for the help investigating and the great tool!
from anything-llm.
Related Issues (20)
- [BUG]: Webpage https://docs.useanything.com/managed-deployment/private-managed-anythingllm does not work HOT 1
- Cannot configure Ollama embeddings HOT 1
- [BUG]: Hard lockup of Windows 11 computer when using native embeddings option in AnythingLLM v1.5.3 to encode PDFs for a Milvus DB HOT 1
- [BUG]: network error HOT 2
- [BUG]: 1 documents failed toadd. Ollama Failed to embed:[undefined]: undefined HOT 5
- [FEAT]: Creating and using specific threads from a workspace via API HOT 1
- [FEAT]: More search engines HOT 6
- [BUG]: [GoogleGenerativeAI Error]: Content with role 'user' can't follow 'user'. Valid previous roles: {"user":["model"],"function":["model"],"model":["user","function"],"system":[]}
- [BUG]: agents do not work with LM Studio HOT 1
- [BUG]: RAG showing all contexts with windows 11 but not windows 10 HOT 1
- 115 Illegal instruction HOT 25
- [FEAT]: Depth scraping merge HOT 1
- [FEAT]: Show supported documents type when uploading files HOT 1
- [BUG]: DOCKER stopped when input message. HOT 1
- [BUG]: agent asking for API key to summarize document HOT 1
- [BUG]: AnythingLLM does not have a workspace window for uploading files
- [DOCS]: Wrong volumes in `docker run` command HOT 15
- [BUG]: AnythingLLM Stuck at "--loading available models--" HOT 1
- [BUG]: Model used isn't updated until save (which doesn't always appear) is explicitly pressed HOT 2
- Setup MD ".env" should be "storage" HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from anything-llm.