Giter VIP home page Giter VIP logo

anwarpy / llocalsearch Goto Github PK

View Code? Open in Web Editor NEW

This project forked from nilsherzig/llocalsearch

0.0 0.0 0.0 318 KB

LLocalSearch is a completely locally running search aggregator using LLM Agents. The user can ask a question and the system will use a chain of LLMs to find the answer. The user can see the progress of the agents and the final answer. No OpenAI or Google API keys are needed.

License: Apache License 2.0

JavaScript 1.32% Go 53.93% TypeScript 3.79% CSS 0.05% Makefile 1.59% HTML 0.30% Dockerfile 0.96% Svelte 38.06%

llocalsearch's Introduction

LLocalSearch

What it is and what it does

Screencast.from.2024-05-03.16-56-58.webm

LLocalSearch is a wrapper around locally running Large Language Models (like ChatGTP, but a lot smaller and less "smart") which allows them to choose from a set of tools. These tools allow them to search the internet for current information about your question. This process is iterative, which means, that the running LLM can freely choose to use tools (even multiple times) based on the information its getting from you and other tool calls.

Here is a rough representation of how this looks like.

flowchart TB
	You
	LLM
	WebSearch
	WebScrape
	Database
	FinalAnswer
	
	You -- asking a question --> LLM
	LLM --> WebSearch
	LLM --> WebScrape
	LLM --> Database
	LLM -- answer --> FinalAnswer

	WebSearch --> LLM
	WebScrape --> LLM
	Database --> LLM

	FinalAnswer -- send to --> You
Loading

Features

  • ๐Ÿ•ตโ€โ™€ Completely local (no need for API keys) and thus a lot more privacy respecting
  • ๐Ÿ’ธ Runs on "low end" hardware (the demo video uses a 300โ‚ฌ GPU)
  • ๐Ÿค“ Live logs and links in the answer allow you do get a better understanding about what the agent is doing and what information the answer is based on. Allowing for a great starting point to dive deeper into your research.
  • ๐Ÿค” Supports follow up questions
  • ๐Ÿ“ฑ Mobile friendly design
  • ๐ŸŒ“ Dark and light mode

Road-map

I'm currently working on ๐Ÿ‘ท

Support for LLama3 ๐Ÿฆ™

The langchain library im using does not respect the LLama3 stop words, which results in LLama3 starting to hallucinate at the end of a turn. I have a working patch (checkout the experiments branch), but since im unsure if my way is the right way to solve this, im still waiting for a response from the langchaingo team.

Interface overhaul ๐ŸŒŸ

An Interface overhaul, allowing for more flexible panels and more efficient use of space. Inspired by the current layout of Obsidian

Support for chat histories / recent conversations ๐Ÿ•ตโ€โ™€

Still needs a lot of work, like refactoring a lot of the internal data structures to allow for more better and more flexible ways to expand the functionality in the future without having to rewrite the whole data transmission and interface part again.

Planed (near future)

User Accounts ๐Ÿ™†

Groundwork for private information inside the rag chain, like uploading your own documents, or connecting LLocalSearch to services like Google Drive, or Confluence.

Long term memory ๐Ÿง 

Not sure if there is a right way to implement this, but provide the main agent chain with information about the user, like preferences and having an extra Vector DB Namespace per user for persistent information.

Install Guide

Docker ๐Ÿณ

  1. Clone the GitHub Repository
[email protected]:nilsherzig/LLocalSearch.git
cd LLocalSearch
  1. Create and edit an .env file, if you need to change some of the default settings. This is typically only needed if you have Ollama running on a different device or if you want to build a more complex setup (for more than your personal use f.ex.). Please read Ollama Setup Guide if you struggle to get the Ollama connection running.
touch .env
code .env # open file with vscode
nvim .env # open file with neovim
  1. Run the containers
docker-compose up -d

llocalsearch's People

Contributors

nilsherzig avatar dependabot[bot] avatar popey avatar jpoz avatar sinwoobang avatar xiaoconstantine avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.