This is a demonstration of how to use reason and act with llama.cpp and a LLM to pose plain english queries to a SQLite database using one of two strategies:
- Actions that mimic interaction with a frontend like Datasette. Actions: list tables, list table columns, facet, filter
- Let the LLM use SQLite queries directly. Actions: list tables, list table schema, execute sql
The things you'll need to do are:
- Provide a SQLite database (named
example.db
or you need to change the name in the Python files) - Change the prompts in both Python scripts (the
prompt
string inside theexecute
functions) to be specific to your data and problems. You'll also want to date theDATA_HELP
table and column descriptions inrun-sql-queries.py
. - Download a GGUF model for use, change the
MODEL_PATH
variable in both scripts to point at its location.
There are minimal dependencies for this project, just sqlite-utils and llama.cpp-py. You can install with using pip:
pip install -r requirements.txt
Once you have everything installed and configured, you can kick off a session by coming up with a question and asking it on the command line:
python run-interface.py "What kind of data do I have available?"
python run-sql-queries.py "What are some interesting records in the database?"
The model output will be printed to stdout.