Neoj_LLMs is a sophisticated tool that facilitates seamless interaction with your Neo4j database, whether it's hosted locally or on a sandbox. This repository enables users to ask natural language questions about their databases, which are then intelligently translated into Neo4j Cypher Queries using Language Model Models (LLMs), making the querying process intuitive and accessible.
-
Database Connection: Connect to your Neo4j database, whether it's locally hosted or on a sandbox.
-
Natural Language Queries: Ask questions about your database using natural language, simplifying user interactions with the graph.
-
LLM Translation: Leverage the power of Language Model Models to intelligently translate natural language queries into Neo4j Cypher Queries.
Follow these steps to get started with Neoj_LLMs:
-
Clone the repository:
git clone https://github.com/elmondhir/Neo4j-LLMS.git cd Neoj-LLMs
-
Install dependencies:
pip install -r requirements.txt
-
Configure your Neo4j database connection details in the
config.py
file. -
Set up the Hugging Face API key for LLM translation.
-
Run the application:
python -B manage.py runserver
-
Open your browser and go to http://localhost:8000 to access the Neoj_LLMs interface.
-
Connect to your Neo4j database by providing the necessary connection details.
-
Ask natural language questions about your database in the provided interface.
-
Neoj_LLMs will intelligently translate your natural language questions into Neo4j Cypher Queries using Language Model Models.
-
View and execute the generated Cypher Queries on your Neo4j database.
To run Neoj_LLMs using Docker, make sure Docker is installed on your machine. If not, you can download and install Docker from the official Docker website.
Follow these steps:
-
Build the Docker image:
docker-compose build
-
Start the application:
docker-compose up
-
Open your browser and go to http://localhost:8000 to access the Neoj_LLMs interface.
We welcome contributions! If you find any issues or have suggestions for improvements, please open an issue or submit a pull request.