This sample demonstrates a few approaches for creating ChatGPT-like experiences over your own data. It uses Azure OpenAI Service to access the ChatGPT model (gpt-35-turbo and gpt3), and vector store (Pinecone, Redis and others) or Azure cognitive search for data indexing and retrieval.
The repo provides a way to upload your own data so it's ready to try end to end.
- Upload (PDF/Text Documents as well as Webpages)
- Chat
- Q&A interfaces
- Explores various options to help users evaluate the trustworthiness of responses with citations, tracking of source content, etc.
- Shows possible approaches for data preparation, prompt construction, and orchestration of interaction between model (ChatGPT) and retriever
- Integration with Cognitive Search and Vector stores (Redis, Pinecone)
NOTE In order to deploy and run this example, you'll need an Azure subscription with access enabled for the Azure OpenAI service. You can request access here.
- Azure Developer CLI
- Python 3.9
- Node.js
- Git
- Azure Functions Extension for VSCode
- Azure Functions Core tools
- Powershell 7+ (pwsh) - For Windows users only. Important: Ensure you can run
pwsh.exe
from a PowerShell command. If this fails, you likely need to upgrade PowerShell.
-
Deploy the required Azure Services - Using scripts and steps below:
- Git clone the repo
- Download the pre-requisites above
- Run
azd login
to login to Azure using your credentials - Run
azd init
to initialize the environment name, subscription & location- enter environment name, select subscription & location
- Run
azd env set AZURE_PREFIX <PrefixName>
- Replace prefix name that will be used during deployment - Run
azd up
to deploy the infrastructure code (azure services) and deploy the Azure functions as well as Backend app
Note Ensure that the location you select is the location where OpenAI service is available to deploy (https://learn.microsoft.com/en-us/azure/cognitive-services/openai/concepts/models#model-summary-table-and-region-availability)
- Above command will deploy following services
- Azure App Service Plan (Linux - B1 Tier)
- Cognitive Search Service (Standard Tier)
- Azure App Service (To Deploy backend service)
- Azure Function app (For all Python API)
- Storage Account (to store all your files) & Function storage account
- Azure Open AI Service
- Azure Application Insight
Note External vector store are not deployed and you will need to manually deploy them (Pinecone or Redis)
-
Alternatively deploy the following services manually
- OpenAI service. Please be aware of the model & region availability documented [here] (https://learn.microsoft.com/en-us/azure/cognitive-services/openai/concepts/models#model-summary-table-and-region-availability)
- Storage Account and a container
- One of the Document Store
- Pinecone Starter. Note Make sure you create the index in Pincone with dimensions as 1536 and metric as cosine
- Cognitive Search
- Redis
- Create Function App (https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-function-app-portal)
- Create Azure Web App
- Git clone the repo
- Open the cloned repo folder in VSCode
- Open new terminal and go to /app/frontend directory
- Run
npm install
to install all the packages - Go to /api/Python directory
- Run
pip install -r requirements.txt
to install all required python packages - Copy sample.settings.json to local.settings.json
- Update the configuration (Minimally you need OpenAi, one of the document store, storage account)
- Deploy the Azure Python API to Function app
- Open new terminal and go to /api/frontend directory
- Run npm run build for production build and copying static files to app/backend/static directory
- Open new terminal and go to /api/backend directory
- Copy env.example to .env file and edit the file to enter the Python localhost API and the storage configuration
- Deploy the app/backend Azure web app.
- Git clone the repo
- Open the cloned repo folder in VSCode
- Open new terminal and go to /app/frontend directory
- Run
npm install
to install all the packages - Go to /api/Python directory
- Run
pip install -r requirements.txt
to install all required python packages - Copy sample.settings.json to local.settings.json
- Update the configuration (Minimally you need OpenAi, one of the document store, storage account)
- Start the Python API by running
func host start
- Open new terminal and go to /api/backend directory
- Copy env.example to .env file and edit the file to enter the Python localhost API and the storage configuration
- Run py app.py to start the backend locally (on port 5000)
- Open new terminal and go to /api/frontend directory
- Run npm run dev to start the local server (on port 5173)
- Browse the localhost:5173 to open the web app.
Once in the web app:
- Try different topics in chat or Q&A context. For chat, try follow up questions, clarifications, ask to simplify or elaborate on answer, etc.
- Explore citations and sources
- Click on "settings" to try different options, tweak prompts, etc.
- Revolutionize your Enterprise Data with ChatGPT: Next-gen Apps w/ Azure OpenAI and Cognitive Search
- Azure Cognitive Search
- Azure OpenAI Service
Adapted from the Azure OpenAI Search repo at OpenAI-CogSearch