AI Assistant
Please check the user guide of how the AI assistant features look like.
The AI assistant plugin is powered by LLM(Large Language Model), like ChatGPT from openai. We're using Langchain to build the plugin.
AI Assistant Plugin
The AI Assistant plugin will allow users to do
- title generation
- text to sql
- query auto fix
- sql completion
Please follow below steps to enable AI assistant plugin:
-
Add
langchain
package dependency by adding-r ai/langchain.txt
torequirements/local.txt
. -
[Optional] Create your own AI assistant provider if needed. Please refer to
querybook/server/lib/ai_assistant/openai_assistant.py
as an example. -
Add your provider in
plugins/ai_assistant_plugin/__init__.py
-
Add configs in the
querybook_config.yaml
. Please refer tocontainers/bundled_querybook_config.yaml
as an example. Please also check the model's official doc for all avaialbe model args.- Dont forget to set proper environment variables for your provider. e.g. for openai, you'll need
OPENAI_API_KEY
.
- Dont forget to set proper environment variables for your provider. e.g. for openai, you'll need
-
Enable it in
querybook/config/querybook_public_config.yaml
Vector Store Plugin
The vector store plugin supports embedding based table search using natural language. It requires an embeddings provider and a vector store. Please check Langchain docs for more details of available embeddings and vector stores.
How to set up and host a vector store or use a cloud vector store solution is not covered here. You can choose your own vector db solution.
-
[Optional] Create your own embeddings or vector store if needed. Please refer to
querybook/server/lib/vector_store/stores/opensearch.py
as an example -
Add the providers in
plugins/vector_store_plugin/__init__.py
-
Add configs in the
querybook_config.yaml
. Please refer tocontainers/bundled_querybook_config.yaml
as an example. Please also check Langchain doc for configs each vector store requires.- Also dont forget to set proper environment variables for your provider. e.g. for openai embeddings, you'll need
OPENAI_API_KEY
.
- Also dont forget to set proper environment variables for your provider. e.g. for openai embeddings, you'll need
-
Enable it in
querybook/config/querybook_public_config.yaml
With vector store plugin enabled, text-to-sql will also use it to find tables if tables are not provided by the user.
Initilize the Vector Index
In Docker based deployments, attach to web
or worker
component and run
python ./querybook/server/scripts/init_vector_store.py
It will add summary for all tables and sample query summary of the tables to the vector store. If you'd like to only index part of the tables, you can follow the example of ingest_vector_index
to create your own script.