Building a Context-Aware Chatbot with Python and BigQuery
Most chatbots are either clever or useful — rarely both. What if you could build a lightweight assistant that actually knew what mattered to your business — without fine-tuning or a full RAG stack?
In this post, we’ll walk through how to build a context-aware chatbot in Python that:
- Uses prompt templates
- Pulls live context from BigQuery
- Generates smart responses using an LLM (like OpenAI or Gemini)
It’s a simple but powerful setup — one that I’ve used in real production tools.
🧠 The Idea
We’re building a chatbot that answers:
“What’s the most viewed product page this week?”
Instead of just guessing or hallucinating, the bot will query BigQuery for relevant context, inject that into a prompt, and give the user a helpful response.
🔁 High-Level Architecture
graph TD A[User Prompt] --> B[Chat Controller] subgraph CONTROLLER B --> C[Intent Classification / Routing] C --> D[Query BigQuery for Context] D --> E[Fill Prompt Template] E --> F[Call LLM API] end F --> G[Formatted Response] G --> H[User]
🧪 Example Dataset
Let’s say we have a table in BigQuery:
dataset.pageviews
With the following schema:
page_url
(STRING)views
(INTEGER)timestamp
(TIMESTAMP)
We’ll write a small Python chatbot that answers:
“What was the top product page this week?”
🧰 Tools You’ll Need
google-cloud-bigquery
openai
(or another LLM SDK)jinja2
for templating
Install them via pip:
pip install openai google-cloud-bigquery jinja2
🧠 Step 1: Set Up the Prompt Template
from jinja2 import Template
prompt_template = Template("""
You are a helpful analytics assistant.
Using the following data:
{{ context }}
Answer the user's question:
"{{ question }}"
""")
🔎 Step 2: Query BigQuery for Context
from google.cloud import bigquery
def get_top_product_page():
client = bigquery.Client()
query = """
SELECT page_url, views
FROM `your-project.dataset.pageviews`
WHERE timestamp >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 7 DAY)
ORDER BY views DESC
LIMIT 1
"""
results = client.query(query).result()
row = next(results)
return f"Top page: {row.page_url}, Views: {row.views}"
🧩 Step 3: Combine Context and Prompt
def build_prompt(user_input):
context = get_top_product_page()
return prompt_template.render(
context=context,
question=user_input
)
🔮 Step 4: Call the LLM
import openai
openai.api_key = "your-key-here"
def ask_chatbot(user_input):
prompt = build_prompt(user_input)
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
messages=[{
"role": "user",
"content": prompt
}]
)
return response.choices[0].message["content"]
✅ Final Output
if __name__ == "__main__":
user_question = "What was the most viewed product page this week?"
print(ask_chatbot(user_question))
🛠️ Extend This Further
- Route more questions to different queries
- Return visuals (e.g.
matplotlib
) - Schedule reports or use a Slackbot
- Add guardrails for safe outputs
💡 Wrap Up
You don’t need a full RAG stack to build useful tools. Prompt templates + BigQuery + Python gets you far.
Want to build one for your team? andre@promptloop.dev