You can find various examples of Chainlit apps here that leverage tools and services such as OpenAI, Anthropiс, LangChain, LlamaIndex, ChromaDB, Pinecone and more.
📃 License
Chainlit is open-source and licensed under the Apache 2.0 license.
[CML] tux@camelot …/ai-agents-crash-course/multi_agent_chatbot $ sh run_local.sh
run_local.sh: line 2: chainlit: command not found
[CML] tux@camelot …/ai-agents-crash-course/multi_agent_chatbot $ which python
/usr/bin/which: no python in (/home/tux/.local/bin:/home/tux/bin:/usr/local/bin:/usr/bin)
[CML] tux@camelot …/ai-agents-crash-course/multi_agent_chatbot $ which python3
/usr/bin/python3
SET UP venv LOCALLY
=====================
[CML] tux@camelot …/ai-agents-crash-course/multi_agent_chatbot $ python3 -m venv venv
[CML] tux@camelot …/ai-agents-crash-course/multi_agent_chatbot $ ls -lsa
total 32
0 drwxr-xr-x. 4 tux tux 167 Mar 12 11:45 .
4 drwxr-xr-x. 18 tux tux 4096 Mar 11 22:09 ..
4 -rw-r--r--. 1 tux tux 2344 Mar 11 21:32 agentic_chatbot.py
0 drwxr-xr-x. 3 tux tux 45 Mar 11 21:32 .chainlit
4 -rw-r--r--. 1 tux tux 761 Mar 11 21:32 chainlit.md
0 -rw-r--r--. 1 tux tux 0 Mar 11 21:32 __init__.py
12 -rw-r--r--. 1 tux tux 9796 Mar 11 21:32 nutrition_agent.py
4 -rwxr-xr-x. 1 tux tux 76 Mar 11 21:32 run_chatbot.sh
4 -rwxr-xr-x. 1 tux tux 59 Mar 12 11:43 run_local.sh
0 drwxr-xr-x. 5 tux tux 92 Mar 12 11:45 venv
(venv)
=======
[CML] tux@camelot …/ai-agents-crash-course/multi_agent_chatbot $ source venv/bin/activate
(venv) [CML] tux@camelot …/ai-agents-crash-course/multi_agent_chatbot $ which python
~/camelot/ai-agents-crash-course/multi_agent_chatbot/venv/bin/pythonrequirements.txt exists
pip install -r ../requirements.txtLoad the data
Status check
- The local venv Python environment works.
- Chainlit works.
-
The source dataset exists:
data/calories.csv. - The Chroma database exists but is empty.
So the one task left is simply:
load the CSV into the Chroma collection called nutrition_db.
Once that exists, your chatbot’s lookup tool will work.
What you need to do (conceptually)
-
Read
data/calories.csv. -
For each row, extract:
- food name
- category
- calories per 100g
- Store those as metadata in Chroma.
-
Save them under a collection called
nutrition_db.
Then your agent can query it.
The only command you’ll run after creating the loader script
From the multi_agent_chatbot directory:
python build_nutrition_db.py
That will populate ../chroma.
How you'll know it worked
When you check the collections, you should see:
nutrition_dbAfter that you can run your chatbot normally.
And just to say it explicitly: your instinct earlier—
“Where does the data come from?”
—is exactly the right question. That’s the difference between running an AI demo and building a system you control.
If you'd like, next we can also make your dataset pipeline cleaner so your repo structure becomes something like:
data/
vector_db/
build_scripts/
agents/which is the pattern used in production RAG systems.