-
-
Notifications
You must be signed in to change notification settings - Fork 37
Open
Description
Llamabot Documentation Read-through
Library User
- Use the library to build stuff
- Experience Simple Bot:
- Basic: Write phyton
- Some experience in interacting with LLMs (Desktop apps)
- Experience: Queuery Bot,
- People who have interacted with RAG, asking chatgpt to answer questions on ChatGPT, or Structured Bot: asked LLM to produce templated text
- AgentBot (Advanced Level)
- People interested in building agents perform autonomous actions through function calls that are executed.
- Need to understand writing python functions
- Have had experience with coding agents (Claude Code)
- People interested in building agents perform autonomous actions through function calls that are executed.
Developer
- Code contribution
- Know how to git, fork, branch = basics of git (specifically git)
- Will need to do mkdocs after reading the doc, need to be able to add documentations in here
- Code base structure
Two links in the Table of Contents state “LlamaBot: A Pythonic bot interface to LLMs”.

Installation notes: LlamaBot: A Pythonic bot interface to LLMs
- It’s always the best if any hyperlinks opens new tabs rather than replacing the current tab
- If linking to a section inside the doc, no need to open new tab
- What application does do I need to use to install Llamabot?
- If the user does not have pip, they are blocked on the first install instruction. How would they get pip?
- Use Marimo Notebook instead of pip install
- Used https://github.com/ericmjl/building-with-llms-made-simple/blob/main/notebooks/00_intro.py#L139 instead of Ollama.
- For SimpleBot local Ollama Model, ollama_chat needs to be specified as provider rather than prefix
- For Simplest memory, the example implies defining variable for responses is important/required, but that is not true. The variable does not need to be declared.
- For extra document demonstrations, could be nice to provide users example docs/images to download to test out
Prepare Llamabot installation with Marimo:
- Install uv (https://docs.astral.sh/uv/getting-started/installation/)
- Cd to the directory in which you want a new notebook to be created by the command line in step 3
- Create a new notebook with “uvx marimo edit -- sandbox .py”. This will also launch Marino.
Once in the Marimo:
- We will need to configure to download all of Llamabot’s tooling
- First, display Llamabot installation pop up by executing “Import llamabot as lmb” in a code cell.
- Configure to download all of Llamabot’s tooling by clicking “+” button next to “Llamabot” dropdown and selecting “all”. (Image is best)
- Click “Install with UV”
- User will try to speed through the pop up as soon as they see it. Show them before they get the prompt
- DEBUG text showing in red means success (For Eric, can we add in text called “Success” and/or make the color Green? Red means error and users asked how to resolve the “Debug error”)
Developer Readme:
- Developer readme seems like it would need to contain Hack on Llamabot with Development Container inside, rather than it being hidden inside a separate section/document
Metadata
Metadata
Assignees
Labels
No labels