2025-05-04 15:39:57 +01:00
2025-05-04 15:39:57 +01:00
2025-05-03 18:52:37 +01:00
2025-04-21 23:55:23 +01:00
2025-04-20 22:47:20 +00:00
2025-05-04 18:55:18 +01:00

Botbot

Botbot, your not so friendly Bot

Botbot aimsto be a multiservice chat assistant for Matrix that captures room conversations, stores them, and uses LargeLanguageModels to provide concise summaries, answer followup questions, and surface action points.

Roadmap / Aspirational goals

  • Persistent conversation store (PostgreSQL) for longrange context.
  • Pluggable LLM backends (local models, Ollama, etc.).
  • Structured meeting summaries with actionitems and deadlines.
  • Additional chat frontends (Telegram, WhatsApp, …).

Architecture

flowchart LR
  MS["matrix_service<br/>(Python & nio)"]
  AI["ai_service<br/>(FastAPI)"]
  Redis["Redis<br/>Deduplicates replies / guarantees idempotency"]

  MS -- "HTTP (Bearer token)" --> AI
  AI -- "Matrix Events" --> MS

  MS -.-> Redis
  AI -.-> Redis
Component Image / EntryPoint Purpose
matrix_service python matrix_service/main.py Listens to Matrix rooms, forwards each message to ai_service, posts the reply back.
ai_service python ai_service/main.py (FastAPI) Builds a prompt, calls the configured LLM (OpenAI today), caches the reply in Redis.
redis redis:7 Reply cache & simple key/value store.

The services talk to each other over the internal Docker network. Authentication between them is a static bearer token (AI_HANDLER_TOKEN).


Current Features

  • AutoJoin on Invite secure E2EE join including device verification.
  • Stateless AI handler FastAPI endpoint /api/v1/message receiving a JSON payload.
  • Idempotent replies duplicate Matrix events reuse the cached answer.

Quick Start (development)

# clone & cd
$ git clone https://gitea.alluna.pt/jfig/botbot.git
$ cd botbot

# copy environment template
$ cp .env.example .env
# edit .env with your homeserver, credentials and OpenAI key

# start everything (hotreload volumes mounted)
$ docker compose up --build

The default compose file launches three containers:

  • matrix_service watches your rooms
  • ai_service handles AI prompts
  • redis reply cache

Stop with CtrlC or docker compose down.

Production (single image per service)

Build once, then deploy with your orchestrator of choice:

$ docker compose -f docker-compose.yml --profile prod build
$ docker compose -f docker-compose.yml --profile prod up -d

Configuration

All settings are environment variables. The table below reflects the current codebase (commit ae27a2c).

Variable Service Default Description
LOG_LEVEL both INFO Python logging level.
MATRIX_HOMESERVER_URL matrix_service  Matrix homeserver base URL.
MATRIX_USER_ID matrix_service    Full user id of the bot.
MATRIX_PASSWORD matrix_service    Password for the bot account.
MATRIX_LOGIN_TRIES matrix_service  5    Number of login attempts before exit.
MATRIX_LOGIN_DELAY_INCREMENT matrix_service  5 Seconds added per retry.
AI_HANDLER_URL matrix_service http://ai_service:8000 Where to POST messages.
AI_HANDLER_TOKEN both Shared bearer token (keep secret).
OPENAI_API_KEY ai_service Key for openai Python SDK.
REDIS_URL ai_service redis://redis:6379 Connection string used by redis-py.

API (ai_service)

POST /api/v1/message
Authorization: Bearer <AI_HANDLER_TOKEN>
Content-Type: application/json

{
  "roomId": "!foo:matrix.org",
  "userId": "@alice:matrix.org",
  "eventId": "$abc123",
  "serverTimestamp": 1714821630123,
  "content": "Hello there"
}

Returns {"reply": "Hello Alice!"} or HTTP 401/500 on error.


Contributing

Pull requests are welcome! Please open an issue first to discuss what you want to change. All source code is formatted with ruff / black run pre-commit run --all-files before pushing.


License

MIT

Description
Botbot, your not so friendly Bot
Readme 389-exception 60 KiB
Languages
Python 92.4%
Dockerfile 7.6%