132 lines
5.0 KiB
Markdown
132 lines
5.0 KiB
Markdown
# Botbot
|
||
|
||
*Botbot, your not so friendly Bot*
|
||
|
||
**Botbot** aimsto be a multi‑service chat assistant for [Matrix](https://matrix.org) that captures room conversations, stores them, and uses Large‑Language‑Models to provide concise summaries, answer follow‑up questions, and surface action points.
|
||
|
||
### Roadmap / Aspirational goals
|
||
|
||
* Persistent conversation store (PostgreSQL) for long‑range context.
|
||
* Pluggable LLM back‑ends (local models, Ollama, etc.).
|
||
* Structured meeting summaries with action‑items and deadlines.
|
||
* Additional chat front‑ends (Telegram, WhatsApp, …).
|
||
|
||
---
|
||
|
||
## Architecture
|
||
|
||
```mermaid
|
||
flowchart LR
|
||
MS["matrix_service<br/>(Python & nio)"]
|
||
AI["ai_service<br/>(FastAPI)"]
|
||
Redis["Redis<br/>Deduplicates replies / guarantees idempotency"]
|
||
|
||
MS -- "HTTP (Bearer token)" --> AI
|
||
AI -- "Matrix Events" --> MS
|
||
|
||
MS -.-> Redis
|
||
AI -.-> Redis
|
||
```
|
||
|
||
| Component | Image / Entry‑Point | Purpose |
|
||
| ------------------- | ------------------------------------- | ------------------------------------------------------------------------------------- |
|
||
| **matrix\_service** | `python matrix_service/main.py` | Listens to Matrix rooms, forwards each message to `ai_service`, posts the reply back. |
|
||
| **ai\_service** | `python ai_service/main.py` (FastAPI) | Builds a prompt, calls the configured LLM (OpenAI today), caches the reply in Redis. |
|
||
| **redis** | `redis:7` | Reply cache & simple key/value store. |
|
||
|
||
The services talk to each other over the internal Docker network. Authentication between them is a static bearer token (`AI_HANDLER_TOKEN`).
|
||
|
||
---
|
||
|
||
## Current Features
|
||
|
||
* **Auto‑Join on Invite** – secure E2EE join including device verification.
|
||
* **Stateless AI handler** – FastAPI endpoint `/api/v1/message` receiving a JSON payload.
|
||
* **Idempotent replies** – duplicate Matrix events reuse the cached answer.
|
||
|
||
|
||
---
|
||
|
||
## Quick Start (development)
|
||
|
||
```bash
|
||
# clone & cd
|
||
$ git clone https://gitea.alluna.pt/jfig/botbot.git
|
||
$ cd botbot
|
||
|
||
# copy environment template
|
||
$ cp .env.example .env
|
||
# edit .env with your homeserver, credentials and OpenAI key
|
||
|
||
# start everything (hot‑reload volumes mounted)
|
||
$ docker compose up --build
|
||
```
|
||
|
||
The default compose file launches three containers:
|
||
|
||
* `matrix_service` – watches your rooms
|
||
* `ai_service` – handles AI prompts
|
||
* `redis` – reply cache
|
||
|
||
Stop with <kbd>Ctrl‑C</kbd> or `docker compose down`.
|
||
|
||
### Production (single image per service)
|
||
|
||
Build once, then deploy with your orchestrator of choice:
|
||
|
||
```bash
|
||
$ docker compose -f docker-compose.yml --profile prod build
|
||
$ docker compose -f docker-compose.yml --profile prod up -d
|
||
```
|
||
|
||
---
|
||
|
||
## Configuration
|
||
|
||
All settings are environment variables. The table below reflects the current codebase (commit `ae27a2c`).
|
||
|
||
| Variable | Service | Default | Description |
|
||
| ------------------------------ | --------------- | ------------------------ | ---------------------------------------- |
|
||
| `LOG_LEVEL` | both | `INFO` | Python logging level. |
|
||
| `MATRIX_HOMESERVER_URL` | matrix\_service | – | Matrix homeserver base URL. |
|
||
| `MATRIX_USER_ID` | matrix\_service | – | Full user id of the bot. |
|
||
| `MATRIX_PASSWORD` | matrix\_service | – | Password for the bot account. |
|
||
| `MATRIX_LOGIN_TRIES` | matrix\_service | `5` | Number of login attempts before exit. |
|
||
| `MATRIX_LOGIN_DELAY_INCREMENT` | matrix\_service | `5` | Seconds added per retry. |
|
||
| `AI_HANDLER_URL` | matrix\_service | `http://ai_service:8000` | Where to POST messages. |
|
||
| `AI_HANDLER_TOKEN` | both | – | Shared bearer token (keep secret). |
|
||
| `OPENAI_API_KEY` | ai\_service | – | Key for `openai` Python SDK. |
|
||
| `REDIS_URL` | ai\_service | `redis://redis:6379` | Connection string used by `redis-py`. |
|
||
|
||
---
|
||
|
||
## API (ai\_service)
|
||
|
||
```http
|
||
POST /api/v1/message
|
||
Authorization: Bearer <AI_HANDLER_TOKEN>
|
||
Content-Type: application/json
|
||
|
||
{
|
||
"roomId": "!foo:matrix.org",
|
||
"userId": "@alice:matrix.org",
|
||
"eventId": "$abc123",
|
||
"serverTimestamp": 1714821630123,
|
||
"content": "Hello there"
|
||
}
|
||
```
|
||
|
||
Returns `{"reply": "Hello Alice!"}` or HTTP 401/500 on error.
|
||
|
||
---
|
||
|
||
## Contributing
|
||
|
||
Pull requests are welcome! Please open an issue first to discuss what you want to change. All source code is formatted with **ruff** / **black** – run `pre-commit run --all-files` before pushing.
|
||
|
||
---
|
||
|
||
## License
|
||
|
||
[MIT](LICENSE)
|