updated to reflect refactoring
This commit is contained in:
172
README.md
172
README.md
@@ -2,108 +2,130 @@
|
|||||||
|
|
||||||
*Botbot, your not so friendly Bot*
|
*Botbot, your not so friendly Bot*
|
||||||
|
|
||||||
A Matrix chat bot that listens to specified rooms, records conversations, leverages OpenAI for AI-driven summaries, and assists with answering questions.
|
**Botbot** aimsto be a multi‑service chat assistant for [Matrix](https://matrix.org) that captures room conversations, stores them, and uses Large‑Language‑Models to provide concise summaries, answer follow‑up questions, and surface action points.
|
||||||
|
|
||||||
## Objectives
|
### Roadmap / Aspirational goals
|
||||||
|
|
||||||
- Record message history on rooms its participating.
|
* Persistent conversation store (PostgreSQL) for long‑range context.
|
||||||
- Create discussion summaries to capture actionable items, deadlines, and decisions on subjects discussed on those rooms.
|
* Pluggable LLM back‑ends (local models, Ollama, etc.).
|
||||||
- Use collected knoledge to answer questions placed by participants in discussions that are left unansewerd after some time, and also when its direclty addressed using it's @handle
|
* Structured meeting summaries with action‑items and deadlines.
|
||||||
|
* Additional chat front‑ends (Telegram, WhatsApp, …).
|
||||||
|
|
||||||
also,
|
---
|
||||||
|
|
||||||
- Support additional AI backends beyond OpenAI (e.g., local LLMs, alternative APIs).
|
## Architecture
|
||||||
- Possibly support other Chat services beyownd Matrix like Telegram, Teams or Whatsapp.
|
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
flowchart LR
|
||||||
|
MS["matrix_service<br/>(Python & nio)"]
|
||||||
|
AI["ai_service<br/>(FastAPI)"]
|
||||||
|
Redis["Redis<br/>Deduplicates replies / guarantees idempotency"]
|
||||||
|
|
||||||
|
MS -- "HTTP (Bearer token)" --> AI
|
||||||
|
AI -- "Matrix Events" --> MS
|
||||||
|
|
||||||
|
MS -.-> Redis
|
||||||
|
AI -.-> Redis
|
||||||
|
```
|
||||||
|
|
||||||
|
| Component | Image / Entry‑Point | Purpose |
|
||||||
|
| ------------------- | ------------------------------------- | ------------------------------------------------------------------------------------- |
|
||||||
|
| **matrix\_service** | `python matrix_service/main.py` | Listens to Matrix rooms, forwards each message to `ai_service`, posts the reply back. |
|
||||||
|
| **ai\_service** | `python ai_service/main.py` (FastAPI) | Builds a prompt, calls the configured LLM (OpenAI today), caches the reply in Redis. |
|
||||||
|
| **redis** | `redis:7` | Reply cache & simple key/value store. |
|
||||||
|
|
||||||
|
The services talk to each other over the internal Docker network. Authentication between them is a static bearer token (`AI_HANDLER_TOKEN`).
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
## Current Features
|
## Current Features
|
||||||
|
|
||||||
- **Auto-join Rooms**: Automatically joins rooms when invited.
|
* **Auto‑Join on Invite** – secure E2EE join including device verification.
|
||||||
- **Message Callbacks**: Responds to basic commands:
|
* **Stateless AI handler** – FastAPI endpoint `/api/v1/message` receiving a JSON payload.
|
||||||
- `!ping` → `Pong!`
|
* **Idempotent replies** – duplicate Matrix events reuse the cached answer.
|
||||||
- `hello botbot` → Greeting
|
|
||||||
|
|
||||||
## Prerequisites
|
|
||||||
|
|
||||||
- Docker & Docker Compose (or Python 3.8+)
|
---
|
||||||
- A Matrix account for the bot
|
|
||||||
- OpenAI API key
|
|
||||||
|
|
||||||
## Installation
|
## Quick Start (development)
|
||||||
|
|
||||||
1. **Clone the repository**
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
git clone https://gitea.alluna.pt/jfig/botbot.git
|
# clone & cd
|
||||||
cd botbot
|
$ git clone https://gitea.alluna.pt/jfig/botbot.git
|
||||||
|
$ cd botbot
|
||||||
|
|
||||||
|
# copy environment template
|
||||||
|
$ cp .env.example .env
|
||||||
|
# edit .env with your homeserver, credentials and OpenAI key
|
||||||
|
|
||||||
|
# start everything (hot‑reload volumes mounted)
|
||||||
|
$ docker compose up --build
|
||||||
```
|
```
|
||||||
|
|
||||||
2. **Configure environment variables**
|
The default compose file launches three containers:
|
||||||
|
|
||||||
Copy the example file and edit it:
|
* `matrix_service` – watches your rooms
|
||||||
|
* `ai_service` – handles AI prompts
|
||||||
|
* `redis` – reply cache
|
||||||
|
|
||||||
|
Stop with <kbd>Ctrl‑C</kbd> or `docker compose down`.
|
||||||
|
|
||||||
|
### Production (single image per service)
|
||||||
|
|
||||||
|
Build once, then deploy with your orchestrator of choice:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
cp .env.example .env
|
$ docker compose -f docker-compose.yml --profile prod build
|
||||||
|
$ docker compose -f docker-compose.yml --profile prod up -d
|
||||||
```
|
```
|
||||||
|
|
||||||
Then open `.env` and set:
|
---
|
||||||
|
|
||||||
```ini
|
## Configuration
|
||||||
LOG_LEVEL=INFO
|
|
||||||
HOMESERVER_URL=https://matrix.org
|
All settings are environment variables. The table below reflects the current codebase (commit `ae27a2c`).
|
||||||
USER_ID=@botbot_user:matrix.org
|
|
||||||
PASSWORD=your_matrix_password
|
| Variable | Service | Default | Description |
|
||||||
OPENAI_API_KEY=your_openai_api_key
|
| ------------------------------ | --------------- | ------------------------ | ---------------------------------------- |
|
||||||
|
| `LOG_LEVEL` | both | `INFO` | Python logging level. |
|
||||||
|
| `MATRIX_HOMESERVER_URL` | matrix\_service | – | Matrix homeserver base URL. |
|
||||||
|
| `MATRIX_USER_ID` | matrix\_service | – | Full user id of the bot. |
|
||||||
|
| `MATRIX_PASSWORD` | matrix\_service | – | Password for the bot account. |
|
||||||
|
| `MATRIX_LOGIN_TRIES` | matrix\_service | `5` | Number of login attempts before exit. |
|
||||||
|
| `MATRIX_LOGIN_DELAY_INCREMENT` | matrix\_service | `5` | Seconds added per retry. |
|
||||||
|
| `AI_HANDLER_URL` | matrix\_service | `http://ai_service:8000` | Where to POST messages. |
|
||||||
|
| `AI_HANDLER_TOKEN` | both | – | Shared bearer token (keep secret). |
|
||||||
|
| `OPENAI_API_KEY` | ai\_service | – | Key for `openai` Python SDK. |
|
||||||
|
| `REDIS_URL` | ai\_service | `redis://redis:6379` | Connection string used by `redis-py`. |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## API (ai\_service)
|
||||||
|
|
||||||
|
```http
|
||||||
|
POST /api/v1/message
|
||||||
|
Authorization: Bearer <AI_HANDLER_TOKEN>
|
||||||
|
Content-Type: application/json
|
||||||
|
|
||||||
|
{
|
||||||
|
"roomId": "!foo:matrix.org",
|
||||||
|
"userId": "@alice:matrix.org",
|
||||||
|
"eventId": "$abc123",
|
||||||
|
"serverTimestamp": 1714821630123,
|
||||||
|
"content": "Hello there"
|
||||||
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
## Usage
|
Returns `{"reply": "Hello Alice!"}` or HTTP 401/500 on error.
|
||||||
|
|
||||||
### Using Docker Compose (development/hot-reload)
|
---
|
||||||
|
|
||||||
```bash
|
|
||||||
docker-compose up --build
|
|
||||||
```
|
|
||||||
|
|
||||||
### Building and Running Manually (production)
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Build container
|
|
||||||
docker build -t botbot .
|
|
||||||
|
|
||||||
# Run container
|
|
||||||
docker run -d --env-file .env \
|
|
||||||
-v matrix_data:/app/data \
|
|
||||||
--restart unless-stopped \
|
|
||||||
botbot
|
|
||||||
```
|
|
||||||
|
|
||||||
## Configuration Options
|
|
||||||
|
|
||||||
- `LOG_LEVEL`: One of `CRITICAL`, `ERROR`, `WARNING`, `INFO`, `DEBUG`, `NOTSET`.
|
|
||||||
- `HOMESERVER_URL`: Matrix homeserver endpoint (e.g., `https://matrix.org`).
|
|
||||||
- `USER_ID`: Bot's full Matrix user ID (e.g., `@botbot_user:matrix.org`).
|
|
||||||
- `PASSWORD`: Password for the bot account.
|
|
||||||
- `OPENAI_API_KEY`: API key for OpenAI usage.
|
|
||||||
|
|
||||||
## How It Works
|
|
||||||
|
|
||||||
1. **Startup**: Loads environment and logs into the Matrix homeserver.
|
|
||||||
2. **Callbacks**:
|
|
||||||
- `message_callback`: Handles text messages and triggers AI logic.
|
|
||||||
- `invite_cb`: Joins rooms on invitation.
|
|
||||||
3. **AI Integration**: Future development will:
|
|
||||||
- Pull recent chat history.
|
|
||||||
- Call OpenAI endpoints to generate summaries or answers.
|
|
||||||
|
|
||||||
## Contributing
|
## Contributing
|
||||||
|
|
||||||
1. Fork the repository.
|
Pull requests are welcome! Please open an issue first to discuss what you want to change. All source code is formatted with **ruff** / **black** – run `pre-commit run --all-files` before pushing.
|
||||||
2. Create a feature branch (`git checkout -b feature/xyz`).
|
|
||||||
3. Commit changes and push (`git push origin feature/xyz`).
|
---
|
||||||
4. Open a pull request with a description of your changes.
|
|
||||||
|
|
||||||
## License
|
## License
|
||||||
|
|
||||||
This project is licensed under the [MIT License](LICENSE).
|
[MIT](LICENSE)
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user