Compare commits

..

13 Commits

Author SHA1 Message Date
Louis Simoneau
79279595ac Keep downloaded EPUBs so kobodl can skip them on future syncs
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-10 21:16:04 +10:00
Louis Simoneau
5197f92685 Fix calibre sync to import books as correct user
Prevents root-owned files in the library volume that
calibre-web can't write to.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-10 21:11:53 +10:00
Louis Simoneau
bcdc0c6cef Add WireGuard VPN, kobodl, and calibre-web
WireGuard for private service access (kobodl behind VPN).
kobodl downloads and de-DRMs Kobo store purchases.
calibre-web serves the library at books.monotrope.au.
sync.sh script handles ongoing download + import workflow.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-10 20:56:26 +10:00
Louis Simoneau
6a54777c5c Move Hermes config into volume, add pre-deploy sync check
Config.yaml was bind-mounted, blocking runtime writes (/sethome).
Move it into the Docker volume via docker cp instead. Add
hermes-sync Makefile target that diffs remote config against local
before deploying, to catch runtime changes.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-10 17:06:19 +10:00
Louis Simoneau
66b0588f52 Rewrite Miniflux plugin to use requests, add filter and bookmark tools
Drop the miniflux pip client in favour of requests (already in the
container). Add update_feed_filters (keeplist/blocklist regex),
toggle_bookmark, get_entry (full content), and category filtering.
Remove the pip install step from Ansible.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-10 16:45:46 +10:00
Louis Simoneau
9b83d56932 Fix Hermes plugin config: use config file instead of env vars
Hermes plugins don't inherit container env vars. Switch the Miniflux
plugin to read credentials from a config.json written by Ansible,
and drop requires_env / container env vars.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-10 16:28:23 +10:00
Louis Simoneau
bbeecde448 Add shared Docker network and Miniflux plugin for Hermes
- Create external 'monotrope' Docker network so services can
  communicate by container name
- Add Miniflux to the shared network (db stays on internal network)
- Add Hermes Miniflux plugin with list_feeds and get_unread_entries tools
- Mount plugin directory and pass Miniflux API key to Hermes container

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-10 16:16:34 +10:00
Louis Simoneau
3a9e3a7916 Add Hermes agent, self-host fonts, new blog post
- Add Hermes (Nous Research LLM agent) with Telegram gateway,
  Ansible provisioning, and Makefile targets
- Self-host JetBrains Mono and Spectral fonts (remove Google Fonts)
- Add "An Experiment in Self-Hosting" blog post
- Update CLAUDE.md with high-level server overview

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-10 16:06:48 +10:00
Louis Simoneau
ab050fddd7 Pin image versions, add security headers, log limits, unattended upgrades
- Pin Miniflux to 2.2.19, Gitea to 1.25 (from :latest)
- Add security headers (X-Content-Type-Options, X-Frame-Options,
  Referrer-Policy, Permissions-Policy) to all Caddy sites
- Add Docker JSON log rotation (10m x 3 files) to all containers
- Add SHA256 checksum verification for GoatCounter binary download
- Install and configure unattended-upgrades for security patches

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-10 08:31:41 +10:00
Louis Simoneau
a9e063867a Harden SSH, add fail2ban, remove redundant setup.sh
Disable password auth, restrict root login, limit auth retries.
Add fail2ban with SSH jail (3 retries, 1hr ban). Remove setup.sh
which predated Ansible and was no longer used.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-10 08:29:15 +10:00
Louis Simoneau
0d4050c58c Update site content, styling, and review layout
Revise about page, hello world post, and The Compound review. Add book
cover support to review template, favicon, typographer config, and
cover image enrichment script.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-10 08:14:29 +10:00
Louis Simoneau
6c7afecce1 Trim CLAUDE.md to non-discoverable context only
Remove project structure, tech stack, services, Makefile targets, DNS,
and other sections that duplicate what's already in the code.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-10 08:14:15 +10:00
Louis Simoneau
0d7287dce1 Add Gitea self-hosted git server
Docker Compose stack (Gitea + Postgres) on port 3000, SSH on 2222,
reverse-proxied via Caddy at git.monotrope.au.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-10 08:14:11 +10:00
35 changed files with 1278 additions and 207 deletions

107
CLAUDE.md
View File

@@ -2,97 +2,36 @@
Personal blog and server infrastructure for monotrope.au. Personal blog and server infrastructure for monotrope.au.
## Project Structure ## Theme & Concept
``` The name is a play on [monotropism](https://en.wikipedia.org/wiki/Monotropism) —
monotrope/ the theory of autistic cognition as deep, singular focus. The site is built around
site/ # Hugo site (content, templates, CSS) that idea: deep attention, flow states, and resisting the fragmentation of modern
infra/ (especially AI-mediated) work. It's also an exercise in ownership — writing and
ansible/playbook.yml # Single playbook for all server provisioning reviews live here instead of on corporate platforms.
miniflux/ # Docker Compose for Miniflux RSS reader
Caddyfile # Copied to /etc/caddy/Caddyfile by Ansible
deploy.sh # Build + rsync to production
Makefile # Common tasks
.env # Local secrets (not committed)
```
## Tech Stack The tone is personal and reflective. Content includes writing (posts) and book
reviews across all genres.
- **Static site generator:** Hugo (no theme — templates built from scratch) The terminal/CRT visual aesthetic is deliberate, not just decorative — it
- **Web server:** Caddy (automatic HTTPS via Let's Encrypt) reinforces the themes of simplicity, focus, and rejecting modern web bloat.
- **Hosting:** DigitalOcean droplet (Sydney region, Ubuntu 24.04 LTS) No JavaScript unless strictly necessary. No images or decorative elements beyond
- **Deployment:** `hugo --minify` then `rsync` to `/var/www/monotrope` CSS. The design should feel minimal, typographic, and monospaced-first.
- **Provisioning:** Ansible (`infra/ansible/playbook.yml`)
## Services ## Hosting
| Service | URL | How it runs | Port | DigitalOcean droplet, Sydney region, Ubuntu 24.04 LTS.
|-------------|-----------------------------|-----------------|------|
| Blog | https://monotrope.au | Static files | — |
| Miniflux | https://reader.monotrope.au | Docker Compose | 8080 |
| GoatCounter | https://stats.monotrope.au | systemd binary | 8081 |
## Ansible Playbook ### What's on the server
**All server changes must go through Ansible.** Everything must be idempotent — no ad-hoc SSH changes. - **Hugo static site** — built locally, rsynced to `/var/www/monotrope`
- **Caddy** — reverse proxy and TLS for all services
The playbook is at `infra/ansible/playbook.yml`. Tags let individual services be re-provisioned without touching the rest. - **Miniflux** — RSS reader (Docker, PostgreSQL)
- **Gitea** — self-hosted git server (Docker, PostgreSQL, SSH on port 2222)
| Tag | What it covers | - **GoatCounter** — privacy-friendly analytics (native binary, SQLite)
|---------------|------------------------------------------------------| - **Hermes Agent** — Nous Research's LLM agent (`nousresearch/hermes-agent`),
| `miniflux` | Miniflux Docker Compose + Caddyfile update | exposed via Telegram bot. Routes through OpenRouter. Used as a personal
| `goatcounter` | GoatCounter binary, systemd service + Caddyfile | assistant reachable from mobile. Docker, config in `infra/hermes/`.
| (no tag) | Full provisioning (system, Caddy, Docker, UFW, users)|
### Secrets
Pulled from environment variables, loaded from `.env` via Makefile:
```
MONOTROPE_HOST
MINIFLUX_DB_PASSWORD
MINIFLUX_ADMIN_USER
MINIFLUX_ADMIN_PASSWORD
GOATCOUNTER_ADMIN_EMAIL
GOATCOUNTER_ADMIN_PASSWORD
```
### GoatCounter
Runs as a systemd service (not Docker) using the pre-built binary from GitHub releases.
Version is pinned via `goatcounter_version` var in the playbook.
Initial site/user creation is gated on a `/var/lib/goatcounter/.admin_created` marker file
so re-running the playbook never attempts to recreate the user.
## Makefile Targets
```
make build # hugo --minify
make serve # hugo server --buildDrafts (local dev)
make deploy # build + rsync to production
make ssh # SSH as deploy user
make setup # Full Ansible provisioning (fresh droplet)
make miniflux # Ansible --tags miniflux
make goatcounter # Ansible --tags goatcounter
```
## DNS
- `monotrope.au` → droplet IP (A record)
- `www.monotrope.au` → droplet IP (A record, redirects to apex via Caddy)
- `reader.monotrope.au` → droplet IP (A record)
- `stats.monotrope.au` → droplet IP (A record)
## Site Layout
Content lives in `site/content/`:
- `posts/` — writing
- `reviews/` — book reviews
- `about.md` — about page (uses `layouts/page/single.html`)
Templates are in `site/layouts/`. No JavaScript unless strictly necessary.
The GoatCounter analytics script is injected in `baseof.html` and only loads
in production builds (`hugo.IsProduction`).
## Conventions ## Conventions

View File

@@ -1,4 +1,4 @@
.PHONY: build serve deploy ssh setup miniflux goatcounter .PHONY: build serve deploy ssh setup miniflux gitea goatcounter hermes hermes-sync hermes-chat enrich wireguard calibre calibre-sync
# Load .env if it exists # Load .env if it exists
-include .env -include .env
@@ -7,7 +7,7 @@ export
DEPLOY_USER := deploy DEPLOY_USER := deploy
MONOTROPE_HOST ?= MONOTROPE_HOST ?=
build: build: enrich
cd site && hugo --minify cd site && hugo --minify
serve: serve:
@@ -29,6 +29,54 @@ miniflux:
@test -n "$(MONOTROPE_HOST)" || (echo "Error: MONOTROPE_HOST is not set"; exit 1) @test -n "$(MONOTROPE_HOST)" || (echo "Error: MONOTROPE_HOST is not set"; exit 1)
ansible-playbook -i "$(MONOTROPE_HOST)," -u root infra/ansible/playbook.yml --tags miniflux ansible-playbook -i "$(MONOTROPE_HOST)," -u root infra/ansible/playbook.yml --tags miniflux
gitea:
@test -n "$(MONOTROPE_HOST)" || (echo "Error: MONOTROPE_HOST is not set"; exit 1)
ansible-playbook -i "$(MONOTROPE_HOST)," -u root infra/ansible/playbook.yml --tags gitea
goatcounter: goatcounter:
@test -n "$(MONOTROPE_HOST)" || (echo "Error: MONOTROPE_HOST is not set"; exit 1) @test -n "$(MONOTROPE_HOST)" || (echo "Error: MONOTROPE_HOST is not set"; exit 1)
ansible-playbook -i "$(MONOTROPE_HOST)," -u root infra/ansible/playbook.yml --tags goatcounter ansible-playbook -i "$(MONOTROPE_HOST)," -u root infra/ansible/playbook.yml --tags goatcounter
hermes: hermes-sync
@test -n "$(MONOTROPE_HOST)" || (echo "Error: MONOTROPE_HOST is not set"; exit 1)
ansible-playbook -i "$(MONOTROPE_HOST)," -u root infra/ansible/playbook.yml --tags hermes
hermes-sync:
@test -n "$(MONOTROPE_HOST)" || (echo "Error: MONOTROPE_HOST is not set"; exit 1)
@echo "Checking for remote config changes..."
@ssh root@$(MONOTROPE_HOST) docker cp hermes:/opt/data/config.yaml - 2>/dev/null | tar -xO > /tmp/hermes-remote-config.yaml || true
@if ! diff -q infra/hermes/config.yaml /tmp/hermes-remote-config.yaml >/dev/null 2>&1; then \
echo ""; \
echo "Remote config.yaml differs from local:"; \
echo "─────────────────────────────────────"; \
diff -u infra/hermes/config.yaml /tmp/hermes-remote-config.yaml || true; \
echo "─────────────────────────────────────"; \
echo ""; \
read -p "Overwrite remote with local? [y/N] " ans; \
if [ "$$ans" != "y" ] && [ "$$ans" != "Y" ]; then \
echo "Aborting. Merge remote changes into infra/hermes/config.yaml first."; \
exit 1; \
fi; \
else \
echo "Config in sync."; \
fi
hermes-chat:
@test -n "$(MONOTROPE_HOST)" || (echo "Error: MONOTROPE_HOST is not set"; exit 1)
ssh -t root@$(MONOTROPE_HOST) docker exec -it hermes hermes chat
wireguard:
@test -n "$(MONOTROPE_HOST)" || (echo "Error: MONOTROPE_HOST is not set"; exit 1)
ansible-playbook -i "$(MONOTROPE_HOST)," -u root infra/ansible/playbook.yml --tags wireguard
calibre:
@test -n "$(MONOTROPE_HOST)" || (echo "Error: MONOTROPE_HOST is not set"; exit 1)
ansible-playbook -i "$(MONOTROPE_HOST)," -u root infra/ansible/playbook.yml --tags calibre
calibre-sync:
@test -n "$(MONOTROPE_HOST)" || (echo "Error: MONOTROPE_HOST is not set"; exit 1)
ssh root@$(MONOTROPE_HOST) /opt/calibre/sync.sh
enrich:
uv run enrich.py

133
enrich.py Normal file
View File

@@ -0,0 +1,133 @@
#!/usr/bin/env python3
# /// script
# requires-python = ">=3.11"
# dependencies = [
# "python-frontmatter",
# ]
# ///
"""Enrich book reviews with ISBN and cover images from OpenLibrary."""
import json
import sys
import time
import urllib.request
import urllib.parse
from pathlib import Path
import frontmatter
REVIEWS_DIR = Path(__file__).parent / "site" / "content" / "reviews"
COVERS_DIR = Path(__file__).parent / "site" / "static" / "covers"
OL_SEARCH = "https://openlibrary.org/search.json"
OL_COVER = "https://covers.openlibrary.org/b/isbn/{isbn}-L.jpg"
def search_isbn(title: str, author: str) -> str | None:
"""Search OpenLibrary for an ISBN by title and author."""
params = {"title": title, "author": author, "limit": "3", "fields": "isbn"}
url = f"{OL_SEARCH}?{urllib.parse.urlencode(params)}"
req = urllib.request.Request(url, headers={"User-Agent": "monotrope-enrich/1.0"})
with urllib.request.urlopen(req, timeout=15) as resp:
data = json.loads(resp.read())
for doc in data.get("docs", []):
for isbn in doc.get("isbn", []):
if len(isbn) == 13:
return isbn
# fall back to ISBN-10 if no 13
for doc in data.get("docs", []):
for isbn in doc.get("isbn", []):
if len(isbn) == 10:
return isbn
return None
def fetch_cover(isbn: str, dest: Path) -> bool:
"""Download a cover image for the given ISBN. Returns True on success."""
url = OL_COVER.format(isbn=isbn)
req = urllib.request.Request(url, headers={"User-Agent": "monotrope-enrich/1.0"})
with urllib.request.urlopen(req, timeout=15) as resp:
data = resp.read()
# OpenLibrary returns a tiny 1x1 placeholder when no cover exists
if len(data) < 1000:
return False
dest.write_bytes(data)
return True
def enrich(path: Path, dry_run: bool = False) -> None:
"""Enrich a single review file with ISBN and cover."""
post = frontmatter.load(path)
title = post.get("title", "")
author = post.get("book_author", "")
has_isbn = bool(post.get("isbn"))
has_cover = bool(post.get("cover"))
if has_isbn and has_cover:
print(f" skip {path.name} (already enriched)")
return
# ── ISBN lookup ──────────────────────────────────
isbn = post.get("isbn", "")
if not isbn:
print(f" search title={title!r} author={author!r}")
isbn = search_isbn(title, author)
if not isbn:
print(f" ✗ no ISBN found for {path.name}")
return
print(f" found ISBN {isbn}")
# ── Cover download ──────────────────────────────
slug = path.stem
cover_file = COVERS_DIR / f"{slug}.jpg"
if not has_cover and not cover_file.exists():
print(f" fetch cover → {cover_file.relative_to(Path(__file__).parent)}")
if not dry_run:
COVERS_DIR.mkdir(parents=True, exist_ok=True)
if not fetch_cover(isbn, cover_file):
print(f" ✗ no cover image available for ISBN {isbn}")
cover_file = None
else:
cover_file = None
elif cover_file.exists():
print(f" ok cover already exists")
# ── Update frontmatter ──────────────────────────
changed = False
if not has_isbn:
post["isbn"] = isbn
changed = True
if not has_cover and cover_file and cover_file.exists():
post["cover"] = f"/covers/{slug}.jpg"
changed = True
if changed and not dry_run:
path.write_text(frontmatter.dumps(post) + "\n")
print(f" ✓ updated {path.name}")
elif changed:
print(f" (dry run) would update {path.name}")
def main() -> None:
dry_run = "--dry-run" in sys.argv
reviews = sorted(REVIEWS_DIR.glob("*.md"))
reviews = [r for r in reviews if r.name != "_index.md"]
if not reviews:
print("No reviews found.")
return
print(f"Enriching {len(reviews)} review(s)...\n")
for path in reviews:
print(f" ── {path.stem} ──")
try:
enrich(path, dry_run=dry_run)
except Exception as e:
print(f" ✗ error: {e}")
print()
if __name__ == "__main__":
main()

View File

@@ -2,6 +2,14 @@ monotrope.au {
root * /var/www/monotrope root * /var/www/monotrope
file_server file_server
# Security headers
header {
X-Content-Type-Options "nosniff"
X-Frame-Options "DENY"
Referrer-Policy "strict-origin-when-cross-origin"
Permissions-Policy "camera=(), microphone=(), geolocation=()"
}
# Compression # Compression
encode zstd gzip encode zstd gzip
@@ -16,6 +24,8 @@ monotrope.au {
path *.html / /posts/ /posts/* path *.html / /posts/ /posts/*
} }
header @html Cache-Control "public, max-age=0, must-revalidate" header @html Cache-Control "public, max-age=0, must-revalidate"
} }
# Redirect www to apex # Redirect www to apex
@@ -27,6 +37,40 @@ www.monotrope.au {
reader.monotrope.au { reader.monotrope.au {
reverse_proxy localhost:8080 reverse_proxy localhost:8080
header {
X-Content-Type-Options "nosniff"
X-Frame-Options "DENY"
Referrer-Policy "strict-origin-when-cross-origin"
Permissions-Policy "camera=(), microphone=(), geolocation=()"
}
encode zstd gzip
}
# Gitea
git.monotrope.au {
reverse_proxy localhost:3000
header {
X-Content-Type-Options "nosniff"
Referrer-Policy "strict-origin-when-cross-origin"
Permissions-Policy "camera=(), microphone=(), geolocation=()"
}
encode zstd gzip
}
# Calibre-web
books.monotrope.au {
reverse_proxy localhost:8083
header {
X-Content-Type-Options "nosniff"
X-Frame-Options "DENY"
Referrer-Policy "strict-origin-when-cross-origin"
Permissions-Policy "camera=(), microphone=(), geolocation=()"
}
encode zstd gzip encode zstd gzip
} }
@@ -34,5 +78,12 @@ reader.monotrope.au {
stats.monotrope.au { stats.monotrope.au {
reverse_proxy localhost:8081 reverse_proxy localhost:8081
header {
X-Content-Type-Options "nosniff"
X-Frame-Options "DENY"
Referrer-Policy "strict-origin-when-cross-origin"
Permissions-Policy "camera=(), microphone=(), geolocation=()"
}
encode zstd gzip encode zstd gzip
} }

View File

@@ -10,9 +10,15 @@
miniflux_db_password: "{{ lookup('env', 'MINIFLUX_DB_PASSWORD') }}" miniflux_db_password: "{{ lookup('env', 'MINIFLUX_DB_PASSWORD') }}"
miniflux_admin_user: "{{ lookup('env', 'MINIFLUX_ADMIN_USER') | default('admin') }}" miniflux_admin_user: "{{ lookup('env', 'MINIFLUX_ADMIN_USER') | default('admin') }}"
miniflux_admin_password: "{{ lookup('env', 'MINIFLUX_ADMIN_PASSWORD') }}" miniflux_admin_password: "{{ lookup('env', 'MINIFLUX_ADMIN_PASSWORD') }}"
gitea_db_password: "{{ lookup('env', 'GITEA_DB_PASSWORD') }}"
goatcounter_version: "2.7.0" goatcounter_version: "2.7.0"
goatcounter_admin_email: "{{ lookup('env', 'GOATCOUNTER_ADMIN_EMAIL') }}" goatcounter_admin_email: "{{ lookup('env', 'GOATCOUNTER_ADMIN_EMAIL') }}"
goatcounter_admin_password: "{{ lookup('env', 'GOATCOUNTER_ADMIN_PASSWORD') }}" goatcounter_admin_password: "{{ lookup('env', 'GOATCOUNTER_ADMIN_PASSWORD') }}"
hermes_openrouter_api_key: "{{ lookup('env', 'HERMES_OPENROUTER_API_KEY') }}"
hermes_telegram_bot_token: "{{ lookup('env', 'HERMES_TELEGRAM_BOT_TOKEN') }}"
hermes_telegram_allowed_users: "{{ lookup('env', 'HERMES_TELEGRAM_ALLOWED_USERS') }}"
hermes_miniflux_api_key: "{{ lookup('env', 'HERMES_MINIFLUX_API_KEY') }}"
wg_client_pubkey: "{{ lookup('env', 'WG_CLIENT_PUBKEY') }}"
tasks: tasks:
@@ -32,8 +38,35 @@
- apt-transport-https - apt-transport-https
- curl - curl
- ufw - ufw
- unattended-upgrades
state: present state: present
- name: Configure unattended-upgrades
copy:
dest: /etc/apt/apt.conf.d/50unattended-upgrades
owner: root
group: root
mode: '0644'
content: |
Unattended-Upgrade::Allowed-Origins {
"${distro_id}:${distro_codename}-security";
"${distro_id}ESMApps:${distro_codename}-apps-security";
"${distro_id}ESM:${distro_codename}-infra-security";
};
Unattended-Upgrade::Remove-Unused-Kernel-Packages "true";
Unattended-Upgrade::Remove-Unused-Dependencies "true";
Unattended-Upgrade::Automatic-Reboot "false";
- name: Enable automatic updates
copy:
dest: /etc/apt/apt.conf.d/20auto-upgrades
owner: root
group: root
mode: '0644'
content: |
APT::Periodic::Update-Package-Lists "1";
APT::Periodic::Unattended-Upgrade "1";
# ── Caddy ─────────────────────────────────────────────────────────────── # ── Caddy ───────────────────────────────────────────────────────────────
- name: Add Caddy GPG key - name: Add Caddy GPG key
@@ -66,7 +99,9 @@
notify: Restart Caddy notify: Restart Caddy
tags: tags:
- miniflux - miniflux
- gitea
- goatcounter - goatcounter
- calibre
- name: Enable and start Caddy - name: Enable and start Caddy
systemd: systemd:
@@ -84,6 +119,48 @@
shell: /usr/sbin/nologin shell: /usr/sbin/nologin
state: present state: present
# ── SSH hardening ───────────────────────────────────────────────────────
- name: Harden SSH configuration
copy:
dest: /etc/ssh/sshd_config.d/hardening.conf
owner: root
group: root
mode: '0644'
content: |
PasswordAuthentication no
PermitRootLogin prohibit-password
MaxAuthTries 3
notify: Restart sshd
# ── Fail2ban ────────────────────────────────────────────────────────────
- name: Install fail2ban
apt:
name: fail2ban
state: present
- name: Configure fail2ban SSH jail
copy:
dest: /etc/fail2ban/jail.local
owner: root
group: root
mode: '0644'
content: |
[sshd]
enabled = true
port = ssh
maxretry = 3
bantime = 3600
findtime = 600
notify: Restart fail2ban
- name: Enable and start fail2ban
systemd:
name: fail2ban
enabled: true
state: started
# ── UFW ───────────────────────────────────────────────────────────────── # ── UFW ─────────────────────────────────────────────────────────────────
- name: Set UFW default incoming policy to deny - name: Set UFW default incoming policy to deny
@@ -113,10 +190,85 @@
port: '443' port: '443'
proto: tcp proto: tcp
- name: Allow Gitea SSH
ufw:
rule: allow
port: '2222'
proto: tcp
- name: Enable UFW - name: Enable UFW
ufw: ufw:
state: enabled state: enabled
# ── WireGuard ───────────────────────────────────────────────────────────
- name: Install WireGuard
apt:
name: wireguard
state: present
tags: wireguard
- name: Generate WireGuard server private key
shell: wg genkey > /etc/wireguard/server_privatekey && chmod 600 /etc/wireguard/server_privatekey
args:
creates: /etc/wireguard/server_privatekey
tags: wireguard
- name: Generate WireGuard server public key
shell: cat /etc/wireguard/server_privatekey | wg pubkey > /etc/wireguard/server_publickey
args:
creates: /etc/wireguard/server_publickey
tags: wireguard
- name: Read server private key
slurp:
src: /etc/wireguard/server_privatekey
register: wg_server_privkey
tags: wireguard
- name: Read server public key
slurp:
src: /etc/wireguard/server_publickey
register: wg_server_pubkey
tags: wireguard
- name: Write WireGuard config
copy:
dest: /etc/wireguard/wg0.conf
owner: root
group: root
mode: '0600'
content: |
[Interface]
PrivateKey = {{ wg_server_privkey.content | b64decode | trim }}
Address = 10.100.0.1/24
ListenPort = 51820
[Peer]
PublicKey = {{ wg_client_pubkey }}
AllowedIPs = 10.100.0.2/32
notify: Restart WireGuard
tags: wireguard
- name: Allow WireGuard UDP port
ufw:
rule: allow
port: '51820'
proto: udp
tags: wireguard
- name: Enable and start WireGuard
systemd:
name: wg-quick@wg0
enabled: true
state: started
tags: wireguard
- name: Display server public key
debug:
msg: "WireGuard server public key: {{ wg_server_pubkey.content | b64decode | trim }}"
tags: wireguard
# ── Docker ────────────────────────────────────────────────────────────── # ── Docker ──────────────────────────────────────────────────────────────
- name: Create Docker keyring directory - name: Create Docker keyring directory
@@ -158,6 +310,15 @@
enabled: true enabled: true
state: started state: started
- name: Create shared Docker network
command: docker network create monotrope
register: docker_net
changed_when: docker_net.rc == 0
failed_when: docker_net.rc != 0 and 'already exists' not in docker_net.stderr
tags:
- miniflux
- hermes
# ── Miniflux ──────────────────────────────────────────────────────────── # ── Miniflux ────────────────────────────────────────────────────────────
- name: Create Miniflux directory - name: Create Miniflux directory
@@ -197,6 +358,201 @@
chdir: /opt/miniflux chdir: /opt/miniflux
tags: miniflux tags: miniflux
# ── Gitea ───────────────────────────────────────────────────────────────
- name: Create Gitea directory
file:
path: /opt/gitea
state: directory
owner: root
group: root
mode: '0750'
tags: gitea
- name: Copy Gitea docker-compose.yml
copy:
src: ../gitea/docker-compose.yml
dest: /opt/gitea/docker-compose.yml
owner: root
group: root
mode: '0640'
tags: gitea
- name: Write Gitea .env
copy:
dest: /opt/gitea/.env
owner: root
group: root
mode: '0600'
content: |
GITEA_DB_PASSWORD={{ gitea_db_password }}
no_log: true
tags: gitea
- name: Pull and start Gitea
command: docker compose up -d --pull always
args:
chdir: /opt/gitea
tags: gitea
# ── Hermes Agent ────────────────────────────────────────────────────────
- name: Create Hermes directory
file:
path: /opt/hermes
state: directory
owner: root
group: root
mode: '0750'
tags: hermes
- name: Copy Hermes docker-compose.yml
copy:
src: ../hermes/docker-compose.yml
dest: /opt/hermes/docker-compose.yml
owner: root
group: root
mode: '0640'
tags: hermes
- name: Stage Hermes config.yaml
copy:
src: ../hermes/config.yaml
dest: /opt/hermes/config.yaml
owner: root
group: root
mode: '0640'
tags: hermes
- name: Copy config.yaml into Hermes volume
command: docker cp /opt/hermes/config.yaml hermes:/opt/data/config.yaml
notify: Restart Hermes
tags: hermes
- name: Copy Hermes plugins
copy:
src: ../hermes/plugins/
dest: /opt/hermes/plugins/
owner: root
group: root
mode: '0640'
directory_mode: '0750'
notify: Restart Hermes
tags: hermes
- name: Write Miniflux plugin config
copy:
dest: /opt/hermes/plugins/miniflux/config.json
owner: root
group: root
mode: '0600'
content: |
{
"base_url": "http://miniflux:8080",
"api_key": "{{ hermes_miniflux_api_key }}"
}
no_log: true
notify: Restart Hermes
tags: hermes
- name: Write Hermes .env
copy:
dest: /opt/hermes/.env
owner: root
group: root
mode: '0600'
content: |
OPENROUTER_API_KEY={{ hermes_openrouter_api_key }}
TELEGRAM_BOT_TOKEN={{ hermes_telegram_bot_token }}
TELEGRAM_ALLOWED_USERS={{ hermes_telegram_allowed_users }}
no_log: true
tags: hermes
- name: Pull and start Hermes
command: docker compose up -d --pull always
args:
chdir: /opt/hermes
tags: hermes
# ── Calibre (kobodl + calibre-web) ────────────────────────────────────
- name: Create Calibre directory
file:
path: /opt/calibre
state: directory
owner: root
group: root
mode: '0750'
tags: calibre
- name: Copy Calibre docker-compose.yml
copy:
src: ../calibre/docker-compose.yml
dest: /opt/calibre/docker-compose.yml
owner: root
group: root
mode: '0640'
tags: calibre
- name: Pull and start Calibre services
command: docker compose up -d --pull always
args:
chdir: /opt/calibre
tags: calibre
- name: Fix downloads volume ownership
command: >
docker compose exec -T kobodl
chown 1000:1000 /downloads
args:
chdir: /opt/calibre
tags: calibre
- name: Check if Calibre library exists
command: >
docker compose exec -T calibre-web
test -f /library/metadata.db
args:
chdir: /opt/calibre
register: calibre_db_check
changed_when: false
failed_when: false
tags: calibre
- name: Initialise Calibre library
command: >
docker compose exec -T --user abc calibre-web
calibredb add --empty --with-library /library/
args:
chdir: /opt/calibre
when: calibre_db_check.rc != 0
tags: calibre
- name: Install calibre-sync script
copy:
dest: /opt/calibre/sync.sh
owner: root
group: root
mode: '0755'
content: |
#!/bin/bash
set -euo pipefail
cd /opt/calibre
# Download all books from Kobo
docker compose exec -T kobodl kobodl --config /home/config/kobodl.json book get --get-all --output-dir /downloads
# Import any new EPUBs into Calibre library
# Files are kept in /downloads so kobodl can skip them next run
docker compose exec -T --user abc calibre-web sh -c '
for f in /downloads/*.epub; do
[ -f "$f" ] || continue
calibredb add "$f" --with-library /library/ || true
done
'
tags: calibre
# ── GoatCounter ───────────────────────────────────────────────────────── # ── GoatCounter ─────────────────────────────────────────────────────────
- name: Create goatcounter system user - name: Create goatcounter system user
@@ -222,6 +578,7 @@
url: "https://github.com/arp242/goatcounter/releases/download/v{{ goatcounter_version }}/goatcounter-v{{ goatcounter_version }}-linux-amd64.gz" url: "https://github.com/arp242/goatcounter/releases/download/v{{ goatcounter_version }}/goatcounter-v{{ goatcounter_version }}-linux-amd64.gz"
dest: /tmp/goatcounter.gz dest: /tmp/goatcounter.gz
mode: '0644' mode: '0644'
checksum: "sha256:98d221cb9c8ef2bf76d8daa9cca647839f8d8b0bb5bc7400ff9337c5da834511"
tags: goatcounter tags: goatcounter
- name: Decompress GoatCounter binary - name: Decompress GoatCounter binary
@@ -336,3 +693,23 @@
systemd: systemd:
name: goatcounter name: goatcounter
state: restarted state: restarted
- name: Restart sshd
systemd:
name: ssh
state: restarted
- name: Restart fail2ban
systemd:
name: fail2ban
state: restarted
- name: Restart Hermes
command: docker compose restart
args:
chdir: /opt/hermes
- name: Restart WireGuard
systemd:
name: wg-quick@wg0
state: restarted

View File

@@ -0,0 +1,50 @@
services:
kobodl:
image: ghcr.io/subdavis/kobodl
restart: unless-stopped
user: "1000:1000"
command: --config /home/config/kobodl.json serve --host 0.0.0.0 --output-dir /downloads
logging:
driver: json-file
options:
max-size: "10m"
max-file: "3"
ports:
- "10.100.0.1:5100:5000"
volumes:
- kobodl_config:/home/config
- downloads:/downloads
networks:
- default
- monotrope
calibre-web:
image: lscr.io/linuxserver/calibre-web:latest
restart: unless-stopped
logging:
driver: json-file
options:
max-size: "10m"
max-file: "3"
ports:
- "127.0.0.1:8083:8083"
volumes:
- calibre_config:/config
- library:/library
- downloads:/downloads
environment:
PUID: "1000"
PGID: "1000"
TZ: "Australia/Sydney"
DOCKER_MODS: "linuxserver/mods:universal-calibre"
networks:
default:
monotrope:
external: true
volumes:
kobodl_config:
calibre_config:
library:
downloads:

View File

@@ -0,0 +1,55 @@
services:
gitea:
image: gitea/gitea:1.25
restart: unless-stopped
logging:
driver: json-file
options:
max-size: "10m"
max-file: "3"
depends_on:
db:
condition: service_healthy
ports:
- "127.0.0.1:3000:3000"
- "2222:22"
volumes:
- gitea_data:/data
environment:
GITEA__database__DB_TYPE: postgres
GITEA__database__HOST: db:5432
GITEA__database__NAME: gitea
GITEA__database__USER: gitea
GITEA__database__PASSWD: "${GITEA_DB_PASSWORD}"
GITEA__server__ROOT_URL: "https://git.monotrope.au/"
GITEA__server__DOMAIN: "git.monotrope.au"
GITEA__server__SSH_DOMAIN: "git.monotrope.au"
GITEA__server__SSH_PORT: 2222
env_file:
- .env
db:
image: postgres:16-alpine
restart: unless-stopped
logging:
driver: json-file
options:
max-size: "10m"
max-file: "3"
volumes:
- gitea_db:/var/lib/postgresql/data
environment:
POSTGRES_DB: gitea
POSTGRES_USER: gitea
POSTGRES_PASSWORD: "${GITEA_DB_PASSWORD}"
env_file:
- .env
healthcheck:
test: ["CMD", "pg_isready", "-U", "gitea"]
interval: 10s
timeout: 5s
retries: 5
volumes:
gitea_data:
gitea_db:

9
infra/hermes/config.yaml Normal file
View File

@@ -0,0 +1,9 @@
model:
provider: openrouter
default: openrouter/auto
memory:
memory_enabled: true
user_profile_enabled: true
agent:
max_turns: 70
TELEGRAM_HOME_CHANNEL: '8455090116'

View File

@@ -0,0 +1,29 @@
services:
hermes:
image: nousresearch/hermes-agent:latest
container_name: hermes
restart: unless-stopped
command: gateway run
logging:
driver: json-file
options:
max-size: "10m"
max-file: "3"
networks:
- monotrope
volumes:
- hermes_data:/opt/data
- ./plugins:/opt/data/plugins:ro
environment:
OPENROUTER_API_KEY: "${OPENROUTER_API_KEY}"
TELEGRAM_BOT_TOKEN: "${TELEGRAM_BOT_TOKEN}"
TELEGRAM_ALLOWED_USERS: "${TELEGRAM_ALLOWED_USERS}"
env_file:
- .env
networks:
monotrope:
external: true
volumes:
hermes_data:

View File

@@ -0,0 +1,40 @@
from . import schemas, tools
def register(ctx):
ctx.register_tool(
name="list_feeds",
toolset="miniflux",
schema=schemas.LIST_FEEDS,
handler=tools.list_feeds,
)
ctx.register_tool(
name="get_unread_entries",
toolset="miniflux",
schema=schemas.GET_UNREAD_ENTRIES,
handler=tools.get_unread_entries,
)
ctx.register_tool(
name="get_entry",
toolset="miniflux",
schema=schemas.GET_ENTRY,
handler=tools.get_entry,
)
ctx.register_tool(
name="toggle_bookmark",
toolset="miniflux",
schema=schemas.TOGGLE_BOOKMARK,
handler=tools.toggle_bookmark,
)
ctx.register_tool(
name="update_feed_filters",
toolset="miniflux",
schema=schemas.UPDATE_FEED_FILTERS,
handler=tools.update_feed_filters,
)
ctx.register_tool(
name="mark_as_read",
toolset="miniflux",
schema=schemas.MARK_AS_READ,
handler=tools.mark_as_read,
)

View File

@@ -0,0 +1,10 @@
name: miniflux
version: 2.0.0
description: Read and manage feeds and entries from the local Miniflux RSS reader
provides_tools:
- list_feeds
- get_unread_entries
- get_entry
- toggle_bookmark
- update_feed_filters
- mark_as_read

View File

@@ -0,0 +1,116 @@
LIST_FEEDS = {
"name": "list_feeds",
"description": (
"List all subscribed RSS feeds from Miniflux. "
"Returns feed titles, URLs, and unread counts."
),
"parameters": {
"type": "object",
"properties": {},
"required": [],
},
}
GET_UNREAD_ENTRIES = {
"name": "get_unread_entries",
"description": (
"Get unread entries from Miniflux. "
"Optionally filter by feed ID and limit the number of results."
),
"parameters": {
"type": "object",
"properties": {
"feed_id": {
"type": "integer",
"description": "Filter to a specific feed. Omit for all feeds.",
},
"category_id": {
"type": "integer",
"description": "Filter to a specific category. Omit for all categories.",
},
"limit": {
"type": "integer",
"description": "Maximum number of entries to return. Defaults to 20.",
},
},
"required": [],
},
}
GET_ENTRY = {
"name": "get_entry",
"description": (
"Get a single entry from Miniflux by ID, including its full content. "
"Use this to read an article's text."
),
"parameters": {
"type": "object",
"properties": {
"entry_id": {
"type": "integer",
"description": "The entry ID to retrieve.",
},
},
"required": ["entry_id"],
},
}
TOGGLE_BOOKMARK = {
"name": "toggle_bookmark",
"description": "Toggle the bookmark/star status of a Miniflux entry.",
"parameters": {
"type": "object",
"properties": {
"entry_id": {
"type": "integer",
"description": "The entry ID to bookmark or unbookmark.",
},
},
"required": ["entry_id"],
},
}
UPDATE_FEED_FILTERS = {
"name": "update_feed_filters",
"description": (
"Update the keep or block filter rules on a Miniflux feed. "
"Rules are case-insensitive regexes matched against entry titles and URLs. "
"keeplist_rules: only entries matching are kept. "
"blocklist_rules: entries matching are excluded. "
"Pass an empty string to clear a rule."
),
"parameters": {
"type": "object",
"properties": {
"feed_id": {
"type": "integer",
"description": "The feed ID to update.",
},
"keeplist_rules": {
"type": "string",
"description": "Regex pattern. Only matching entries are kept. Omit to leave unchanged.",
},
"blocklist_rules": {
"type": "string",
"description": "Regex pattern. Matching entries are excluded. Omit to leave unchanged.",
},
},
"required": ["feed_id"],
},
}
MARK_AS_READ = {
"name": "mark_as_read",
"description": "Mark one or more Miniflux entries as read.",
"parameters": {
"type": "object",
"properties": {
"entry_ids": {
"type": "array",
"items": {"type": "integer"},
"description": "List of entry IDs to mark as read.",
},
},
"required": ["entry_ids"],
},
}

View File

@@ -0,0 +1,144 @@
import json
from pathlib import Path
import requests
_PLUGIN_DIR = Path(__file__).parent
with open(_PLUGIN_DIR / "config.json") as _f:
_CONFIG = json.loads(_f.read())
_BASE = _CONFIG.get("base_url", "http://miniflux:8080").rstrip("/")
_HEADERS = {"X-Auth-Token": _CONFIG.get("api_key", "")}
def _get(path, **params):
resp = requests.get(f"{_BASE}/v1{path}", headers=_HEADERS, params=params, timeout=10)
resp.raise_for_status()
return resp.json()
def _put(path, body):
resp = requests.put(f"{_BASE}/v1{path}", headers=_HEADERS, json=body, timeout=10)
resp.raise_for_status()
return resp
def list_feeds(args: dict, **kwargs) -> str:
try:
feeds = _get("/feeds")
counters = _get("/feeds/counters")
unreads = counters.get("unreads", {})
result = []
for f in feeds:
result.append({
"id": f["id"],
"title": f["title"],
"site_url": f.get("site_url", ""),
"category": f.get("category", {}).get("title", ""),
"unread": unreads.get(str(f["id"]), 0),
})
result.sort(key=lambda x: x["unread"], reverse=True)
return json.dumps({"feeds": result, "total": len(result)})
except Exception as e:
return json.dumps({"error": str(e)})
def get_unread_entries(args: dict, **kwargs) -> str:
try:
params = {
"status": "unread",
"limit": args.get("limit", 20),
"direction": "desc",
"order": "published_at",
}
if args.get("feed_id"):
path = f"/feeds/{args['feed_id']}/entries"
elif args.get("category_id"):
path = f"/categories/{args['category_id']}/entries"
else:
path = "/entries"
data = _get(path, **params)
entries = []
for e in data.get("entries", []):
entries.append({
"id": e["id"],
"title": e["title"],
"url": e.get("url", ""),
"feed": e.get("feed", {}).get("title", ""),
"category": e.get("feed", {}).get("category", {}).get("title", ""),
"author": e.get("author", ""),
"published_at": e.get("published_at", ""),
"reading_time": e.get("reading_time", 0),
})
return json.dumps({
"entries": entries,
"total": data.get("total", len(entries)),
})
except Exception as e:
return json.dumps({"error": str(e)})
def get_entry(args: dict, **kwargs) -> str:
try:
entry = _get(f"/entries/{args['entry_id']}")
return json.dumps({
"id": entry["id"],
"title": entry["title"],
"url": entry.get("url", ""),
"author": entry.get("author", ""),
"feed": entry.get("feed", {}).get("title", ""),
"category": entry.get("feed", {}).get("category", {}).get("title", ""),
"published_at": entry.get("published_at", ""),
"reading_time": entry.get("reading_time", 0),
"content": entry.get("content", ""),
})
except Exception as e:
return json.dumps({"error": str(e)})
def toggle_bookmark(args: dict, **kwargs) -> str:
try:
_put(f"/entries/{args['entry_id']}/bookmark", {})
return json.dumps({"ok": True, "entry_id": args["entry_id"]})
except Exception as e:
return json.dumps({"error": str(e)})
def update_feed_filters(args: dict, **kwargs) -> str:
try:
feed_id = args["feed_id"]
body = {}
if "keeplist_rules" in args:
body["keeplist_rules"] = args["keeplist_rules"]
if "blocklist_rules" in args:
body["blocklist_rules"] = args["blocklist_rules"]
if not body:
return json.dumps({"error": "Provide keeplist_rules and/or blocklist_rules"})
resp = requests.put(
f"{_BASE}/v1/feeds/{feed_id}",
headers=_HEADERS, json=body, timeout=10,
)
resp.raise_for_status()
feed = resp.json()
return json.dumps({
"ok": True,
"feed_id": feed["id"],
"title": feed["title"],
"keeplist_rules": feed.get("keeplist_rules", ""),
"blocklist_rules": feed.get("blocklist_rules", ""),
})
except Exception as e:
return json.dumps({"error": str(e)})
def mark_as_read(args: dict, **kwargs) -> str:
try:
entry_ids = args.get("entry_ids", [])
if not entry_ids:
return json.dumps({"error": "No entry_ids provided"})
_put("/entries", {"entry_ids": entry_ids, "status": "read"})
return json.dumps({"ok": True, "marked_read": entry_ids})
except Exception as e:
return json.dumps({"error": str(e)})

View File

@@ -1,12 +1,20 @@
services: services:
miniflux: miniflux:
image: miniflux/miniflux:latest image: miniflux/miniflux:2.2.19
restart: unless-stopped restart: unless-stopped
logging:
driver: json-file
options:
max-size: "10m"
max-file: "3"
depends_on: depends_on:
db: db:
condition: service_healthy condition: service_healthy
ports: ports:
- "127.0.0.1:8080:8080" - "127.0.0.1:8080:8080"
networks:
- default
- monotrope
environment: environment:
DATABASE_URL: "postgres://miniflux:${MINIFLUX_DB_PASSWORD}@db/miniflux?sslmode=disable" DATABASE_URL: "postgres://miniflux:${MINIFLUX_DB_PASSWORD}@db/miniflux?sslmode=disable"
RUN_MIGRATIONS: "1" RUN_MIGRATIONS: "1"
@@ -20,6 +28,11 @@ services:
db: db:
image: postgres:16-alpine image: postgres:16-alpine
restart: unless-stopped restart: unless-stopped
logging:
driver: json-file
options:
max-size: "10m"
max-file: "3"
volumes: volumes:
- miniflux_db:/var/lib/postgresql/data - miniflux_db:/var/lib/postgresql/data
environment: environment:
@@ -34,5 +47,10 @@ services:
timeout: 5s timeout: 5s
retries: 5 retries: 5
networks:
default:
monotrope:
external: true
volumes: volumes:
miniflux_db: miniflux_db:

View File

@@ -1,97 +0,0 @@
#!/usr/bin/env bash
set -euo pipefail
# setup.sh — Provision a fresh Ubuntu 24.04 droplet for monotrope.au
# Run as root via: ssh root@<DROPLET_IP> 'bash -s' < infra/setup.sh
DEPLOY_USER="deploy"
SITE_DIR="/var/www/monotrope"
DEPLOY_PUBKEY="${DEPLOY_PUBKEY:-}" # Set this env var before running, or edit below
echo "==> Updating packages"
apt-get update -y
apt-get upgrade -y
# ── Caddy ─────────────────────────────────────────────────────────────────
echo "==> Installing Caddy"
apt-get install -y debian-keyring debian-archive-keyring apt-transport-https curl
curl -1sLf 'https://dl.cloudsmith.io/public/caddy/stable/gpg.key' \
| gpg --dearmor -o /usr/share/keyrings/caddy-stable-archive-keyring.gpg
curl -1sLf 'https://dl.cloudsmith.io/public/caddy/stable/debian.deb.txt' \
| tee /etc/apt/sources.list.d/caddy-stable.list
apt-get update -y
apt-get install -y caddy
# ── Site directory ─────────────────────────────────────────────────────────
echo "==> Creating www user and site directory"
id -u www &>/dev/null || useradd --system --no-create-home --shell /usr/sbin/nologin www
mkdir -p "$SITE_DIR"
chown www:www "$SITE_DIR"
chmod 755 "$SITE_DIR"
# ── Caddyfile ──────────────────────────────────────────────────────────────
echo "==> Installing Caddyfile"
cp "$(dirname "$0")/Caddyfile" /etc/caddy/Caddyfile
chown root:caddy /etc/caddy/Caddyfile
chmod 640 /etc/caddy/Caddyfile
systemctl enable caddy
systemctl restart caddy
# ── UFW ────────────────────────────────────────────────────────────────────
echo "==> Configuring UFW"
apt-get install -y ufw
ufw default deny incoming
ufw default allow outgoing
ufw allow ssh
ufw allow http
ufw allow https
ufw --force enable
# ── Docker ────────────────────────────────────────────────────────────────
echo "==> Installing Docker"
install -m 0755 -d /etc/apt/keyrings
curl -fsSL https://download.docker.com/linux/ubuntu/gpg \
| gpg --dearmor -o /etc/apt/keyrings/docker.gpg
chmod a+r /etc/apt/keyrings/docker.gpg
echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] \
https://download.docker.com/linux/ubuntu $(. /etc/os-release && echo "$VERSION_CODENAME") stable" \
| tee /etc/apt/sources.list.d/docker.list
apt-get update -y
apt-get install -y docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin
systemctl enable docker
# ── Deploy user ───────────────────────────────────────────────────────────
echo "==> Creating deploy user"
id -u "$DEPLOY_USER" &>/dev/null || useradd --create-home --shell /bin/bash "$DEPLOY_USER"
# Give deploy user write access to the site directory
chown -R "$DEPLOY_USER":www "$SITE_DIR"
chmod 775 "$SITE_DIR"
# Set up SSH key auth
DEPLOY_HOME="/home/$DEPLOY_USER"
mkdir -p "$DEPLOY_HOME/.ssh"
chmod 700 "$DEPLOY_HOME/.ssh"
touch "$DEPLOY_HOME/.ssh/authorized_keys"
chmod 600 "$DEPLOY_HOME/.ssh/authorized_keys"
chown -R "$DEPLOY_USER":"$DEPLOY_USER" "$DEPLOY_HOME/.ssh"
if [[ -n "$DEPLOY_PUBKEY" ]]; then
echo "$DEPLOY_PUBKEY" >> "$DEPLOY_HOME/.ssh/authorized_keys"
echo "==> Deploy public key installed"
else
echo "WARNING: DEPLOY_PUBKEY not set. Add your public key to $DEPLOY_HOME/.ssh/authorized_keys manually."
fi
echo ""
echo "==> Done. Checklist:"
echo " - Point DNS A records for monotrope.au and www.monotrope.au to this server's IP"
echo " - If DEPLOY_PUBKEY was not set, add your key to $DEPLOY_HOME/.ssh/authorized_keys"
echo " - Run 'make deploy' from your local machine to push the site"

View File

@@ -3,8 +3,8 @@ title: About
type: page type: page
--- ---
Monotrope is a play on the idea of [monotropism](https://en.wikipedia.org/wiki/Monotropism). I created this site as an experiment in writing regularly, and also in trying to own things instead of putting them on corporate platforms. Monotropism relates to the autistic experience, but I also just like the idea of deep singular focus and flow states. Modern work, especially with the use of AI, is increasingly fragmented and alienating. Monotrope is a play on the idea of [monotropism](https://en.wikipedia.org/wiki/Monotropism) -- the theory of autistic cognition as deep, singular focus. I created this site as an experiment in writing regularly, and also in trying to own my content instead of putting it on corporate platforms.
I read across all genres, and post my reviews here because I want to own them. I read across all genres, and post my reviews here.
I live on Djadjawurrung and Taungurong land. I live on Djadjawurrung and Taungurong land.

View File

@@ -4,4 +4,4 @@ date: 2026-04-08
draft: false draft: false
--- ---
This is the first post on Monotrope. My intent is to write here as frequently as possible. In part this is to practice the craft of writing directly, and in part it is to ensure I own the stuff I'm writing. This is the first post on Monotrope. My intent is to write here as frequently as possible. In part this is to practice the craft of writing directly, and in part as an experiment in direct ownership of my writing.

View File

@@ -0,0 +1,19 @@
---
title: "An Experiment in Self-Hosting"
date: 2026-04-10T00:00:00+10:00
draft: false
---
One of the things I wanted to do with this site is to see how much tooling I could self-host on a small VPS, in particular with the acceleration afforded by AI coding through Claude Code.
So far I have:
* A [Hugo](https://gohugo.io/) static site
* [Caddy](https://caddyserver.com/) webserver
* Self-hosted feed reader with [Miniflux](https://miniflux.app/)
* Analytics using [Goatcounter](https://www.goatcounter.com/)
* Git server using [Gitea](https://about.gitea.com/) (you can check out the source for the whole project, inception-style, at [git.monotrope.au/louis/monotrope](https://git.monotrope.au/louis/monotrope))
The only external dependency for the whole setup is the server itself (a DigitalOcean droplet), and I'm sure I'll come up with more tools I can add to the server over time.
All of this I would estimate took less than 4 hours to set up and deploy. I think previously even the small amount of effort required to deploy a static blog would have pushed me towards free platforms like Medium or GitHub Pages. This mode of production has the potential be a great thing for the Web if more tinkerers can build and host their own stuff instead of relying on centralised platforms where you are the product.

View File

@@ -1,26 +1,26 @@
--- ---
title: "The Compound" book_author: Aisling Rawle
book_author: "Aisling Rawle" cover: /covers/the-compound.jpg
date: 2026-03-01 date: 2026-04-08
date_read: "March 2026" date_read: March 2026
isbn: '9780008710088'
rating: 4 rating: 4
tags: ["dystopia", "satire"] tags:
- dystopia
- satire
title: The Compound
--- ---
I liked this book _a lot_. I'm giving it 4 stars because I'm not sure I can fully decipher what it was trying to say, or if it was trying to say anything. _The Compound_ is a hell of a vibe. It's compulsively readable while also being creepy as fuck.
The pitch is "Love Island meets Lord of the Flies", and that probably about sums it up. Its setting is a realtiy TV show in a very ambiguous future dystopia (there are references to "the wars", but really we get no detail of the world outside the show). It takes place in an indeterminate dystopian near-future, where we get only the tiniest glimpses of life outside the eponymous compound. Characters make reference to "the wars," and there is general sense that life as a whole has gone downhill. But it's entirely possible to imagine that these characters live in our world; nothing they describe about the dystopia is wholly incompatible with life in 2026, it's just a bit magnified or brought into focus.
In a sense maybe it's a bit of a mess, it's a satire but it's not obvious exactly what it's satirising. There's an ambient theme of consumerism and late-stage capitalism, but it's never really a plot point, it's just there. It doesn't really make you think so much as make you vaguely uneasy. As an example, the contestants on the show get rewards for completing tasks, and when they get the reward they go to the camera and thank the brand that provided the reward. Nothing else happens about this, and the narrator (whose name is Lilly) doesn't really reflect on it much either. But it creates this sense of the absurd that contrasts with the darkness of what's going on between the characters. The setting is a reality TV show set in an isolated compound in the desert, a bit of Love Island meets Survivor. The characters are paired off for romantic drama, but also pushed to extremes by the producers who withhold food or water in the intrest of creating tension or pushing specific tasks or challenges.
The narrator is an interesting point of view as well, because she's deliberately written as a bit dumb, which isn't something I've encountered before in a first person narrative. The satire is more ambient than ever directly stated. To give one example, the contestants on the show get rewards for completing tasks, after which they're expected to walk up to a camera and thank the brand that provided the reward. This is never really commented on or used to make a point, it's simply there, an absurd reflection of late-stage consumer capitalism contrasted with the darkness of what the characters are being put through.
Most of the time all of the internal monologue just relates to the immediate situation in the compound, but there are a small number of times when we get a bit more reflection. There's a bit I highlighted where Lilly talks about dreading leaving the show and going home, and it ends with: The protagonist and narrator, Lilly, is at once extremely shallow and more than a little stupid, while also having good intuitions about people and their motivations, and more self-awareness than you'd give her credit for were you a viewer of the show. Most of her inner monologue is in the moment, focused on the people and situations immediately in front of her, but very occasionally she dips into reflection. Here's the final few sentences of a longer passage I highlighted, in which Lilly talks about her dread of leaving the show and going home:
> What did it matter to wake up at the same time every morning and wear the same clothes and try to eat more protein but less sugar, when an earthquake or a tsunami or a bomb might end it all at any minute? Or maybe we would all continue to boil, slowly but surely, in the mess that we pretended was an acceptable place to live. > What did it matter to wake up at the same time every morning and wear the same clothes and try to eat more protein but less sugar, when an earthquake or a tsunami or a bomb might end it all at any minute? Or maybe we would all continue to boil, slowly but surely, in the mess that we pretended was an acceptable place to live.
I loved the writing style: tight pace, not overly flowery, but it had this creeping sense of unease or dread that made me want to keep reading, even when not much was happening. I haven't read that many thriller type books but I think this is something I really enjoy, when something manages to be a page turner without relying on plot twists or crazy stuff happening all the time. The style isn't flowery but it is sharp and expressive, and personally I found the pacing to be perfect: tense without the need for plot twists or constant action.
Didn't overstay its welcome, the ending was ambiguous and open ended but I thought it was everything it needed to be.
In writing this I've _very nearly_ convinced myself to bump it up to five stars. Maybe I'll come back to this one.

View File

@@ -2,6 +2,9 @@ baseURL = "https://monotrope.au"
languageCode = "en-au" languageCode = "en-au"
title = "Monotrope" title = "Monotrope"
[markup.goldmark.extensions.typographer]
disable = false
[markup.goldmark.renderer] [markup.goldmark.renderer]
unsafe = false unsafe = false
@@ -10,4 +13,4 @@ title = "Monotrope"
section = ["HTML", "RSS"] section = ["HTML", "RSS"]
[params] [params]
description = "writing on software, books, and ideas." description = "writing on technology, books, and ideas."

View File

@@ -6,9 +6,9 @@
<meta name="viewport" content="width=device-width, initial-scale=1.0"> <meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>{{ if not .IsHome }}{{ .Title }} · {{ end }}{{ .Site.Title }}</title> <title>{{ if not .IsHome }}{{ .Title }} · {{ end }}{{ .Site.Title }}</title>
<meta name="description" content="{{ with .Description }}{{ . }}{{ else }}{{ .Site.Params.description }}{{ end }}"> <meta name="description" content="{{ with .Description }}{{ . }}{{ else }}{{ .Site.Params.description }}{{ end }}">
<link rel="preconnect" href="https://fonts.googleapis.com"> <link rel="preload" href="/fonts/jetbrains-mono-latin.woff2" as="font" type="font/woff2" crossorigin>
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin> <link rel="preload" href="/fonts/spectral-400-latin.woff2" as="font" type="font/woff2" crossorigin>
<link href="https://fonts.googleapis.com/css2?family=JetBrains+Mono:ital,wght@0,300;0,400;0,500;1,400&family=Spectral:ital,wght@0,400;0,600;1,400&display=swap" rel="stylesheet"> <link rel="icon" href="/favicon.svg" type="image/svg+xml">
<link rel="stylesheet" href="/css/main.css"> <link rel="stylesheet" href="/css/main.css">
{{ range .AlternativeOutputFormats -}} {{ range .AlternativeOutputFormats -}}
{{ printf `<link rel="%s" type="%s" href="%s" title="%s">` .Rel .MediaType.Type .Permalink $.Site.Title | safeHTML }} {{ printf `<link rel="%s" type="%s" href="%s" title="%s">` .Rel .MediaType.Type .Permalink $.Site.Title | safeHTML }}

View File

@@ -17,8 +17,13 @@
<span class="rating-blocks">{{- strings.Repeat $rating "▓" -}}{{- strings.Repeat (sub 5 $rating) "░" -}}</span> <span class="rating-blocks">{{- strings.Repeat $rating "▓" -}}{{- strings.Repeat (sub 5 $rating) "░" -}}</span>
<span class="rating-num">{{ $rating }}/5</span> <span class="rating-num">{{ $rating }}/5</span>
</span> </span>
</div> </div>
{{ with .Params.cover }}
<img class="book-cover" src="{{ . }}" alt="Cover of {{ $.Title }}">
{{ end }}
{{ .Content }} {{ .Content }}
{{ with .Params.tags }} {{ with .Params.tags }}

Binary file not shown.

After

Width:  |  Height:  |  Size: 34 KiB

View File

@@ -1,5 +1,96 @@
/* ── Fonts ─────────────────────────────────────── */ /* ── Fonts ─────────────────────────────────────── */
@import url('https://fonts.googleapis.com/css2?family=JetBrains+Mono:ital,wght@0,300;0,400;0,500;1,400&family=Spectral:ital,wght@0,400;0,600;1,400&display=swap');
/* JetBrains Mono — latin-ext */
@font-face {
font-family: 'JetBrains Mono';
font-style: normal;
font-weight: 300 500;
font-display: swap;
src: url('/fonts/jetbrains-mono-latin-ext.woff2') format('woff2');
unicode-range: U+0100-02BA, U+02BD-02C5, U+02C7-02CC, U+02CE-02D7, U+02DD-02FF, U+0304, U+0308, U+0329, U+1D00-1DBF, U+1E00-1E9F, U+1EF2-1EFF, U+2020, U+20A0-20AB, U+20AD-20C0, U+2113, U+2C60-2C7F, U+A720-A7FF;
}
/* JetBrains Mono — latin */
@font-face {
font-family: 'JetBrains Mono';
font-style: normal;
font-weight: 300 500;
font-display: swap;
src: url('/fonts/jetbrains-mono-latin.woff2') format('woff2');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02BB-02BC, U+02C6, U+02DA, U+02DC, U+0304, U+0308, U+0329, U+2000-206F, U+20AC, U+2122, U+2191, U+2193, U+2212, U+2215, U+FEFF, U+FFFD;
}
/* JetBrains Mono italic — latin-ext */
@font-face {
font-family: 'JetBrains Mono';
font-style: italic;
font-weight: 400;
font-display: swap;
src: url('/fonts/jetbrains-mono-italic-400-latin-ext.woff2') format('woff2');
unicode-range: U+0100-02BA, U+02BD-02C5, U+02C7-02CC, U+02CE-02D7, U+02DD-02FF, U+0304, U+0308, U+0329, U+1D00-1DBF, U+1E00-1E9F, U+1EF2-1EFF, U+2020, U+20A0-20AB, U+20AD-20C0, U+2113, U+2C60-2C7F, U+A720-A7FF;
}
/* JetBrains Mono italic — latin */
@font-face {
font-family: 'JetBrains Mono';
font-style: italic;
font-weight: 400;
font-display: swap;
src: url('/fonts/jetbrains-mono-italic-400-latin.woff2') format('woff2');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02BB-02BC, U+02C6, U+02DA, U+02DC, U+0304, U+0308, U+0329, U+2000-206F, U+20AC, U+2122, U+2191, U+2193, U+2212, U+2215, U+FEFF, U+FFFD;
}
/* Spectral — latin-ext */
@font-face {
font-family: 'Spectral';
font-style: normal;
font-weight: 400;
font-display: swap;
src: url('/fonts/spectral-400-latin-ext.woff2') format('woff2');
unicode-range: U+0100-02BA, U+02BD-02C5, U+02C7-02CC, U+02CE-02D7, U+02DD-02FF, U+0304, U+0308, U+0329, U+1D00-1DBF, U+1E00-1E9F, U+1EF2-1EFF, U+2020, U+20A0-20AB, U+20AD-20C0, U+2113, U+2C60-2C7F, U+A720-A7FF;
}
/* Spectral — latin */
@font-face {
font-family: 'Spectral';
font-style: normal;
font-weight: 400;
font-display: swap;
src: url('/fonts/spectral-400-latin.woff2') format('woff2');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02BB-02BC, U+02C6, U+02DA, U+02DC, U+0304, U+0308, U+0329, U+2000-206F, U+20AC, U+2122, U+2191, U+2193, U+2212, U+2215, U+FEFF, U+FFFD;
}
/* Spectral 600 — latin-ext */
@font-face {
font-family: 'Spectral';
font-style: normal;
font-weight: 600;
font-display: swap;
src: url('/fonts/spectral-600-latin-ext.woff2') format('woff2');
unicode-range: U+0100-02BA, U+02BD-02C5, U+02C7-02CC, U+02CE-02D7, U+02DD-02FF, U+0304, U+0308, U+0329, U+1D00-1DBF, U+1E00-1E9F, U+1EF2-1EFF, U+2020, U+20A0-20AB, U+20AD-20C0, U+2113, U+2C60-2C7F, U+A720-A7FF;
}
/* Spectral 600 — latin */
@font-face {
font-family: 'Spectral';
font-style: normal;
font-weight: 600;
font-display: swap;
src: url('/fonts/spectral-600-latin.woff2') format('woff2');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02BB-02BC, U+02C6, U+02DA, U+02DC, U+0304, U+0308, U+0329, U+2000-206F, U+20AC, U+2122, U+2191, U+2193, U+2212, U+2215, U+FEFF, U+FFFD;
}
/* Spectral italic — latin-ext */
@font-face {
font-family: 'Spectral';
font-style: italic;
font-weight: 400;
font-display: swap;
src: url('/fonts/spectral-italic-400-latin-ext.woff2') format('woff2');
unicode-range: U+0100-02BA, U+02BD-02C5, U+02C7-02CC, U+02CE-02D7, U+02DD-02FF, U+0304, U+0308, U+0329, U+1D00-1DBF, U+1E00-1E9F, U+1EF2-1EFF, U+2020, U+20A0-20AB, U+20AD-20C0, U+2113, U+2C60-2C7F, U+A720-A7FF;
}
/* Spectral italic — latin */
@font-face {
font-family: 'Spectral';
font-style: italic;
font-weight: 400;
font-display: swap;
src: url('/fonts/spectral-italic-400-latin.woff2') format('woff2');
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02BB-02BC, U+02C6, U+02DA, U+02DC, U+0304, U+0308, U+0329, U+2000-206F, U+20AC, U+2122, U+2191, U+2193, U+2212, U+2215, U+FEFF, U+FFFD;
}
/* ── Reset ─────────────────────────────────────── */ /* ── Reset ─────────────────────────────────────── */
*, *::before, *::after { *, *::before, *::after {
@@ -336,6 +427,14 @@ article hr {
} }
/* ── Book review ───────────────────────────────── */ /* ── Book review ───────────────────────────────── */
.book-cover {
float: right;
width: 180px;
margin: 0 0 1rem 1.5rem;
border: 1px solid var(--border);
border-radius: 2px;
}
.book-meta { .book-meta {
font-family: var(--font-mono); font-family: var(--font-mono);
font-size: 0.75rem; font-size: 0.75rem;
@@ -528,6 +627,13 @@ article hr {
padding-top: 0; padding-top: 0;
} }
.book-cover {
float: none;
display: block;
width: 100px;
margin: 0 0 1.25rem 0;
}
.book-meta { .book-meta {
grid-template-columns: 1fr; grid-template-columns: 1fr;
gap: 0.5rem; gap: 0.5rem;

16
site/static/favicon.svg Normal file
View File

@@ -0,0 +1,16 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 32 32">
<rect width="32" height="32" fill="#0b0d0a"/>
<!-- Pixel art M — each "pixel" is a 2x2 block on the 32x32 grid -->
<!-- Left vertical stroke -->
<rect x="6" y="8" width="4" height="18" fill="#5bff8f"/>
<!-- Right vertical stroke -->
<rect x="22" y="8" width="4" height="18" fill="#5bff8f"/>
<!-- Left diagonal -->
<rect x="10" y="10" width="2" height="4" fill="#5bff8f"/>
<rect x="12" y="14" width="2" height="4" fill="#5bff8f"/>
<!-- Right diagonal -->
<rect x="20" y="10" width="2" height="4" fill="#5bff8f"/>
<rect x="18" y="14" width="2" height="4" fill="#5bff8f"/>
<!-- Center peak -->
<rect x="14" y="16" width="4" height="4" fill="#5bff8f"/>
</svg>

After

Width:  |  Height:  |  Size: 722 B

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.