stream-docs
This repo hosts all STREAM documentation — both the internal developer docs (this site) and the external customer-facing docs — as two separate MkDocs projects living side-by-side.
stream-docs/
├── internal/ # MkDocs site you are reading right now
└── external/ # Public customer docs (DE + EN)
Running locally
Both sites are served via Docker Compose. Start everything with:
just up
| Site | URL | Engine |
|---|---|---|
| External | http://localhost:8000 | MkDocs |
| Internal | http://localhost:8001 | Zensical |
Both servers watch for file changes and reload automatically, so you can edit Markdown and see the result in the browser immediately.
Adding a page
Internal docs
- Create a
.mdfile underinternal/docs/. - Add it to the
navsection ininternal/mkdocs.yml.
External docs
External docs are bilingual (German default, English secondary). The folder structure mirrors the locale:
external/docs/
├── de/ # German content (default)
└── en/ # English content
- Create the
.mdfile in bothde/anden/at the same relative path. - Add the page to the
navinexternal/mkdocs.yml. - If the nav label is new, add a German translation under
plugins.i18n.languages[de].nav_translationsinexternal/mkdocs.yml.
Diagrams
Both sites support Mermaid (fenced code blocks with ```mermaid) and PlantUML (via the plantuml_markdown extension). The PlantUML server is included in the Compose stack and available at http://plantuml:8080 inside the container network — no additional setup required.
Mermaid
Use a fenced code block with the language tag mermaid:
```mermaid
sequenceDiagram
participant Client
participant API
participant DB
Client->>API: POST /products
API->>DB: INSERT product
DB-->>API: ok
API-->>Client: 201 Created
```
sequenceDiagram
participant Client
participant API
participant DB
Client->>API: POST /products
API->>DB: INSERT product
DB-->>API: ok
API-->>Client: 201 Created
PlantUML
Use a fenced code block with the language tag plantuml:
```plantuml
@startuml
actor User
User -> API : request
API -> Cache : lookup
alt cache hit
Cache --> API : data
else cache miss
API -> DB : query
DB --> API : data
API -> Cache : store
end
API --> User : response
@enduml
```
Publishing
Both sites are hosted on Cloudflare Workers and deploy automatically. There is nothing to build or push manually — just commit and get it into main.
- Make your changes.
- Create a conventional commit following the pattern used in this repo:
chore(internal): <description>for changes to the internal docschore(external): <description>for changes to the external docs
- Push to
maindirectly, or open a PR and merge it.
The Cloudflare Workers deployment picks up the new commit automatically.