Architecture Overview
Kairos follows Clean Architecture with a Cargo workspace separating concerns into independent crates. Multiple frontends (CLI, Web) share a single API backend.
High-Level Architecture
graph TB
subgraph Frontend Layer
CLI["kairos-cli\nclap + reqwest"]
Web["Web App\n(React)"]
end
subgraph API Layer
API["kairos-api\naxum + tower"]
end
subgraph Application Layer
Core["kairos-core\nEntities + Traits + Services"]
end
subgraph Infrastructure Layer
Platform["kairos-platform\nSeek / Indeed / LinkedIn"]
LLM["kairos-llm\nrig-core + Claude"]
DB["kairos-db\nrusqlite + SQLite"]
end
CLI --> API
Web --> API
API --> Core
Platform --> Core
LLM --> Core
DB --> Core
Dependency Flow
kairos-api (HTTP server — the unified backend)
+-- kairos-core (domain: entities, traits, errors)
+-- kairos-platform (implements JobSearchService)
| +-- kairos-core
+-- kairos-llm (implements JdAnalysisService, ResumeTailorService)
| +-- kairos-core
+-- kairos-db (implements *Repository traits)
+-- kairos-core
kairos-client (shared HTTP client library)
+-- kairos-core (shared types for deserialization)
kairos-cli (HTTP client via kairos-client)
+-- kairos-core
+-- kairos-client
Rule: Infrastructure crates depend on kairos-core for trait definitions. kairos-core depends on nothing — it’s the innermost layer. All frontends go through kairos-api — no exceptions.
Crate Responsibilities
| Crate | Responsibility | Key Trait |
|---|---|---|
| kairos-core | Domain entities, service traits, error types | All trait definitions |
| kairos-api | HTTP API server, routes, auth, WebSocket | — (binary) |
| kairos-client | Shared HTTP client library for CLI (and future frontends) | — (library) |
| kairos-cli | Command-line HTTP client | — (binary) |
| kairos-platform | Job scraping from Seek/Indeed/LinkedIn | JobSearchService |
| kairos-llm | JD analysis + resume tailoring via Claude | JdAnalysisService, ResumeTailorService |
| kairos-db | SQLite persistence | JobRepository, ResumeRepository, ApplicationRepository |
Async Architecture
flowchart LR
subgraph API Server
Routes[axum Routes]
Scrape["Scrape Jobs\ntokio::spawn"]
Analyze["Analyze JD\ntokio::spawn"]
Tailor["Tailor Resume\ntokio::spawn"]
end
Clients["CLI / Web"] --> |HTTP/WS| Routes
Routes --> Scrape
Routes --> Analyze
Routes --> Tailor
Scrape --> |WebSocket| Clients
Analyze --> |WebSocket| Clients
Tailor --> |WebSocket| Clients
All I/O-heavy operations (scraping, LLM calls) run as tokio::spawn tasks in the API server. Frontends receive progress updates via WebSocket.