StreamCart Real-Time Analytics
Process clickstream events on the fly. Build a low-latency architecture to power live Black Friday sales dashboards.
| TX_ID | USER | STATUS | LATENCY |
|---|---|---|---|
| #101 | u_882 | CLEAN | 0.8ms |
| #102 | u_431 | FLAGGED | 1.2ms |
| #103 | u_219 | CLEAN | 0.5ms |
| #104 | u_667 | FLAGGED | 1.1ms |
STATE_STORE
RocksDB Optimized
RELIABILITY
Replication_Factor: 3
fig 1 — real-time fraud detection monitor
LATENCY
<100ms
Fraud Detection
SEMANTICS
EOS
Exactly-Once
BROKERS
3
KRaft Cluster
PANELS
15+
Grafana Dashboard
Kafka Topology
The streaming pipeline from ingestion to production deployment.
What You'll Build
A complete fraud detection system — from local Kafka cluster to production Kubernetes deployment with full observability.
Multi-Broker Cluster
3-broker KRaft mode Kafka cluster with Schema Registry, Avro schemas, and Kafka UI for real-time monitoring
Real-Time Detection
Windowed aggregations, velocity checks, geographic anomaly detection, and dead letter queue routing
Stream Enrichment
KStream-KTable joins for customer/merchant enrichment, Interactive Queries REST API, and Kafka Connect sinks
K8s + Chaos Labs
Strimzi operator deployment, HPA auto-scaling, Prometheus/Grafana (15+ panels), and failure recovery scenarios
Progressive Build Path
4 parts, each building on the last. Local setup to production Kubernetes.
Infrastructure Standards
Production patterns you'll implement across the streaming platform.
Exactly-Once Semantics with idempotent producers and transactional consumers
Horizontal scaling via Strimzi operator with auto-scaling pod replicas
Broker failures, state corruption, and network partition recovery labs
Sub-millisecond state store lookups with RocksDB and changelog topics
Environment Setup
Launch the Kafka cluster and register your first Avro schema.
# Clone StreamGuard & launch Kafka cluster$ git clone https://github.com/aide-hub/streamguard.git$ cd streamguard# Start 3-broker KRaft cluster + Schema Registry + Kafka UI$ docker-compose -f docker-compose.kafka.yml up -d# Register Avro schema for transactions$ curl -X POST http://localhost:8081/subjects/transactions-value/versions \$ -H "Content-Type: application/vnd.schemaregistry.v1+json" \$ -d '{"schema": "{\"type\": \"record\", \"name\": \"Transaction\"}"}'
Tech Stack
Prerequisites
- Java fundamentals (classes, streams API, lambda expressions)
- Docker basics (containers, docker-compose commands)
- Kafka basics (topics, producers, consumers)
- Kubernetes concepts (pods, services, deployments)
Related Learning Path
Master Kafka architecture, stream processing patterns, and production deployment strategies before tackling this capstone project.
Kafka Streams Learning PathReady to build production stream processing?
Start with Part 1: Ingestion & Schema Registry