Skip to content

Discover your next Milestone.

Choose from industry-vetted challenges. Build local, push to GitHub, and earn cryptographic proof of your engineering skills.

Streaming Chat API Endpoint

backendBeginner365d access
99onwards

Build the server-side of a streaming chat — an SSE endpoint that proxies LLM chunks to the client in real time. Learn async generators, backpressure, and stream piping.

  • Understand the Server-Sent Events (SSE) protocol and its use cases
  • Build a streaming endpoint that proxies LLM response chunks in real time
  • Use Python async generators or Node.js streams for efficient chunk forwarding
  • Detect client disconnections and cancel upstream requests to avoid waste

LLM Proxy API

backendBeginner365d access
99onwards

Build an API wrapper around OpenAI — add request logging, API key auth, response caching, and basic error handling. Your first AI-aware backend service.

  • Build a reverse proxy API that wraps a third-party LLM service
  • Implement API key authentication with middleware
  • Log structured request data (latency, tokens, model) to a database
  • Cache identical LLM requests using SHA-256 hashed keys in Redis
12k+
Verified Developers
150+
Active Projects
450+
Companies Hiring
14 Days
Avg. Completion

Got questions?

Every challenge includes detailed documentation, technical constraints, and automated evaluation scripts to ensure you have everything you need to succeed.