Skip to content
Backend

LLM Proxy API

Build an API wrapper around OpenAI — add request logging, API key auth, response caching, and basic error handling. Your first AI-aware backend service.

365 days access
Beginner
Total Fee99
Enroll Now
Project preview

Project Overview

Build an API wrapper around OpenAI — add request logging, API key auth, response caching, and basic error handling. Your first AI-aware backend service.

You will learn to:

  • Build a reverse proxy API that wraps a third-party LLM service
  • Implement API key authentication with middleware
  • Log structured request data (latency, tokens, model) to a database
  • Cache identical LLM requests using SHA-256 hashed keys in Redis
  • Handle upstream API errors and timeouts gracefully

Technologies You'll Use

pythonapiexpresssql

What's Included

  • Detailed Project Requirements
  • Implementation Milestones
  • Submission Checklist
  • Review Guidance
  • Certificate of Completion