Skip to main content
Entitybits
Media

Global Sports Digital Platform

Re-architected high-traffic platform to serve 2M+ concurrent users at peak with edge caching, auto-scaling, and aggressive mobile optimization.

2.1M
peak users
94
lighthouse
99.99%
uptime
40%
cost saved
Faster page loads
60%
2.1M concurrent users at peak
Overview

The system, in plain terms.

A global sports organization was preparing for a major international event expected to attract millions of concurrent visitors. Their existing platform suffered from slow page loads, frequent crashes during traffic spikes, and poor mobile performance. The platform needed to deliver a flawless experience to a worldwide audience under extreme load.

We completely re-architected the platform with performance as the primary focus. This included implementing edge caching, optimizing asset delivery, database query optimization, and building a resilient infrastructure that could auto-scale to meet demand. Special attention was paid to mobile performance, as over 70% of traffic came from mobile devices.

The platform successfully handled peak traffic of 2+ million concurrent users during the event, with dramatically improved performance metrics and zero downtime throughout the competition.

The challenge

What needed to be solved.

Re-engineered a high-traffic sports platform to handle millions of concurrent users during major events with significantly improved performance.

  • Optimizing database queries for real-time score updates
  • Delivering fast page loads to global audience
  • Handling traffic spikes during key moments
  • Optimizing mobile performance on slow networks
Performance optimization is about making smart tradeoffs at every layer of the stack.
— From the engagement retrospective
Objectives

What we set out to do.

  1. 01Reduce page load time by at least 50%
  2. 02Support 2+ million concurrent users during peak events
  3. 03Achieve 90+ Lighthouse performance scores
  4. 04Implement global CDN for worldwide audience
  5. 05Ensure zero downtime during high-traffic events
Our approach

How we built it.

Optimizing database queries for real-time score updatesImplemented materialized views, query optimization, and Redis caching layer for frequently accessed data

Delivering fast page loads to global audienceDeployed edge CDN with intelligent caching, image optimization, and static asset pre-compression

Handling traffic spikes during key momentsBuilt auto-scaling infrastructure with pre-warming capabilities and intelligent load distribution

Optimizing mobile performance on slow networksImplemented progressive loading, critical CSS inlining, and aggressive code splitting for faster TTI

60%

Faster page loads

60% reduction in average page load time (3.2s to 1.3s)

Tech stack

What we used.

Next.js
React
TypeScript
PostgreSQL
Redis
Cloudflare
AWS
Kubernetes
GraphQL
Outcomes

What changed in production.

01

60% reduction in average page load time (3.2s to 1.3s)

02

Successfully served 2.1M concurrent users at peak

03

Lighthouse performance score improved from 45 to 94

04

99.99% uptime during entire event period

05

40% reduction in infrastructure costs through optimization

What we learned

Lessons from shipping it.

Performance optimization is about making smart tradeoffs at every layer of the stack. We learned that edge caching and CDN strategy have the biggest impact on perceived performance for global audiences. However, cache invalidation strategies need careful planning—we implemented a hierarchical caching approach with different TTLs for different data types.

Database optimization was critical but often overlooked. Simple query optimizations and proper indexing provided massive improvements. We also learned that auto-scaling needs to be proactive, not reactive—by the time metrics trigger scaling, you're already experiencing degradation. Pre-warming instances before anticipated traffic spikes proved essential for maintaining performance.

Have a similar system to ship?

30-minute scoping call. We'll tell you if your use case is a fit and what shipping it actually looks like.

Start the conversation