Don’t Let That Consistent Launch Time Slip Away
Article Summary
Dream11's 110M+ users join fantasy sports contests minutes before deadlines. A slow app launch means lost revenue and unfilled contests.
Dream11's engineering team built an automated system to monitor and maintain consistent app launch times across iOS and Android. They needed insights before release, not after, to keep performance in check during rapid weekly releases.
Key Takeaways
- Automated Jenkins + Fastlane pipeline tests launch time on every build
- Firebase Performance (Android) and custom metrics (iOS) capture real data
- Historical metrics in Kibana with Slack alerts catch regressions early
- Launch time data available pre-release enables fixes without delaying timelines
- Manual review removed from release process using threshold-based automation
Dream11 maintains consistent app launch times across releases by automating performance testing before code ships, protecting user experience during peak traffic.
About This Article
Dream11's app launch times kept getting slower with each weekly release. When major tournaments like IPL happened, this became a real problem. Users would join contests in the final 30 minutes before deadlines, and the app couldn't handle the load.
The Dream11 Engineering team set up automated Jenkins jobs that used Fastlane scripts to build and test the app repeatedly on simulators. They collected launch time data including minimum, maximum, average, and median values, then posted the results to Slack for immediate visibility.
The team could see launch time data before releasing the app instead of waiting hours after release. This let them spot problems early and fix them without delaying timelines. Performance stayed consistent across different device configurations.