Enhancing Chat Performance on PhonePe Android - Part 1
Article Summary
Ayush Bagaria from PhonePe set out to cut chat latency by 50% in their P2P payment flow. The journey taught some expensive lessons about the gap between POC results and production reality.
PhonePe's Android team tackled performance issues in their peer-to-peer payment chat feature, one of their most critical user flows. This first installment covers three optimization approaches: debugging slow database queries, experimenting with deserialization libraries, and implementing pagination.
Key Takeaways
- Protobuf showed 5x faster deserialization in POC but matched GSON in production due to caching
- Pagination reduced chat roster load time by 30-33% at P90 (from 3+ seconds to 2+ seconds)
- SQLite Explain Query revealed most slow queries came from complex joins, not missing indexes
- GSON caching cuts repeat deserialization from 200ms to 25-35ms, negating library swap benefits
These three approaches collectively improved latency by 20%, with the biggest lesson being that POC performance gains don't always translate to production environments.
About This Article
PhonePe's chat roster was loading entire chat lists from the database and rendering them all in RecyclerView, even though users only saw 6-8 threads at a time. For users with over 10,000 chat threads, this meant the app wasted resources transforming data that would never appear on screen.
Ayush Bagaria's team added Android's paging library to load chat data in chunks instead of all at once. This cut down on database queries and reduced the number of UI elements the app had to create.
Chat roster load time dropped by 30-33% at P90. Latency fell from 3-3.4 seconds to 2-2.3 seconds. The change opened the door for further performance improvements down the line.