Mobile Lens Profiler: Case Study
Article Summary
Snap's engineering team just dropped a masterclass in mobile performance debugging. Ever wonder why your AR lens stutters on the first frame?
Snap published a detailed case study walking through real-world profiling of a Snapchat Lens built with their Lens Studio 5.1. They used Google's Perfetto UI to trace performance on an iPhone 13, examining everything from lens activation to frame drops.
Key Takeaways
- Lens activation took 173ms, with 125ms spent on main rendering thread
- First frame slowdown traced to deferred GPU texture uploading for jacket and hair assets
- Achieved 30 FPS after initial load, down from 28 FPS with first frame included
- ShapeTrack face detection retries drain battery via expensive neural network calls
- Shader caching prevented major slowdowns during lens loading phase
The case study demonstrates how to pinpoint GPU driver stutters and texture upload bottlenecks that cause frame drops in AR experiences.
About This Article
Snapchat's Lens Studio developers ran into a problem where frames took longer than 33ms to process when using the front camera. The first frame was especially slow compared to the rest, which they traced back to GPU driver behavior.
The team used Google's Perfetto UI to trace what was happening on an iPhone 13. They found that the Leather Jacket and Messy Hair assets were uploading textures to the GPU in a deferred way, and this was blocking the rendering thread when the initial draw calls happened.
Once the first frame finished, the lens ran at a steady 30 FPS. Over 12.7 seconds, it rendered 351 frames without dropping frames. Fixing the texture upload bottleneck let the lens handle complex assets without frame rate issues.