Trusting Metrics at Pinterest
Article Summary
Pinterest relies on data to drive decisions and ML models. But what happens when a metric as simple as Daily Active Users gets counted wrong?
Pinterest's engineering team built a three-step certification process to ensure their core metrics stay accurate. Ryan Cooke, Engineering Manager, shares how they caught edge cases like browser extensions auto-logging users or activity being double-counted across platforms.
Key Takeaways
- Created product and technical specs to define exact metric behavior across platforms
- Built data checkers to find unexplained DAU and other anomalies automatically
- Implemented UI tests that block releases if metrics fail validation
- Run quarterly health checks to catch new use cases and edge cases
Pinterest only certifies metrics that need near-perfect accuracy, using cross-functional teams of iOS, Android, Web, and data engineers to maintain trust.
About This Article
Daily Active Users metrics can fail silently across platforms. Browser extensions auto-log users every day, midnight activity gets counted twice on some platforms but not others, and non-qualifying activity inflates the numbers.
Ryan Cooke's team wrote product and technical specification documents that defined exact behavior. They built data checkers that alert when something looks wrong, like users with active events but no pin views. UI tests now block releases if the checkers fail.
Every audited metric had definitional issues that this process caught. The team runs quarterly health checks to make sure the checkers still work and that new use cases don't break the certified metrics.