Diff Authoring Time (DAT)

Software development is the foundation of modern innovation, yet many companies still measure productivity with outdated metrics, lines of code, number of commits, or hours spent coding. These don’t tell you anything useful. That’s why at Meta, they focus on Diff Authoring Time (DAT), a metric that actually reflects how fast engineers can create, refine, and commit meaningful code changes.

A “diff” is just a code change, essentially the smallest unit of software evolution. DAT tracks how long it takes from writing to testing and debugging before that change is committed. Unlike superficial metrics that can be gamed, DAT is data-driven. It pulls real-time information from Integrated Development Environments (IDEs), version control systems, and even operating system-level tracking to understand where developers are getting stuck.

Why does this matter? Because speed matters. Bottlenecks kill momentum, and DAT helps identify friction in the system. The average DAT at Meta? 50 minutes for 87% of diffs. That’s an insight into how their engineers operate at scale, making sure the company builds and deploys efficiently.

Optimizing developer experience (DevEx) Through DAT

Software engineers solve problems. Every minute spent wrestling with inefficient tools is a minute not spent innovating. Meta uses DAT to pinpoint these inefficiencies and make targeted improvements.

Take tool investments. If a process slows developers down, they test alternatives. A/B testing, widely used in product development, is applied internally to developer tools. One recent test looked at Meta’s Hack programming language (an optimized version of PHP). Engineers found that running type checks earlier in the coding process reduced disruptions and cut DAT by 14%. That’s a huge gain for something as simple as tweaking when an automated check runs.

The key takeaway? Small friction points, repeated thousands of times across an engineering team, add up to massive productivity losses. DAT helps cut through noise and focus on fixing what actually matters.

Evolving the definition of DAT without breaking it

Metrics are only useful if they remain consistent. Change them too much, and you lose comparability. Keep them static, and they become outdated. Meta is now on its fifth iteration of DAT, refining how it’s measured without disrupting its core definition.

Consistency is key. If different teams measure productivity differently, comparisons become meaningless. Meta’s next step is standardizing how DAT is measured across all engineering groups, making sure every diff is tracked the same way.

But here’s the challenge: the metric must evolve. Software development isn’t static, tools, processes, and best practices change. Moritz Beller, a software engineering researcher at Meta, sums it up well: “You don’t want the definition of your metric to change, but you also want the metric to evolve.” It’s about precision without rigidity, something all high-growth, engineering-driven companies should consider.

Why Meta doesn’t use DAT to measure individual developers

You might be wondering: why not use DAT to evaluate individual developers? Wouldn’t that boost performance?

Simple answer: it backfires.

Meta understands that individual tracking can damage morale and kill creativity. Software engineering isn’t just about speed, it’s about problem-solving. If developers feel like they’re being constantly watched, they start optimizing for the metric instead of the mission. That’s why DAT is only measured at an organizational level.

Sarita Mohanty, a Meta data scientist, acknowledges that DAT isn’t always 100% accurate at an individual level anyway. The real danger? Psychological safety. Meta has cut 23% of its workforce over the last three years, and another 5% reduction is coming. In an environment like that, tracking individual DAT could create unnecessary anxiety, reducing productivity instead of improving it.

“DAT is used as a top-line metric, a way to see broad trends and make sure the entire engineering organization is moving faster, not just a select few.”

BlueOptima vs. Meta

Meta isn’t the only company measuring developer performance, but they take a different approach than some of the more aggressive tracking tools out there.

Take BlueOptima, an enterprise benchmarking tool that aggregates development data, including coding hours and individual productivity metrics. Unlike Meta, BlueOptima does allow tracking at an individual level, but with limited managerial access. The idea? Team leads can use the data to improve workflows without penalizing developers.

Adam Minter, account director at BlueOptima, explains it like this: “You don’t hire bad developers. If something’s happening, you need to identify it and help them. We’re trying to give that power back to developers.”

It’s a different philosophy. Meta avoids individual tracking to protect team dynamics, while BlueOptima sees transparency as a way to empower developers and optimize teams. Which approach works best? It depends on company culture, but one thing is clear, bad developer metrics do more harm than good.

The future of developer productivity

Speed and efficiency matter. But engineering isn’t about metrics, it’s about people.

Developer productivity isn’t just about writing code quickly, it’s about solving problems creatively. That’s why metrics like DAT must be balanced with qualitative insights. If you only focus on numbers, you risk optimizing for speed at the cost of innovation.

Henri Verroken, a Meta data scientist, puts it best: “Developer delight is not just looking at data and saying ‘we know best what developers want.’ Listen to developers to make sure they have a delightful experience.”

This is the key takeaway: use data to inform decisions, not dictate them. The best engineering teams don’t just move fast, they move smart. And the best companies don’t track developers, they support them.

Final thoughts

Measuring software productivity isn’t about counting lines of code. It’s about removing friction, investing in the right tools, and optimizing for long-term efficiency.

DAT is a powerful metric, but like all metrics, it’s just one piece of the puzzle. The real goal? Enable developers to do their best work.

Move fast. Optimize wisely. Innovate relentlessly.

Key takeaways

  • Enhanced developer metrics: Meta’s Diff Authoring Time (DAT) provides a data-driven measure of the time from code creation to commit, helping leaders to pinpoint bottlenecks and improve overall engineering efficiency.

  • Targeted tool investments: DAT data supports A/B testing for tool optimizations, such as early type-check implementations that have shown up to a 14% improvement in cycle times, guiding smarter investments.

  • Evolving yet consistent metrics: Meta continuously refines DAT to adapt to changing development practices while maintaining consistency across teams. Decision-makers should favor adaptable metrics that remain comparable over time.

  • Fostering psychological safety: By avoiding individual performance tracking with DAT, Meta protects team morale and creativity. Leaders should focus on organization-wide metrics that support a collaborative, innovative culture.

Alexander Procter

February 10, 2025

6 Min