Why Developers Insist on Being Measured on Software Performance

Posted by Load Impact on Mar 6, 2018

How do you set your performance metrics for your personal performance reviews? What do other developers do? Turns out, most prefer two performance-related metrics.

Every organization, every team, is a little bit different. Your personal job performance metrics might be different from your colleague’s, or your friend’s at another company.

But one of our favorite annual surveys suggested something we believe reinforces why we think you’re smart to emphasize performance testing (and load testing) as a developer. In the 2017 survey, Stack Overflow users answered their top performance metrics (in fact, over 25,000 of them answered the question). (We’re all still anxiously awaiting the 2018 survey results. In the meantime, you can find the 2017 results here.)

The top four job performance metrics developers preferred were:

  1. Customer satisfaction (71.7%)
  2. On time/in budget (66.4%)
  3. Peers' rating (54.8%)
  4. Benchmarked product performance (41.4%)


Interesting. Things like “lines of code” and “commit frequency” were way at the bottom of the list, with 6% and 9%, respectively.

Since we focus on software performance around here, we were particularly intrigued by the first and the fourth job performance metrics. In both cases, software performance drives those numbers.

Customer satisfaction is largely dependent on performance, with the assumption that a site is at least functional. Many, many surveys report that site speed and performance are the number one driver for customer/visitor satisfaction. (This is well-established; here’s just one of the many references pointing out that “the main driver of satisfaction for streaming mobile sites and apps is actually performance.”)

And, of course, as we strongly believe, developer-driven, benchmarked product performance is the key to happiness.

(As a side note, we could make a good argument that even the second job performance metric on the list, “on time/in budget,” has significant performance elements as well. When regular testing of all kinds, from unit to load testing, is performed as part of the development process, issues are caught earlier and nasty surprises are less likely to crop up.)

We were fascinated to see that so many developers felt like we do: that performance-related metrics are some of the most important things a developer can focus on. Too, it’s important to note that the way the question is phrased in the Stack Overflow survey suggests not necessarily how developers are measured, but how they feel is the smartest way to measure their job performance.

As more developers are involved with the full stack and devops, we suggest that this survey indicates an important trend: software performance (measured in performance benchmarks and customer satisfaction) should be part of developers’ job performance measurements.

Happy testing!

Topics: Performance testing, Load Testing, developer experience, developer centric, software testing, software performance, Stack Overflow

Recent Posts

Popular posts

Posts by Topic

see all

Subscribe to Email Updates