How The Way We Measure “Productivity” Makes Us Take Bad Bets

Encouraging news from Oxford as researcher Sarah Gilbert says she’s “80% confident” the COVID-19 vaccine her team has been testing will work and may be ready by the autumn.


As a software developer, the “80%” makes my heart sink. I know from bitter experience that 80% Done on solving a problem is about as meaningless a measure as you can get. The vaccine will either solve the problem – allowing us to get back to normal – or it won’t.

In software development, I apply similarly harsh logic. Teams may tell me that they’re “80% done” when they mean “We’ve built 80% of the features” or “We’ve written 80% of the code”. More generally: “We’re 80% through a plan”.

A plan is not necessarily a solution. Several promising vaccines are undergoing human trials as we speak, though. So, while Gilbert’s 80% Done may eventually turn out to be the 20% Not Done after more extensive real-world testing, there are enough possible solutions out there to give me hope that a vaccine will be forthcoming within a year.

Think of “80% done” as a 4/5 chance that it’ll work. There are several 4/5 chances – several rolls of the dice, which give the world cumulatively pretty good odds. Bill Gate’s plan to build multiple factories and start manufacturing multiple vaccines before the winner’s been identified will no doubt speed things up. And there are more efforts going on around the world if those all fail.

Software, not so much. Typically, a software development effort is the only game in town – all the eggs in a single basket, if you like. And this has always struck me as irrational behaviour on the part of organisations. At best, the design of a solution is complete guesswork as to whether or not it will solve the customer’s problem. It’s a coin toss. But a lot of organisations plan just to toss a single coin, and only once. Two coin tosses would give them 75% odds. 3 would give them 87.5%. 4 would give them 93.75%. And so on.

It’s more complex than that, of course. In real life, there’s significant odds that we’re barking up completely the wrong tree. We can’t fix a fundamentally flawed solution by refining it. So iterating only helps when we’re in the ballpark to begin with.

Software solutions – to have the best odds of succeeding in the real world – need to start with populations of possible solutions, just like the COVID-19 solution starts with a population of potential vaccines. If there was only one team working on one vaccine, I’d be very, very worried right now.

Smart organisations – of which there are sadly very few, it would seem – start projects by inviting teams or individuals to propose solutions. The most promising of those are given a little bit if budget to develop further, so they can at least go through some preliminary testing with small groups of end users. These Minimum Viable Products are whittled down further to the most promising, and more budget is assigned to evolve them into potential products. Eventually, one of those products will win out, and the rest of the budget is assigned to that to take it to market (which could mean rolling it out as an internal system into the business, if that’s the kind of problem being solved.)

We know from decades of experience and some big studies that the bulk of the cost of software is downstream. For example, my team was given a budget of £200,000 to develop a job site in the late 90s. The advertising campaign for the site’s launch cost £4 million. The team of sales, marketing people and admin people who ran the site cost £2.5 million a year. The TCO of the software itself was about £2.8 million over 5 years.

Looking back, it seems naive in the extreme that the business went with the first and only idea for the design of the site that was presented to them, given the size of the bet they were about to place. (Even more naive that the design was presented as a database schema, with no use cases – but that’s a tale for another day.)

Had I been investing that kind of money, I would have spent maybe £10,000 each on the four most promising prototypes – assigning one pair of developers to each one. After 2 weeks, I would have jettisoned the two least promising – based on real end user testing – and merged the fallow pairs into two teams, then given them £40,000 each for further development. After another 4 weeks, and more user testing, I would have selected the best of the two, merged the two teams into one, and invested the remaining £80,000 in four more weeks of development to turn it into a product.

Four throws of the dice buys you a 93.75% chance of success. Continuous user feedback on the most promising solution raises the odds even further.

But what most managers hear when I say “Start with 8 developers working on 4 potential solutions” is WASTE. 75% of the effort in the first two weeks is “wasted”. 50% of the effort in the next 4 weeks is “wasted”. The total waste is only 27.5%, though – measured in weeks spent by developers on software that ultimately won’t get used.

Three quarters of the time invested is devoted to the winning solution. That’s in exchange for much higher odds of success. If we forecasted waste by time spent multiplied by odds of failure, then having all 8 developers work on a single possible solution – a toss of a coin – presents a risk of wasting 40 weeks of developer time (or half our budget).

Starting with 4 possible solutions uses the exact same amount of developer time and budget for a 93.75%+ chances of succeeding. Risk of waste is actually – in real terms – only 6.25% of that total budget, even though we know that a quarter of the software written won’t get used.

But that’s only if you measure waste in terms of problems not solved instead of software not delivered.

The same investment: £200,000. Starkly different odds of success. Far lower risk of wasting time and money.

And that’s just the money we spent on writing the software. Now think about my job site. Many millions more were spent on the business operation that was built on that software. Had we delivered the wrong solution  – spoiler alert: we had – then that’s where the real waste would be.

Focusing on solving problems makes us more informed gamblers.


Author: codemanship

Founder of Codemanship Ltd and code craft coach and trainer

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s