“Practical men, who believe themselves to be quite exempt from any intellectual influence, are usually the slaves of some defunct economist.” John Maynard Keynes
Most of you are not only familiar with the idea of economies of scale but you expect economies of scale. Much of our market economy operates on the assumption that when you buy/spend more you get more per unit of spending.
At some stage in our education – even if you never studied economies or operational research – you have assimilated the idea that if Henry Ford builds 1,000,000 identical, black, cars and sells 1 million cars, than each car will cost less than if Henry Ford manufactures one car, sells one car, builds another very similar car, sells that car and thus continues. The net result is that Henry Ford produces cars more cheaply and sells more cars more cheaply so buyers benefit.
(Indeed the idea and history of mass production and economies of scale are intertwined. Today I’m not discussing mass production, I’m talking Economies of Scale.)
You expect that if you go to your local supermarket to buy milk then buying one, large – carton of milk – say 4 pints in one go, will be cheaper than buying 4 cartons of milk each holding one pint of milk.
In my #NoProjects talk I use this slide, it always gets a laugh:
Yesterday I put this theory to a test in my local Sainsbury’s, here is the proof:
- 1 pint of milk costs 49p (marginal cost of one more pint 49p)
- 2 pints of milk cost 85p, or 42.5p per pint (marginal cost of one more pint 36p)
- 4 pints of milk cost £1, or 25p per pint (marginal cost of one more pint 7.5p)
(And if you don’t know, the UK is a proudly bi-measurement country. Countries like Canada, The Netherlands and Switzerland teach their people to speak two languages. In the UK we teach our people to use two systems of measurement!)
So ingrained is this idea that when it supermarkets don’t charge less for buying more, complaints are made (see The Guardian from a few months back.)
Buying milk from Sainsbury’s isn’t just about the milk: Sainsbury’s needs the store there, the store needs staffing, it needs products to sell, and they need to get me into the store. That costs the same for one pint as for four. Thats why the marginal costs fall.
Economies of scale are often cited as the reason for corporate mergers: to extract concessions from suppliers, to manufacture more items for lower overall costs. Purchasing departments expect economies of scale.
But…. and this is a big BUT…. get ready….
Software development does not have economies of scale.
In all sorts of ways software development has diseconomies of scale.
If software was sold by the pint then a four pint carton of software would not just cost four times the price of a one pint carton it would cost far far more.
The diseconomies are all around us:
- Small teams frequently outperform large team, five people working as a tight team will be far more productive per person than a team of 50, or even 15. (The Quattro Pro development team in the early 1990s is probably the best documented example of this.)
- The more lines of code a piece of software has the more difficult it is to add an enhancement or fix a bug. Putting a fix into a system with 1 million lines can easily be more than 10 times harder than fixing a system with 100,000 lines.
- Projects which set out to be BIG have far higher costs and lower productivity (per unit of deliverable) than small systems. (Capers Jones’ 2008 book contains some tables of productivity per function point which illustrate this. It is worth noting that the biggest systems are usually military and they have an atrocious productivity rate – an F35 or A400 anyone?)
- Waiting longer – and probably writing more code – before you ask for feedback or user validation causes more problems than asking for it sooner when the product is smaller.
The examples could go on.
But the other thing is: working in the large increases risk.
Suppose 100ml of milk is off. If the 100ml is in one small carton then you have lost 1 pint of milk. If the 100ml is in a 4 pint carton you have lost 4 pints.
Suppose your developers write one bug a year which will slip through test and crash the users’ machine. Suppose you know this, so in an effort to catch the bug you do more testing. In order to keep costs low on testing you need to test more software, so you do a bigger release with more changes – economies of scale thinking. That actually makes the testing harder but… Suppose you do one release a year. That release blue screens the machine. The user now sees every release you do crashes his machine. 100% of your releases screw up.
If instead you release weekly, one release a year still crashes the machine but the user sees 51 releases a year which don’t. Less than 2% of your releases screw up.
Yes I’m talking about batch size. Software development works best in small batch sizes. (Don Reinertsen has some figures on batch size The Principles of Product Development Flow which also support the diseconomies of scale argument.)
Ok, there are a few places where software development does exhibit economies of scale but on most occasions diseconomies of scale are the norm.
This happens because each time you add to software software work the marginal cost per unit increases:
- Add a fourth team member to a team of three and the communication paths increase from 3 to 6.
- Add one feature to a release and you have one feature to test, add two features and you have 3 tests to run: two features to test plus the interaction between the two.
In part this is because human minds can only hold so much complexity. As the complexity increases (more changes, more code) our cognitive load increases, we slow down, we make mistakes, we take longer.
(Economies of scope and specialisation are also closely related to economies of scale and again on the whole, software development has diseconomies of scope (be more specific) and diseconomies of specialisation (generalists are usually preferable to specialists).)
However be careful: once the software is developed then economies of scale are rampant. The world switches. Software which has been built probably exhibits more economies of scale than any other product known to man. (In economic terms the marginal cost of producing the first instance are extremely high but the marginal costs of producing an identical copy (production) is so close to zero as to be zero, Ctrl-C Ctrl-V.)
What does this all mean?
Firstly you need to rewire your brain, almost everyone in the advanced world has been brought up with economies of scale since school. You need to start thinking diseconomies of scale.
Second, whenever faced with a problem where you feel the urge to go bigger run in the opposite direction, go smaller.
Third, take each and every opportunity to go small.
Four, get good at working in the small, optimise your processes, tools, approaches to do lots of small things rather than a few big things.
Fifth, and this is the killer: know that most people don’t get this at all. In fact it’s worse…
In any existing organization, particularly a large corporation, the majority of people who make decisions are out and out economies of scale people. They expect that going big is cheaper than going small and they force this view on others – especially software technology people. (Hence Large companies trying to be Agile remind me of middle aged men buying sports cars.)
Many of these people got to where they are today because of economies of scale, many of these companies exist because of economies of scale; if they are good at economies of scale they are good at doing what they do.
But in the world of software development this mindset is a recipe for failure and under performance. The conflict between economies of scale thinking and diseconomies of scale working will create tension and conflict.
Have I convinced you?
Finally, I increasingly wonder where else diseconomies of scale rule? They can’t be unique to software development. In my more fanciful moments I wonder if diseconomies of scale are the norm in all knowledge work.
Even if they aren’t, as more and more work comes to resemble software development – because of the central role of individual knowledge and the use of software tools – then I would expect to see more and more example of diseconomies of scale.
(PS, if you didn’t see my last post I’ve started a newsletter list, please subscribe.)
8 thoughts on “Software has diseconomies of scale – not economies of scale”
Pingback: My Homepage
Pingback: Money Talks: A Tale of Two Change Programs – My Blog
“If software was sold by the pint then a four pint carton of software would not just cost four times the price of a one pint carton it would cost far far more”
A better analogy of mass production would be:
cp linux.iso linux2.iso voila two linux operating systems for the price of one (which was free anyway, but that’s besides the point). Sotfware is MORE scalable. It costs virtually nothing make as many copies as you like.
What you’re really discussing is building complex software systems is harder than building simple ones, but that’s also true in the physical world, as you point out with military aircraft. An extreme example: building a space station costs vastly more than building a whole University even though the latter is vastly larger.
Even if you’re building software, if you already have built a working system, then adding features to it that makes it 4 times larger might be cheaper because the engineers are already familiar with the system and have already solved the difficult core features. e.g. adding drivers to an operating system can increase it’s size, but may be less difficult to achieve.
The things matter: 1) the quality of the requirements and analysis. 2) the quality of the management 3) the quality of the developers and testers 4) budget 5) time. 6) A (usually agility driven) process for continually verifying the above.
“Small teams frequently outperform large team” – I believe that’s Brooke’s law, from The Mythical Man Month. It’s mainly due to exponentially increasing communication pathways. But some projects require large teams. Linux has many thousands of developers and other contributors. It works because of a fantastic manager, Linus Torvalds.
P.S. In the UK they moved away from pounds and ounces to kilograms. “Pints” seem to be sticking around, I think because a pint is a convenient hand-held quantity compared to a litre 🙂
Straight up, I have never ever denied the scalability of built software.
It is absolutely true that making an identical copy of an existing software product is virtual zero and therefore probably the most scalable product humans have ever products.
My entire argument concerns the building and enhancement of that thing.
Space stations are an interesting analogy, I wonder if they too have diseconomies of scale? – The seven Salyuts (well 8 if you count Mir and 5 if you take away the 3 miltary Almaz) cost a fraction of the cost of the ISS. Similarly, flying multiple (disposable) Soyuz is a damn site cheaper that “reusing” one space shuttle.
As for “projects” which “require large teams” I can only point out that successful large teams inevitably grow from successful small teams. Our industry too frequently jumps to a big team without even trying a small team. Hence my recommendation to start with an MVT: Minimally Viable team.
There are some questions that came to mind: When you do small SW packages, many times these are connected to multiple other small or large SW entities.
The more you develop these small packages, the more challenging it is keep up a repository (i.e. documentation) about what they do and how they interact with each other. Yes developing code in small chunks might be cost effective for a certain period of time, until you hit a critical mass where these numerous small parts form a complex ecosystem and connects with the existing ecosystem in more and more complex ways.
The biggest challenge is keeping this under control. Making sure what is done is 1) cost effective 2) documented 3) supportable and 4) can be handled by operations which are many times dumped deliverables which meet none of the 3 first criteria.
As any software system grows in increases in size, that makes it more expensive to continue (maintenance, enhancement, what-ever.)
Therefore the engineers are in a constant battle to keep the characteristics of small. In all likelihood the engineers will loose the battle eventually. Commercial success usually demands growth – we could debate strategies which avoid that but its a big topic.
Although the engineers may eventually loose the battle there is much they can do to maintain the characteristics of small
– there are lots of modularisation techniques available which can help, converting their “monolith” to micro-services is popular right now
– improved repository/source code control
– automation in the build release pipeline and test suites
– executable documentation
And there are more.
Keeping it small is a challenge for engineers, but if this was easy they would have done it already!
Pingback: Estimation, planning, teams and money, some data | Iwantings|Article, media, sports, TV, conversations &more
Pingback: The Lean Wisdom at the heart of Agile Software Development | Extreme Uncertainty