The “Combinatorial Explosion” of Software Testing – and Why You Need to Change Your Test Strategy
Companies that develop digital products and services need to test them. Testing and QA is unglamorous work but it is critical to the usability and success of any product, and ultimately to your company’s brand. Of course, testing costs money, so it’s interesting to consider how your volume of testing will change as the years go by. There are three key trends that mean the amount of testing needed is increasing every year, and taken together they are leading to an exponential increase. If you ignore this, either your product quality will suffer or your test costs will become unsustainable. The smart option is to fundamentally re-evaluate your test strategy.
But firstly, what are the three trends?
- The pace of technological change is ever increasing. This is such a cliché that we take it for granted, but it’s worth considering the profound implications it has. Technology isn’t changing linearly; it’s changing exponentially. Perhaps Moore’s Law is the best example of this, and there are so many examples that futurist Kurzweil has coined the Law of Accelerating Returns. It means that new products and services are introduced and become obsolete at ever increasing rates. So, when 20 years ago, you were introducing a new product every five years, now you are introducing them every year and in 20 years’ time, you will be introducing new products several times a year. At the same time these products are becoming more complex – containing more software and more features – all of which need testing.
- Devices and services are ever more inter-connected. Consider an electronic device around your home, such as a TV. Forty years ago, it had two interfaces: a power lead and an analogue tuner. Then came remote controls and later SCART input. These days an inexpensive TV could have upwards of twenty interfaces, including several digital tuners, Wifi, Ethernet, HDMI, USB, etc. But this connectivity trend doesn’t just affect devices; online services have the same challenge. Whereas early websites might have been used with a couple of web browser types and the backend connectivity didn’t go much beyond a database, a modern website will support tens of browser variants on a huge variety of screen sizes and will integrate with third party websites for data exchange, advertising, analytics, user authentication, etc. All of these interfaces need careful testing one by one. Yet it isn’t enough to just test that each interface (whether it’s an HDMI port, a Wifi link or a REST API) is technically conformant, you also need to test that each interface works correctly with a range of typical devices or services on the other side of that interface. Whether your service works properly when accessed by the most popular mobile phone your customers are using cannot be left to chance. In other words, interoperability testing is increasingly important and increasingly expensive.
- The need to release software updates more and more often. While not every company releases software once every second like Amazon, there are very few domains where the pace of software releases isn’t relentlessly increasing. This means that for a given product, testing has to be done more often and more quickly. And you can’t ignore this pressure or your product will rapidly fall behind its competition while your user base complains about unfixed bugs on social media.
So, why is this a combinational explosion? Let’s say at year zero you spend $100,000 on testing every year. The following numbers – which are illustrative only – demonstrate the effect of combining the three increases.
|Year 5||Year 10||Year 20|
|Technical complexity (exponential – increase by 10% each year)||x 1.6||x 2.6||x 6.7|
|Interconnectivity (linear – add 50% every 5 years)||x 1.5||x 2||x 3|
|Release cadence (linear – add 50% every 5 years)||x 1.5||x 2||x 3|
These trends equally affect software design and development and we have seen numerous approaches to address this, such as Agile, continuous development, DevOps, etc. on the one hand, and increased use of software componentisation and integration of third party software on the other hand. Together these approaches have successfully offset some of the development cost increases and made them more manageable. But far less attention has been given to improving approaches to testing, so testing costs are spiralling and software quality worsening at the same time.
From a test perspective, companies need to address the combinatorial explosion challenge in the following ways:
- Emphasise better test design and test reuse. Testing is often seen as the easy part of the development life-cycle and much less attention is given to the selection and design of test strategies. Do you pay your QA lead and lead architect the same amount? Does your software QA plan receive the same level of review and scrutiny as your product design? For a given team, there is often a glaring gap between the state of maturity of its software development and software testing practices. This isn’t magic: the problem is not that better test practices don’t exist, just that they’re not being used. Careful design of test harnesses and ensuring that tests can be reused across different devices and types (via templating and other means) can yield real benefits.
- Test automation. You can’t hope to keep releasing your software regularly or testing it on an ever-wider range of devices if you are still testing manually. Test automation is a must. Doing this well is not easy and can result in an expensively created set of automated tests that are too brittle to keep reusing and fall into disuse. Selecting tools and techniques that are proven in the market is a good first step – don’t be tempted to “home roll” test automation tools in house as it’s not as simple as might first appear.
- Use third party test solutions. A company’s test challenges are rarely unique and nearly always there are other companies out there with the same test needs at the same time. You need to test your apps on fifty mobile phones? Why buy those phones when you can access a richer set of devices through one of the many companies offering you mobile interoperability testing via PaaS. You need a tool to rigorously check your compliance to a particular protocol? Well so do 100 other companies, which is why someone already offers a tool and it’s unlikely to be more cost effective to build your own. Testing tends to suffer from a do-it-yourself mentality, in a way that software development did 25 years ago.
At Eurofins Digital Testing we address these challenges daily with our clients. Testing may not be glamorous but we’re passionate about it – the art of testing and the science of testing. This is why we offer both testing expertise through our consultancy and staffing services, as well as off-the-shelf test tools that address both specific technical areas and general test automation. Get in touch for more information on our testing solutions.