How do you test the Internet of Things?
If we think traditional software is hard, just wait until all the ugly details of the physical world start to pollute our perfect digital platforms.
[embed]https://www.youtube.com/watch?v=7odT95kmf_w[/embed]
What is the IoT?
The Internet of Things (IoT) is a global network of digital devices that exchange data with each other and cloud systems. I'm not Wikipedia, and I'm not a history book, so I'll just skip past some things in this definitions section.
Where is the IoT?
It's everywhere, not just in high-tech houses. Internet providers handing out new cable modems that cat as their own WiFi is just a new "backbone" for these devices to connect in over, in almost every urban neighborhood now.
Enter the Mind of an IoT Tester
How far back should we go? How long do you have? I'll keep it short: the simpler the system, the less there is to test. Now ponder the staggering complexity of the low-cost Raspberry Pi. Multiplied by the number of humans on Earth that like to tinker, educated or no, throw in some APIs and ubiquitous internet access for fun, and now we have a landscape, a view of the magnitude of possibility that the IoT represents. It's a huge amount of worry for me personally.
Compositionality as a Design Constraint
Good designers will often put constraints in their own way purposely to act as a sort of scaffolding for their traversal of a problem space. Only three colors, no synthetic materials, exactly 12 kilos, can I use it without tutorials, less materials. Sometimes the unyielding makes you yield in places you wouldn't otherwise, flex muscles you normally don't, reach farther.
IoT puts compositionality right up in our faces, just like APIs, but with hardware and in ways that both very physical and personal. It forces us to consider how things will be combined in the wild. For testing, this is the nightmare scenario.
Dr. Strangetest, or How I Learned to Stop Worrying and Accept the IoT
The only way out of this conundrum is in the design. You need to design things to very discrete specifications and target very focused scenarios. It moves the matter of quality up a bit into the space of orchestration testing, which by definition is scenario based. Lots of little things are easy to prove working independent of each other, but once you do that, the next challenges lie in the realm of how you intend to use it. Therein lies both the known and unknown, the business cases and the business risks.
If you code or build, find someone else to test it too
As a developer, I can always pick up a device I just flashed with my new code, try it out, and prove that it works. Sort of. It sounds quick, but rarely is. There's lots of plugging and unplugging, uploading, waiting, debugging, and fiddling with things to get them to just work. I get sick of it all; I just want things to work. And when they finally *do* work, I move on quickly.
If I'm the one building something to work a certain way, I have a sort of programming myopia, where I only always want it to work. Confirmation bias.
What do experts say?
I'm re-reading Code Complete by Steve McConnell, written more than 20 years ago now, eons in the digital age. Section 22.1:
"Testing requires you to assume that you'll find errors in your code. If you assume you won't, you probably won't."
"You must hope to find errors in your code. Such hope might feel like an unnatural act, but you should hope that it's you who find the errors and not someone else."
True that, for code, for IoT devices, and for life.