3 minute read

Some vendors look at analyst reports on API testing and all they see is dollar signs. Yes, API testing and virtualization has blown up over the past 5 years, and that's why some companies who were first to the game have the lead. Lead position comes from sweat and tears, that's how leaders catch the analysts attention in the first place; those who created the API testing industry, gained the community and analyst attention, and have the most comprehensive products that win. Every time.

There are snakes in the grass no matter what field you're in

I recently had opportunity to informally socialize with a number of "competitors", and as people are great people to eat tacos and burn airport wait time with. Unfortunately, their scrappy position in the market pushes them to do things that you can only expect from lawyers and pawn sharks. They say they're about one thing in person, but their press releases and website copy betray their willingness to lie, cheat, and deceive actual people trying to get real things done.

In other words, some vendors proselytize about "API testing" without solid product to back up their claims.

I don't like lying, and neither do you

One of my current job responsibilities is to make sure that the story my employer tells around its products accurately portray the capabilities of those products, because if they don't, real people (i.e. developers, testers, engineers, "implementers") will find out quickly and not only not become customers, but in the worst cases tell others that the story is not true. Real people doing real things is my litmus test, not analysts, not some theoretical BS meter.

Speaking of BS meter, a somewhat recent report lumped API "testing" with "virtualization" to produce a pie chart that disproportionately compares vendors market share, both by combining these two semi-related topics and by measuring share by revenue reported by the vendors. When analysts ask for things like revenue in a particular field, they generally don't just leave the answer solely up to the vendor; they do some basic research on their own to prove that the revenue reported is an accurate reflection of the product(s) directly relating to the nature of the report. After pondering this report for months, I'm not entirely sure that the combination of the "testing" and "virtualization" markets is anything but a blatant buy-off by one or two of the vendors involved to fake dominance in both areas where there is none. Money, meet influence.

I can't prove it, but I can easily prove when you've left a rotting fish in the back seat of my car simply by smelling it.

What this means for API testing

It means watch out for BS. Watch really closely. The way that some companies use "API testing" (especially in Google Ads) is unfounded in their actual product capabilities. What they mean by "testing" is not what you know as what's necessary to ship great software. Every time I see those kinds of vendors say "we do API testing", which is a insult to actual API testing, I seriously worry that they're selling developers the illusion of having sufficient testing over their APIs when in reality it's not even close.

Why your API matters to me

On the off-chance that I actually use it, I want your API to have been tested more than what a developer using a half-ass "testing" tool from a fledgling vendor can cover. I want you to write solid code, prove that it's solid, and present me with a solid solution to my problem. I also want you to have fun doing that.

The API vendor ecosystem is not what it seems from the outside. If you have questions, I have honesty. You can't say that about too many other players. Let's talk if you need an accurate read on an analyst report or vendor statement.