Why responsive web developers must learn Selenium

Developers need fluency in technologies that the team uses for UI testing. Similarly, testing has become more a programmer’s job than a point-n-click task. As the world’s most widely-adopted web UI testing framework, Selenium is a must-know technology for responsive web development teams in order to maintain the pace of continuous delivery.

Mostly Working, Somewhat Broken ‘Builds’

I use the term ‘build’ loosely here when it comes to web apps, since it’s often more like packaging (particularly with Docker containers), but the idea is the same. You have code in a repo, that gets sucked in by your CI system, built/packaged), tested/validated, then deployed somewhere (temp stack or whatnot).

If the build succeeds but tests fail, you need to quickly assess if it’s the test’s fault or the new build. The only way to do that quickly is to maintain proficiency in technologies and practices used to validate the build.

Code in production is where money is made; broken code doesn’t make any money, it just makes people move on. Everyone is responsible for the result.

We have a QA person. That’s their job.

Not so fast. In continuous delivery teams, developers don’t have the luxury of leaving it to someone else. When build verification tests (BVT) fail, a developer needs to track down the source of the problem which typically means parsing the test failure logs, referring to the code and the context/data used for the test, then making some adjustment and re-running the tests.

Though your test engineer can fix the test, you still have to wait for a developer to fix problems in code. Why not make them the same person?

Of course I’m not suggesting that every developer write ALL their own UI/UX tests, there’s a division of labor and skills match decision here which each team needs to make for themselves. I also thing in larger teams it’s important to have separation of concern between those who create and those who validate.

Code in production is where money is made; broken code doesn’t make any money, it just makes people move on. Everyone is responsible for the result.

How can web developers find time for UI testing?

The answer is simple: do less. That’s right, prioritize work. Include a simple test or two in your estimation for a feature. Put another way, do more of the right things.

You may get push-back for this. Explain that you’re purposely improving your ability to turn things around faster and become more transparent about how long something really takes. Product owners and QA will get it.

Now that you have a sliver of breathing room, learn how to write a basic Selenium script over what you just created. Now you have a deliverable that can be included in CI, act as the basis for additional validation like load testing and production monitoring, and you look very good doing it.

You can do it!

Obviously, you can go to the main site, but a quick way for those with Node already installed is through WebDriver:

[full tutorial here]

For those who have never worked with Selenium before, start with this tutorial. Then go through the above.

Even if you aren’t a Node.js fan, it’s an easy to grasp the concepts. If you want to go the Java or C# route, there are plenty of tutorials on that too.

More reading:

 

What Can You Do Without an Office?

4
Meme == fun! Make one yourself! Tag #SummerOfSelenium

I rarely go to the beach. When I do, I like to do what I like to do: surprise, tech stuff. We all relax in different ways, no judgement here. We also all need to work flexibly, so I’m also busy with a new co-working space for my locals.

In back and forth with a colleague of mine, we started to talk about strange places where we find ourselves writing Selenium scripts. All sorts of weird things have been happening in mobile this summer, Pokemon Go (which I call pokenomics) for example, and Eran was busy creating a cross-platform test script for his upcoming webinar that tests their download and installation process.

At work, we’re doing this #SummerOfSelenium thing, and I thought that it would be cool to start a meme competition themed around summer-time automated testing from strange places. Fun times, or at least a distraction from the 7th circle of XPath hell that we’re still regularly subjected to via test frameworks.

If you want to build your own meme, use the following links…

940x450_phone-computer-on-beach a.baa-With-computer-on-the-beach businessman-surfboard-9452686

Reply to my tweet with your image and I’ll figure out a way to get our marketing team to send you some schwag. ;D

Mine was:

17zb39

Then Eran responded with:

Capture

We think we’re funny at least.


Side-note: co-working spaces are really important. As higher-paid urban jobs overtake local employment in technical careers, we need to respond to the demand for work-life balance and encourage businesses to do the same. Co-working spaces create economic stickiness and foster creativity through social engagement. My thoughts + a local survey are here, in case you want to learn more. A research area of mine and one I’ll be speaking on in the next year.

How do you test the Internet of Things?

If we think traditional software is hard, just wait until all the ugly details of the physical world start to pollute our perfect digital platforms.

What is the IoT?

The Internet of Things (IoT) is a global network of digital devices that exchange data with each other and cloud systems. I’m not Wikipedia, and I’m not a history book, so I’ll just skip past some things in this definitions section.

Where is the IoT?

It’s everywhere, not just in high-tech houses. Internet providers handing out new cable modems that cat as their own WiFi is just a new “backbone” for these devices to connect in over, in almost every urban neighborhood now.

Enter the Mind of an IoT Tester

How far back should we go? How long do you have? I’ll keep it short: the simpler the system, the less there is to test. Now ponder the staggering complexity of the low-cost Raspberry Pi. Multiplied by the number of humans on Earth that like to tinker, educated or no, throw in some APIs and ubiquitous internet access for fun, and now we have a landscape, a view of the magnitude of possibility that the IoT represents. It’s a huge amount of worry for me personally.

Compositionality as a Design Constraint

Good designers will often put constraints in their own way purposely to act as a sort of scaffolding for their traversal of a problem space. Only three colors, no synthetic materials, exactly 12 kilos, can I use it without tutorials, less materials. Sometimes the unyielding makes you yield in places you wouldn’t otherwise, flex muscles you normally don’t, reach farther.

IoT puts compositionality right up in our faces, just like APIs, but with hardware and in ways that both very physical and personal. It forces us to consider how things will be combined in the wild. For testing, this is the nightmare scenario.

Dr. Strangetest, or How I Learned to Stop Worrying and Accept the IoT

The only way out of this conundrum is in the design. You need to design things to very discrete specifications and target very focused scenarios. It moves the matter of quality up a bit into the space of orchestration testing, which by definition is scenario based. Lots of little things are easy to prove working independent of each other, but once you do that, the next challenges lie in the realm of how you intend to use it. Therein lies both the known and unknown, the business cases and the business risks.

If you code or build, find someone else to test it too

As a developer, I can always pick up a device I just flashed with my new code, try it out, and prove that it works. Sort of. It sounds quick, but rarely is. There’s lots of plugging and unplugging, uploading, waiting, debugging, and fiddling with things to get them to just work. I get sick of it all; I just want things to work. And when they finally *do* work, I move on quickly.

If I’m the one building something to work a certain way, I have a sort of programming myopia, where I only always want it to work. Confirmation bias.

What do experts say?

I’m re-reading Code Complete by Steve McConnell, written more than 20 years ago now, eons in the digital age. Section 22.1:

“Testing requires you to assume that you’ll find errors in your code. If you assume you won’t, you probably won’t.”

“You must hope to find errors in your code. Such hope might feel like an unnatural act, but you should hope that it’s you who find the errors and not someone else.”

True that, for code, for IoT devices, and for life.

[Talk] API Strategy: The Next Generation

I took the mic at APIStrat Austin 2015 last week.

A few weeks back, Kin Lane (sup) emailed and asked if I could fill in a spot, talk about something that was not all corporate slides. After being declined two weeks before that and practically interrogating Mark Boyd when he graciously called me to tell me that my talk wasn’t accepted, I was like “haal no!” (in my head) as I wrote back “haal yes” because duh.

I don’t really know if it was apparent during, but I didn’t practice. Last year at APIStrat Chicago, I practiced my 15 minute talk for about three weeks before. At APIdays Mediterranea in May I used a fallback notebook and someone tweeted that using notes is bullshit. Touché, though some of us keep our instincts in check with self-deprecation and self-doubt. Point taken: don’t open your mouth unless you know something deep enough where you absolutely must share it.

I don’t use notes anymore. I live what I talk about. I talk about what I live. APIs.

I live with two crazy people and a superhuman. It’s kind of weird. My children are young and creative, my wife and I do whatever we can to feed them. So when some asshole single developer tries to tell me that they know more about how to build something amazing with their bare hands, I’m like “psh, please, do have kids?” (again, in my head).

Children are literally the only way our race carries on. You want to tell me how to carry on about APIs, let me see how much brain-power for API design nuance you have left after a toddler carries on in your left ear for over an hour.

My life is basically APIs + Kids + Philanthropy + Sleep.

That’s where my talk at APIstrat came from. Me. For those who don’t follow, imagine that you’ve committed to a long-term project for how to make everyone’s life a little easier by contributing good people to the world, people with hearts and minds at least slightly better than your own. Hi.

It was a testing and monitoring track, so for people coming to see bullet lists of the latest ways to ignore important characteristics and system behaviors that only come from working closely with a distributed system, it may have been disappointing. But based on the number of conversation afterwards, I don’t think that’s what happened for most of the audience. My message was:

Metrics <= implementation <= design <= team <= people

If you don’t get people right, you’re doomed to deal with overly complicated metrics from dysfunctional systems born of hasty design by scattered teams of ineffective people.

My one piece of advice: consider that each person you work with when designing things was also once a child, and like you, has developed their own form of learning. Learn from them, and they will learn from you.

 

Don’t Insult Technical Professionals

Some vendors look at analyst reports on API testing and all they see is dollar signs. Yes, API testing and virtualization has blown up over the past 5 years, and that’s why some companies who were first to the game have the lead. Lead position comes from sweat and tears, that’s how leaders catch the analysts attention in the first place; those who created the API testing industry, gained the community and analyst attention, and have the most comprehensive products that win. Every time.

There are snakes in the grass no matter what field you’re in

I recently had opportunity to informally socialize with a number of “competitors”, and as people are great people to eat tacos and burn airport wait time with. Unfortunately, their scrappy position in the market pushes them to do things that you can only expect from lawyers and pawn sharks. They say they’re about one thing in person, but their press releases and website copy betray their willingness to lie, cheat, and deceive actual people trying to get real things done.

In other words, some vendors proselytize about “API testing” without solid product to back up their claims.

I don’t like lying, and neither do you

One of my current job responsibilities is to make sure that the story my employer tells around its products accurately portray the capabilities of those products, because if they don’t, real people (i.e. developers, testers, engineers, “implementers”) will find out quickly and not only not become customers, but in the worst cases tell others that the story is not true. Real people doing real things is my litmus test, not analysts, not some theoretical BS meter.

Speaking of BS meter, a somewhat recent report lumped API “testing” with “virtualization” to produce a pie chart that disproportionately compares vendors market share, both by combining these two semi-related topics and by measuring share by revenue reported by the vendors. When analysts ask for things like revenue in a particular field, they generally don’t just leave the answer solely up to the vendor; they do some basic research on their own to prove that the revenue reported is an accurate reflection of the product(s) directly relating to the nature of the report. After pondering this report for months, I’m not entirely sure that the combination of the “testing” and “virtualization” markets is anything but a blatant buy-off by one or two of the vendors involved to fake dominance in both areas where there is none. Money, meet influence.

I can’t prove it, but I can easily prove when you’ve left a rotting fish in the back seat of my car simply by smelling it.

What this means for API testing

It means watch out for BS. Watch really closely. The way that some companies use “API testing” (especially in Google Ads) is unfounded in their actual product capabilities. What they mean by “testing” is not what you know as what’s necessary to ship great software. Every time I see those kinds of vendors say “we do API testing”, which is a insult to actual API testing, I seriously worry that they’re selling developers the illusion of having sufficient testing over their APIs when in reality it’s not even close.

Why your API matters to me

On the off-chance that I actually use it, I want your API to have been tested more than what a developer using a half-ass “testing” tool from a fledgling vendor can cover. I want you to write solid code, prove that it’s solid, and present me with a solid solution to my problem. I also want you to have fun doing that.

The API vendor ecosystem is not what it seems from the outside. If you have questions, I have honesty. You can’t say that about too many other players. Let’s talk if you need an accurate read on an analyst report or vendor statement.