The “Click Trepidation” Test of Software Quality

Banner click icon
“Banner” by Stefan Parnarov for the Noun Project.

When I was 17, I entered a programming competition for high school students. The exact nature of the brief has gotten a bit fuzzy (I think it was something to do with creating a tool to manage and track student attendance - you know, the sexy sort of thing school programming competitions generally seem to cover), but the thing I remember most happened well after we’d shipped off our entries.

As part of the judging process, the top-ranked submissions were required to meet with the judges to give a demo. During mine, I can remember proudly firing up my Visual Basic app, stepping through the feature set, and stumbling through an explanation of how I’d inspirationally used a relational database (without ever actually knowing or using the term “relational database”).

After placing second overall, I was at the awards ceremony chatting to one of the judges — a “developer evangelist” at Microsoft. (A job title that no one would blink an eye at now, but which sounded very exotic and impressive at the time.)

I wanted to know what had impressed them — was it my tasteful choice of colour palette (blue), or a result of my nascent understanding of the relational database?

He replied that actually, the thing that stood out for him during my demo was simply the confidence with which I could navigate around my app and enter, manipulate and retrieve data. He continued,

“A lot of programs, even in a professional setting, don’t have that… people are afraid of what’s going to happen when they click a button in their own applications.” Heavily paraphrased competition judge

Considering it now, I think that this sort of “click trepidation” test is actually a really telling indicator of software design and quality. If you’re showing off your app to a bunch of people for the first time, then often you know in your gut whether it’s any good or not. Having that confident feeling as you’re putting software though its paces indicates:

  • it’s been designed well; you know instinctively what the results of your interactions with the UI will be;
  • it’s been well tested; you’re not worried about any embarrassing error screens (or at the least, you’re expecting errors to be handled gracefully);
  • you care about the software you’ve built.

Of course, this isn’t a be-all-end-all mark of software quality (no throwing out unit/coverage/integration tests just yet, sorry). However, if I look back at the various software demonstrations I’ve given since that programming competition 11 years ago, the more successful among them would definitely have scored better on the Click Trepidation™ test.