For years coding tests have been an accepted part of technical recruiting. They seem so obvious - we need them to write code, so we'll make them write some code.

I've done a few take-home coding tests as an applicant, created several and evaluated more than I can remember. They suck for everyone involved, but happily there is a better way.

A few reasons code tests suck

It's difficult to find a coding task that is...

  • big enough to demonstrate some skill
  • not so big it's unreasonable to ask people to do it
  • representative of the work you do
  • not already well known and ready for copy+paste on codepen

From the applicant side...

  • they take heaps of time, particularly as you want to show your best work
  • they rarely have opportunities to ask questions to clarify scope
  • they are very specific and rigid, so you often can't showcase your strengths
  • every company you're talking to has given you their test, giving you days of work on top of your existing job and personal commitments

From the hiring side...

  • you don't know if you're really seeing the applicant's own work
  • you know recruiters are coaching applicants on your test
  • tests are time-consuming to set and review
  • you know you're asking people to do work for free
  • you can lose great applicants because they don't have time to do your test

The breaking point

A few years ago Clark Pan and I were looking at these issues while creating a new frontend coding test at Ansarada. We had a great mini project with a realistic design brief and a dummy API. It had scope for a range of solutions so we could give it to all of our interface developer applicants, from junior to senior.

But we still had a lot of concerns.

We needed a scalable way to anonymise tests, review them in a code review tool, collate the feedback and then link it back to the HR system. The easier it was to involve more reviewers, the harder it was to protect the applicant's identity. It was taking a lot of time, and we were hiring specifically because we were so busy!

We were also increasingly uncomfortable with setting people up for many hours of unpaid work. The project could be done in a solid evening (we had done it ourselves), but it could easily blow out to a whole weekend. Either way we knew this was unfair for many of our applicants - particularly those with young families. We also knew we'd lost some great candidates, because they had received an offer before they'd had time to finish all the tests they'd been given.

We did consider online coding test systems including anonymisation, plagiarism detection and so on. While some concerns are lessened you still have to spend time processing the tests, and you are still making people do unpaid work just to come in for an interview.

The breakthrough

We went back to the basics of what we were trying to get out of the test. We wanted to evaluate...

  • their experience level (junior, mid, senior)
  • how they think and solve problems
  • how they communicate

...and the way we were doing this was getting the applicant to talk us through their code. Some of the best insights came from asking why they'd chosen a certain framework or methodology; how they'd change it to add a new feature; or the questions they'd take back to stakeholders. None of that is evident in the code itself, unless the applicant had spent significant time writing comments.

The epiphany was: we didn't need the code to have the conversation. We only needed to set the scenario, not actually get it built, to be able to discuss it. Thus the non-coding code test was born.

The non-coding code test

(or, "no coding code test" if you prefer)

As the name implies, the non-coding code test (NCCT) is a coding test without the code. We would give applicants the scenario and dummy API; but tell them not to actually write any code. We had to make the "don't write code" aspect really clear, even reassuring some candidates that we weren't joking! Typically we'd supply the scenario the day before the interview, although we had a couple of applicants do it after as little as fifteen minutes to read through and prepare.

During the tech interview we would get people to talk us through how they would build it.

It worked really well. We got to see things like...

  • how the applicant broke down a problem
  • how they clarified scope and requirements
  • the choices they made and how they justified them
  • what areas they were focused on
  • what challenges they anticipated
  • whether they were comfortable challenging the status quo

It's a conversation. By the end we would have an idea of someone's focus, passion, strengths and weaknesses.

We didn't waste any time with the applicant having to explain where they'd rushed and where they'd run out of time. We didn't make anyone spend a weekend building throw-away code; and the solution to the task could never be copy+pasted from someone else.

What was happening was top-of-mind responses to a realistic scenario, including all the human elements that go along with coding.

What if...?

A common question is what if you hire someone who can talk a good game but can't code it?

First thing to note is a take-home coding test can't stop this either, if anything it's easier to fake it. The really hard-nosed answer is "that's what probation is for". There's no ethical concern about firing someone who's being totally dishonest.

We also had screening in place to catch people who simply didn't have the right skills for the role. Both our in-house and contracted recruiters properly understood the tech. If you don't have this, you'll need to do some kind of screening test before going to a full tech interview.

Essentially we chose not to optimise for something we considered an edge case. Everyone puts their best spin on themselves in an interview anyway, part of the process is working out how you rank the applicant's skill.

Plus we were frequently talking to people who were switching from another framework or stack anyway, so upskilling on our specific stack was an open part of the discussion already. Many things are easier during recruiting if you expect to support people's learning and development after they join.

Bias considerations

The NCCT does remove a rare chance for someone to have their code evaluated anonymously. However a take-home test requires the applicant to have significant spare time available to complete it. You need to evaluate your own organisation's specific processes to determine if either option works for your applicants.

Isn't this just a whiteboard test?

The NCCT is similar to whiteboard tests, and we often had people sketch something. You could categorise it as an option somewhere between the take-home test and on-the-spot whiteboard test.

But in practice most whiteboard tests seem to focus on absolutely on-the-spot reactions rather than a prepared response. Also due to time constraints the task has to be either very small or very high level (or the test takes all day). Ultimately whiteboard tests just feel a bit weird and combative.

Anecdotally the NCCT seems to reduce the chances of an interview stress brain freeze; and allows discussion of a more realistic scenario for a range of developer roles. Instead of firing questions at someone's back as they scribble with shaking hands, you sit down and have a conversation.

Plus, when we started using the NCCT we had a lot of feedback that it was unusual. People didn't think of it as a whiteboard test; and they certainly appreciated the fact we hadn't asked them to spend a weekend writing code. Even if it is nothing more than a way to frame a whiteboard test, it seems positive.

Is it right for you?

The NCCT worked really well and I still prefer it over more-traditional tests. But as with anything else, you should consider whether you have the problems the NCCT tries to solve.

If your system is working perfectly, you probably don't need to mess with it! If you still want to see actual code or need total anonymity, a well-tuned hosted test may be the sweet spot for you.

But if you are spending a heap of time slogging through applicant code, or missing out on great people who don't have time for yet another code test, maybe you should consider adding a non-coding code test to your recruiting toolkit.