Interviews as no-code zones

I wrestled with many new challenges during the last three years, as Head of Engineering at Placemeter building a product and an engineering team from scratch, and one of the toughest ones was hiring. I had little experience initially. How do you get good candidates to even apply at an early-stage startup? How do you decide who to interview? I thought at least I knew how to conduct interviews, since I had spent five years at Google where I averaged about one interview per week.

The specific format of the Google software engineer interview during that time (2007 – 2012) was impressed on me quite strongly: The interviewer states a problem (generally some form of algorithmic challenge), asks the candidate to describe possible solutions verbally and then to write out a solution in code on a whiteboard or on paper. One interview had to fit into 45 minutes, so these problems couldn’t be all that involved.

That last part of writing code in an interview is colloquially known as whiteboarding. I like to use the term live coding since it may happen in a code editor or specialized tool rather than on a literal whiteboard.

img_20120622_110044

Because Google was obviously very successful with it and it was part of my routine for so long, I never questioned this interview format. When I found myself in charge of the interview process at a startup, when I actually had to stop and think about the qualities I want to evaluate in a software engineer, I soon realized the limitations of this format, particularly of the live coding aspect.

My main objection to live coding is that it’s a highly artificial activity which does not resemble anything a software engineer does day-to-day. In the real world, code is written on a keyboard with the support of an IDE and StackOverflow. In the real world, code is not written under time pressure, especially not while also being watched over. In the real world, code is not written in small isolated chunks; it almost always has to integrate with an existing large system. Live coding is so unnatural, I think it very weakly predicts how well someone is going to perform at a tech startup.

I’m not claiming this as an original insight. Google has reportedly stopped relying so much on whiteboard coding, with more focus on just problem solving. Other people have raised issues with live coding, for example Laurie Voss in this excellent post on interviewing.

Programming is easy

Apart from the artifice of live coding, there is a deeper problem with placing coding at the core of interviewing: Programming is easy.

Many people bristle at this idea, so let me explain. Here is a non-exhaustive list of things that I think are harder than programming but are often just as important when creating software:

  • Collaborating in a team. Working in a team requires nuanced communication on an ever-growing list of channels (meetings, email, Slack, git, issue trackers and so on). Being part of a team means making compromises, and that may be hard to swallow.
  • Product design and UI design. An engineer is rarely just a voiceless executing arm of a product manager, especially not in a startup. There has to be a feedback loop, with engineers for example reporting early when a feature is too hard to implement or a UI is too complex or underspecified. Developing a sense for product and UI design (or API design, in backend systems) doesn’t come easy to many.
  • Reading code. In any non-trivial codebase you are going to read a lot more code than you’re going to write. The corollary is that every bit of code is read many more times than it is written, so knowing how to organize and write code for readability is also important (and hard).
  • Debugging. The natural state of software is buggy. Reproducing bugs locally is hard, so is mastering debugging and profiling tools. It gets even trickier with production systems in the mix where you have to interpret live data, logging and monitoring output.

I can illustrate this well with my experience as an intern host. I’ve personally mentored about a dozen interns over the years. They were generally Computer Science undergrads, they were very bright and motivated, and they were already good programmers. I believe that interns get the most out of an internship if you put them to work on real features that are going to be launched to real users (no toy projects or internal-only tools). When I say they were good programmers, I mean that I would give them specs for such a feature, they would get to implementing for a couple of weeks or so, and then they would call me over to their computer to proudly demo what they presumed was the complete feature.

None of them initially realized that they may have completed most of the coding — which they were already capable of without much help from me — but they were far from done. Once they merged their code they would discover that it conflicted with somebody else’s ongoing work, and/or it broke an integration test, and/or when running against production data their code couldn’t handle some users’ messy data, and/or QA revealed their UI component can get into an inconsistent state, and/or spurious exceptions started happening when their code was deployed to production. Sorting out these issues would take at least as much time as they initially spent “programming”.

It’s a long way from a local demo of a feature to getting it into the hands of real users. Most of the actual mentoring I ever did happened during this stage of the software lifecycle, between the first local demo and the public launch. Not by accident this is precisely the stage into which most of the hard activities from above fall.

I see the advancement of a junior engineer to a senior engineer mainly as a shift of focus away from programming to those other, harder activities. Organizing a team of engineers for high productivity is about finding ways to emphasize those concerns in the overall development process.

In the no-code zone

If the correlation between live coding skills and actual engineering skills is weak and testing for coding will give you a one-dimensional picture of a candidate anyway — how should you interview engineers? Here are some techniques I found useful.

Take-home coding challenge. Obviously it’s necessary to evaluate coding skills somehow, but the test should be designed to minimize the drawbacks of live coding and to recreate realistic conditions as much as possible. I like to give candidates a coding challenge that a) they do at home in their own preferred development environment, b) has no deadline, c) and is building on some existing code. I think 2 to 4 hours is appropriate, so it takes some effort to design a challenge that is not too trivial but can be completed in that timeframe.

Take-home challenges have become pretty common in tech startups and can be controversial because of the time commitment required. I have never had a candidate refuse a challenge because I frame it like this: We think working in your own environment without a deadline is the best way for you to demonstrate your coding skills and we will not make you write code at any other point during the hiring process.

Various services have sprung up for administering take-home challenges, none of which I’ve tried. A very simple model worked for me: I share a new private GitHub repo with every candidate, containing instructions and the code they have to build on.

Design discussions. Whiteboards are better for drawing than for coding. I find a good way to ground a design discussion is to base it on a real system that a candidate would work on if hired. For example, I would sketch part of a system on the whiteboard and ask the candidate to design some extension to it. During the discussion I try to move from high-level architecture to very specific low-level details, including possibly UI or API design. If the conversation stalls I ask about tradeoffs and compromises. Under what circumstances would you use C++ as opposed to Python? How do your design decisions change if you are constrained by memory or by bandwidth?

Code reading. I have experimented with showing candidates a piece of code and asking concrete questions about what it does and how they could improve on it. Almost everybody I’ve tried this on has struggled with it, underscoring my point that reading code is really hard. I still think this can lead to revealing discussions.

**

Writing code during interviews is appealing because it produces something we can supposedly judge objectively. I think a carefully designed take-home challenge can satisfy this desire for a code artifact more effectively. Even then, code only says so much about how a candidate is going to fare at software engineering, which includes many activities other than programming. This leaves you with mostly design discussions during in-person interviews, the results of which are unfortunately more nuanced, more subjective, and harder to interpret.

Nobody says hiring is easy. Evaluating a person’s potential in a complex job like software engineering, after spending just a few hours with them, is inherently hard. It’s better to acknowledge that than to act as if there are fixed rules that always lead to the right hiring decision. You should be constantly questioning if your process is working.

I’ve rarely met an interviewee who likes live coding, so explicitly renouncing live coding may work to your advantage as a startup competing for scarce talent. I know I will push for that at my next job.

One thought on “Interviews as no-code zones

  1. I’m just about to start a hiring process for data scientsists. My idea for the coding aspect was to have pose a ‘find the mistake’ exercise, which would be a test of code reading more than a test of code writing in a way, but still require some knowledge of the language involved. In particular my concern is to ensure that any syntax I was checking would be contained in the question so that it wasn’t a test of memory – if the exercise was to check ability to write and apply a for loop (it won’t be ) there would be a for loop that either needed a small change to work without error or ran successfully without doing the thing required by the challenge. Obviously for a data scientist, rather than a software engineer, coding is less core, compared to data analysis, which has a separate challenge.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s