How do you assess technical competency in an interview?
Take home tests are useless and put off your best candidates, try these methods instead.
I am a big detractor of take-home tests. I think they test available free time, not actual ability. They always take longer than expected and are easily cheated by using more time or getting help from a friend or now, AI.
Instead, I test competency in three ways:
Thorough questioning - asking questions but going deep to test understanding and not just surface knowledge
Case study - going through a real recently solved project to see how they think
Pairing interview - pair with them to solve a real problem
Thorough questioning can be used for most hires. The case study better suits data science and product hires. Pairing better suits engineers. Data engineers can go either way.
Let's go through them.
Thorough Questioning
I have two main go-tos here: first, asking about a previous project, and second, about a technology they have used I'd like to know more about.
The goal is to unearth the nuanced decisions that only a hands-on contributor would know and understand how they make decisions.
What project are you most proud of and why?
Ask, “What project are you most proud of and why?” Then (very) importantly, dig into the details. Inquire about their data and decision-making process: Why MongoDB? Why a random forest? What features did you choose, and why? Did you consider alternatives? Which features were most impactful, and how do you know? What would you do next to enhance the results?
There’s no set script here; it’s about exploring something they should know intimately. If candidates can’t answer these questions, it suggests they either mindlessly followed a process or are taking credit for work they weren’t very involved in.
Not every decision requires exhaustive analysis (most don't), but thoughtful responses like, “I chose a random forest because it performs well on our data with minimal tuning” or “Doing that would add complexity but only affect 4% of users, which wasn’t worth the effort” indicate genuine engagement with the project.
The goal is to unearth the nuanced decisions that only a hands-on contributor would know and understand how they make decisions. If someone struggles to explain a project they claim to be proud of, that’s a significant red flag.
I have been asked how to use a scorecard for this question. I may share one in future. But briefly, the types of things you may have are: communication, understanding of what they did, evidence of good reasoning, evidence of product thinking, and tying their decisions to user value.
Ask deeply about technologies they claim expertise in
If you met the candidate at a conference, what questions would you ask them to learn from their experience? Treat candidates as experts in the technologies and frameworks they list on their CV.
For example:
You've used MongoDB a lot? When have you found it to be a better choice over SQL? Some of our older infrastructure is on Mongo, and we've had to recreate a lot of relational logic in the application code.
Have you faced query optimisation issues with GraphQL? We love the flexibility but have heard users can end up running suboptimal queries that can be really tough to fix.
You say you have deployed LLM-based features? How did you handle ensuring safety and avoiding hallucinations? How did you evaluate how it was performing?
These questions work best when they're genuine – when the candidate should be better versed on the subject than you, but you know enough to ask good questions. You can quickly gauge someone's real-world experience by how well they can teach you the nuances of using it.
They might not always know the solution to your specific problem, but if they have real experience, it should lead to an insightful conversation. It’s a red flag when a candidate has only surface knowledge about the things they have highlighted in their CV.
As an aside for candidates – don't put things on your CV that you don't want to talk about in an interview!
Case study
Pick a project you have recently solved, then present the goal, data and other relevant information, and work with them to solve it. Try and remember what you uncovered and present the information you knew at the start of the project, not the end. You can reveal more as the candidate asks relevant questions or as you move the interview along.
A recent project you worked on is key, as you will be able to deal with questions or approaches you weren't expecting. Forget creating artificial exercises with deliberate traps; they are hard to engineer and don’t reflect the messiness of real projects.
As you start, remember that the candidate is not a domain expert in your company. So you need to give them context and may need to give some prompts or jump in if they have the wrong idea about something.
I like to use Miro or similar to capture ideas and write down any extra information I give them as they navigate the problem. It also helps it feel more collaborative.
Through the exercise, you’ll see how they think and approach problems. Are they asking the right questions? Can they modify their approach when they get new information? You should be looking for the qualities your team values.
Everyone will likely get stuck at some point and need some hints or extra information. But if you have to hand-hold them through the whole exercise, it's not a good sign.
You should have an idea of the types of things you expect a candidate to discover through questioning. What good solutions to a project look like. And how much prompting and help from you is expected to be needed. This can all form parts of the scorecard.
Pairing interview
Choose a task from your backlog or one from the past you can turn into a regular exercise if doing that a lot. Pair with the candidate to get it done.
Choosing a real but previous task will allow you to use the same task for all hires and reduce the noise in your process – though this can be time-consuming to create and a less genuine experience. You will have to choose what's most important to you.
Ideally, they will be driving, but you may help out more if they aren't familiar with your stack. Here you will get an idea about their problem-solving and how they work in a real scenario.
Importantly, versus a take-home test, the candidate gets the same information from you! Do they want to work here? It’s mutually beneficial and naturally respects their time as you need to put in the same hours as them.
Bonus for platform hires
Provide a diagram of your current infrastructure, deployment process etc. It can be simplified, but you want to leave in the bits that aren't great or need work. Ask a candidate to review it, ask questions, and give their thoughts for improvements.
Wrap up
So please, try to avoid take-home tests. They don't test what you think they do.
An exception may be made for graduates, but in that case, I consider a simple and short low bar of “can they actually code?”. Generally, I would do these on a time-controlled platform so you can see them develop an answer, and any copy-pasting and available time isn't an issue.
Try these out and let me know your feedback. If you are looking for more questions you can ask in an interview, check out my recent update to the Data Leaders Handbook.