It's difficult to accurately judge the depth of a candidate's practical technical experience in the context of an interview process.
You're in a software development organization that, for whatever reason, needs to hire a person having a specific set of technical skills required for them to perform the job.
- Interviewers may not have the technical skills required for the job.
- Interviewers often ask esoteric or obscure questions that make their assessments more subjective than objective.
- Even if a candidate has a certain skill, they may get a question
- The cumulative time it takes to interview candidates may be too costly for the organization given deadline constraints.
Create a written or computer-based test that asks a series of comprehensive technical questions intended to determine the candidate's technical abilities. Before the test, clearly communicate that you are interested in the exact answers to all questions, not guesses. Communicate that if the exact answer cannot be recalled, that the candidate is free -- in fact encouraged -- to juxtapose the question with a related question and its answer of equal value.
It's good to have an unrelated example (e.g., a question asking about how you home-brew beer and a juxtaposed question and answer) on the front page for demonstration purposes. Give the candidate 30-45 minutes, approach them and ask how things are going, and give them a final 15-30 minutes to complete their work.
You've got real evidence as to how well a candidate measures up to the skills required for a job. If the candidate doesn't give up and adequately meets the requirements of the job, additional interviews can follow depending on scheduling constraints.
This approach illustrates how well the candidate listens to directions, whether they have the technical skills required for the job, how well they operate under pressure, and how creative they can be on their feet.
I'm not convinced this is the best idea ... it seems like this approach will test how good the candidate is at reading and writing an exam. If he's given a suitable exam, we might find out something about what he knows. We won't find how well he interacts with humans, we won't find out whether he can do any particular job (other than writing exams).
The quality of the result is some kind of dot product of the quality of the exam (and how good is the average manager at writing exams) times the ability of the candidate to take exams while wearing his good clothes. --RonJeffries
In addition to the specific language and platform skills required to do the job are there a set of questions that every decent programmer should know the answer to? Or, is the job of programmer too broad for any such list to be of value? Like the book CulturalLiteracy
which attempts to list everything a literate American should know, can such a list be created, say ProgrammerLiteracy
, that lists stuff that every programmer should know? -- GlennWilson
Technical skills are important, but are of limited value without human skills like communicating, cooperating, encouraging, helping, teaching, learning. How can you test for these?
Short answer: You don't.
Long answer: Note that the "Resulting Context" mentioned "If the candidate ... adequately meets the requirements of the job, additional interviews can follow...". Seems to me that the test is intended to be a filter to weed out those who do not qualify for the job (in terms of technical skills, assuming the test is well designed). With the technically qualified candidates short-listed, you can spend more time with each in following interviews to find out the rest you want to know about.
Anecdote: Several years ago I applied for a job for which I was not qualified (they wanted an 8051 guy, and I only had 8080/Z80). After letting me down gently, and explaining that they had chosen another candidate who was a ringer for the job, the interviewing manager said, "Well, as long as you've taken the time to come down, let me show you around the place."
We walked through the fabrication floor, the design and programming areas, and the testing lab. As we strolled, I asked questions at every turn, trying to get a better handle on what they did and what they needed. At one point, he mentioned a problem they were having with one of their systems. I asked a bracket of troubleshooting questions and proposed a couple of approaches. We concluded and I left.
Two weeks later, I got a call: they wanted to make me an offer for a job that hadn't yet been listed. It seems that the roving conversation showed him I could grasp their business and apply useful techniques toward the solutions they needed.
Ever since then, when I interview a candidate, I "walk the beat" with them, involving them in what we're doing, listening for their questions and comments, and using that as a gauge to supplement any other selection process used. Been real pleased with the results.