Skip to main content
News Releases

Tech Sector Job Interviews Assess Anxiety, Not Software Skills

person writes on whiteboard
Photo credit: Christina Morillo.

For Immediate Release

Chris Parnin

A new study from North Carolina State University and Microsoft finds that the technical interviews currently used in hiring for many software engineering positions test whether a job candidate has performance anxiety rather than whether the candidate is competent at coding. The interviews may also be used to exclude groups or favor specific job candidates.

“Technical interviews are feared and hated in the industry, and it turns out that these interview techniques may also be hurting the industry’s ability to find and hire skilled software engineers,” says Chris Parnin, an assistant professor of computer science at NC State and co-author of a paper on the work. “Our study suggests that a lot of well-qualified job candidates are being eliminated because they’re not used to working on a whiteboard in front of an audience.”

Technical interviews in the software engineering sector generally take the form of giving a job candidate a problem to solve, then requiring the candidate to write out a solution in code on a whiteboard – explaining each step of the process to an interviewer.

Previous research found that many developers in the software engineering community felt the technical interview process was deeply flawed. So the researchers decided to run a study aimed at assessing the effect of the interview process on aspiring software engineers.

For this study, researchers conducted technical interviews of 48 computer science undergraduates and graduate students. Half of the study participants were given a conventional technical interview, with an interviewer looking on. The other half of the participants were asked to solve their problem on a whiteboard in a private room. The private interviews did not require study participants to explain their solutions aloud, and had no interviewers looking over their shoulders.

Researchers measured each study participant’s interview performance by assessing the accuracy and efficiency of each solution. In other words, they wanted to know whether the code they wrote would work, and the amount of computing resources needed to run it.

“People who took the traditional interview performed half as well as people that were able to interview in private,” Parnin says. “In short, the findings suggest that companies are missing out on really good programmers because those programmers aren’t good at writing on a whiteboard and explaining their work out loud while coding.”

The researchers also note that the current format of technical interviews may also be used to exclude certain job candidates.

“For example, interviewers may give easier problems to candidates they prefer,” Parnin says. “But the format may also serve as a barrier to entire classes of candidates. For example, in our study, all of the women who took the public interview failed, while all of the women who took the private interview passed. Our study was limited, and a larger sample size would be needed to draw firm conclusions, but the idea that the very design of the interview process may effectively exclude an entire class of job candidates is troubling.”

What’s more, the specific nature of the technical interview process means that many job candidates try to spend weeks or months training specifically for the technical interview, rather than for the actual job they’d be doing.

“The technical interview process gives people with industry connections an advantage,” says Mahnaz Behroozi, first author of study and a Ph.D. student at NC State. “But it gives a particularly large advantage to people who can afford to take the time to focus solely on preparing for an interview process that has very little to do with the nature of the work itself.

“And the problems this study highlights are in addition to a suite of other problems associated with the hiring process in the tech sector, which we presented at ICSE-SES [the International Conference on Software Engineering, Software Engineering In Society],” adds Behroozi. “If the tech sector can address all of these challenges in a meaningful way, it will make significant progress in becoming more fair and inclusive. More to the point, the sector will be drawing from a larger and more diverse talent pool, which would contribute to better work.”

The study on technical interviews, “Does Stress Impact Technical Interview Performance?,” will be presented at the ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering, being held virtually from Nov. 8–13. The study was co-authored by Shivani Shirolkar, a Ph.D. student at NC State who worked on the project while an undergraduate; and by Titus Barik, a researcher at Microsoft and former Ph.D. student at NC State.

-shipman-

Note to Editors: The study abstract follows.

“Does Stress Impact Technical Interview Performance?”

Authors: Mahnaz, Behroozi, Shivani Shirolkar and Chris Parnin, North Carolina State University; and Titus Barik, Microsoft

Presented: ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering, Nov. 8-13

Abstract: Software engineering candidates commonly participate in whiteboard technical interviews as part of a hiring assessment. During these sessions, candidates write code while thinking-aloud as they work towards a solution, under the watchful eye of an interviewer. While technical interviews should allow for an unbiased and inclusive assessment of problem-solving ability, surprisingly, another possibility is that technical interviews are instead a procedure for identifying candidates who best handle and migrate stress solely caused by being examined by an interviewer (performance anxiety). To understand if coding interviews—as administered today—can induce stress that significantly hinders performance, we conducted a randomized controlled trial with 48 Computer Science students, comparing them in private and public whiteboard settings. We found that performance is reduced by more than half, by simply being watched by an interviewer. We also observed that stress and cognitive load were significantly higher in a traditional technical interview when compared with our private interview. Consequently, interviewers may be filtering out qualified candidates by confounding assessment of problem-solving ability with unnecessary stress. We propose interview modifications to make problem-solving assessment more equitable and inclusive, such as through private focus sessions and retrospective think-aloud, allowing companies to hire from a larger and diverse pool of talent.

Leave a Response

Your email address will not be published. All fields are required.

  1. After further review of the research paper, it seems that the “public setting” interview is flawed. It does not seem to accurately model interviews at the top tech companies. Even though they used a question that has also been used at top tech companies, the way in which they responded to candidates was deliberately “brief”. They provided no further details about how they conducted these interviews. They provided no evidence that this is in fact how the top tech companies conduct interviews. There is nothing in this paper to indicate that they surveyed top tech companies to determine how those companies actually conduct the interviews. They have done some previous research on opinions from GlassDoor.com, but how accurate are those opinions? Are people as likely to complement as they are to complain? I have interviewed at Google. The interviewers were not only responsive to my questions, but they also encouraged me to find better solutions and exception conditions. There were no “brief” responses. Also, the interviewers are the same people who designed the research. They could have let their own biases about these kinds of interviews interfere with how they responded to the candidates. The interviewer should have been a neutral party.

  2. looks like this research is highly biased and is specifically made for hype.

    i concluded hundreds of interview myself on both sides of table . as interviewer and candidate. i can’t say that stress is main obstacle. and that tasks are easier to solve alone, without hints or clarifications from interviewer. there are specific tasks, likes ones from leetcode or hackerrank that could be solved this way. but at least half of interview is about how people approach tasks, how they are reasoning, not how they solve purely algorithmic tasks.

    another strange point about women. ALL of them failed on interview and ALL passed when solving tasks solo. really? ALL?
    this raises a question, what kind of tasks there were and how “success” was measured.

    to me seems its too much bias in research . however i’d like to have a look at tasks which were solved and on solutions. to draw my own conclusions

    ===
    and as mentioned by Joe, programming job is not about working alonw, its about working with people. for now 50% of my time is communicating and aligning with others, not writing code. nobody will give you well defined algorithmic problems in real job

  3. There are so many flaws in the claims of this article, I wouldn’t even know where to start.

    The problem isn’t whether or not an interviewer is observing the interviewee, it’s in a dozen other key contextual factors in framing & executing the “technical interview”. That’s in quotes, because a realistic, simulated work-sample test is proven to be the most accurate predictor of on-the-job performance. Not just in tech, but in ANY position.

    The two approaches mentioned are sad, lazy, and completely useless, but at least they serve the pre-determined narrative.

    The conjecture by Mr. Parnin is also a riot. All the study “suggests” is that these 48 CS students should have been better prepared for collaborative problem-solving by the institutions they’re paying 5-6 figures to NOT set them up for a successful career in Software Engineering.

  4. I’m not going to call out specific people in the community and allow them to speak up if they want to. On a very popular engineering forum this sentiment has been paralleled by many if not most high level FAANG company employees and hundreds of others. It’s ridiculous that we are not allowing people who want to solve problems and more importantly CAN by creating these artificial barriers.

  5. As an entrepreneur, I was very excited about the results described in this article. I’d really like better ways to interview and hire the best candidates regardless of their demographic. Unfortunately, I was disappointed once I read the actual research paper.

    The first paragraph of this article ends with, “The interviews may also be used to exclude groups or favor specific job candidates.” This is pure speculation, there is no evidence in the study to support this statement. Information regarding race and ethnicity was not collected.

    “Our study suggests that a lot of well-qualified job candidates are being eliminated because they’re not used to working on a whiteboard in front of an audience.” This depends on your definition of well-qualified. Software is built by teams–not individuals. Whiteboard collaboration has proven to be very beneficial for software teams. Software is also built in high-stress environments with deadlines. Social skills and coping skills are important for all jobs. Software is not an exception.

    The study unfortunately did not use a within-subjects design, so there are many questions that we can’t answer. For example, was poor performance in private correlated with high performance in public? This could be true for extroverts. Also, to what degree was poor performance in public related to high performance in private? It’s possible that private interviewing made no improvement for people with high performance in public. This is the assumption of current popular interviews, and this paper has provided no evidence to confirm or reject it.

    “…all of the women who took the public interview failed, while all of the women who took the private interview passed. Our study was limited, and a larger sample size would be needed to draw firm conclusions…” I am very much in favor of hiring more qualified women into the tech industry, but this statement is irresponsible. It is irresponsible to suggest that public interviews are discriminating against women while at the same time saying that you didn’t have enough data to support this claim. You are scientific researchers in computer science. The social sciences and the general public are listening. This study was mentioned in this week’s ACM TechNews. Think carefully about what you say before you say it, otherwise you run the risk of being misunderstood.

  6. How might we make interviewing an equitable process for people with various mental health issues and different socioeconomic disparities? The current hiring practices are demeaning to those who don’t fit the mold of charisma and privileged academic upbringing.