For context, interviewing.io is a platform where people can practice technical interviewing anonymously and, in the process, find jobs-do well in practice, and you get guaranteed technical interviews at companies like Uber, Twitch, Lyft, and more.

Over the course of our existence, we’ve amassed performance data from thousands of real and practice interviews.

Data from these interviews sets us up nicely to look at what signals from an interviewee’s background might matter when it comes to performance.

Interview questions on the platform tend to fall into the category of what you’d encounter at a phone screen for a back-end software engineering role, and interviewers typically come from a mix of large companies like Google, Facebook, and Uber, as well as engineering-focused startups like Asana, Mattermark, KeepSafe, and more.

Why did schooling matter in this iteration of the data but didn’t matter when was looking at resumes? I expect the answer lies in the disparity between performance in an isolated technical phone screen versus what happens when a candidate actually goes on site.

With the right preparation, the technical phone interview is manageable, and top schools often have rigorous algorithms classes and a culture of preparing for technical phone screens.

Having been a founder didn’t matter at all when it came to technical interview performance.