We’ve heard that artificial intelligence can make so many things better, while reducing the amount of manual effort required. But have you ever thought that AI can help in hiring as well? Yes, not just to make the process more efficient, but also to remove any biases. Pymetrics, a company founded in 2013 by neuroscientist Frida Polli is doing exactly that. And at a time when the pandemic and social unrest in the US is showing that our world will be changing sooner than later, it was opportune for us to talk to Ms Polli at the sidelines of Collision From Home event.
PS: the interview has been edited for the sake of brevity
1) You have an unusual journey, so before we talk about Pymetrics itself, we would love to know how you transition from academics to founding a company?
She replied that “I was an academic to begin with. I was a cognitive neuroscientist for 10 years at Harvard and M.I.T. and I became interested in building recruiting software. After that, I went to business school to get an MBA and saw recruiting first-hand for two years, because that’s what MBA students do. And that’s sort of how I transitioned from being an academic neuroscientist to being an entrepreneur is through this discovery of a problem. That is, how do we select the right people for jobs? Since I saw this problem in the MBA school, I realized that there was a software company that could be built out of that.“
2) While there are so many hiring solutions available, be it backed by humans or machines, how does Pymetrics stand out?
Ms Frida Polli said that “there are a couple of ways that we stand out: the first is that we measure people differently. So historical ways of measuring people, unfortunately, aren’t particularly good and they lead to a lot of discrimination. What I mean by that is personality tests and traditional cognitive tests measure people in ways that were based on systems that were designed in the early 1900s. So you can imagine that they’re not very up to date in terms of thinking about people specifically from a diversity perspective. For example, if those old personality tests say emotionally stable people are always better employees than people that are not emotionally stable. Well, it just turns out that women on average tend to be less emotionally stable than men as defined by this personality test. Hence, it’s not very gender friendly. If you are using a personality test that’s based on that because you’re going to be excluding women. So a lot of these tools weren’t created with the modern sensibility in mind. And when they’re still used today for hiring, they create a lot of challenges to gender equality, racial equality, socio-economic equality, and so on and so forth.“
“One of the main things that we did differently was to measure people in a new way. So that was my background as a cognitive scientist, is realizing that we had developed all these new ways to measure people and those ways are far more precise and accurate, but they are also much more diversity-friendly. So that’s one of the core components of the platform that makes us quite different from everything else that’s out there,” she added.
3) While there’s no denying that AI is useful, there’s also a lot of research that suggests that algorithms aren’t free of bias either, so how do you ensure that your dataset is devoid of any such biases?
She told that “we create all our algorithms with a dual optimization process in mind, meaning we will optimize for both, performance of the algorithms and fairness. So there are legal definitions of what constitutes fair hiring in most countries. Whereas a lot of older technologies in the space really didn’t take the fairness consideration at all. Because if you think about it and just, you know, 10, 20, 30, 40 years ago when they were developed, people were not as concerned about gender equality and racial and socioeconomic equality.“
Adding further,”having said that, we can only know for sure that our algorithms aren’t biased for the things that we check for. So, for instance, I want to give you a silly example. I don’t know whether our algorithms are biased when it comes to hair color. It might be that blind people have a better outcome than brown hair people. So that might be a bias that I’m introducing that I’m not aware of. However, I’m not too concerned about this aspect because hair color is irrelevant. What we care a lot about are biases that relate to gender, ethnicity, age, disability status, and socioeconomic status. So those are sort of the main categories of biases that we think are important to mitigate, because those are the things in the workforce where we see them.“
“For race and gender or ethnicity and gender, we audit our algorithms in real-time to make sure that they are not that they do not show bias. For age and disability status, it’s not a real-time monitoring system, but we do have a periodic check to make sure that that with those two things, the bias hasn’t been introduced. So for the things that we think are important. My point is you can’t check for every single response because there are just millions of definitions. But so, we do test for things that are both important from a fairness perspective of the workforce and also a legal perspective. And for those things, I feel extremely confident in saying that our system works better. We have data to show this as over a million people at this point can confirm that our platform absolutely does not produce bias. So that I feel extremely comfortable in saying from the gender and ethnicity perspective, our algorithms don’t have any bias,” she concluded.
4) Why did you choose SaaS as a business model for Pymetrics?
She mentioned “the reason SaaS made sense is because if you think about, software as a service. Since we’re providing a software that will mitigate bias that is typically introduced at an early stage of the recruiting funnel where other types of either software platforms or services might be. So you might be using you know, an applicant tracking system or an interviewing software. So SaaS also made sense because there were other platforms that we saw that we were trying to disintermediate, and they were available in the subscription-based model. And then, another thing that attracted us towards it is that it’s more scalable.“
5) Pymetrics doesn’t replace hiring, it just makes it better for the companies to hire the right people without having any biases. So where exactly does it fit in terms of the hiring stack for your clients?
“It’s introduced at the very beginning, i.e. at the top of the funnel. That’s because, with traditional methods, about 90 percent of candidates could get excluded from any further consideration. So you can imagine that if you’re not impacting the recruiting funnel at the top, you’re going to have a lot of downstream effects that you can’t really solve for. And that’s why, it’s available at the first stage of hiring,” Ms Polli stated.
6) Can you share the evolution of Pymetrics since 2013?
Answering the question, she said that “it’s been exciting to continue to see the company grow, to take on new client’s new challenges. We’ve grown from when it was just two people to now 150 with offices around the world and being able to serve more clients, which is just exciting.“
7) How many companies are currently using Pymetrics? Is it being used by a specific category of companies or its use spans across all categories?
Per Ms Frida Polli, “about 100 companies are currently on the platform today. it really does stand across many, many different industries. We do hourly workers, as well white-collar workers. I mean, there’s really no one type of company or job that we are best suited for.”
8) One of the biggest differentiators for Pymetrics is its focus on diversity, and in the current context of social unrest in the US, it’s even more relevant than before. Could you share what companies are missing out, and how it could be improved going forward?
Giving an insightful perspective, she mentioned “well, I don’t think that human beings are bad. I’m an optimist, and human beings are good-natured. However, it’s back to this idea that we have been trying for too long to assume that human beings could remove bias from their own, their own processes, their own brains. And we’ve been trying that now for 15 to 20 years, and it’s been proved to be unsuccessful. And so, we must start looking at other ways of doing this.“
“There’s been some great academic research which we were lucky to be a part of, that describes how process and technology are really the only two things that are going to make a difference. So people have to become more comfortable with this idea that human beings are wonderful, but they are also biased. And to assume that they are not biased just because they’re nice people is just that that is going to continue to lead to these situations of civil unrest. And it’s a part of the reason that I think there’s a lot is there are many reasons for the civil unrest. It’s evident that people of color specifically African Americans and Latinos are really at a strong disadvantage in the hiring process. If you use a resumé, they’re at a disadvantage, and in other traditional forms of testing, they’re at a disadvantage. Therefore, we must change those hiring systems because otherwise, the civil unrest will continue to exist because fundamentally, they’re not being treated fairly by the hiring system,” she added further.
Not ignoring the harsh reality of the situation, she stated “we can think of other things that need to change, like police brutality and mass incarceration and all. And I completely agree with all of that, too. But there’s no doubt in our mind that we should look at hiring as a major cause of this type of unrest.”
9) With respect to the ongoing COVID-19 crisis, has the usage of your platform increased or decreased?
Ms Polli told us that “our usage has increased, though there’s obviously some shifts in terms of the client’s budgets and everything else. But overall people are moving towards more digital processes anyway for obvious reasons, because they want to do more things remotely. And our platform allows them to do that. And then on top of that, since we offer anti-bias or bias-mitigation properties, it’s become even more compelling to most clients. The platform also has features such as video interviews, which make it even more useful in the current situation.“
10) What’s the future looking like for Pymetrics?
Highlighting the need for having some industry standards, she replied “I think what we really need are some standards that say if you’re going to use artificial intelligence in this context or that context, you need to have these types of monitoring systems in place. I’m saying because otherwise what’s going to end up happening is someone’s going to be like I’ve heard that artificial intelligence can remove human bias and then they’re just going to buy any kind of platform that says they use AI. But if that platform doesn’t remove bias? It will introduce a whole new form of bias or whatever. I shouldn’t say new form, but it will just increase it potentially, and that will also be scalable. So I think that what we really need to think about as these platforms become more prevalent are some well-defined standards. And those standards must include reporting on the outcome of your platform. And that’s really what we’re pushing for. We’re involved in a bunch of efforts to change legislation in hiring so that platforms like ours are more regulated and are not given a pass.”
In addition to that, she said that “having a broader spread geographically and what-not definitely is important in terms of improving the algorithms. The more data we have, the better it will be, specifically for remote hiring purposes which will become the norm.”
11) What are your favorite SaaS products out there?
“Well, the one product that I’m using almost the entire day is Zoom, and considering that we all have been sitting at homes, it’s been a great way for me to stay connected with colleagues,” she replied, finishing the interview.