The entire recruiting landscape is undergoing a profound transformation as organizations grapple with the implications of AI and the economic disruption 2025 is bringing. Talent acquisition teams are drowning in applications while simultaneously being asked to do more with fewer resources. Candidates find themselves in increasingly dehumanized processes where ghosting is now the norm. At the same time, regulatory bodies are developing laws to ensure fairness and transparency around the use of AI in hiring.
So, how can employers navigate this challenging terrain while creating fair, accessible, and effective hiring processes?
My guest this week is Ruth Miller, a talent acquisition and HR consultant who works across the public and private sectors. Ruth is an advisor to the Better Hiring Institute, working with the UK Government on developing legislation around AI in recruiting. In our conversation, she shares her insights into how organizations can proactively develop strategies that balance innovation with compliance while enhancing rather than diminishing the human elements of hiring.
– Different perceptions and reactions to AI among employers across sectors
– The paradox of AI both introducing and potentially removing bias from hiring processes
– Neurodivergent candidates and AI in job applications
– Common misconceptions job seekers have about employers’ AI usage.
– Strategic advice for organizations implementing AI in recruitment
– The future of recruitment and the evolving balance between AI and human interaction
Follow this podcast on Apple Podcasts.
Follow this podcast on Spotify.
00:00
Matt Alder
TA teams are overwhelmed. Job seekers feel invisible. And while AI promises efficiency, it also threatens to dehumanise the process entirely. As organizations race to implement technology that could transform hiring, they’re colliding with regulators determined to prevent discrimination in this volatile landscape. Who’s getting it right and what’s the real cost of getting it wrong? Keep listening to find out.
00:30
Matt Alder
Support for this podcast comes from smart recruiters. Are you looking to supercharge your hiring? Meet Winston Smart Recruiters AI Powered Companion. I’ve had a demo of Winston. The capabilities are extremely powerful, and it’s been crafted to elevate hiring to a whole new level. This AI sidekick goes beyond the usual assistant, handling all the time consuming admin work so you can focus on connecting with top talent and making better hiring decisions. From screening candidates to scheduling interviews, Winston manages it all with AI precision, keeping the hiring process fast, smart and effective. Head over to smartrecruiters.com and see how Winston can deliver superhuman results.
01:35
Matt Alder
Hi there. Welcome to episode 699 of Recruiting Future with me, Matt Alder. The entire recruiting landscape is undergoing a profound transformation as organizations grapple with the implications of AI and the economic disruption 2025 is bringing. Talent acquisition teams are drowning in applications while simultaneously being asked to do more with fewer res. Candidates find themselves increasingly dehumanised processes where ghosting is now the norm. At the same time, regulatory bodies are developing laws to ensure fairness and transparency around the use of AI in hiring. So how can employers navigate this challenging terrain while creating fair, accessible and effective hiring processes? My guest this week is Ruth Miller, a talent acquisition and HR consultant who works across the public and private sectors. Ruth is an advisor to the Better Hiring Institute, working with the UK government on developing legislation around AI in recruiting.
02:42
Matt Alder
In our conversation, she shares her insights into how organizations can proactively develop strategies that balance innovation with compliance while enhancing rather than diminishing the human elements of hiring. Hi Ruth, and welcome to the podcast.
03:01
Ruth Miller
Thank you. Nice to be here.
03:03
Matt Alder
An absolute pleasure to have you on the show. Please, could you introduce yourself and tell everyone what you do?
03:10
Ruth Miller
Yeah. Thank you, Matt. I was the head of Talent Acquisition within global retail and banking organizations and for the last 15 years I’ve been a talent acquisition and HR consultant, consulting with and training a wide range of private, public and nonprofit sector clients to navigate employment challenges.
03:28
Matt Alder
Fantastic. And I know you speak to a really sort of Wide range of people. I’m just really interested in the kind of the challenges you’re seeing in the market at the moment for talent acquisition people, for hiring managers and also the job seekers themselves.
03:43
Ruth Miller
Yeah, absolutely. And I’m sure this mirrors a lot of podcasts you’ve already done. I’ve heard some of them saying that most organizations, whether they’re small or large, are getting massive numbers of candidate applications for both talent acquisition and hiring managers to script screen. And how do they screen applications generated by AI? So that could be the same problem. It could be two problems. You know, a lot of applications, some of them are AI, but not all of them. And also, I think if we’re looking at businesses, I will go on to counter in a minute, but businesses, I think they’re thinking, gosh, you know, I should be introducing AI into recruitment strategy, but how do It effectively? You know, how do I avoid biases? When’s the law going to come into effect and how should I even start?
04:31
Ruth Miller
And then on the candidate side, they find it both demoralising, frustrating to look for jobs, as often they’re getting no replies to their job applications, they often get no feedback or replies after multiple interviews and assessments, and particularly early careers level. I hear of a very automated and dehumanised process where they may not speak to an actual person until they’re three or four stages into it. Then I think there’s a challenge. We both hear about where companies are restructuring at the moment, have much smaller talent acquisition teams to handle hiring.
05:06
Matt Alder
Yeah, no, it’s interesting. I think that graduate recruitment thing’s interesting as well. I was talking to someone the other day whose son was kind of going through that sort of process, and they were sort of shortlisted for four different companies and they ended up going with a completely different company outside of their sort of target market, because that company actually phoned them up and everyone else had just been automated. So it was an interesting kind of insight into what that must be like. I want to zoom in on some of the things that you talked about there in a second, but before we do, you’ve got quite a unique insight into what’s going on with AI and hiring through the work that you’re doing with the Better Hiring Institute. Tell us a bit about that.
05:46
Ruth Miller
Okay, so if you haven’t heard of it already, I’m an advisor to the Better Hiring Institute. It’s basically an independent body working with the UK government, and I was involved with developing the AI and recruitment toolkit with them. You download it for free. From their website, if you’re interested. So a couple of weeks ago I was at a meeting with the Cross party parliamentary group who are developing AI legislation. So with the main peer who leads AI for the government. And you know, I see the data about use of AI by organizations and job candidates. UK AI legislation hasn’t been finalised yet, as I’m sure you’re aware, but it’s likely that facial recognition software will be illegal. And we know a lot of organisations are conducting video interviews where you screen candidates based on facial behavior. That won’t be legal anymore.
06:43
Ruth Miller
As we know current legislation, particularly on GDPR already covers that. You can’t make a hiring position with AI. But what I see is the UK government is probably going to land somewhere between some of the US states which allow more use of AI and the EU where it’s more tightly regulated. So I guess you know, the view on AI in terms of answering your question from government level is they are hearing from lots of different interested parties, whether that’s organizations or candidates looking for jobs or public sector bodies, and they’re looking at what other countries are doing. So they haven’t finalised anything yet. They’re of course aware of all the concerns, but they’re also aware that things can’t go on as they are where there really isn’t enough regulation about what companies do.
07:31
Matt Alder
Yeah, I mean, that’s kind of interesting and I think that kind of squares with what I was thinking or other things that I’ve heard that the UK has been very much waiting to see what happens with America, what happens with the EU, and potentially plotting a course in between. So, yeah, it’ll be interesting to see how that plays out. But I think by the sounds of it, I don’t think there’ll be any kind of big surprises.
07:50
Ruth Miller
No, I don’t think there will be either. Yeah, I think it’ll also depend on, I think some of the lobby groups and some of the discussions we hear in the media about, you know, how effective they are in lobbying the government to be more or less regulated, really. So we’ll see.
08:06
Matt Alder
Yeah, it kind of reminds me of my politics teacher back at school who said the thing about laws, they’re like sausages, you know, you kind of consume them that you don’t want to see how they’re made.
08:15
Ruth Miller
Yeah, absolutely. But it’s certainly very interesting and I think one of the most interesting things, apart from hearing from the government people themselves that are working on this, is hearing from all these different organizations and them hearing Perhaps some of the challenges they’re coming up with, not every single organization might deal with. But you know, it certainly gives me quite, I suppose, a wide perspective on, you know, on different organizations and where they’re going to go with this.
08:43
Matt Alder
Give us kind of a bit more detail on that. You know, what the kind of perception, what are the reactions to AI amongst employers? Is there a difference between public and private sector? What are people saying? What are people concerned about?
08:54
Ruth Miller
That’s a good question. I think there are a couple of differences between public and private sectors. I think with the use and perception of AI, I guess there’s money is one of them that particularly the larger private sector organizations who perhaps already introducing AI into hiring, they tend to have bigger budgets, don’t they, than the public sector which is struggling with their cost base. So those organizations, you know, those public sector organizations have got less room to introduce new AI powered technology, I suppose from an ATS CRM point of view. But interestingly, they are starting to use generative AI platforms more in the public sector as they’re free of course, aren’t they? Will you campaign for them? But a lot of them are free. So I think in answer to that question, less use of bought technology, more use of generative AI.
09:46
Ruth Miller
But the second difference is while all organizations are concerned about potential bias in the use of AI when hiring, public sector roles are often more regulated, aren’t they? Such as teaching or social work, which runs into how they hire in general and that they’re a lot more, there’s a lot more compliance and checking in terms of DBs and other types of background checks around how they hire. So that feeds into whether an organization is more or less risk averse, doesn’t it? Which includes their use of AI.
10:21
Matt Alder
Do you see that sort of thinking around bias developing? Because obviously the kind of accepted narrative at the moment is like, oh, AI can have this sort of bias built into it and we need to be, you know, quite rightly careful about that. But there are also a lot of voices saying, well actually AI offers us the opportunity to remove bias in the recruitment process where we’re kind of relying on humans or there isn’t a kind of a level playing field for everyone. I mean, how do you see that developing?
10:47
Ruth Miller
Absolutely, I think that’s a good question. So yes, absolutely, AI can bring in bias because it’s, you know, it’s drawing on data and learning from humans who are biased. But then that leads into humans actually doing any sort of assessment and selection and we are biased too. So I suppose my view is both the biased. Our job is to ensure that we have evidence based recruiting, whether it’s AI or whether it’s humans doing it, you know, so to me it’s all around, you know, testing and reviewing and piloting any new process, whether that process were AI driven and bringing in new technology or whether it’s human driven. So yeah, my view is that shouldn’t be a reason for not look for not bringing in AI technology. That the concern about bias actually, you know, if you’re doing it properly, you will have reviewed that.
11:39
Ruth Miller
And yeah, you know, if you actually hire properly and you know, evidence based and you’ve got, you know, robust process at the moment, then you’re hoping to avoid human bias. So you would do the same with AI.
11:55
Matt Alder
In terms of bias. There’s also accessibility and things like that. What’s the situation with kind of AI and things like neurodiversity?
12:06
Ruth Miller
So going back to the Better Hiring Institute, the data they produced with a body called Arctic Shores shows that neurodivergent applicants, those from lower socioeconomic backgrounds and people of color, are more likely to use AI to apply for jobs. And as we know, those are often the groups organisations that are keen to encourage to apply, aren’t they? You know, we’ve already talked about, you know, how do we avoid bias when we introduced AI software? Well, you know, their jobs organisations have to already make sure that any testing of software meets the requirements of the Equality Act 2010 in the UK and doesn’t discriminate. But for example, I’ve heard of several, you know, examples recently.
12:51
Ruth Miller
You might have seen them too, either on media or on things like LinkedIn where several companies were hiring managers, rejected candidates because they thought they were using AI on their applications when actually the candidate was neurodivergent. So, you know, neurodivergent. I suppose I’m going with this is neurodivergent. Candidates might use AI more to apply for jobs. But you know, we can’t make, you know, as managers or as humans, we are not able to tell with the human eye and no software is able to tell whether a candidate has used AI for their job application. And even if they have, you know, it may be that somebody is neurodivergent in terms of how they communicate. It may not be AI. So all of these things do come back to the conversation we’re having about putting in barriers and putting in biases, your process.
13:46
Matt Alder
Yeah, that’s really interesting and I’ve kind of heard stories of people submitting essays at school or university and them being failed because they used AI and they didn’t use AI, they just used perhaps some of the words that are kind of most associated with some of these models when they’re sort of used badly. And I suppose that brings us nicely on to talking about a broader conversation about job seekers. I mean, how is it being used and also what other kind of misconceptions that job seekers might have about how employers are currently using AI.
14:17
Ruth Miller
So I do pro bono work with sick formers in schools applying for university apprenticeships and also with recent graduates looking for jobs. And the common misconception is that all employers now use their AI to screen job applications. I think this is driven by the fact that candidates rarely get replies to job applications and also the media which is fanning the flames of AI running absolutely everything, including, you know, whether you get a job. So many candidates are using AI for job applications, particularly those who are neurodivergent, as we’ve said, or hear about, have English as a second language, but they do worry about whether it will disadvantage them. And then others are using the new AI bots and there seem to be more of these every day, don’t there to set up job alerts and apply for jobs.
15:06
Ruth Miller
So they run the risk of applying for jobs which aren’t suitable for them or sending out very generic applications. I guess. You know, some hiring managers like candidates using AI as they’ve got forward facing skills, but some really don’t. But the main issue here, as we said, is that no software or human eye can 100% identify where the AI is being used. So, you know, we need to make sure we’re using only objective criteria to screen and not people’s personal views on AI. We need to communicate to candidates, don’t we, about what effective and acceptable use of AI for applications involves in your organisation in all our communications with them, whether that’s, you know, on all our socials and our career sites or whether that’s when we’re actually communicating with them about interviews and assessment. It’s a two way street.
15:59
Ruth Miller
So we can’t expect to communicate to candidates what we expect without letting them know how we as an organization are using AI or not. I think we should even communicate if we are not. And it is also likely to be illegal to use AI in hiring in the future without being clear on all your communications that you are using it.
16:22
Matt Alder
Yeah, I mean that makes perfect sense. And I think it’s. The thing about this aspect of it is to me it kind of starts to blow the whole recruitment process open. And I’m hoping that it means that we’re going to build a better, fairer recruitment process. And the communication around whether people are using AI or not is just so broken. I mean, I saw a, an article, I read an article yesterday on the BBC news site. Obviously an organization that grounds itself on impartiality and both sides of the story. And it was a former Tory mp, someone who’d lost their seat in the last election complaining that they couldn’t find a job because they’d applied for hundreds of places, but AI was screening out their application because they didn’t go to university. We know that’s. And there was no comeback in the article.
17:09
Matt Alder
It was just presented as opinion, as fact. And I was like, I’m in two minds about it because obviously that’s not true. But actually the truth that it could be someone’s human bias or literally just not being able to deal with volume applications, I wonder whether that was worse to tell someone. So. Yeah, it’s a tricky one.
17:28
Ruth Miller
It really is. Yeah, absolutely, I saw that too. And I must say I did think, gosh, if you’re applying for hundreds of jobs, you know, is the issue that actually you’re not adapting to your application, you know, specifically enough to a job, so therefore you get rejected by a human because you don’t have a great application? But that was just my initial.
17:47
Matt Alder
Yeah, no, exactly. Yeah. And I think it’s, you know, it sounds like that there is perhaps legislation on the way to help, but I think there’s a real onus on employers to really explain how their recruitment processes work and what’s being used and what’s not being used. And that kind of communication just isn’t really out there at the level it should be at the moment. I suppose, following on from that, what advice would you give employers around sort of AI right now in terms of how they should be thinking about it, what they should be doing? You know, I know there’s lots of confusion and lots of inertia out there because things seem very difficult or complicated or people are waiting for legislation, but what would your advice be for what employers should be doing right now?
18:27
Ruth Miller
Well, I guess, you know, any strategy for AI has to be the organization strategy for AI. I hear a lot of talent acquisition teams talking about, oh, what we going to do? Well, your organization is going to have a strategy of where AI fits in or doesn’t, or, you know, how you’re going to review its use so the HR should not be doing something in a separate Runway. But I guess, you know, if an organization’s looking to bring in some sort of AI technology, which, as we know is very different from general automation, it’s not an ATS is not AI, is it? So, but if it’s.
19:02
Ruth Miller
They’re actually looking at bringing in genuine AI, they need to work with the technology suppliers to find out how that AI tool has been tested, to make sure that, you know, it has been tested on sources which don’t introduce bias, and then pilot it internally to test it themselves before launching it, you know, because the risks are, for example, technology shortlisting a very narrow pool of candidates for a vacancy or rejecting perhaps people from different backgrounds, as we’ve already talked about earlier. And going back to what’s going to be the likely legislation from a UK government, actually, it will be that even if you have bought technology from a supplier and they say, tick box, it’s all tested and fine to go legally, it’ll be the employer who is responsible.
19:55
Ruth Miller
And if a candidate then complains and says, you’ve discriminated against me by using AI in the hiring process, it won’t be the supplier.
20:03
Matt Alder
So, which is identical to how it.
20:04
Ruth Miller
Is in exactly now. And I hear a lot of suppliers going, oh, it’s the test, they do absolutely fine, you know, actually avoid bias. But, you know, I. It’s not lack of trust, it’s more that, well, as my compliance process is that, you know, I’m responsible as an employer, so I need to test it, I need to review it myself, and then it’s about data and auditing, as it. As it is at the moment, really, with regular reviews and being clear about what successful use of AI is within your business. And then if we’re moving away from AI technology and we’re looking just at generative AI, if you’re using something like ChatGPT or Copilot, then you need to make sure you’ve got clear policies on their use.
20:47
Ruth Miller
At the moment, I’m not necessarily seeing that often it’s driven by individuals and their knowledge of IT and, you know, and testing it out. And then, of course, I would say this as a trainer, but that, you know, HR teams have received training on how to use generative AI to create documents, you know, for use within. Within a company.
21:07
Matt Alder
Yeah, no, 100%. I think that’s the. It’s kind of the thing that’s missing at the moment, I think, is that sort of really helping people, you know, understand this above and beyond. The very necessary act of them exploring it and experimenting themselves. So as a final question, what do you think the future looks like?
21:24
Matt Alder
I mean, how might things look.
21:25
Matt Alder
In, in sort of two or three years time?
21:28
Ruth Miller
Okay, interesting one. Yeah. Well, I think we’ll see the UK legislation launch, which we’ve discussed, so organizations will be clear about what they can and can’t do. And I think we’ll move to every organization having a business strategy regarding AI which HR will be included in. And I think while there has been a focus on HR and talent acquisition teams losing jobs because of AI or a worry about that, to me the positives about introducing AI are to manage the administrative and time consuming parts of the hiring process for candidates, hiring managers and recruitment teams. So we know that no candidate is going to join a business because you’re such a great user of AI. It’s about human discussion and engagement. So during the interview and offer process, there has to be that human contact. Human contact, sorry.
22:21
Ruth Miller
And any AI implementation should involve the total remapping of your hiring process to identify where AI should be involved and where there should be that human interaction. So no, it definitely shouldn’t be that. Organizations format the whole process to AI as they’ll start to have a lower level of candidate engagement and people dropping out of the process. So I suppose where I’m going is that yes, yeah, I think there is a risk of HR in general losing people because of AI. But you know, essentially we are a human driven function and we will need humans to engage people to join or work for an organization. And while the economy isn’t in great shape at the moment, once it picks up, there will be a more challenging time to attract candidates and retain those employees.
23:09
Ruth Miller
And then, you know, those human skills will be more and more important. I think companies will also be remapping the skills they want to attract and retain. And they’ll need creativity, communication, resilience, adaptability, all things that at the moment, AI isn’t as good at humans at. I know you’ve had guests on talking about skills based hiring and I agree with them that focusing on these skills and identifying them will become more and more important.
23:38
Matt Alder
Ruth, thank you very much for talking to me.
23:40
Ruth Miller
It’s been an absolute pleasure. Thanks for having me on.
23:44
Matt Alder
My thanks to Ruth. You can follow this podcast on Apple Podcasts on Spotify or wherever you get your podcasts. You can search all the past episodes@recruitingfuture.com on that site. You can also subscribe to our weekly newsletter, Recruiting Future Feast and get the inside track on everything that’s coming up on the show.
24:07
Matt Alder
Thanks very much for listening.
24:09
Matt Alder
I’ll be back next time, and I hope you’ll join me.