AI is revolutionizing the hiring process. From automated screening to video interview analysis, many tools promise faster, smarter recruitment. But faster isn’t always fairer.
Recent studies, including one from researchers in Australia (TheGuardian), show that AI-based hiring tools may inadvertently discriminate against candidates with non-standard accents, speech patterns, or cultural expressions. This opens up uncomfortable - but necessary - questions:Is your hiring AI judging more than just qualifications? And how can HR reclaim visibility and fairness in a process that's becoming increasingly opaque?
🎙 The accent bias problem
In the Melbourne study (TheGuardian), researchers found that speech recognition models used in AI hiring systems showed up to 22% higher error rates for people with regional accents or English as a second language. Some candidates were even misclassified as “uncooperative” or “incoherent” based on voice tone or phrasing - despite giving completely reasonable answers.
This kind of bias isn’t just unfair. It’s a legal, ethical, and reputational risk.
🚫 Ulla doesn’t run the interview. It observes it
Unlike many tools, Ulla HR is not an automated hiring decision-maker.It doesn't grade, rank, or filter candidates.
Instead, Ulla acts as a silent observer - an AI that listens during interview conversations (online or offline), providing clear and GDPR-compliant insights to HR professionals, not judgments to algorithms.
Where most tools might generate a pass/fail output, Ulla empowers HR and recruiters with rich behavioral data:
- Speaking time balance (is the candidate given enough space to speak?)
- Clarity of communication (was the message understandable?)
- Engagement signals (was the tone confident, hesitant, disengaged?)
- Supportiveness & responsiveness (for internal panel interviews)
No one is rejected because Ulla “thinks” they sounded off. Instead, Ulla helps HR teams notice patterns, biases, or even interviewer issues - without replacing human judgment.
✅ Recognising accents ≠ Penalising them
Ulla is trained to handle a wide range of English accents and international speech patterns. Because it doesn’t use its analysis to make hiring decisions, but rather to support reflection and fairness, it avoids the trap of turning language diversity into a technical “problem.”
In fact, Ulla can highlight when:
- An interviewer dominates the conversation
- A candidate isn’t given enough time to elaborate
- Non-verbal cues (like long pauses or interruptions) affect candidate comfort
This shifts the responsibility back where it belongs: on the hiring team to reflect, improve, and correct their own biases.
🧠 What HR can do with Ulla
With Ulla, HR leaders and talent teams can:
- Audit interviews for bias, tone, balance, and clarity - without judgment
- Use structured observations to train recruiters and hiring managers
- Identify whether certain questions or moments consistently lead to lower engagement
- Track inclusivity over time - are some candidate groups less heard?
By combining ethical observation with privacy-first AI, Ulla helps you fix your own house before AI burns it down.
TL;DR
Some AI tools talk too much - and listen too little.
Others listen, but judge.
Ulla listens, analyzes, and gives the power back to people.
If your hiring AI feels like a black box, maybe it’s time to switch to something that sees nuance, respects context, and keeps your hiring team - not your algorithm - in charge.