Algorithms – they’re everywhere. Some are so clever they take your breath away. Others are surprisingly stupid. Does it matter? In the case of the algorithms that run our world and make decisions about us as individuals, yes. Is machine learning bias adversely affecting your business or your life? Would you even notice?
The weird world of algorithms
Algorithms are not always trustworthy. Their creators rarely fully understand how they work thanks to runaway machine learning. Some algorithms have nasties like racial and gender bias built in, taught using fatally flawed data. As an article from New Scientist says:
“Machine learning is increasingly being used to make sensitive decisions, says Matt Kusner at the Alan Turing Institute in London. In some US states, judges make sentencing decisions and set bail conditions using algorithms that calculate the likelihood that someone will reoffend. Other algorithms assess whether a person should be offered a loan or a job interview.
But it is often unclear how these systems come to their conclusions, which makes it impossible to tell if they are fair ones. An algorithm might conclude that people from a certain demographic are less likely to pay back a loan, for example, if it is trained on a data set in which loans were unfairly distributed in the first place.”
As Chris Russell, also from the Alan Turing Institute, says, “In machine learning, we have this problem of racism in and racism out.” There’s no reason to believe other algorithmic decisions – financial, sexual, criminal, medical and more – are unbiased.
LinkedIn’s job recommendation algorithm – A bit thick
As a content creator I understand my recruitment clients are concerned about HR algorithms that promise to pick the best candidates but actually do nothing of the sort. They leave suitable candidates out because their applications and CVs don’t contain the ‘right’ keywords. So it shouldn’t really be a surprise that LinkedIn’s job algorithm is a bit crap at recommending roles.
I have ticked the relevant box in my LinkedIn account several times in an effort to stop it suggesting jobs, because I’m not looking for a job. It still makes recommendations, but that’s another story. The point is, it suggests seriously irrelevant roles. This morning’s was for a junior copywriting vacancy. I have almost thirty years’ marketing industry experience and have been a freelance copywriter for the last twelve. And that somehow makes me a junior?
It’s disturbing because LinkedIn has full access to my CV and career experience. Any human being who can read would glance at my CV and see I’m not suited to a junior copywriting role. It isn’t rocket science, but the algorithm struggles with the basics. It’s a bit thick.
Why unbiased machine learning matters
In a world increasingly run by algorithms – think Google’s search algorithm, those used by online dating sites and systems used to decide who gets a mortgage or loan – unbiased machine learning matters. So does regulation. Algorithms can already outperform specialists at diagnosing disease, for example, but we still don’t have clear guidelines around what data they’re allowed to use.
Most of the time algorithms are created to do good. Google’s neural networks, for example, deliver a new way of translating speech without transcribing it, something that could soon outperform conventional machine translation. One US-based start-up is training algorithms to find signs of Alzheimer’s and Parkinson’s in people’s voices, teaching the AI using data from millions of phone calls to a health insurer. And a Japanese tech firm has trained its algorithm to act as an agony aunt, taught using 190,000 questions and 770,000 answers lifted from a forum.
Algorithms and AI aren’t going away. They’re fast taking over all sorts of vital tasks and decisions previously handled by humans. We deserve unbiased algorithms and genuinely intelligent AI, but right now that’s not what we always get. If you’re getting silly job recommendations from LinkedIn, it might just be the tip of a much more sinister algorithmic iceberg.