Skip to main content

Book Review New tools of oppression

JAMES WALSH is appalled by the implications of introducing AI to the workplace 

The Algorithm: How AI Can Hijack Your Career And Steal Your Future
Hilke Schellmann, Hurst, £22

A FRIEND of mine once wrote a sketch about a management training guru, who claimed to have invented a tool that can tell “whether you’re a blue, a green, a red or... a dickhead.” Compared to the products and procedures documented in Hilke Schellmann’s The Algorithm: How AI can Hijack Your Career and Steal Your Future, the dickhead test is positively scientific.

 

 

One can usually tell a scam by its acolytes; Rishi Sunak, Elon Musk and Wes Streeting count among “Artificial Intelligence”’s cheerleaders. It will, apparently, automise many jobs, turbocharge the economy, and save the NHS. To paraphrase Blackadder, there is just one problem with this narrative: it is bollocks.

AI isn’t revolutionary. We are not on the cusp of robots with feelings or computers capable of independent thought. Instead, it’s theft: large language models trained on material by writers and artists who never consented for their material to be used in this way, with the cheap-trick ability to regurgitate uncanny-valley approximations of human creativity. 

AI is also an environmental catastrophe: machine learning uses an extraordinary amount of computer power, with data centres around the world burning through more fossil fuel than the world can sustain.

Schellmann’s book focuses on one specific aspect of the AI con: its implementation into the world of work, the promises made, the grim reality, and the consequences for the future. She has interviewed scores of true believers and whistle blowers, alongside ordinary people whose working lives have been “disrupted” by anonymous, unaccountable algorithms.

As ever, these latest tools for oppression were first used on the working classes: delivery drivers sacked by automated message; warehouse workers disciplined by algorithmic surveillance; and even bus drivers in China wearing brainwave-monitoring hats that use AI to detect “emotional spikes such as depression, anxiety or rage.” Even white collar workers are no longer safe, when millions are already being filleted by keywords with profound discriminatory implications, or working in jobs where you have to wiggle the mouse every 30 seconds to convince the machine that you’re “working.”

Ken Willner, an employment attorney, found many troubling things when given access to the black box of data that these AI tools rely on. “The algorithm could look for shoe size, and people with a larger shoe size may be associated with doing well... that’s not related logically to job performance, it just happens to be a correlation that may exist. This may happen to have an adverse impact on women, who have smaller shoe sizes on average.”

Willner found “Afric*” was being used as a keyword to score job applicants. Another example of non-job-related demographics being used by the algorithm is zip codes: in a country with such a history of segregation as the United States, this is “in practice, a proxy and could have a disparate impact on racial minorities,” according to Ifeoma Ajunwa, an expert in AI and hiring.

A lot of what is documented here is management consultant phrenology, the discredited theory that the shape of your cranium describes the contents of your brain. “We’re worried about the snake oil or the bad science in our industry,” says the CEO of a company that makes bullshit “good employee” games, with no apparent irony. And in the case of techniques like facial expression recognition to figure out if someone is “confident” or “happy” — owners of resting bitch face need not apply — we’re seeing a technology with few non-dystopian uses trying desperately to find a purpose.

The only criticism of this by turns enjoyable and disturbing read is that the author can be a little too forgiving of the shysters and frauds she gently exposes by their own words. And her conclusions are feeble, given the depths of what she diligently reports upon. 

Without regulation — and fast — all of this is going to get a lot worse, but the solution isn’t to make AI “more fair”; it’s to challenge the outsourced, surveilled, and corporate model of employment on its very existence.

OWNED BY OUR READERS

We're a reader-owned co-operative, which means you can become part of the paper too by buying shares in the People’s Press Printing Society.

 

 

Become a supporter

Fighting fund

You've Raised:£ 12,822
We need:£ 5,178
1 Days remaining
Donate today