AI, Ethics and the Social Enterprises of the Future
While getting the technology right is undoubtedly a key part of AI, the ethics of its application are also significant. Priya Kantaria talks to Richard Skellett about the ambitions of his social enterprise project Digital Anthropology, from ethics to economics, and warns “this is a huge subject”
Digital Anthropology is the study of the relationship between people and digital-era technology. It is also the name of tech entrepreneur Richard Skellett’s social enterprise outsourcing company, which aims to bring awareness of and solutions to some of the fundamental sociological questions being created around tech.
One such question is: how can AI and automation benefit people instead of making them redundant from their jobs? Skellett goes one step further to ask: “What role does AI have in helping to enable your dreams and aspirations?”
“We’ve got a situation that exists at the minute,” he says. “Where AI, robotics, is definitely displacement-based. It’s not about augmentation, it’s about displacement.”
He uses Amazon in a hypothetical example, that if it had a business case where it could automate a warehouse and have no people in it for lower costs than employing people at minimum wage, Amazon would take it.
The rise of social enterprises
The conundrums of efficiency and convenience versus people and lifestyle are not new, but Skellett wants to know who is asking these questions and finding the solutions. The problem, he believes, is that efficiency and shareholder returns drive business decisions.
“Companies say people are assets, but they don’t treat them like assets they treat them like liabilities,” he says.
The problem might be considered economic one or about business models, and Skellett could be heard as an anti-capitalist, in spite of his entrepreneurial past with tech company Allied Worldwide.
But he says: “A social enterprise isn’t necessarily just about going off and doing good; a social enterprise business is one where there’s no shareholder return. So basically what the social enterprise is about is supporting the people and the communities.”
He believes a trend in social enterprises is coming, especially if it’s true that consumers are becoming more aware of the threats of money-driven, efficiency-driven shareholder business models that will turn to technology ahead of protecting employees and their right to work for a livelihood.
He predicts an ethical component to our consumer choices that will affect companies in the future because we will have to choose between companies that value people or buy from the technology solution.
“You have to use your moral compass,” he says. His moral compass has guided him to choose not to use Amazon, as an individual and within his businesses. Convenience might be high up in one person’s value system, but he chooses to use the shops and pay a little more, in the interests of people.
The message of technology companies
Problems around the relationship between tech and people cross into politics too. Skellett talks about the gap there would be between a digital tax, that’s been suggested, and income tax losses when robots and automation replace people.
The purpose of Skellett’s Digital Anthropology enterprise, then, is to say to corporate organisations that work in tech: take responsibility for the questions around displacement or augmentation of people that they are creating.
“I want to get to the source of the solutions, i.e. looking at the AI and robotics companies and say, hey this is about the people. Because generally being in the world, it’s about the impact and the differences we make with people.
“There’s a social impact piece that’s not being understood at the moment and Digital Anthropology wants to bridge that gap.”
To be clear, Skellett believes in progress and in technology. But he has seen his part-time jobs at the age of 13 be replaced by tech.
“I want to fight to have those jobs existing or have the choice of them not existing. The choice isn’t there now because they’ve been displaced by tech,” he adds.
“I think we ought to have a situation where there’s an option for an organisation and the individual to decide: do we want tech or do we want people, and understand the consequences and costs and make a conscious decision.”