Digital transformation, privacy and the 'darkside' of the Metaverse
Technology is growing more and more sophisticated with each passing year. However, as the metaverse begins to unfold, we're only just scratching the surface to leverage data in new and innovative ways.
The metaverse is coming. But the question is — are we really ready?
According to Chris Wylie, the Cambridge Analytical whistleblower that spurred a massive overhaul of privacy practices and legislation across the globe, the answer to the above question is a resounding no. In fact, it might be best to start pumping the breaks on the metaverse now before it infiltrates our day-to-day lives, he warns.
Chris recently spoke with Cheetah Digital's CMO Richard Jones about the metaverse, marketing and the future of privacy. Notably, he questioned the long-term social impact of the metaverse on transforming our human environment.
"People have largely become who they are by navigating freely through their world. It's through your experience in life, dabbling and random happenstance, that allows you to grow and develop as a person. But what happens when, all of a sudden, the environment decides to get involved – classifying you, influencing your every move, and ultimately grooming you into the ideal consumer?"
Likening the metaverse to a physical entity like a skyscraper or an aeroplane, Chris paints a picture of how digital worlds should be architected with so-called fire exits or other protections. It should have a building code of sorts, protecting its users, their privacy, and, importantly, their mental health, he says.
A shocking realisation of it all, Chris says, is that privacy is only one small piece of the puzzle. The matter at hand is much larger and much more complex than the collected data and how it was collected.
"When we are looking at some of the consequences of algorithmic harm, whether that's mental health and mental wellbeing — particularly in young, developing men and women — to social cohesion across the globe where actual harm is stemming from these systems, it's critical that we address these consequences prior to the metaverse becoming mainstream," Chris says.
Constructing a new paradigm
When there are more safety regulations for a toaster in a kitchen than for a platform that touches a billion people, Chris says, it's time for a change. The best way to realise that change, he says, is to start at the source.
"A big part of the issue is that we are not framing the conversation around those who are responsible — the engineers and architects. The things that are causing harm are the products of architecture and engineering," he says.
"When you look at how we relate other products of technology and other products of engineering whether that's in aerospace, civil engineering, pharmaceuticals, etc.; there are safety standards. There are inspections. We need to start scrutinising the technological constructions on the internet to ensure that there are regulatory frameworks in place to create a safer environment."
Digital transformation, bias and the 'desired reality'
Chris questions what happens when people begin personalising their experiences in the metaverse to suit their preferences. So, for example, racists could eliminate people of colour from their view. Or people could create a society where they walk down the street and no longer see homeless people; they no longer see significant societal problems.
"What happens to a society when we no longer fully understand what's happening around us, and the only people who do are those in charge of augmenting it," he asks. "That's a really important question, and it's not far-fetched."
At Face(book) value
Despite Facebook's, now called Meta, attention-grabbing algorithms that encourage one-sided views and fuel disinformation, marketers continue to pump large sums of money into its platform every year. But is it worth it? Are we correct to believe that Facebook's data is truly as valuable as they think it is?
According to Chris, absolutely. "From a purely functional standpoint, yes. It's incredibly valuable data," he admits.
However, brands using that data to create personalised adverts isn't the problem. The problem, he says, is a bit more involved than that. "There's a difference between personalised advertising and creating an entire ecosystem using that logic," Chris points out.
"When you look at the news feed that Facebook and other social media platforms provide to users, it extends this logic that originated for advertising, showing content that is relevant to you.
"It's not just the basic things that make your ads more efficient, and also less annoying for the people receiving them. It extends to, 'You should only see things that engage you the most, full stop, in all information that you consume,' to the point where the only information that you consume is the thing that usually makes you really angry, because that's what's going to make you click on stuff. And that's different from marketing."
Don't trust a wolf in sheep's clothing
To support businesses who want to transform and adapt to the brave new metaverse reality, Chris offers a word of advice: Don't trust a wolf in sheep's clothing.
"The tool that you're using, that you love so much, is probably one of your biggest threats. While advertisers cringe at the idea of regulations that limit what they're allowed to do, in the long run, regulations might be beneficial for the viability of the industry," he says.
"Don't get dragged down by bad practices within an industry that is behaving badly. There is a substantial loss of trust in platforms like Facebook because it continuously doesn't listen to consumers. It doesn't respect consumers. Siding yourself with that industry could backfire in the long run."