And laws in the United States can not protect you as they “are defined by protecting your identity, not protecting your thoughts or your impulses”
If there’s one thing we’ve learned about new modes of communication over the previous century, it’s that when technology draws people’s eyes and ears, advertisers won’t be far behind.
It has been the case with radio, cinema, television, the Internet, and social media, and it appears unlikely that it will not be the case with the so-called metaverse – the next fully realized, shared reality that firms like Meta are proposing to build.
A number of businesses have already dipped their toes into gaming metaverses, organizing virtual fashion shows and releasing exclusive collections in-game, which might be a harbinger of things to come.
To mention a few, luxury fashion brands such as Louis Vuitton, Valentino, and Marc Jacobs have all produced digital products for the social simulation game Animal Crossing, while Balenciaga has collaborated with Fortnite on an exclusive drop of wearable skins for in-game characters.
However, now that Meta, a targeted advertising behemoth, has established its claim to the metaverse, several experts are voicing concerns about the specific implications immersive advertising will have for user privacy, safety, and consent.
“When you think about advertising in XR, you should think about it as placement in the product instead of product placement,” said Brittan Heller, counsel with American law firm Foley Hoag.
“The way that advertising works in these contexts is a little different because you seek out the experiences. You like the experiences,” she added.
“An ad in virtual reality may look like buying a designer jacket for your digital avatar [but] that’s an ad for a clothing company that you are wearing on your body”.
“It may look like buying a game that puts you into Jurassic Park – [but] what better way to advertise the movie franchise than to actually put you in the experience of being in Jurassic Park?”
What exactly is biometric psychography?
According to Heller, the difficulty here is that in the metaverse, the capacity of harvesting biometric data and exploiting that sensitive data to target ads targeted to you goes far beyond the significant amount of data Facebook already utilizes to build our consumer profiles.
If the technology that Meta is proposing becomes a reality, there is a chance that a type of targeted advertising that tracks involuntary bodily responses would flourish.
Heller believes that for VR headsets to work in this setting, they must be able to follow your pupils and eyes.
This means that commercials could be modified based on what draws or holds your visual attention, as well as how you physically react to it.
Heller has invented the term “biometric psychography” to describe the combination of one’s biometric information and targeted advertising.
If a company had access to biometric data such as pupil dilation, skin moisture, EKG, or heart rate – biological signs that occur spontaneously in response to stimuli – and merged it with existing targeted advertising datasets, it would be like reading your thoughts, Heller said.
“The type of information you can get from somebody’s pupil dilation, for example – that can tell you whether or not somebody is telling the truth. It can tell you whether or not somebody is sexually attracted to the person that they’re seeing,” she added.
“We’re rapidly moving into a space where your intentions and your thoughts are substantial data sets that have technological importance in a way that they didn’t before”.
“The risk that I think we’ve learnt from Cambridge Analytica is that privacy risks come into play when you have the combination of unanticipated data sets, especially when you’re looking at emerging technology”.
Controlling the metaverse
Heller says that biometric rules in the United States are insufficient to safeguard users from the use or misuse of this type of data because “biometrics laws in the United States are defined by protecting your identity, not protecting your thoughts or impulses.”
The worry with the metaverse remains that the rate of technological development will outpace institutions’ ability to effectively govern it, as has arguably been the case with social media platforms.
Given that organizations intending to construct the metaverse are international and traverse borders, Heller argues that a “human rights-based approach” is the most effective way to deal with user protection issues.
“There are many stakeholders in this, there’s civil society, there are public groups, there are governments and then there are intergovernmental organisations as well,” she said.
“A human rights approach has been the way that we’ve been able to bring all of these players and their concerns together and make sure that everybody is heard”.
But what can businesses do to keep people safe in the metaverse?
If tech companies are serious about protecting people’ digital rights in immersive environments, they must be transparent about the technology they are building.
“I would want companies to be more transparent with the functionality of their technologies, not just their intentions and their business plans, but how this will work,” Heller added.
“That will help lawmakers ask the questions that they need to protect the public and to cooperate with each other for trans border technology”.
Image Credit: Getty
You were reading: With metaverse, the risk remains as they can read your mind