Press "Enter" to skip to content

Virtual reality telemetry means you can virtually kiss goodbye to privacy

Exclusive Virtual reality presently looks like it will offer virtually no privacy for those looking to hide online.

That forecast comes not just from knowing that the leading cash-burner in the race to establish a metaverse, Meta. It follows from academic studies enumerating the more than two dozen private data attributes available for players in a corporate VR panopticons.

And now this lack of privacy looks even more likely in light of the latest research from a group of computer scientists with ties to UC Berkeley, RWTH Aachen in Germany, and Unanimous AI.

In a paper provided to The Register in advance of its publication on ArXiv, academics Vivek Nair, Wenbo Guo, Justus Mattern, Rui Wang, James O’Brien, Louis Rosenberg, and Dawn Song set out to test the extent to which individuals in VR environments can be identified by body movement data.

The boffins gathered telemetry data from more than 55,000 people who played Beat Saber, a VR rhythm game in which players wave hand controllers to music. Then they digested 3.96TB of data, from game leaderboard BeatLeader, consisting of 2,669,886 game replays from 55,541 users during 713,013 separate play sessions.

These Beat Saber Open Replay (BSOR) files contained metadata (devices and game settings), telemetry (measurements of the position and orientation of players’ hands, head, and so on), context info (type, location, and timing of in-game stimuli), and performance stats (responses to in-game stimuli).

From this, the researchers focused on the data derived from the head and hand movements of Beat Saber players. Just five minutes of those three data points proved enough to train a classification model that, given 100 minutes of motion data from the game, could uniquely identify the player 94 percent of the time. And with just 10 seconds of motion data, the classification model managed accuracy of 73 percent.

“The study demonstrates that over 55k ‘anonymous’ VR users can be de-anonymized back to the exact individual just by watching their head and hand movements for a few seconds,” said Vivek Nair, a UC Berkeley doctoral student and one of the authors of the paper, in an email to The Register.

“We have known for a long time that motion reveals information about people, but what this study newly shows is that movement patterns are so unique to an individual that they could serve as an identifying biometric, on par with facial or fingerprint recognition. This really changes how we think about the notion of ‘privacy’ in the metaverse, as just by moving around in VR, you might as well be broadcasting your face or fingerprints at all times!”

Asked whether this technique could be generalized to be useful outside of VR environments, Nair said he expects you’d be able to learn more about people from general motion tracking in the real world because more information can be observed.

“There have been papers as early as the 1970s which showed that individuals can identify the motion of their friends,” said Nair. “A 2000 paper from Berkeley even showed that with motion capture data, you can recreate a model of a person’s entire skeleton.”

“What hasn’t been shown, until now, is that the motion of just three tracked points in VR (head and hands) is enough to identify users on a huge (and maybe even global) scale. It’s likely true that you can identify and profile users with even greater accuracy outside of VR when more tracked objects are available, such as with full-body tracking that some 3D cameras are able to do.”

Nair expressed skepticism at the possibility that privacy laws might be crafted to restrict the collection of all identifiable information. “For example, everyone uses slightly different vocabulary and structure when writing, and this is enough to identify an individual,” he explained. “On that basis, should we restrict the collection of all written text in general?”

Nair said he remains optimistic about the potential of systems like MetaGuard – a VR incognito mode project he and colleagues have been working on – to address privacy threats by altering VR in a privacy-preserving way rather than trying to prevent data collection.

The paper suggests similar data defense tactics: “We hope to see future works which intelligently corrupt VR replays to obscure identifiable properties without impeding their original purpose (e.g., scoring or cheating detection).”

One reason to prefer data alteration over data denial is that there may be VR applications (e.g., motion-based medical diagnostics) that justify further investment in the technology, as opposed to propping up pretend worlds just for the sake of privacy pillaging.

“Perhaps we’ll see more sophisticated versions of privacy-preserving approaches in VR develop over time,” said Nair, punctuating his message with a smiley face emoticon as a sign of optimism.

It’s either that or storing VR headsets in the trash bin. ®

source: The Register