Virtual reality has the potential to enable so many amazing utopian futures, but it also has the potential to become one of the most intimate surveillance technologies that could create a Big Brother dystopia of political and economic control. But if privacy considerations are built into virtual reality technologies from the beginning, then Accomplice investor Sarah Downey argues that the metaverse could actually be one of the last bastions of privacy that we have in our lives.
Downey has a legal background in privacy and previously worked at a privacy start-up, and she’s currently investing in virtual reality technologies that could have privacy implications such as the brain-control interface company Neurable that can detect user intent.
Downey believes that privacy is a fundamental requirement for freedom of expression. In order to have the full potential of First Amendment that guarantees the freedom of speech in the United States, then you need to have the protections of the Fourth Amendment that protects a reasonable expectation of privacy. She makes the point our digital footprints are starting to bleed into our real lives, and that this will lead to less authentic interactions in the real world.
The advertising business models of Google and Facebook rely upon creating a unified profile that connects your online behavior to your actual identity, which creates a persistent digital footprint that follows you where ever you go. As search histories have been subpoenaed into child custody cases or social media posts get people fired, it’s created an online environment where shared content is being considered public and permanent not ephemeral.
This has a chilling effect that creates what Downey calls a “fraudulent shell that limits authenticity.” The erosion of a truly private context has created a stale and boring environment that has limited her authentic expression on sites like Facebook, and she warns that our unified digital footprints will start to spread into the real world as augmented reality technologies with facial recognition start to spread. As we start to loose the feeling of anonymity in public spaces, then we’ll all be living out the first episode of season three of Black Mirror called “Nosedive” where every human interaction is rated on a five-star scale.
Downey also argues that the Fourth Amendment is based upon a culturally reasonable expectation of privacy, which means that our cultural use of mobile and web technologies has had a very real legal effect on our constitutional rights. There’s a subjective interpretation of the law that’s constantly evolving as we use technology to share the more intimate parts of our lives. If we feel confident enough to share something with a third-party company, then it’s not really legally private and can be subpoenaed and used within a court of law.
There are different classes of private information, and so companies like Google and Facebook are able to collect massive behavioral histories of individuals as long as they don’t share access to the personally identifiable information. They can anonymize and aggregate collective behavioral information that’s provided to their advertising customers, which enables them to create a business model that is based upon detailed surveillance of all of our online behavior.
Google recently quietly created an opt-in that links all of your historical web browsing history from DoubleClick to your personally identifiable account, and Facebook actually buys external financial and mortgage data to tie into your web browsing and social media interactions.
As of right now, none of the information gathered by a virtual reality technologies has been determined to be definitively classified as “personally identifiable information,” which enables VR hardware companies and application developers to capture and store whatever they like. But once there are eye tracking technologies with more sophisticated facial detection or one day brain-control interfaces, then VR technology will have the capability to capture and store really intimate data including facial expressions, eye movements, eye gaze, gait, hand & head movements, engagement, speech patterns, emotional states and, from EEG, brainwaves, attention, interest, intent, and potentially even eventually our thoughts. The rules we make today about how basic information like head and hand movements in VR can be used will set a precedent ahead of these more sophisticated and precise tracking technologies.
There are some existing biometric identifiers that can connect information gathered from your body that can personally identify you, which include things like facial features, fingerprint, hand geometry, retina, iris, gait, signature, vein patterns, DNA, voice, and typing rhythm. Right now your gait, voices, or retina or iris as captured by an eye tracking camera could be biometric data that proves to be personally identifiable. It’s also likely that the combination of other factors like your body, hand, and head movements taken together may prove to create a unique kinematic fingerprint that could also personally identifiable you with the proper machine learning algorithm. This could mean data is being anonymously stored today that could eventually be aggregated to personally identify you, which is a special class of PII that requires special legal protections.
OpenBCI co-founder Conor Russomanno told me that EEG brainwave data may turn out to have a unique fingerprint that can not ever fully be anonymized and could be potentially be tracked back to individuals. What are the implications of storing massive troves of physical data gathered from VR headsets and hand tracked controllers that turns out to be personally identifiable? Downey suggests that the best answer from a privacy perspective is to not record and store the information in the first place.
There’s a set of self-regulatory principles for online behavioral advertising that companies have collectively agreed to follow to help with the Federal Trade Commission’s oversight of companies protecting the privacy of individuals. But up to this point all of the major virtual reality companies have not taken a proactive approach to educate, be transparent and provide consumer controls to opt-out of what may be recorded and stored from a VR system.
The site VRHeads did a great comparison of the different privacy policies of VR companies pointing out some of the commonalities and differences. They also flagged Oculus’ privacy as concerning saying, “The company states that all of that information is necessary to help make your game experience more immersive; they also use the data to make improvements on future games. But permanently storing that data, and then sharing it? That’s a bit invasive.”
Oculus made this statement about privacy in response to an UploadVR report from April, 2016:
Lastly, Facebook owns Oculus and helps run some Oculus services, such as elements of our infrastructure, but we’re not sharing information with Facebook at this time. We don’t have advertising yet and Facebook is not using Oculus data for advertising – though these are things we may consider in the future.
Just because Oculus hasn’t shared information with Facebook as of early 2016, that doesn’t mean that they won’t and they don’t plan to in the near or far future. In fact, it’s likely that they will otherwise they wouldn’t have included the legal language to do so.
The boundaries of independence between Oculus and Facebook have been fading lately. Facebook has been taking more and more of an active part in running Oculus as shown by the Oculus logo including mention of Facebook, with CEO Brendan Iribe recently stepping down, and with Mark Zuckerberg giving a much more in-depth demo about the future of VR and Facebook at the recent Oculus Connect 3.
As these online profiles start to merge into our real world with augmented reality technologies, it could vastly reduce our sense of privacy. So Downey is optimistic about the potential of a virtual reality metaverse could become one of the last bastions of privacy that we have, if VR technologies are architected with privacy in mind.
Downey encourages VR application and hardware developers to minimize data collection and to maintain as little data as possible. She also suggests to not personally identify people, and to use decentralized payment options like Bitcoin or other cryptocurrencies as to not tie information back to a singular identity. Finally to avoid using social sign-ins so as to not have people’s actions be tied back to a persistent identity that’s permanent stored and shared forever.
Virtual reality technologies are going to have increased scrutiny from public policy creators in 2017, and there has already been a Senate Commerce hearing about Augmented Reality in November of 2016.
Some of the open questions that should be asked of virtual reality hardware and software developers are:
• What information is being tracked, recorded, and permanently stored from VR technologies?
• Is this information being stored with the legal protections of personally identifiable information?
• What is the potential for some of anonymized physical data to end up being personally identifiable using machine learning?
• Why haven’t Privacy Policies been updated to reflect what VR data is being tracked and stored? If nothing is being tracked, then are they willing to make explicit statements saying that certain information will not be tracked and stored?
• What controls will be made available for users to opt-out of being tracked?
• What will be the safeguards in place to prevent the use of eye tracking cameras to personally identify people with biometric retina or iris scans?
• Are any of our voice conversations are being recorded for social VR interactions?
• Can VR companies ensure that there any private contexts in virtual reality where we are not being tracked and recorded? Or is recording everything the default?
• What kind of safeguards can be imposed to limit the tying our virtual actions to our actual identity in order to preserve our Fourth Amendment rights?
• How are VR application developers going to be educated and held accountable for their responsibilities of the types of sensitive personally identifiable information that could be recorded and stored within their experiences?
The technological trend over the last ten to twenty years has been that our behaviors with technology have been weakening our Fourth Amendment protections of a reasonable expectation of privacy. As we start to provide more and more intimate data that VR and AR companies are recording and storing, are we yielding more of our rights to a reasonable expectation of privacy? If we completely erode our right to privacy it will have serious implications on our First Amendment rights to free speech.
As virtual reality consumers, we should be demanding that VR companies do not record and store this information, in order to protect us from overreaching governments or hostile state actors who could capture this information and use it against us.
In order to have freedom of expression in an authentic way we need to have a container of privacy. Otherwise, we’ll be moving towards the dystopian futures envisioned by Black Mirror, where our digital footprint bleeds over into our real life that constrains all of our social and economic interactions.
Is VR going to be the most powerful surveillance technology ever created or the last bastion of privacy? It’s up to us to decide. We need to make these privacy challenges to VR companies now before they become ingrained in our expectations.
Via: Road To VR