Home Fashion products Privacy in the Metaverse | Wilson Sonsini

Privacy in the Metaverse | Wilson Sonsini

0

Coined in Neal Stephenson’s 1992 bestselling novel, Snowfallthe term “metaverse” has recently reentered the lexicon of the general public to point to a technology hailed by some as the successor to the mobile internet and the next step in humanity’s technological evolution. Although there is no consensus on the precise contours of the definition, the metaverse has generally been described as an embodied Internet where, instead of passively viewing content in two-dimensional space, users are in the content and experience it with others.

The recent hype and commercial promise of this more immersive digital experience has led businesses at all stages to consider a metaverse strategy, early-stage startups offering metaverse fashion items for mature financial institutions buying virtual land to open metaverse-based bank branches. And yet, in any vision of the metaverse, real-world privacy concerns are magnified, as I/O devices can capture qualitatively new and more intimate data. For example, virtual reality headsets could take advantage of built-in sensors to capture facial movements, drawing even more deeply personal conclusions about individuals, such as their medical conditions or their emotions. Here are some key privacy considerations for companies considering venturing into the metaverse:

  • Design your offer with user privacy in mind. Although there is no comprehensive federal privacy law with clear rules on the collection, use, and sharing of personal data, several states have enacted privacy laws that would apply to personal information reasonably related to a consumer or a device. But regardless of the legal framework, companies venturing into the metaverse have other reasons to consider privacy: it’s an important part of ensuring consumer trust. Studies have shown that, if consumers are confident that a company will use their data in a way that benefits them, they are more willing to share more data. To that end, companies need to build data privacy and security into their products and services from the start. This means understanding what personal data they need, only collecting that data if they have a business need, disposing of it when that need no longer exists, and securing the personal data in their possession. Some state laws codify these requirements.
  • Have a compliance strategy to implement consumer data rights. State data privacy laws such as the California Consumer Privacy Act, Virginia Consumer Data Protection Act, and Colorado Privacy Act provide consumers, under certain circumstances, with the right to access, correct, or delete their personal data. . Many metaverse evangelists predict that blockchain technologies will play a significant role in the future of technology; the immutability of the blockchain can in some cases complicate the respect of the consumer’s right to erasure. (See a related article Wilson Sonsini Consulting addressing the potential application of consumer data rights to NFTs.) Companies operating in the metaverse must have processes in place to comply with such requests, as the large amount of consumer data available in the metaverse may increase consumers’ interest in exercising these rights.
  • Comply with biometric privacy laws. The new I/O devices that allow users to enter the metaverse are capable of collecting biometric data, from conscious physical movements to eye flickers, to emotional data. Several states have passed laws to protect this data. For example, Illinois’ Biometric Information Protection Act (BIPA) requires that private entities using biometric information have a written public policy setting out a retention schedule and guidelines for the permanent destruction of that information. The BIPA also imposes other obligations on private entities that collect biometric information, such as requiring prior notice and consent before collecting biometric information. The BIPA provides a private right of action and the penalties for violating its provisions are severe. In 2021, for example, Facebook has settled a multi-year lawsuit on its photo tagging feature for $650 million.
  • Be especially careful if your offer appeals to children. Politicians have stepped up to protect children from the perceived harms of technology. President Biden specifically pointed to the protection of children from online advertisements and the pernicious effects of social media in his State of the Union Address; and Senator Edward Markey (D-MA), as well as Congresswomen Kathy Castor (D-FL) and Lori Trahan (D-MA), recently sent a letter to Federal Trade Commission (FTC) Chairman Lina Khan, encouraging the FTC to monitor the growing use of virtual reality by children and to exercise its authority under the Children’s Online Privacy Protection Act (COPPA) and the FTC Act “to protect children in the metaverse”. As that letter indicated, two-thirds of parents with virtual reality (VR) devices report that their children have asked them to buy the device, and about three-quarters of children aged 8 to 15 who responded to a 2017 survey expressed an interest in virtual reality. Companies offering metaverse offerings that appeal to minors, including children under 13, will in many circumstances be required to offer COPPA-compliant experiences that may include parental consent before collecting, using, or disclosing the personal information of a child, or to limit the type of personal information. collected and how this information is used. Failure to do so could result in regulatory action and substantial fines.

As you develop your metaverse offerings, if you have any questions about privacy, please contact Wilson Sonsini’s attorneys Dan Chase, Maneesha Mithal, Tracy Shapiro, or Libby Weingarten, or another member of the privacy and cybersecurity practice.