Review Privacy and Purchase Policies to Avoid Epic Failure

Review Privacy and Purchase Policies to Avoid Epic Failure

Review Privacy and Purchase Policies to Avoid Epic Failure

1000 648 David Hoppe

Epic Games’ half-billion-dollar lesson in children’s online privacy protection and the perils of retail trickery should serve as an early-warning system for other platforms considering ways to boost microtransactions and leverage user data.

With Chairwoman Lina Kahn’s Federal Trade Commission taking an increasingly activist approach to consumer protection, video game developers, metaverse architects, extended reality producers, and other digital media stakeholders would do well to ensure their media platforms comply with the strictest global regulations: The Children’s Online Privacy Protection Act (COPPA), California Consumer Privacy Act (CCPA), Europe’s General Data Privacy Regulation (GDPR), and more.

Epic will pay $520 million, the largest penalty the FTC has ever recovered for a rule violation, to settle two complaints about its popular Fortnite franchise. The FTC found that Fortnite violated the privacy of its under-13 players. The COPPA violation alleges Fortnite collected young players’ personal information without obtaining parental consent. COPPA rules require companies to provide clear notice to parents about the data they collect and how they obtain it. The FTC also states certain of the game’s text and voice communications features that were enabled by default exposed juveniles to online bullying, harassment, and conversations with adult players about self-harm and sexual activity.

The second part of the fine is the result of the FTC’s finding that Fortnite includes built-in “dark patterns” designed to trick players into making unintended purchases. These dark patterns comprising “counterintuitive, inconsistent, and confusing button configurations,” says the FTC, induced players to place orders either by accident or without fully understanding the charges that would apply. In some instances, the orders were placed when players simply pressed random buttons in their attempts to wake the game from sleep mode or when they were attempting to preview a product. Dark patterns are common in many retail settings where they may lead confused consumers to sign up for monthly subscription services, buy insurance for a newly purchased laptop, download unwanted software, or purchase skins, weapons, resources, and other gaming assets. Fortnite is also accused of “stealth advertising,” presenting promotional material as news content to encourage purchases.

Websites, games, and brands that count on minors for part of their revenue must understand that marketing to children comes with additional responsibilities. This case drives home that point in several ways:

  1. It gives fuel to the FTC and other US and European regulators to justify their more aggressive regulation of games, and online content, especially products and services that could take advantage of vulnerable populations such as children, compulsive gamblers, and the elderly. Lina Khan’s FTC has taken on the biggest names in the industry recently, notably opposing Microsoft’s acquisition of Activision Blizzard. The UK’s Online Safety Bill imposes requirements on sites and search engines to rigorously evaluate user-generated content and delete anything that is illegal or injurious.

  2. Similarly, the case serves notice to these services that data security and privacy safeguards must be robust. Here, Epic’s text and voice communication system within the game made it easy for children to be bullied, harassed, and exposed to inappropriate discussions about sexuality, suicide, drug abuse, and other harmful behaviors. The voice and chat feature was turned on by default and there was no filter to prevent adults from playing in and communicating with underage players. A recently passed California law (the California Age Appropriate Design Code) addresses this problem by requiring companies to set children’s accounts to the highest privacy settings by default starting in 2024.

  3. It illustrates that online games and host sites must be more diligent in investigating consumer complaints. In this case, Epic received more than 1 million complaints and concerns from employees that accounts were being wrongfully charged. Instead of issuing quick refunds, the company seems to have purposely made it more difficult to apply for refunds, locked the accounts of people who complained and threatened to ban players if they disputed future charges.

Mobile apps, gaming platforms, social media sites, and media companies should re-evaluate their revenue strategies in light of the Epic case to ensure not only that their own policies are audience- and age-appropriate but also that their partners, users, and hosts collect and use data responsibly. It’s a big ask, Chief Marketing Officer of the influencer marketing agency Whalar, Jamie Gutfreund told Campaign US.

“Marketing to children requires specialized expertise, which is unlikely to exist as an internal capability. Brands must partner with dedicated experts…to manage the requirements,” she said.

Epic has agreed to pay the fine in an effort to position itself “at the forefront of consumer protection and provide the best experience for our players. The video game industry is a place of fast-moving innovation, where player expectations are high and new ideas are paramount. Statutes written decades ago don’t specify how gaming ecosystems should operate. The laws have not changed, but their application has evolved, and long-standing industry practices are no longer enough.

This evolution and the inapplicability of physical-world regulations to the digital space underscores the need for emerging technology companies to engage an experienced video game and digital media lawyer to review their online privacy policies, user agreements, contracts, and data security measures. Because the internet is international, one tactic is to comply with the strictest requirements and adopt marketing plans that do not require mass customization based on individual consumers’ data-derived characteristics. As Epic noted in its announcement of its FTC settlement, “While game developers may be familiar with COPPA, they may not be aware of its global application. COPPA is just one of the many regulations addressing children’s privacy around the world, which are expanding to include teens. This means game developers should expand youth privacy protections to include players under 18.”

An experienced law firm can assist businesses in any sector that collects, stores, processes, sells, or uses customer and employee data so as to minimize legal risk and potential damages. Video game developers, metaverse platforms, extended reality producers, and other entertainment and communications companies are among the most likely to leverage this data. And, as the FTC, Federal Trade Commission, and regulators around the world confirm every day, they are often under the microscope when it comes to microtransactions, loot boxes, data security, and mobilization.

Gamma Law is a San Francisco-based firm supporting select clients in cutting-edge business sectors. We provide our clients with the support required to succeed in complex and dynamic business environments, to push the boundaries of innovation, and to achieve their business objectives, both in the U.S. and internationally. Contact us today to discuss your business needs. 

Author

David Hoppe

All stories by: David Hoppe

Subscribe to Gamma Law's
Monthly News & Insights