Online Gaming and Online Safety – Where Is the Law Heading?

Jul 30, 2021

By Cris Pikes, CEO and Co-founder of Image Analyzer

Introduction:

In the May issue of Esports and the Law, Jeffrey Neuburger and Frances P. Kelley, of Proskauer Rose law firm, reported on a case brought before a California district court  (Coffee v. Google LLC, No. 20-03901 (N.D. Cal. Feb. 10, 2021)), where Google LLC successfully argued that, under Section 230 of the Communications Decency Act, it could not be held liable for mobile game players’ in-app purchases.

The court ruled that, even though Google earns 30% of any in-app purchases that are made within apps downloaded from the Google Play App Store, it was simply providing a platform. The court also stated that, under Section 230 of the Communications Decency Act “even if a service provider knows that third parties are using such tools to create illegal content, the service provider’s failure to intervene is immunized.”

Online games companies with significant user bases in Europe and the United Kingdom face a very different regulatory environment.

Impending legislation in the form of the European Union’s Digital Services Act and the UK Online Safety Bill will place increased legal obligations on online platform operators to swiftly remove illegal content. The draft UK Bill, published May 12th 2021, goes further and requires online service providers to remove illegal and ‘harmful’ online content, particularly where this poses a risk to children.

The EU Digital Services Act

The EU Digital Services Act aims to create a safer online environment for EU citizens by legally requiring online service providers to swiftly remove illegal content from their platforms.

Germany’s Network Enforcement Act (NEA) already requires Social Network Providers that have more than 2 million registered users in Germany to remove ‘manifestly unlawful’ content within 24 hours of receiving a complaint.

The EU Digital Services Act proposals, published December 15th 2020, will bring many more organizations into scope to remove illegal content uploaded to their digital platforms. As an example, LEGO recently announced its LEGO VIDIYO™ service, a partnership with Universal Music that enables children to make and upload their own music videos. Under the proposed legislation, LEGO will be legally responsible for moderating this user-generated content to ensure that adequate protections are provided to service users who reside in Europe.

As a provider of content moderation technology to multiple global organizations that may find themselves in scope for the new legislation, Image Analyzer commissioned lawyers at Bird & Bird, who specialise in technology law, to write two whitepapers that provide early insights into the legal obligations proposed within the EU DSA and the draft UK Online Safety Bill. The lawyers at Bird & Bird have scrutinised the proposals and presented a digest of the requirements and penalties, to aid organizations in gaining clarity on their future obligations and potential legal exposure under the new legal regimes in the EU and UK.

Territorial Scope

An important point to note is that the proposed legislation will apply to all online service providers that have significant numbers of users in Europe, regardless of the service providers’ place of establishment or residence. As Bird & Bird writes in the whitepaper, “The DSA has a broad extra-territorial scope irrespective of intermediaries’ places of establishment or residence. Based on a “substantial connection” test, the law will apply to intermediaries based outside of Europe that provide their services within the EU’s internal market and have service users whose place of establishment or residence is within the EU.”

A “substantial connection” will be deemed to exist where a service provider has a significant number of users in one or more of the European Member States; where it targets services towards residents in one or more of the Member States, for example by using their language or currency; or where it has an establishment within the EU.

Legal representation in Europe

Under the proposals, all service providers must provide a publicly listed single point of contact for direct communication with the European Commission and authorities in the Member States.

Online service providers that do not have a physical establishment in Europe must appoint a legal representative in the Member State where its services are used and provide that representative’s name, address, and contact details to that Member State’s Digital Services Co-Ordinator.

Bird & Bird writes in the whitepaper: “This legal representative can also function as a point of contact. It is important to note that this legal representative can be held liable for non-compliance with obligations under the DSA.”

Tiered obligations

The EU DSA proposes a four-tiered approach with organizations meeting incremental legal obligations according to the type and breadth of services they provide. Regardless of their tier, all organizations will have to comply promptly with valid Take Down Orders relating to illegal content on their platforms that are issued by a European Member State’s court.

The four tiers comprise:

Very Large Platform Operators whose services are actively used by 10% of the European population will have the highest level of legal obligation and will risk the maximum fines and penalties for non-compliance. The major social media platforms will be included in this tier.

Online Platforms which provide hosting services that store and disseminate content to the public at users’ request. Bird & Bird cites online marketplaces, app stores and content-sharing platforms as examples.

Hosting Services providers, such as cloud or web hosting services will be required to enable third parties to notify the presence of alleged illegal content on their systems. Where a hosting service provider removes access to information provided by a service recipient, the hosting service provider will be legally obliged to provide the service recipient with a statement of reasons (Article 15).

Intermediary Services which act as a conduit for online information or provide a limited hosting or caching service for the transmission of information will face the lowest level of obligations under the EU Digital Services Act. Bird & Bird cites internet service providers, network infrastructure providers and domain name registrars as examples of organizations that will fall into this tier. The obligations follow the existing intermediary liability regime. Intermediary Services’ obligations described in Chapter 11 of the draft Digital Services Act, re-state the core principles of the e-Commerce Directive, which includes liability exemptions and prohibits Member States from obliging intermediaries to introduce “general monitoring” of information they store or transmit. However, Article 6 of the proposals provides a “Safe Harbor” to remove legal liability for Intermediary Services that take the initiative to remove illegal content.

Content moderation:

The EU DSA will require all service providers to include within their user Terms and Conditions clear information on how illegal content will be moderated. Terms and Conditions must provide clear information on policies and procedures and disclose the use of any tools including algorithmic decision-making, automated and human content moderation. Online platforms will also be required to report on the accuracy of automated content moderation and the safeguards applied to the use of the technology.

All service providers must provide annual EU DSA reports that show which actions they have taken in response to authorities of Member States; the number of orders received, and the categories of illegal content removed from their platforms; as well as providing details of any content moderation undertaken of their own volition.

Trusted Flaggers:

The EU DSA proposals include a facility for law enforcement agencies, non-governmental agencies, and other entities to become “Trusted Flaggers” that can alert online platforms to illegal content hosted therein. Bird & Bird writes, “To become a trusted flagger, an entity will need to demonstrate: (a) it has the expertise and competence required to detect, identify and notify of illegal content; (b) it represents collective interests and is independent of any Online Platform; and (c) it carries out its activities for the purpose of submitting notices in a timely, diligent and objective manner. Online Platforms will be required to ensure that reports submitted by trusted flaggers are treated with priority and are decided upon without delay.”

Penalties and Sanctions

Under the proposals, the European Commission will be empowered to impose penalties for non-compliance of up to 6% of an organization’s global turnover in the preceding financial year. As a final resort, services can be suspended in affected European Member States.

Bird & Bird writes, “In cases where there is an urgency due to the risk of serious damage for the recipients of a service, the Commission may order interim measures. Digital Service Coordinators can also order the cessation of an infringement and impose proportionate remedies to bring the infringement to an effective end.”

When will EU DSA become law?

Bird & Bird writes in the whitepaper that a final consensus must be reached by the European Council representatives and the European Parliament on the proposed Digital Services Act before it is formally adopted by EU Member States. As a result, it is not likely to pass into law before 2023, which allows organizations time to make the necessary preparations to comply.

UK Online Safety Bill:

On May 12th 2021, the UK government published its draft Online Safety Bill which will place a legal obligation on online platform operators and internet search services to swiftly remove illegal content, which is defined as, “(i) terrorism content, (ii) CSEA content, (iii) priority illegal content, and (iv) other illegal content, taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service” [Draft Online Safety Bill Section 7, pg 16].

Distinct from the EU DSA proposals, the UK Online Safety Bill will also legally oblige online service providers to remove content that is not illegal, but which poses a risk of harm to service users and the wider public, particularly children.

This latter requirement has received criticism that the ‘harms’ are insufficiently defined in the draft Bill.

The UK Parliament Commons Library Briefing Paper, Regulating Online Harms offers some clarification by stating content will be deemed to be harmful “If a service provider has “reasonable grounds to believe that there is a material risk” of the dissemination of the content “having a significant adverse physical or psychological impact” on a child or adult of “ordinary sensibilities” [page 26].”

In the context of the Coffee vs. Google LLC case, this is important because Google argued that it was neither the creator of the video games, nor the Loot Boxes which were purchased within the mobile apps hosted on the Google Play App Store, it was merely acting as the passive host of the app platform and played no role in what people decide to buy within the apps.

Under the Online Safety Bill, Loot Boxes could be deemed to pose a risk of online harm and indeed, within its guidance on the risks within online games, the UK’s National Society for the Prevention of Cruelty against Children (NSPCC) cites that the risks to minors using online games include “grooming, cyber bullying, trolling, griefing, scams and in-game purchases.”

Territorial Scope

Similar to the EU DSA proposals, an online service provider will be in scope to comply with the UK Online Safety Bill if the service is targeted towards and used by a significant number of people in the UK, or if the content presented on the service carries a “material risk of significant harm” to UK citizens.

Bird & Bird writes in the whitepaper, “Examples of services likely to be in scope include, but are not limited to social media services, consumer cloud storage sites, video sharing platforms, online forums, dating services, online instant messaging services, peer-to-peer services, video games which enable interaction with other users online, and online marketplaces.”

Tiered Obligations

The aim of the Bill is to legally oblige online service providers to help combat online harms by putting systems in place to risk assess, identify, and remove illegal and harmful content. The obligations placed on providers will be risk-based and proportionate to the reach of the service.

As with the EU DSA, the UK government’s draft Online Safety Bill includes proposals to categorize organizations based on the number of people using their online platforms.

‘Category 1’ organizations will include the largest social media sites which will have greater obligations to moderate content uploaded by users.

‘Category 2’ companies are likely to include any interactive community platform that promotes the sharing of user-generated content, such as online gaming, dating, online travel agencies, private messaging, and retail sites.

With regard to the content moderation obligations faced by smaller online organizations that serve UK citizens, Bird & Bird writes, “The government has already called out terrorist content and child sexual abuse and exploitation content as illegal, and in-scope services are expected to take particularly stringent measures to address these, and further “priority illegal content” will be identified by the government in secondary legislation. For smaller companies with limited resources, these categories of harm should be tackled as an absolute bare minimum.”

Penalties and Sanctions:

The draft Bill published in May confirms that the UK Office of Communications (Ofcom) will be appointed as the authority with the power to enforce the new law.

Failure to adequately moderate content could result in companies being fined up to 10 percent of their global turnover or having access to their services blocked in the UK. In a move that takes penalties farther than the EU DSA proposals, the draft Bill also suggests that company executives could be held personally liable for online harms created by their services.

Conclusion:

While the proposed EU and UK legislation may appear Draconian, there are many business benefits to be gained from creating a safer online environment for service users, particularly where an online game is designed to be used by children. Failure to tackle harmful content risks making an online community toxic and driving users off the platform. Responsible content moderation, in compliance with the proposed EU DSA and UK Online Safety Bill, will help organizations to maintain a positive online user experience and reduce legal exposure while protecting brand integrity and revenues. The EU DSA proposals lead with this aim when they state that the legal obligation to remove illegal content will provide digital service providers with “legal certainty, harmonisation of rules”, and make it, “Easier to start-up and scale-up in Europe,” while providing business users with, “a level playing field against providers of illegal content.”

References:

eSports and the Law, May 12th 2021, https://esportsandthelaw.com/2021/05/12/ mobile-app-platform-entitled-to-cda-immunity-over-state-law-claims-related-to-in-app-purchases-of-loot-boxes/

Jeffrey Neuburger, partner at Proskauer https://newmedialaw.proskauer.com/author/jneuburger/

European Commission, The Digital Services Act: ensuring a safe and accountable online environment https://ec.europa.eu/info/strategy/priorities-2019-2024/europe-fit-digital-age/digital-services-act-ensuring-safe-and-accountable-online-environment_en

Draft UK Online Safety Bill https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/985033/Draft_Online_Safety_Bill_Bookmarked.pdf

Research Briefings, UK Parliament, Commons Library, ‘UK Briefing Paper, Regulating Online Harms’, Page 26, https://researchbriefings.files.parliament.uk/documents/CBP-8743/CBP-8743.pdf

UK Parliament, Commons Library, ‘Regulating Online Harms’, May 28th 2021 https://commonslibrary.parliament.uk/research-briefings/cbp-8743/

NSPCC, ‘Keeping children safe, online safety, online games’ https://www.nspcc.org.uk/keeping-children-safe/online-safety/online-games/

Childline, ‘Information and advice on online mobile safety and online gaming’ https://www.childline.org.uk/info-advice/bullying-abuse-safety/online-mobile-safety/online-gaming/

Bird & Bird whitepaper – EU Proposal for a Digital Services Act – Are You in Scope?

Bird & Bird whitepaper – The UK’s proposed Online Safety Bill – Are You in Scope?

Both whitepapers are available for download at www.image-analyzer.com.

Articles in Current Issue