The video game industry is expected to generate over 200 billion USD in 2023. While this includes the sale of home consoles and physical video games, the largest growth is in mobile gaming. There are an estimated 2.7 billion gamers worldwide, of which 2.6 billion are playing on mobile devices such as smartphones and tablets.[1] Mobile gaming platforms allow individuals to play without a gaming console or personal computer. They are the most popular for gaming among children.[2] The availability of affordable mobile devices, as well as expanded internet capabilities, has fueled the rise of the online gaming industry. The COVID-19 pandemic has further contributed to this proliferation in the global digital gaming market.

As outlined in a report published by the United Nations Children’s Fund (UNICEF) concerning children’s rights and online gaming, this increased accessibility has “revolutionized the industry and opened doors to a new generation of gamers – changing the way they communicate and interact with other gamers and as spectators, how they buy and play games, and how the games they play interact with other digital services.”[3] The report defines online gaming as “playing any type of single- or multiplayer commercial digital game via any Internet-connected device, including dedicated consoles, desktop computers, laptops, tablets and mobile phones.” [4] A target demographic for online gaming, children are increasingly engaging with digital devices which have become a part of everyday life.

Online gaming not only provides entertainment, but also allows children to engage in a shared activity and can foster collaboration and the development of learning skills such as strategizing and problem-solving. These potential positive effects must be balanced against the risks inherent in the participation of children in online gaming. Companies that develop and market games to children must understand the potential impact on the rights of children and establish policies and procedures to best support and respect those rights.[5] In this regard, a key area for consideration is consent and the collection of the personal data of children.

As set out in Article 1 of the United Nations Convention on the Rights of the Child, a child is any person under the age of 18 unless otherwise stated under the law applicable to the child.[6] Regardless of regulatory regimes, as part of its guidelines concerning the online gaming industry, UNICEF recommends that companies provide special consideration and protection to all persons under the age of 18 in line with international norms and standards. When it comes to the collection, sharing, or reselling of the personal information of children, many regulatory regimes specify that these activities must not be undertaken unless specific and valid consent has been obtained.[7] Additionally, as current age verification processes can be ineffective, companies must also consider that children may be playing games targeted to adults.[8] As such, companies that produce and market games intended for adult-only audiences must consider the implications on the rights of the children who may be playing those games.

In terms of the use of player data that has been collected, UNICEF recommends that gaming companies consider the nature of the data and its intended use. Data that is collected to improve the gaming experience is probably less harmful than data collected for targeted advertising or onward data sales. Companies should proactively inform users about the type of personal data they collect and how it will be used, to ensure users understand the implications and that fully informed consent can be provided.[9] As such, companies should provide granular options for consent and not bundle consent for essential and non-essential data collection together.[10] Obtaining meaningful consent is key to ensuring that the rights of children are protected in respect of data collection.

Under the European Union’s (EU) General Data Protection Regulation (GDPR) children merit special protection in relation to the collection and use of their personal data.[11] This additional protection is provided as it can be expected that children are less aware of their rights and the risks inherent in sharing personal data online. The European Commission recommends that “any information addressed specifically to a child should be adapted to be easily accessible, using clear and plain language.”[12] The consent of a parent or guardian is required in order to process the personal data of a child. In the EU the age threshold for requiring consent is set by each EU member state and ranges between the ages of 13 and 16 years old.[13] A company must make a reasonable effort to ensure that consent is provided in line with the law and this includes the implementation of age verification measures.

In Canada, the federal Office of the Privacy Commissioner (OPC) has released guidance specifically concerning gaming and the collection of personal information. Gaming companies must obtain meaningful consent from players if personal data is collected, used, or disclosed.[14] However, this can be challenging when it comes to the monitoring of children who are participating in online gaming. As reflected in other guidance concerning the privacy of children online, children cannot be expected to appreciate fully how their personal data may be collected and used online. As a result, it needs to be given special protection. The OPC guidance provides that gaming services must request parental consent for children under the age of 13 years. Additionally, parents and guardians should be able to control their child’s access to content, ability to chat with other account holders, and how personal data will be shared.[15] In relation to the retention of personal data for inactive accounts, companies must apply an appropriate retention schedule for inactive accounts as this data should be destroyed after a defined period of inactivity.[16] Companies must also provide a “true deletion” option which allows for users to request that their accounts be entirely deleted.[17] Otherwise a company is at risk of retaining personal data for longer than is strictly necessary.

In the United States, the Children’s Online Privacy Protection Rule (COPPA) is a federal law that requires the operators of websites or online services to obtain parental consent before collecting the personal data of children under the age of 13 years.[18] This includes mobile applications that connect to the Internet and Internet-enabled gaming platforms. As set out by the Federal Trade Commission (FTC), regardless of country of origin, COPPA applies to any online service that is directed to users in the United States or collects information from children in the United States. The FTC can investigate and fine companies that do not follow COPPA.

Companies covered by COPPA must make available online a clear and comprehensive online privacy policy that sets out their information management practices for personal data collected online from children. They must provide direct notice to parents and obtain verifiable consent before collecting any personal data. They must also provide the option of consenting to the operator’s collection and internal use of a child’s personal data, but prohibit the disclosure of that data to third parties. Any disclosure must be made clear to parents. Parents must also have access to the personal data of their child in order to ensure the accuracy of the data or have the data deleted. As well, they must have the option to prevent the further use or online collection of their child’s personal data. Personal data collected online from a child must only be retained for as long as is necessary to fulfil the purposes for which it was collected. Personal data must be deleted in a manner that protects against its unauthorized access or use. Additionally, a child’s participation in an online activity must not involve any conditions on the need to provide more personal data than is reasonably necessary.[19] It is clear that in order to ensure compliance with the requirements set out under COPPA, covered companies must employ thorough data security, retention, and destruction practices.

The protection of children’s privacy is one of many areas of concern when it comes to the rights of children that are relevant to companies that create and market online games. As this industry continues to boom, children will remain a key, but vulnerable, consumer group for online gaming. The collection, sharing, and monetization of the personal data of children must not be undertaken unless explicit and valid consent has been obtained from a parent or guardian. In order to meet the requirements of privacy laws, companies must proactively establish policies and procedures to support and respect the rights of children, before exploiting their personal data for business purposes.

[1] Oehlenschlager, M. (2021, June) Online Games Gamble with Children’s Data. Danish Society of Engineers’ Working Group on Ethics and Technology &

[2] Russell, C., Reidenberg, J., and  Moon, S. (2018) Privacy in Gaming. Fordham Law Legal Studies Research Paper.

[3] United Nations Children’s Fund (UNICEF) (2019, August) Child Rights and Online Gaming: Opportunities & Challenges for Children and the Industry. Discussion Paper Series: Children’s Rights and Business in a Digital World.

[4] Ibid.

[5] Ibid.

[6] United Nations. (1989). Convention on the Rights of the Child. Treaty Series, 1577, 3.

[7] United Nations Children’s Fund (UNICEF) (2020, April) Online Gaming and Children’s Rights: Recommendations for the Online Gaming Industry on Assessing Impact on Children.

[8] Ibid.

[9] Ibid.

[10] Ibid.

[11] General Data Protection Regulation (GDPR) Recital 38 Special protection of children’s personal data

[12] European Commission. Can personal data about children be collected?

[13] Ibid.

[14] (2019, May) Gaming and personal information: playing with privacy. Office of the Privacy Commissioner of Canada.

[15] Ibid.

[16] (2015, December) Collecting from kids? Ten tips for services aimed at children and youth. Office of the Privacy Commissioner of Canada.

[17] Ibid.

[18] Federal Trade Commission (FTC). Children’s Online Privacy Protection Rule (“COPPA”)

[19] Federal Trade Commission (FTC) (2020, July) Complying with COPPA: Frequently Asked Questions. Business Guidance Resources.


Lisa Douglas is a member of Baker McKenzie’s Technology Practice. She currently focuses on information governance, drawing on a rich background in knowledge management, legal research, and library science to provide compliance advice on the enterprise information lifecycle.


Sarah Nagy is an Information Governance Specialist with the global Information Governance group within Baker McKenzie’s Information Technology & Communications Practice in Canada. She has a background in records and information management, corporate archives, and knowledge management. She supports clients in the management of their information governance programs by advising on records and data retention matters.