Articles

Review of Rwandan Ministerial Instruction on Child Online Protection

Authors: Chisom Mbamalu, Dorcas Tsebee and Ridwan Oloyede

Introduction

In response to the growing use of technology and associated risks to child safety online, the Ministry of Information, Communication, Technology, and Innovation of Rwanda issued an instruction on child online protection. These guidelines, published in the Official Gazette on January 23, 2024, seek to bolster online safety and protection measures for children. They delineate the responsibilities of various stakeholders, including parents, guardians, internet service providers (ISPs), digital content creators, entities providing internet access (such as cybercafés), broadcasters, and social media users. These instructions are applicable to any person or organisation that broadcasts or provides content online or provides access to the online content. This review highlights key provisions of the ministry’s instructions based on specific issues relating to protecting children on the Internet.

Key Provisions

  1. Data protection and privacy

The primary objective of these instructions is to enhance awareness and promote online safety for children. While they do not explicitly focus on data protection, they implicitly aim to safeguard children's privacy in the digital realm. For instance, Article 4(d) mandates ISPs to incorporate parental control tools, upholding children's rights to privacy and online safety. This can be achieved through a privacy-focused design and default settings in the early stages of service development. Article 9 of the Privacy and Data Protection Law defines a child as someone below sixteen (16).

Moreover, Article 8(b) underscores the necessity of parental consent for publishing children's images online. This aligns with the requirement under Article 9 of the Privacy and Data Protection Law, where the consent of the person holding parental responsibility is required to process the data of a child. This requirement has been enforced and penalised by data protection authorities in other countries. For instance, Kenya’s data protection authority penalised organisations for sharing children's photos without consent. This rule obliges social media users to self-regulate, ensuring that children's images or videos are not posted for commercial or entertainment purposes without parental or guardian consent. Additionally, Article 9(e) places a responsibility on parents and guardians to educate children about the risks of sharing personal information online without proper guidance.

  1. Labelling and age verification

The instructions mandate that digital content providers implement appropriate digital content filtering tools to curb harmful online content and also provide clear external labels on their platforms describing whether the content is suitable for children. The implication of this provision is that digital content providers must indicate whether the content is suitable for children or not. This could be achieved by properly labelling the content at the display stage. Article 4(e) also mandates them to implement a mechanism to prevent children from accessing age-inappropriate content, sites, products or interactive services. This indirectly mandates that digital content providers implement age verification mechanisms on the platforms to determine appropriate content. Article 9 of the Rwandan Data Protection and Privacy Act provides that a child is any individual below 16 years. Thus, reference will be made to this age limit in verifying a child's age. Moreover, broadcasters and TV service providers operating online are required to put in place mechanisms to indicate the suitability or harmfulness of content for various age categories of children. Further, they are required to ensure that children are not exposed to harmful content by implementing technical barriers such as parental control, age verification tools, and personal identification numbers. This is an explicit provision on age verification.

  1. Content moderation and age appropriateness 

The instructions mandate that cybercafés and public Wi-Fi providers implement measures to protect children's online safety. This includes blocking access to websites known for hosting harmful content for children and installing software and procedures to monitor and restrict children's internet access. Such measures are crucial, as public Wi-Fi networks often lack robust security, posing risks to children's data safety and security. By requiring content filters on these networks, the regulations aim to create a safer online environment for children.

Furthermore, Article 8 prohibits social media users from creating and sharing videos that feature child actors in roles unsuitable for their age. This includes content involving sexual activity, hate speech, obscenity, horror, violence, drug use, or offensive or disrespectful language. Additionally, it places a responsibility on parents to employ parental controls or other methods to block, filter, or monitor their children's online activities, further ensuring their safety in the digital realm.

Conclusion

The instructions from the Ministry advocate for robust safety measures to protect children online. They include specific provisions to respect children's privacy. Businesses within the ambit of these instructions must enhance their compliance strategies to avoid regulatory penalties. The guidelines apply to any individual or organisation involved in broadcasting, content provision, or offering online content access.

 Article 4(d)

 Article 7(a).

 Article 7(c).

 Article 6.

 Article 9(b).