by Gabriel Eze, Associate
The National Information Technology Development Agency of Nigeria (“NITDA”), on 13 June 2022, announced via its Twitter handle the release of its draft Code of Practice for Interactive Service Computer Platforms and Digital Intermediaries (the “draft Code of Practice”). NITDA is empowered by the NITDA Act 2007 to, amongst other things, create a framework to coordinate, monitor, and regulate information technology practices, activities, and systems in Nigeria and all related matters. In effect, the Code of Practice seeks to regulate all digital platforms and online businesses operating in Nigeria. NITDA recognizes that information-technology systems have become a critical infrastructure that must be safeguarded, regulated, and protected against online harm. It takes cognizance of the importance of uninterrupted access to authentic alternative sources of information and ideas, disseminating those ideas and the impact of the internet, as a shared resource, in shaping societal values especially in Nigeria. Conscious of this fact, NITDA sees the need to develop practices that will preserve and promote societal values within Nigeria’s digital ecosystem. It welcomes the general public to send in their reviews, comments, and responses to the draft Code of Practice.
Objectives of the draft Code of Practice
The draft Code of Practice comes with four objectives as follows:
(a) Set out best practices required of Interactive Computer Service Platforms/Internet Intermediaries;
(b) Set out best practices that will make the digital ecosystem safer for Nigerians and non-Nigerians in Nigeria;
(c) Set out measures to combat online harms such as disinformation and misinformation; and
(d) Adopt and apply a co-regulatory approach towards implementation and compliance.
What Digital Platforms and Online Businesses Operating in Nigeria Should Know
The draft Code of Practice consists of six parts, stating responsibilities, prohibitions, and other obligations digital platforms and intermediaries must comply with.
Parts One and Two
Parts one and two cover responsibilities, including the following:
- Digital platforms must publish on its website or application rules of engagement which should be in simple English;
- Through its terms of service, inform users not to create, publish, promote, modify, transmit, store or share any content or information that is generally harmful to a child; could cause any physical or psychological harm to another user directly or indirectly; or that is defamatory, libelous, pornographic, revenge porn, bullying, harassing, obscene, encouraging money laundering, exploiting a child, fraud, violence; or inconsistent with Nigeria’s laws and public order, etc;
- Platforms must carry out risk assessment to determine whether a content is harmful, upon receiving a notice. Accordingly, a platform shall take steps to mitigate and manage the impact of such content and ensure that the community rules or guidelines specify how children and adults will be protected from harmful content which they may encounter;
- Preserve a disabled or removed content, and any related record as required by law;
- Display a label to a disabled or removed content, stating grounds for such action;
- Preserve any information concerning a person that is no longer a user of a platform due to withdrawal or termination of registration, or for any other reason, as required by law;
- Regularly inform users that access to and usage of the platform is subject to compliance with rules and regulations. Where a user fails to comply, the platform reserves the right to terminate the user’s access to the platform;
- Inform users whenever there is a change or update to their rules;
- File an annual compliance report with NITDA that indicates the:
(a) number of registered users on its platform in Nigeria;
(b) number of active registered users on its platform in Nigeria;
(c) number of closed and deactivated accounts in Nigeria;
(d) number of removed content with and without notice or court order;
(e) number of contents put back with or without notice;
(f) number of contents removed and reuploaded;
(g) information on how children and adults are protected from harmful content which they may encounter;
(h) information on the number of complaints registered with a platform;
(i) number of resolved and unresolved complaints;
(j) independent awareness report on disinformation and misinformation;
(k) number of contents taken down due to disinformation and misinformation;
(l) any other relevant information.
Part Three focuses on large platforms. According to the draft Code of Practice, all platforms whose users are more than one hundred thousand (100,000) are Large Service Platforms. And in addition to the responsibilities stated earlier, Large Service Platforms must do the following:
- Be incorporated in Nigeria;
- Have a physical contact address in Nigeria, details of which shall be available on their website or platform;
- Appoint a Liaison Officer who shall serve as a communication channel between the government and the platform;
- Provide the necessary human supervision to review and improve the use of automated tools to strengthen accuracy and fairness, checkmate bias and discrimination to ensure freedom of expression and privacy of users.
- On demand, furnish a user, or authorized government agency with information on:
(a) reason behind popular online content demand and the factor or figure behind the influence.
(b) why users get specific information on their timelines.
Part Four prohibits a number of things which digital platforms must adhere to. The draft Code of Practice prohibits a platform from keeping ‘prohibited materials’ or making them available for access when they are informed of such materials. The prohibited material includes materials that are objectionable on the grounds of public interest, morality, order, security, peace, or are otherwise prohibited by applicable Nigerian laws.
When a digital platform has been informed of the existence of ‘prohibited material’, the digital platform must, in all instances, remove the content within 24 hours.
In considering what a ‘prohibited material’ is, the draft Code of Practice requires digital platforms to consider the laws of Nigeria, including but not limited to the following:
- Nigerian Communications Act;
- National Broadcasting Commission Act;
- Nigeria Broadcasting Code;
- Cybercrimes (Prohibition, Prevention, etc.) 2015;
- Advance Fee Fraud and other Fraud Related Offences Act 2006;
- Nigeria Data Protection Regulation 2019;
- Advertising Practitioners Act 2004;
- Sales Promotions of Nigerian Code of Advertising Practice Sales Promotion and other Right/Restrictions on practice Act 2004;
- Terrorism Prevention Amendment Act 2022;
- NCC Consumer Code of Practice Regulations 2017; and
- Federal Competition and Consumer Protection Act (FCCPA) 2018.
Part Five covers the measures digital platforms must take in containing disinformation and misinformation. According to the definition section of the draft Code of Practice, “disinformation” means verifiably false or misleading information that, cumulatively, is created, presented, and disseminated for economic gain or to deceive the public intentionally and that may cause public harm. ‘Misinformation’, on the other hand, means the unintentional dissemination of false information.
Amongst other list of obligations of a digital platform, it shall caution the publisher of any false information that is likely to cause violence, public disorder, or exploitation of a child, to remove the content as soon as reasonably practicable. The digital platform must not only provide users with easily accessible tools to report disinformation or misinformation but also ensure that it improves access to “different authentic sources with alternative perspectives”. Digital platforms must prioritize authentic information in search, feeds, or other distribution channels. It must also trace, expose, penalize, and close accounts and sources that amplify disinformation and misinformation.
But the draft Code of Practice makes an exception for a user who, without intent, merely redistributes through intermediaries, the content of which the user is not the author and which the user has not modified.
Part Six, which is the miscellaneous section of the draft Code of Practice, gives room for NITDA to review and amend the draft Code of Practice from time to time. It provides that noncompliance with the Regulation (when it comes into force) shall be construed as a breach of the provisions of the NITDA Act of 2007. Any platform and/or internet intermediary that violates the Regulation may be liable to disciplinary measures under civil service rules, prosecution, and conviction for violation of NITDA Act 2007.
A Reincarnation of Social Media Bill and Hate Speech Bill?
First, however well-intended, it is important that we do not end up with a Code of Practice that would become an instrument used to suppress the right to speech or violate the fundamental rights of individuals in our society. Section 39 of the Constitution of the Federal Republic of Nigeria (as amended) (the “Constitution”) entitles every Nigerian to freedom of expression, including freedom to hold opinions and to receive and impart information without interference. The same Constitution already contemplates that some of these fundamental rights are not going to be absolute, thus subjecting these rights to certain grounds upon which these rights could be justifiably derogated in order to ensure defense, public safety, public order, public morality, and public health. The draft Code of Practice must be consistent with the provisions of the Constitution, and take nothing away from it.
Second, that noncompliance with the Regulation (when it comes into force) would be construed, according to Part 6 of the Draft Code of Practice, as a breach of the provisions of the NITDA Act 2007 should, with due respect, not be the case. A code of practice should be just that—a code of practice or conduct to encourage best practices and standards amongst digital platforms and intermediaries, not a civil or criminal code that creates offenses and construes these as offenses—civil or criminal—under the NITDA Act 2007. More so, most of the provisions in the draft Code of Practice are already captured in various existing legislation in Nigeria. The draft Code of Practice in fact states some of the applicable and relevant laws under Part 4. Therefore, NITDA should treat and promote the Code of Practice as a code of conduct and best practices only.
Third, some provisions in the draft Code of Practice fail to take cognizance of the fact that different digital platforms necessarily operate differently regarding how they manage complaints, takedown requests/orders, data protection and privacy relating to user identity, etc. NITDA must therefore avoid provisions that are too specific so these provisions do not box digital platforms into an unreasonable corner, and consequently make compliance difficult or problematic.
Besides, coming after the infamous ban of Twitter in Nigeria, the timing for the proposed Code of Practice may not be very right, although the coming general elections may have been a major factor. Many Nigerians kicked against the ‘Protection from Internet Falsehood and Manipulation and for Other Related Matters Bill’ popularly known as the Social Media Bill and the ‘Hate Speech Bill’ in 2019. The ‘Social Media Bill’ and the ‘Hate Speech Bill’ sought to give the Nigerian government regulatory powers over conversations on social media platforms and contemplated a death sentence for anyone found guilty of hate speech respectively. Since then, Nigerians have never been more conscious of their fundamental rights. In the wake of NITDA’s new draft Code of Practice, some civil-rights organizations and concerned Nigerians have expressed fears over what they believe is a reincarnation of the ‘Social Media Bill’ and ‘Hate Speech Bill’, viewed rightly or wrongly as attempts by the Federal Government of Nigeria to suppress citizens’ fundamental rights of expression. Therefore, NITDA needs to consider this concern and be more than ever before intentional about engaging more industry stakeholders as well as members of the public about the proposed Code of Practice. Such multi-stakeholder and public engagements will enhance compliance.
The draft Code of Practice targeted at digital platforms and intermediaries is a welcome initiative. This is because in today’s digital economy, digital platforms and intermediaries have become a vital part of our lives—whether as individuals, businesses, institutions, or governments. Online harms, including bullying, child exploitation, fraud, money laundering, violence, etc. are some of the realities users face on digital platforms daily, particularly social media. Besides, the harm disinformation and misinformation do to the society can be quite dangerous, leading to a breakdown of law and order. Therefore, it is critical that these digital platforms and intermediaries are properly regulated in order to prevent or at least minimize abuses that may affect law and order in the country.
Since the draft Code of Practice seeks to safeguard the security and interest of Nigerians and nonNigerians regarding activities conducted on digital platforms and intermediaries, NITDA should be cautious so it does not, through the Code of Practice, negate the constitutional rights of Nigerians. Also, the livelihood of Nigerians operating online businesses or hosting their businesses online using digital intermediaries should not be impaired or crippled under any guise. NITDA should take all stakeholders and members of the public along towards ensuring the buy-in that NITDA needs for effective compliance with the proposed Code of Practice. If NITDA’s relatively developmental approach to regulation of Nigeria’s information technology space is anything to go by, achieving the level of buy-in needed is not expected to be so herculean.