[Pakistan] AIC Submits comments on the Amendment – Removal and Blocking of Unlawful Online Content (Procedure, Oversight and Safeguards) Rules (June 2021)

Download: [Pakistan] AIC Submits comments on the Amendment – Removal and Blocking of Unlawful Online Content (Procedure, Oversight and Safeguards) Rules (June 2021)

The Asia Internet Coalition (AIC) has submitted comments to the Ministry of Information Technology and Telecommunication (MoITT) on the latest draft of the Removal and Blocking of Unlawful Online Content (Procedure, Oversight and Safeguards) Rules.

While some less significant concerns outlined in our earlier submissions have been adopted in part, the most problematic provisions remain unchanged in the latest draft of the Rules and have, to an extent, regressed in comparison with previous versions. It is particularly worrying that large portions of the Rules go beyond the scope of the parent act (PECA 2016). In particular mandatory local incorporation requirements, instead of being removed for these reasons, appear to have been expanded with a requirement to have a dedicated grievance officer based in-country.

AIC and its member companies continue to have concerns on various aspects of the rules, including decryption of data, fixed turnaround times for blocking content, local presence requirements including data localization, and the ability of government agencies to make confidential content removal requests, among others.

In our letter of 4 February, 2021 we sought the government’s assistance to ensure that the basic principles of meaningful consultation were upheld in the drafting of the Rules. The latest draft, which replicates the previous draft with only minor changes, shows that the consultation process was not undertaken with a view towards substantive changes. We remain committed to working with the government on meaningful consultation towards balanced regulation. The adoption of balanced rules that incorporate industry feedback is an opportunity for Pakistan to set itself apart from the rest of the South Asia region and position itself as a world leader in digital transformation and regulation.

An overview of our key concerns and recommendations regarding the latest draft are provided below:

  1. Fixed turnaround times for blocking content – Rule 6(2): While the turnaround times have doubled from 24 to 48 hours and 6 to 12 hours, respectively, we continue to maintain that the exact time frame for complying with a notice is not something that should be stipulated in the Rules. The time needed to review requests will vary from case to case, depending on the complexities and volume of content under consideration. As an alternative, we would propose that social media companies be required to acknowledge the Authority’s requests within 24 hours and process the request within a reasonable timeframe.
  2. Definition of “emergency” – Rule 3(1)(vi): We appreciate that the latest draft includes a definition of the term “emergency,” under which content must be removed within 12 hours. However, we are concerned about the broadness of the definition, which includes vague and unclear terms like “security or integrity of Pakistan.”
  3. Requirement to establish a permanent office and local registration – Rule 8(6)(a) & (b): As stated in our prior correspondence and submissions, the effectiveness with which social media companies moderate online content does not depend on having a local presence or local registration, but rather on having well-established processes and product-specific policies, clear local laws to guide the process, and properly informed and valid requests for takedowns. Importantly, most AIC members are entities registered under US laws, and any forced requirement for such entities to establish permanent officers would entail several unintended implications in the form of conflict of laws, taxation, apart from a high degree of business uncertainty. As the requirement for a physical office and local registration is fundamentally unrelated to the issue of content moderation, this provision falls outside the scope of the parent legislation, namely section 37 of PECA, which tasks PTA to develop rules on safeguards, transparent processes, and effective oversight mechanisms for the exercise of its powers to block certain types of content.
  4. Requirement for a locally based authorized compliance officer – Rule 8(6)(c): Similar to the physical office requirement, the appointment of a compliance officer based in Pakistan would not facilitate content removal. To the extent there is a requirement for improved coordination between a social media company and the Authority, a dedicated point person can be appointed without a requirement for the person to be locally based.
  5. Requirement for a “dedicated grievance officer” – Rule 8(6)(d): The scale of content uploaded to Social Network Services is tremendous. Appointing a single grievance officer is not a scalable or practical solution. Furthermore, publishing the name and contact information of the grievance officer would likely subject this person to an immense deluge of extraneous communications, and even harassment. Instead, social media companies should establish clear and transparent frameworks and processes for review and removal of content. Lastly, the requirement to be locally based is misguided since support teams of global platform providers are based across different offices and any forced local presence of such officers is unnecessary.
  6. Requirement to provide user data in decrypted format – Rule 8(4): The rules empower Pakistani law enforcement authorities to exercise their powers extraterritorially, contrary to established procedures of international law, including treaty-based and other diplomatic procedures, to seek disclosure of user data held by the global social media companies. Requests for user data made to foreign-based service providers outside the proper, legitimate international channels may create conflicts with foreign law. Pakistan should follow international norms, standards, and treaties when seeking user data. Since the regime for user data production is already outlined extensively under PECA, this provision is superfluous and should be removed from the Rules.
  7. Ability for state agencies to make confidential content removal requests – Rule 5(1)(ii) & 5(5): The rules allow a broad range of state agencies to make confidential requests for content removal through the Authority, without any visibility on the source of the complaint. This appears to be entirely antithetical to values of transparency, the need for which has also been emphasized globally, as well as by local courts.
  8. Scope of powers conferred on the Authority: Under the Rules, the Authority can take cognizance of any online content, make legally binding determinations on its lawfulness, and issue removal directions to social media platforms. The Authority has also been empowered to hear reviews against its own decisions. The absence of any transparency with respect to the Authority’s actions or form of accountability is an important area of concern to us.
  9. Definition of Social Media or Social Network Service: We are additionally concerned that the definition of “Social Media or Social Network Service” extends beyond those products and services which are generally considered to be user-generated content platforms (i.e., “social media”) and could include any website, product, or service where users are able to share content, including those which are designed for engaging in commerce and assisting productivity.