SafeWebLK: New initiative with global tech companies on a Code of Practice for Online Safety

March 15, 2022 at 3:05 PM

Sri Lanka’s web users are joining hands to prepare an industry Code of Practice for Online Safety to enhance how global internet companies self-regulate online content originating from Sri Lanka.

The citizen initiative, branded as SafeWebLK, will collaborate with global internet companies to come up with a set of commitments on how they respond to disinformation, hate speech, cyber bullying and online harassment.

The SafeWebLK initiative – the first of its kind in Asia – promotes a safer web experience for all users in Sri Lanka while fully safeguarding citizens’ right to freedom of expression online.

SafeWebLK will be inaugurated at a public event at BMICH in Colombo on 16 March 2022. The Minister of Justice, M U M Ali Sabry, and the Minister of Mass Media, Dullas Alahapperuma, will participate and speak on this occasion.

Senior representatives from relevant public institutions, civil society groups, IT/ICT industries and research organisations are expected to attend the inauguration. In the coming weeks, more will be engaged through a public consultation process.

SafeWebLK is a public interest effort by Factum, an independent think tank focusing on international relations. Collaboration with the global and Asian tech companies is being facilitated through their regional industry alliance, Asian Internet Coalition (AIC).

“Our initiative is a first in Asia, and we believe anywhere in the developing world too,” says Dr Ranga Kalansooriya, a senior consultant to SafeWebLK. “It will be a learning experience for everyone involved. We are very glad that the government and internet companies are collaborating with us.”

Jeff Paine, Managing Director of AIC, says: “The Asia Internet Coalition (AIC) is committed to work with the industry partners towards building a self-regulatory framework that will pave the way forward in developing Sri Lanka’s Code of Practice for Online Safety and Harms.”

During the past five years, the European Union (EU) and Australia have signed such industry Codes of Practice concerning certain aspects of online content and conduct. New Zealand also drafted a code last year, which is to be finalised in 2022. Such codes are meant for platform level guidance and not individual users.

“Industry Codes of Practice represent the next level of self-regulation, where multiple tech companies publicly commit to the same framework to improve content monitoring in one country or region,” explains Nalaka Gunawardene, digital media analyst who heads the core group assembled by Factum to steer the process.

The inauguration on March 16 starts a process that would include a public call for written inputs, as well as consultations with various stakeholder groups – most of which will happen online due to pandemic considerations. The Code of Practice, once drafted, will be shared on the Factum website for public review and comment.

Factum, a for-profit company registered in Sri Lanka, is an independent think tank that offers key insights, critical analysis and unbiased perspectives on global politics that directly or indirectly impacts Sri Lanka and Asia. https://factum.lk

Background

By the end of 2021, a little over half of Sri Lanka’s population was using the internet and most of them (an estimated 8.2 million) were using one or several social media and instant messaging services.

Social media platforms like Facebook, Instagram, YouTube and TikTok enable easy self-expression but these services are sometimes misused for spreading hatred, disinformation or sexist content. Abusive behaviour like cyber bullying, privacy violations and digital identity theft are also rising as more people go online. One key challenge today is to optimise the digital benefits of web use while minimising the digital risks.

Most tech platforms have their own rules and repeat violators face suspension or account termination. These self-regulatory arrangements – such as Facebook’s Community Standards and YouTube’s Community Guidelines – deal with problematic content and behaviour on a vast scale through a combination of automated software and human reviewers.

Yet, there are growing calls for tech companies to do more at a systemic level to proactively detect and remove harmful content more quickly, and to tackle abusive behaviour more resolutely.

Contact for further information:

Omar Rajaratnam, Factum.lk;

Phone: +94 777 173 452

Email: omar@factum.lk