Social Media vs. Public Deliberation

The case of Facebook’s Free Basics in Myanmar

Heidi Hanhijärvi



Social media platforms have had multiple beneficial impacts on public deliberation processes. Social media has reduced the threshold for participation, created new arenas for deliberation and, most importantly, increased the number of people able to take part in public discussions. The most prominent example of social media’s empowering potential is that of the Arab Spring. Despite of social media’s potential to foster deliberation online social media platforms can distort and derail public discussion intentionally or unintentionally due to, for instance, unpolished algorithms.

Jürgen Habermas was one of the first theorists whom presented their vision of an ideal public sphere where all participants could find solutions to societal issues through deliberation (Habermas 1962). Habermas’s theory of deliberative democracy is the basis of this article as social media encompasses many of the elements described as relevant to good deliberation. However, as the case example of this paper highlights, social media can also work against deliberative processes online and strengthen societal divides.

The case of Facebook’s Free Basics in Myanmar is an exceptional example in so far as the spread of disinformation, filter bubbles and hate-speech contributed to violence in the region (Stevenson 2018). Facebook’s role in Myanmar does, nonetheless, showcase that social media platforms do not always function as arenas of good deliberation. Instead, these platforms can have extremely harmful effects on public discussion and regional stability.

Theoretical framework: deliberative democracy

Habermas first presented his theory of the public sphere in 1962 when he described bourgeois coffeehouses as ideal public spheres:  in coffee houses people could freely come together to discuss common issues (Habermas 1962, 36). For Habermas the most important elements of good deliberation within the public sphere are a shared language and context as well as mutual recognition and respect between the parties to a conversation (Habermas 1962, 36; Habermas 1984, 95). For deliberative processes to be successful there must also be an absence of any power relations which could distort the deliberation process (ibid.). 

The key idea in deliberation is that the parties to the conversation can develop their opinions, values and preferences through discussion and, thus, deliberation is essentially a learning process (Bächtiger et al. 2018, 2-3). In deliberative democracy the deliberation process has a central role and the result of deliberation is ideally a mutually satisfactory concept of the public good (ibid.). Deliberation can, moreover, be defined as a process of legitimation where transparency, inclusion, equality and rational arguments play a central role (Habermas 2006, 413). The concept of ‘good deliberation’ is of course an ideal as no public sphere can be completely free of power relations, biases or prejudice (Bächtiger et al. 2018, 3). Thus, although social media can be viewed as a coffeehouse were people from different areas of life can come together, social media platforms can never completely reflect the ideals of deliberation. 

Habermas has developed his theory of deliberative democracy later in his carrier to better suit the modern age. More specifically, Habermas makes the case for deliberative democracy in the context of mass communication and he outlines two conditions for deliberative democracy in complex societies (Habermas 2006, 411-421). The first condition argues that the media must be independent of its societal settings and the second condition states that there must be a working feedback loop between citizens and societal elites for deliberative democracy to be successful in modern societies (ibid.).

Habermas recognizes that these two conditions are not fulfilled in the current mediated world. Social exclusion, unequal opportunities for participation, the overarching role of economic interests and a strategic rational in media all hinder the ideals of deliberative democracy from realizing (ibid., 420-422). Electronic communication is also mentioned as a threat to the two conditions as online communication can erode trust between citizens and political actors (ibid., 422). 

Facebook’s Free Basics in Myanmar

Mediated communication in the form of social media can either further deliberative processes or hinder them. Facebook’s spread to Myanmar is an example of the latter. In this article social media platforms are not viewed as inherently harmful to public discussion in the context of nation states. Instead, this paper is focused on the potential of social media platforms to distort, disinform and, essentially, manipulate public discussion through the byproduct of algorithms. 

During the past ten years Facebook has been growing rapidly and part of its growth has been the spread of Facebook to developing countries such as Myanmar. The year 2014 was groundbreaking for Myanmar’s economy as during that year Myanmar’s government opened the telecom industry to foreign investments (Paladino 2018, 5). This introduced a vast amount of cheap mobile devices and SIM-cards to Myanmar’s markets and allowed social media platforms such as Facebook to increase their operations in the country (ibid., 5-6).

Facebook had previously created a simplified mobile version of its application called Free Basics. This version only has the core functions of Facebook such as text and news headlines all offered in a multitude of languages but without, for instance, video content (ibid., 6). Free Basics was a part of a larger initiative,, which aimed to provide low-cost internet connection to citizens in developing countries (Facebook 2013). The spread of Facebook in Myanmar was also accelerated by mobile phone operators whom offered free Facebook connection to their customers (Stecklow 2018). Soon Facebook in the form of Free Basics and as a part of quickly spread across Myanmar and for most of the country’s population these applications offered a cost-free internet connection for the first time (Paladino 2018, 6).

Facebook’s impact on deliberative processes 

Facebook’s mobile application Free Basics quickly became a synonym for internet in Myanmar (Mozur 2018). Within a country like Myanmar where public discussion had long been censored Facebook’s Free Basics managed to connect citizens to the same online platform to discuss common concerns, share news and create content. However, Facebook failed to function as an equal opportunity public sphere which would’ve fostered mutual recognition after its introduction to Myanmar. 

Facebook’s Free Basics has been severely criticized for its western bias and it has been banned in countries such as India (Paladino 2018, 7-8).  Critics argue, in short, that Free Basics only provides users restricted access to the internet free of charge and connection is enhanced according to the amount one can pay (ibid.). This imbalance of access to information created fertile grounds for large-scale social media operations which aimed to manipulate the public opinion. During 2017 and 2018 Myanmar’s military actors were accused of spreading propaganda and disinformation via a multitude of fake accounts on Facebook (Mozur 2018).

The military, in effect, spread faked photographs, news articles and blog posts mostly aimed at the country’s Rohingya Muslims (ibid.). Military actors have also been accused of manipulating discussion on Facebook by, for instance, silencing specific users and encouraging hate-speech (ibid.).

As hate-speech spread on Facebook following the years it was first introduced to Myanmar, online discussions were quickly connected to violence in the region. By 2018 over 700 000 Rohingya Muslims had fled from violent attacks led by the Military in Myanmar (Stecklow 2018). Facebook’s algorithms had a central reason for the vast spread of hate-speech on the platform. The algorithms used by Facebook during the past ten years has made it easier for content that arouses strong positive or negative content to spread on the platform (Berger and Milkman 2012). In addition, social media platforms tendency to filter visible data according to the users’ previous behavior has created filter bubbles and helped fake news to spread in specific groups (Paladino 2018, 9). Moreover, when algorithms which encourage the spread of emotional content, hate-speech and filter bubbles are integrated to societies where ethnic divides and polarization are strong, divides can become even deeper.

Facebook cannot be accused of the manipulated content that is found on its platform. The company’s policy explicitly forbids inflammatory content that encourages violence against specific groups of people (Stecklow 2018). However, Facebook’s unpolished algorithms together with the company’s slow reaction to controlling the spread of hate-speech in Myanmar had concrete consequences. Since 2012 Facebook had received multiple requests from journalists, researchers and ordinary users about the increasing amount of hate-speech on the platform but the company did not make clear changes in its policies during the following years (ibid.).

One of the biggest issues for Facebook was the Burmese language commonly used in Myanmar. Facebook simply did not have enough employees fluent in Burmese reading through and controlling the manipulated content and hate-speech that was being spread on the platform (ibid.) Facebook has admitted that its reaction to events in Myanmar came too late and the company has removed several accounts that could be traced back to military actors in Myanmar (Facebook 2018). However, by 2018 Facebook still had very few employees able to translate Burmese content (Stecklow 2018).

The spread of hate-speech and manipulated content is not uniquely Facebook’s problems as several other social media platforms currently struggle with similar issues. The case of Facebook’s Free Basics in Myanmar is an example of how social media platforms can be turned against public deliberation online. In a country like Myanmar where public discussion has long been censored and where democratic processes are struggling to make grounds, Facebook was warmly welcomed.

Instead of empowering public discussion in the form of good deliberation, Facebook, however, helped deepen social divides and encourage violence in the form of social media operations and as a byproduct of its algorithms. The case underlines the clear need for greater responsibility of social media companies to actively regulate and control the content that is spread on their platforms as well the fact that social media platforms cannot foster democratic deliberation processes alone without concrete political and institutional support.


Social media reflects many of the ideals of deliberative democracy as described by Jürgen Habermas: social media platforms have created arenas where people can discuss societal issues freely from an equal position. Social media has also widened the public sphere by providing an easy access to public discussion online to all whom have internet connection. Social media companies, however, have helped increase violence and hate-speech in several countries despite their idealistic goal of connecting people all around the world.

The specific case of Facebook’s spread in Myanmar exemplifies how severe effects social media platforms can have when they are introduced to countries with deep societal divides and little institutional support for public deliberation. More precisely, Myanmar’s case shows that social media can work against public deliberation processes due to the unpreparedness and functional logic of social media companies.

Heidi Hanhijärvi is a world politics student at the University of Helsinki. Through her studies and volunteering experience Heidi has developed an interest towards international cooperation on global security issues. Next spring Heidi will continue her Masters level studies at the University of Ottawa with a focus on peace and conflict studies.


Bächitger, André, Dryzek, John S., Mansbridge, Jane and Warren, Marke E. 2018. “The Oxford Handbook of Deliberative Democracy.” Oxford: Oxford University Press. 

Berger, Jonah, and Katherine L. Milkman. 2012. “What makes online content viral?” Journal of marketing research 49, No.2: 192-205.

Facebook. 2013. “Technology Leaders Launch Partnership to Make Internet Access Available to All.” Facebook. Retrieved From: 31.10.2019

Facebook. 2018. “Removing Myanmar’s Military Officials from Facebook.” Facebook. Retrieved From: 1.11.2019

Habermas, Jürgen. 1962. “Structural Transformation of the Public Sphere.” MIT Press.

Habermas, Jürgen. 1984. “The theory of communicative action.” Beacon Press.

Habermas, Jürgen. 2006.”Political Communication in Media Society: Does Democracy Still Enjoy an Epistemic Dimension? The Impact of Normative Theory on Empirical Research”. Communication Theory 16, No. 4: 411-26. 

Paladino, Brandon. 2018. “Democracy Disconnected: Social Media’s Caustic Influence on Southeast Asia’s Fragile Republics.” Brookings India. Retrieved From: 1.11.2019

Stecklow, Steve. 2018. “Why Facebook is Losing the War on Hate Speech in Myanmar.” Reuters. Retrieved From: 1.11.2019

Stevenson, Alexandra. 2018. “Facebook Admits It Was Used to Incite Violence in Myanmar.” The New York Times. Retrieved From: 2.11.2019