<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Twitter Archives - iWatch Africa</title>
	<atom:link href="https://iwatchafrica.org/tag/twitter/feed/" rel="self" type="application/rss+xml" />
	<link>https://iwatchafrica.org/tag/twitter/</link>
	<description>...africa values</description>
	<lastBuildDate>Mon, 12 Apr 2021 15:14:59 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
	<item>
		<title>How Big Tech’s Content Moderation Policies Could Jeopardize Users in Authoritarian Regimes</title>
		<link>https://iwatchafrica.org/2021/02/how-big-techs-content-moderation-policies-could-jeopardize-users-in-authoritarian-regimes/</link>
		
		<dc:creator><![CDATA[Gideon Sarpong]]></dc:creator>
		<pubDate>Thu, 25 Feb 2021 06:57:22 +0000</pubDate>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Digital Rights]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Censorship]]></category>
		<category><![CDATA[Content moderation]]></category>
		<category><![CDATA[Facebook]]></category>
		<category><![CDATA[Twitter]]></category>
		<guid isPermaLink="false">https://iwatchafrica.org/?p=3204</guid>

					<description><![CDATA[<p>Social media advocates have historically lauded its ability to facilitate democratic progress by connecting people over space and time, enabling faster and wider mobilization than ever before. However, in recent &#8230;</p>
<p>The post <a href="https://iwatchafrica.org/2021/02/how-big-techs-content-moderation-policies-could-jeopardize-users-in-authoritarian-regimes/">How Big Tech’s Content Moderation Policies Could Jeopardize Users in Authoritarian Regimes</a> appeared first on <a href="https://iwatchafrica.org">iWatch Africa</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p style="text-align: justify;">Social media advocates have historically lauded its ability to facilitate democratic progress by connecting people over space and time, enabling faster and wider mobilization than ever before. However, in recent years, this optimism has faded, and platforms have also become effective tools for dictators looking to spread disinformation and propaganda. Today, the social media ecosystem is a hotly contested sphere of political influence rife with disinformation, hate speech, and even violence.</p>
<p style="text-align: justify;">Despite being owned and operated by private companies based largely in the United States, these platforms play an important role in public life worldwide. The battles for influence between democratic and authoritarian actors on social media can make or break attempts at democratic transition. As such, the decisions made by these big tech companies can have significant consequences in countries around the world, especially in authoritarian countries with burgeoning democratic movements.</p>
<p style="text-align: justify;"><strong>Defending Freedom of Expression: The Need for Transparency and Consistency</strong></p>
<p style="text-align: justify;">Global audiences are becoming increasingly aware of the threat of false information on social media. A <a href="https://reutersinstitute.politics.ox.ac.uk/digital-news-report-2020">2020 survey</a> by the Reuters Institute for the Study of Journalism showed that 40 percent of respondents are more concerned about encountering false information on social media than other online sources of information.  While this may justify efforts by social media companies to aggressively moderate online content, the reality of the issue is more complex.</p>
<p style="text-align: justify;">These social media companies have <a href="https://www.eff.org/deeplinks/2019/04/content-moderation-broken-let-us-count-ways">proven inconsistent</a> in how they apply content moderation systems, and their decisions about what content is displayed as well as their overall lack of transparency should be of grave concern to users everywhere. Erratic moderation policies have sometimes resulted in <a href="https://gfmd.info/gfmd-content/uploads/2020/11/DC-Sustainability-Annual-Report-2020-FINAL.pdf">the censorship of legitimate investigative news content</a> and important health information by social media algorithms.</p>
<p style="text-align: justify;">Katherine Chen, a Facebook Oversight Board member, is critical of the content moderation policy that guides the company. In an interview with Reuters, she states that the Board “can see that there are some policy problems at Facebook.” She argues for the establishment of policies, particularly those involved in human rights and freedom of speech, that are “precise, accessible, clearly defined.”</p>
<p style="text-align: justify;"><strong>Uganda: How Facebook “Caused” an Internet Blackout</strong></p>
<p style="text-align: justify;">Recent events in Uganda are an example of the dangers of these unilateral decision-making processes. Days ahead of the elections in Uganda in January, President Museveni announced a ban on Facebook and other social media platforms. <a href="https://www.youtube.com/watch?v=dsFsAJtn0Io">Addressing the nation</a> in Kampala, Museveni accused Facebook of political bias against the ruling party, the National Resistance Movement (NRM), and “arrogance.”</p>
<p style="text-align: justify;">The move was prompted by Facebook’s decision to take down a <a href="https://www.reuters.com/article/uk-uganda-election-facebook/facebook-takes-down-ugandan-pro-museveni-accounts-ahead-of-election-idUSKBN29G1H9">network of pro-Museveni accounts</a> in the run-up to the presidential election, claiming they were fake accounts linked to the ministry of information. “I told my people to warn [Facebook] . . . If it is to operate in Uganda, it should be used equitably,” Museveni said. “If you want to take sides against the NRM, then that group would not operate in Uganda. Uganda is ours.” Facebook defended its decision to ban the accounts, insisting that an <a href="https://medium.com/dfrlab/social-media-disinformation-campaign-targets-ugandan-presidential-election-b259dbbb1aa8">investigation</a> had revealed their involvement in a coordinated effort to undermine political debate in Uganda.</p>
<p style="text-align: justify;">Regardless of the merits of the investigation, many open internet advocates argue private corporations do not have unilateral power over the ‘public realm’ and must consider local circumstances and political nuances in their moderation decisions.</p>
<p style="text-align: justify;">As Odanga Madung, an internet policy researcher based in Kenya, <a href="https://www.bbc.com/news/world-africa-55618994">told the BBC</a>: “Any casual observer of Ugandan politics expected the government to impose internet restrictions ahead of the elections, so Facebook&#8217;s decision—especially the absence of tact when punishing infringements of its terms of service—offered Museveni a timely ruse to clothe the inevitable shutdown as a retaliation.”</p>
<p style="text-align: justify;">While controversial content moderation decisions from Facebook and Twitter might garner criticism in thriving democracies, they can be met with much harsher consequences in more authoritarian countries where decision-making is concentrated in few hands. In these contexts, companies are often left with no choice but to do the government’s bidding, lest they end up restricted or outright blocked. In either case, users are left to deal with potentially significant repercussions, such as restricted access to the internet, which can impact their businesses and livelihoods.</p>
<p style="text-align: justify;">In the Ugandan case, citizens bore the brunt of Facebook’s decision. Internet freedom monitor <a href="https://netblocks.org/">NetBlocks</a> <a href="https://news.trust.org/item/20210120134502-2jnhz/">found</a> that the five-day internet shutdown in Uganda cost the economy around $9 million (approximately 33 billion Ugandan Shillings), disproportionally affecting poor Ugandans who often rely on internet-based mobile applications to run their businesses.</p>
<p style="text-align: justify;"><strong>The Global Implications of Content Moderation Decisions</strong></p>
<p style="text-align: justify;">Content moderation decisions by social media giants have also sparked an intense debate about free speech globally. The seemingly inconsistent decisions made by social media platforms have been used by authoritarian leaders to cloak their attempts to silence opponents—whether by blocking social media access or shutting down the internet altogether—in a thin veneer of legitimacy.</p>
<p style="text-align: justify;">In response to Twitter’s ban of then-President Trump, jailed Russian opposition leader Alexey Navalny <a href="https://twitter.com/navalny/status/1347969772177264644">criticized</a> the platform’s seemingly arbitrary decision and warned that it could potentially help authoritarians stifle dissent.</p>
<figure id="attachment_3205" aria-describedby="caption-attachment-3205" style="width: 916px" class="wp-caption alignnone"><img fetchpriority="high" decoding="async" class="size-full wp-image-3205" src="https://iwatchafrica.org/wp-content/uploads/2021/02/Screenshot-5.png" alt="Screenshot of Tweet by Russian opposition leader Alexey Navalny " width="916" height="507" srcset="https://iwatchafrica.org/wp-content/uploads/2021/02/Screenshot-5.png 916w, https://iwatchafrica.org/wp-content/uploads/2021/02/Screenshot-5-300x166.png 300w, https://iwatchafrica.org/wp-content/uploads/2021/02/Screenshot-5-768x425.png 768w" sizes="(max-width: 916px) 100vw, 916px" /><figcaption id="caption-attachment-3205" class="wp-caption-text">Screenshot of Tweet by Russian opposition leader Alexey Navalny</figcaption></figure>
<p style="text-align: justify;">And it is clear that silencing online dissent is a priority for authoritarian leaders around the world. In the past, radio and television stations were often among the first sources of public information to be targeted during coups and revolutions. Today, it is the internet. Citing misinformation and fake news to claim an outsized gatekeeper role over the internet provides dictators a powerful political weapon. And there is certainly no shortage of dictatorial leaders willing to use internet shutdowns as a political cudgel. For instance, a <a href="https://cipesa.org/2019/03/despots-and-disruptions-five-dimensions-of-internet-shutdowns-in-africa/">2019 report</a>, by the Collaboration on International ICT Policy in East and Southern Africa (CIPESA) showed that since 2015 up to 22 African governments had ordered network disruptions.</p>
<p style="text-align: justify;">The more recent actions of military leaders in Burma in the immediate aftermath of their coup are cases in point that demonstrate how important control over online spaces is to broader authoritarianism. The morning after the military takeover, citizens reported <a href="https://www.bbc.com/news/world-asia-55889565">decreased internet connectivity</a> and days later  access to Facebook (the primary means of internet access for many Burmese citizens), as well as other Facebook-owned services like WhatsApp and Instagram, was <a href="https://www.forbes.com/sites/roberthart/2021/02/04/myanmars-military-blocks-access-to-facebook-after-overthrowing-government/?sh=6276669c5032">blocked</a>. In justifying the ban, officials from the new regime claim Facebook is used to spread “fake news and misinformation . . . [that is] causing misunderstanding among people,” which could lead to further unrest.</p>
<p style="text-align: justify;">The risk of giving authoritarian leaders an excuse to shut down the internet is not the only potential threat posed by the lack of transparency in content moderation decision-making processes. Over the last decade, social media companies have emerged at the center of a political firestorm in many countries. The danger that these private corporate firms could come under undue influence from authoritarian regimes to stifle dissenting voices and opposition groups demonstrates an urgent need for an independent, fair, and transparent content moderation system.</p>
<figure id="attachment_3206" aria-describedby="caption-attachment-3206" style="width: 925px" class="wp-caption alignnone"><img decoding="async" class="size-full wp-image-3206" src="https://iwatchafrica.org/wp-content/uploads/2021/02/Screenshot-4.png" alt="Screenshot of Tweet by Russian opposition leader Alexey Navalny " width="925" height="497" srcset="https://iwatchafrica.org/wp-content/uploads/2021/02/Screenshot-4.png 925w, https://iwatchafrica.org/wp-content/uploads/2021/02/Screenshot-4-300x161.png 300w, https://iwatchafrica.org/wp-content/uploads/2021/02/Screenshot-4-768x413.png 768w" sizes="(max-width: 925px) 100vw, 925px" /><figcaption id="caption-attachment-3206" class="wp-caption-text">Screenshot of Tweet by Russian opposition leader Alexey Navalny</figcaption></figure>
<p style="text-align: justify;">Some of these platforms have already demonstrated their willingness to comply with authoritarian governments’ efforts to censor critics. Instagram recently <a href="https://www.washingtonpost.com/world/europe/russian-opposition-leader-slams-instagram-for-caving-in-to-the-government/2018/02/15/fddd06da-1247-11e8-9570-29c9830535e5_story.html?utm_term=.03b81fce7d21&amp;itid=lk_inline_manual_13">caved</a> to the Russian telecom regulator’s demands that it remove content related to opposition activist Alexei Navalny’s anti-corruption investigation. Facebook has also previously complied with arcane censorship laws and blocked anti-government content in several countries, including <a href="https://www.reuters.com/article/us-vietnam-facebook-exclusive-idUSKCN2232JX">Vietnam</a>, <a href="https://www.propublica.org/article/facebook-hate-speech-censorship-internal-documents-algorithms">Morocco, and India.</a></p>
<p style="text-align: justify;"><strong>The Path Forward</strong>—<strong>a Rights-respecting Content Moderation Regime</strong></p>
<p style="text-align: justify;">Currently, Twitter is piloting its “<a href="https://blog.twitter.com/en_us/topics/product/2021/introducing-birdwatch-a-community-based-approach-to-misinformation.html">BirdWatch</a>,” a community-driven approach to addressing misleading information on its platform. Facebook has introduced an independent <a href="https://oversightboard.com/">Oversight Board</a>, which will review Facebook’s content moderation decisions and offer binding recommendations on whether or not to uphold them.</p>
<p style="text-align: justify;">Yet, social media companies will need to take more decisive action to improve public confidence in the wake of their <a href="https://www.npr.org/2020/09/22/915555286/what-can-social-media-do-to-slowdown-the-spread-of-misinformation">sluggish attempts</a> to halt the spread of misinformation and their acquiescence to censorship demands from authoritarian governments. To that end, they must set up regional rapid response units with a commitment to understanding the local political and social context to avoid decisions that play into the hands of authoritarian regimes.</p>
<p style="text-align: justify;">These independent, localized units will also be important in addressing the <a href="https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/">overwhelming</a> and varied forms of complaints received by big tech companies, ranging from appeals decisions to take-down requests from governments and users.</p>
<p style="text-align: justify;">As big social media companies have taken on an increasingly central role in the public sphere, the intertwinement between private companies and international politics has become increasingly complicated. Laws enacted in the United States and Europe to regulate social media platforms may not apply elsewhere, particularly the Global South. A content moderation regime that is not inclusive, has a slow response rate, and fails to factor in local, regional, and political contexts is ultimately failing to overcome the challenges it aims to address.</p>
<p style="text-align: justify;">Access to information and freedom of expression, including the public conversation on social media, are a vital part of strong democratic processes. Social media companies have an enormous responsibility to respond to demands for greater accountability.</p>
<p style="text-align: justify;"><em>Gideon Sarpong is a 2020/21 Open Internet for Democracy Leader. He is a co-founder of <a href="http://www.iwatchafrica.org">iWatch Africa</a> and a Policy Leader Fellow at the European University Institute, School of Trans-national Governance in Florence, Italy.</em></p>
<p style="text-align: justify;">
<p>The post <a href="https://iwatchafrica.org/2021/02/how-big-techs-content-moderation-policies-could-jeopardize-users-in-authoritarian-regimes/">How Big Tech’s Content Moderation Policies Could Jeopardize Users in Authoritarian Regimes</a> appeared first on <a href="https://iwatchafrica.org">iWatch Africa</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>#FIFAfrica20: iWatch Africa calls on tech multinationals to do more to protect journalists &#038; rights activists in Africa</title>
		<link>https://iwatchafrica.org/2020/10/fifafrica20-iwatch-africa-calls-on-tech-multinationals-to-do-more-to-protect-journalists-rights-activists-in-africa/</link>
		
		<dc:creator><![CDATA[iWatch Africa]]></dc:creator>
		<pubDate>Thu, 01 Oct 2020 09:42:01 +0000</pubDate>
				<category><![CDATA[Articles]]></category>
		<category><![CDATA[Digital Rights]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Facebook]]></category>
		<category><![CDATA[Google]]></category>
		<category><![CDATA[Twitter]]></category>
		<guid isPermaLink="false">https://iwatchafrica.org/?p=3110</guid>

					<description><![CDATA[<p>Policy Director of iWatch Africa, Gideon Sarpong has called on tech multinationals such as Facebook, Google and Twitter to do more combat the increasingly spate of attacks meted out to &#8230;</p>
<p>The post <a href="https://iwatchafrica.org/2020/10/fifafrica20-iwatch-africa-calls-on-tech-multinationals-to-do-more-to-protect-journalists-rights-activists-in-africa/">#FIFAfrica20: iWatch Africa calls on tech multinationals to do more to protect journalists &#038; rights activists in Africa</a> appeared first on <a href="https://iwatchafrica.org">iWatch Africa</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p style="text-align: justify;">Policy Director of iWatch Africa, Gideon Sarpong has called on tech multinationals such as Facebook, Google and Twitter to do more combat the increasingly spate of attacks meted out to journalists and rights activists within the digital ecosystem in Africa.</p>
<p style="text-align: justify;">Gideon Sarpong, speaking at the just ended <span class="aCOpRe">Forum on Internet Freedom in Africa 2020 (#<wbr />FIFAfrica20)</span> described the current structures put in place by big tech companies as “unsustainable” in dealing with the evolving threat against journalists and rights activists in Africa.</p>
<p style="text-align: justify;">“Tech multinationals must set up regional offices to deal with its increasing users and abusive content on their platforms. The current strategy of using algorithms to detect and take down abusive content as well as lack of physical presence in almost all African countries is unsustainable. The likes of Facebook and Twitter must go beyond using AI systems to deal with vitriol on their platforms to engaging journalists and rights activists around the continent to provide sustainable solutions,” he stated.</p>
<p style="text-align: justify;">iWatch Africa between January to August 2020 has recorded over 4000 instances of online abuse directed at journalists and rights activists in Ghana. Among these abuses include threats of violence and harm which have been duly reported to the law enforcement bodies in Ghana.</p>
<p style="text-align: justify;">A recent study by the Reuters Institute in Oxford also found that seven in ten journalists (71%) in the Global South have experienced online harassment, with more than half saying it has increased in the past year.</p>
<p style="text-align: justify;">Digital rights issues such as data governance, safety and security are increasingly becoming of major concern to users around the continent as some governments, nefarious groups and individuals exploit these platforms to stifle, abuse and threaten the freedom of expression.</p>
<p style="text-align: justify;">“People all around the continent are reliant on platforms like Google, Twitter, Facebook etc. to express themselves freely. Just like the European Union, the African Union (AU) in coordination with member states must play a prominent role championing issues about privacy, safety and security by engaging tech multinationals,” Sarpong stated.</p>
<p style="text-align: justify;">“Many today consider social media platforms as public utilities and argue for some form of regulation. Self-regulation has so far failed to protect users. We need to a new approach,” he added.</p>
<p style="text-align: justify;">Credit: iWatch Africa</p>
<p>The post <a href="https://iwatchafrica.org/2020/10/fifafrica20-iwatch-africa-calls-on-tech-multinationals-to-do-more-to-protect-journalists-rights-activists-in-africa/">#FIFAfrica20: iWatch Africa calls on tech multinationals to do more to protect journalists &#038; rights activists in Africa</a> appeared first on <a href="https://iwatchafrica.org">iWatch Africa</a>.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
