Policing the platforms
By: Jon Guinness, Portfolio manager, Equities and Sumant Wahi, Portfolio manager, Equities
Questions about free speech on social media, internet companies’ policies on user expression, and government intervention in the tech industry are becoming impossible for investors to ignore. In the aftermath of a particularly bad-tempered US presidential election campaign, it is important, in our view, for social media companies to have external boundaries on speech. In the meantime, social media firms have to set transparent, non-partisan rules for speech on their platforms with independent oversight boards to ensure fairness and consistency for the good of democratic discourse and their own long-term business viability.
Former US President Donald Trump’s suspension and banning from various social media sites is a high-profile case of the debate between free and responsible speech online that has been rumbling on for some time. The problem for social media companies is where to draw the line between acceptable and unacceptable speech, and perhaps, should they even be the ones to judge that distinction?
Before addressing these difficult questions, it is worth reiterating the position that we approach this topic from. As technology fund managers, it is our responsibility to take a keen interest in these debates because they directly affect the sustainability and long-term returns of companies in our investment universe. More broadly, we subscribe to digital ethics, specifically around:
On free speech, we believe that it is ethically incumbent upon the dominant social media platforms to facilitate a range of user views, with only specific limitations where necessary against toxic content. Independent oversight boards set up by social media networks can be an important tool to achieve this while more societally driven frameworks are established. We think this approach is consistent with the long-term business interests of social media companies.
Who draws the line between free speech and incitement?
At the moment social media enjoys the best of both worlds: in the US, Section 230 of the Communications Decency Act means they are able to claim indemnity from prosecution for opinions expressed on their platforms, while they simultaneously reserve the right to ban whoever they want for any reason. This gives social media companies immense power that they sometimes wield unevenly.
In 2017, web infrastructure company Cloudflare removed a neo-Nazi website from the internet after a violent far-right rally in Charlottesville, Virginia. While there are undeniably strong justifications for such a move, it was the conflicting stances of the company that caught our attention. Cloudflare executives previously defended hosting a number of unsavoury forums on grounds of free speech, but following the Charlottesville incident, the CEO commented, “Literally, I woke up in a bad mood and decided someone shouldn’t be allowed on the Internet.” He may have been flippant, but what this statement and other actions from social media companies show are the lack of clearly defined frameworks around moderating online speech.
It should not be the job of internet businesses alone to wrestle with philosophical issues around free speech that affect our whole society. In democratic countries, governments and independent regulators should play a central role in shaping the parameters of speech for internet publishing, with a strong sympathy towards promoting open debate.
Facebook has been vocal about this need. CEO Mark Zuckerberg recently commented: “it would be very helpful to us and the Internet sector overall for there to be clear rules and expectations on some of these social issues around how content should be handled, around how elections should be handled, around what privacy norms governments want to see in place, because these questions all have trade-offs.”
“Bad ideas die in the sunlight and thrive in the shadows”
YouTube recently banned UK TalkRadio on the grounds that it had “posted material that contradicted expert advice about the coronavirus pandemic”, only to quickly rescind the decision following a public outcry. The problem is that expert advice on the pandemic has changed in some fundamental respects. Last spring, for example, health authorities in the US, UK and elsewhere argued that face masks do not reduce coronavirus transmission and they were unnecessary. Today they are obligatory in various public places. We are not condoning breaking government guidance, but we think it is fair and healthy to debate ideas freely without fear of reprisal with a view to reaching consensus in society. It is precisely this process that leads to robust decisions that benefit everyone.
There is also the function that social media plays in political outcomes. During the 2020 US election campaign, the New York Post published a report challenging the veracity of Joe Biden’s son’s tax returns. Social media networks suppressed the story. The quality of research behind the Post’s article at the time is debateable, although post-election the US Justice Department confirmed an investigation into Hunter Biden’s tax affairs. The broader question is around the internal processes that govern these decisions at social media firms, and how fairly and consistently they are applied to speech across the political spectrum. It’s impossible for us to know what impact this event - or others such as the well-publicised second FBI probe into candidate Hillary Clinton’s use of private servers for work emails announced shortly before the 2016 election - had on election results. But it does demonstrate the reverberating effects of coverage decisions by social media.
Another dimension to the online speech debate is the potential for monopoly or cartel-like behaviour. Parler, a conservative/right wing social media forum, had its app removed by Apple’s and Google’s app stores in the wake of violence in Washington during a congressional session to approve the election vote count. Parler was allegedly removed because of promoting violence and its use, in part, to co-ordinate the protest. Amazon Web Services subsequently refused to provide cloud hosting to Parler, effectively restricting access to users. Whether the tech giants were justified in removing Parler or not, the incident shows the vulnerability of third-party apps in reaching an audience and the power of big tech to control the agenda.
Tech giants dominate cloud computing
Notes: 12 months to 30 June 2020. Source: Statista, August 2020.
Oversight boards can form part of the solution, at least while more permanent frameworks are developed. These boards could be comprised of lawyers, academics, journalists and political experts, and function in a quasi-judicial role to review cases, monitor content and contribute to policies. Social media companies can set up these panels themselves, as Facebook recently did, but it’s important that they are seen as independent and balanced so they have legitimacy with the public.
If the social media market was more fragmented, it could be argued that platforms should be free to promote certain political and speech agendas as competition would ensure a variety of voices were heard - this is generally the case in newspaper markets. But the current social media landscape is dominated by just two names, Facebook and Twitter. This market structure makes it ethically necessary for the leading firms to prioritise impartiality, truthfulness and a commitment to free speech.
If social media companies fail to do this, they could invite harsh controls on their publishing rights and other business practices from politicians and voters increasingly resentful of their perceived biases. In this scenario, the leading companies could become utility-like entities where they would preserve their monopolies, but at the cost of heavy regulation and supervision restraining their ability to innovate and grow.
Regulation of big tech is likely
Regulation on economic grounds is another flashpoint. There is no serious alternative to Facebook and Twitter yet in social media terms. Facebook and its network reach nearly half the world’s population each month, three times more than equivalent platforms such as WeChat. As a result, fair competition in the tech sector has become increasingly prominent over the past 12 months.
Facebook reaches 3.2 billion people each month
Note: Data using last reported figures: Facebook (3Q20), WhatsApp (1Q20), Messenger (1Q17), Instagram (2Q18), Any (3Q20). Source: Statista, December 2020.
The US Federal Trade Commission and attorney generals from 46 states launched antitrust proceedings against Facebook in December 2020 over its acquisitions of Instagram and Snapchat. In Australia, the government is attempting to legislate on a ‘New Bargaining Code’ for internet firms, which would force the likes of Facebook and Google to negotiate payments to third party media companies for content they use. European regulators are preparing new laws that will make it easier to launch investigations, curb growth into new product areas and bar tech firms from giving their own products preferential treatment in their digital stores.
The stakes are high for big tech and society
Social media platforms also need to maintain the loyalty of active users. As investors, we want to see commitment from these companies to free speech, political impartiality, and robust and transparent moderation policies. Establishing independent oversight boards is a solid step towards this goal. This would help restore trust in their output, appeal to users from a broad range of political persuasions and support their businesses in the long run.
There are other issues about how to draw up acceptable parameters for free speech online. Governments and regulators need to play a role in setting clear guidelines for social media platforms, so they understand their responsibilities. We take heart that some companies, such as Facebook, recognise this and are taking steps to reduce political polarisation and improve the quality of content oversight and review.
Social media businesses hold enormous influence and, as a society, we should consciously decide what the limits to that power should be. If we don’t, we could reach a point where the boundaries become so blurred, views so polarised and the dominance of internet companies so embedded that it becomes too difficult to unwind.
This document is issued by FIL Responsible Entity (Australia) Limited ABN 33 148 059 009, AFSL No. 409340 (“Fidelity Australia”). Fidelity Australia is a member of the FIL Limited group of companies commonly known as Fidelity International.
Investments in overseas markets can be affected by currency exchange and this may affect the value of your investment. Investments in small and emerging markets can be more volatile than investments in developed markets.
This document is intended for use by advisers and wholesale investors. Retail investors should not rely on any information in this document without first seeking advice from their financial adviser. This document has been prepared without taking into account your objectives, financial situation or needs. You should consider these matters before acting on the information. You should also consider the relevant Product Disclosure Statements (“PDS”) for any Fidelity Australia product mentioned in this document before making any decision about whether to acquire the product. The PDS can be obtained by contacting Fidelity Australia on 1800 119 270 or by downloading it from our website at www.fidelity.com.au. This document may include general commentary on market activity, sector trends or other broad-based economic or political conditions that should not be taken as investment advice. Information stated herein about specific securities is subject to change. Any reference to specific securities should not be taken as a recommendation to buy, sell or hold these securities. While the information contained in this document has been prepared with reasonable care, no responsibility or liability is accepted for any errors or omissions or misstatements however caused. This document is intended as general information only. The document may not be reproduced or transmitted without prior written permission of Fidelity Australia. The issuer of Fidelity’s managed investment schemes is FIL Responsible Entity (Australia) Limited ABN 33 148 059 009. Reference to ($) are in Australian dollars unless stated otherwise.
© 2021 FIL Responsible Entity (Australia) Limited. Fidelity, Fidelity International and the Fidelity International logo and F symbol are trademarks of FIL Limited.