Second Reading Speech by Senior Minister of State for Law, Mr Edwin Tong on The Protection from Online Falsehoods and Manipulation Bill
7 May 2019 Posted in Parliamentary speeches and responses
Mr Speaker, Minister has sketched out some of the forces that explain why online falsehoods are real and serious problems, not just to Singapore but for democracies around the world. I will take Members through the key provisions in the Bill. Members who have read the Select Committee report will notice that the Bill tracks closely the Select Committee’s findings and recommendations. The measures in the Bill are designed to address firstly, the impact of falsehoods; and secondly, the reasons underlying why falsehoods have had a severe impact. On the impact of falsehoods, the Select Committee made the following findings.
Falsehoods can have a one off dramatic impact. But low-level falsehoods that have no immediate visible impact can be just as dangerous. This is found in paragraphs 105 to 107 of the Select Committee report.
Minister has explained in detail how falsehoods harm democratic institutions and free speech. The Select Committee goes into this in some detail at paragraphs 121 to 138 of its report.
Falsehoods can also cause serious and sometimes fatal consequences for individuals and for businesses. This is covered in paragraphs 139 to 151 of the Select Committee report.
To understand why falsehoods have had such a serious impact, the Select Committee heard extensive expert evidence on the psychology and mechanisms of how falsehoods work and operate. This can be found at paragraphs 152 to 177 of the Select Committee report.
But let me summarise. The core trick of falsehoods lie in their use to arouse anger, fear and negative emotions. This more effectively exploits cognitive biases. There is also a stark power imbalance between facts and falsehoods. It is very difficult for facts to overcome falsehoods organically. Falsehoods move and take effect quickly, long before corrections can be put in motion. Hence, the importance of putting out corrections swiftly and circulating them vigorously. The social transformations caused by the digital revolution have given falsehoods renewed power. This was something that the Select Committee also addressed at paragraphs 178 to 185. It is useful to see what this means in the Singapore context. And on that score, the Select Committee looked at it, evaluated, made its findings at paragraphs 208 to 237.
A few of its key findings include: first, that there are increasing signs of the phenomenon in Singapore and evidence of foreign disinformation; two, there is a real risk of slow drip falsehoods exploiting Singapore diversity to damage society in the long term; thirdly, Singapore is also vulnerable due to its regional circumstances. The Select Committee ultimately concluded that such deliberate online falsehoods are a problem that Singapore has to take action against.
How should we respond to this? The Select Committee gave five broad areas for action: first, nurture an informed public; second, reinforce social cohesion and trust; third, promote fact checking; fourth, disrupt online falsehoods; and fifth, deal with threats to national security and sovereignty.
The Select Committee also underscored the importance of having to preserve public trust, something that Minister spoke about at some length. It noted at paragraph 311 of the report that loss of faith in public institutions increased the success of disinformation operations significantly.
There was very detailed discussion as well on legislation. Three broad positions emerged from the representations received. First, that legislation should play a role. A considerable number of experts recommended this. Two, voluntary regulation by technology companies. Three, we take a hands off approach altogether and leave matters to be dealt with organically by the marketplace of ideas.
The Committee considered these positions in some detail. Its findings explained why the theory of the unregulated marketplace idea was based on assumptions that are flawed in the digital age. Why the conduct of tech companies – their repeated failures and inadequate responses – pointed to a fundamental conflict of interest between their business goals, and acting in the public interest. Minister has also covered this point in some detail in his speech.
Essentially, the Committee accepted a robust and credible legal analysis showing that existing laws in Singapore were inadequate. It concluded that new laws are needed and, importantly, the Select Committee found that the concerns over free speech could be addressed using a calibrated approach in legislation. This Bill reflects the fundamental principle of calibration.
The Government examined the Select Committee’s various findings, some of which I have referred to earlier, and found this approach would be in the long run more supportive of encouraging good quality public discourse and increasing public trust. We thus decided that a new approach is needed. This will be a shift away from the blunt tools that we already have and which many other jurisdictions are considering.
In considering how powers should be exercised, four decision-making models were discussed and considered by the Select Committee. This is set out at paragraphs 364 of the report. The four models are: first, have Courts including an expedited process; or two, an Executive followed by recourse to the Courts; third, an independent body; and fourth, online platforms.
The Select Committee made the following findings: one, that the Broadcasting Act already relies on Executive action; two, judicial process is not fast enough; three, in situations involving public order, national security, public institutions – only the Executive would hold the facts, and facts should be backed up by the Executive’s authority; judicial oversight could assuage concerns over the abuse of Executive power.
The unanimous recommendation of the Select Committee was that the Government should have powers to swiftly disrupt online falsehoods. Let me now take Members through the key provisions of this Bill.
Consistent with the Select Committee’s recommendations, this Bill provides a tool-box of Government powers to address the impact of specific individual falsehoods and source of falsehoods. This implements recommendations 15 and 16 of the Select Committee’s report. It provides regulatory oversight of Internet intermediaries, to ensure that they take effective measures to prevent and combat the problems. This implements recommendations 17 to 20 of the Select Committee’s report.
Let me go into some details and take Members through the various levers in this Bill that deal with the impact of falsehoods. I must emphasise that these levers remedy the impact of falsehoods primarily and not punish wrong-doers. In other words, just because one might receive a Direction does not mean that that person has done something illegal.
I mentioned earlier how the digital revolution has given falsehoods a new power. In particular, the proliferation of social media services, content aggregators, blogs, search engines and other intermediary services, has profoundly changed the way we consume information.
Large majority of the toolkit is therefore designed for platforms, not individual publishers. The Bill also allows for the issuing of Codes of Practice binding the platforms. Minister Iswaran will speak later on this.
The tools fall into the following categories, which closely reflect the Select Committee’s recommendations: one, providing access to and increasing the visibility of corrections, which implements recommendation 12 of the Select Committee report; two, disrupting fake accounts that amplify falsehoods, which also implements recommendation 12. Likewise, discrediting online sources of falsehoods; and finally, the levers which cut off financial incentives of online sources of falsehoods which implements recommendation 15.
Let me elaborate on the powers that target falsehoods.
The Minister will be empowered to issue Directions against falsehoods, where it is in the public interest to do so. The Courts will have the final say over what is false.
The provisions are divided into two fairly self-contained Parts: Part 3 for individual publishers and Part 4 for the platforms, which are the “persons” covered under this section of the Bill which are the Internet intermediaries, and mass media service providers such as newspapers, broadcasters, and telecommunications service providers.
Let me outline the corrections regime.
In line with this new approach I mentioned earlier, the primary tool that we intend to use, is the power to give people direct access to corrections. In other words, the falsehood stays up. People will then have access to both the falsehood and the corrections, and they can decide for themselves. In such a case, the Directions add to and not remove the discourse.
In general terms, the corrections powers will require a person to “tag” a falsehood with a correction, or amplify a correction generally. These powers are needed because of the difficulty of getting corrections to overcome the reach of falsehoods. The Select Committee’s report cover the findings on this issue in great detail, at paragraphs 171 to 177.
Let me just highlight a few.
First, by way of example, the Select Committee cited a 2018 study by the Massachusetts Institute of Technology (MIT) that false news was 70% more likely to be re-tweeted than true news. The Select Committee also cited a very interesting study by a tech start-up examining a rumour in 2017 about then-French Presidential candidate Emmanuel Macron. It found that, on Twitter, there was almost no overlap between the audience of the rumour, which was false, and the audience of the correction of that same rumour. So, no overlap between the two groups.
Research shows that corrections tend to be effective when they provide an explanation of the facts and give prior warning about the falsehood to come. This is also mentioned in paragraph 361 of the Select Committee’s report. As such, the Corrections Directions are designed with this in mind. Corrections will take the form of a notice warning people about the falsehood and the notice can set out the facts, or provide a link to the facts.
The powers relating to Corrections are set out in clauses 11, 21 and 23 of the Bill.
There are two main types of Corrections possible, both of which are designed in accordance with the recommendations in paragraph 361 of the Select Committee’s report. The first is set out in clauses 11 and 21. For convenience, I will call this the Targeted Correction. A Targeted Correction must be made accessible to viewers of the falsehood. It acts as a warning tag on the falsehood.
The second type of correction is set out in clause 23 and this correction must be generally amplified on certain platforms, such as news outlets and Internet intermediaries, even if these platforms are not carrying the falsehood. For convenience, I will call this the General Correction.
A General Correction is important to inoculate the public before a falsehood reaches them. Psychological research has shown that corrections used in the same manner as vaccines can be very effective. This is especially appropriate when a campaign to put out falsehoods is on-going, or a broad false narrative based on various lies could be developing and gaining traction. A General Correction can also help when a falsehood is serious and persistent, or is moving underground, into less visible spaces on closed platforms.
Let me now describe the take downs. Besides corrections powers, the Bill also provides for disabling of access to falsehoods, where it is in the public interest to do so. These powers are set out at clauses 12 and 22. There can be a Direction to cease communication of the falsehood to viewers in Singapore. There can be a further order to require that a correction be communicated to those who had previously viewed the falsehood. The Bill requires these Directions to be published in the Government Gazette.
Who may receive these Directions? There are several groups.
Falsehoods, when spread online, may pass through hands and cascade through and move across different platforms. To curb dissemination, it will be most effective to issue Directions to key nodes of dissemination. These will mainly be the internet intermediaries, which almost always play a crucial role in the spread of online falsehoods. Directions could also in some situations be issued to those with large followings. It would often not make sense to issue Directions to every single person who shares a falsehood. Corrections must be published to users in a clear and conspicuous manner.
Two conditions must be satisfied before the Directions can be issued. Minister Shanmugam touched on this. First, a false statement of fact must be communicated in Singapore. Second, it is in the public interest to issue a Direction.
The phrase “false statement of fact” is, as already explained, a legal term drawn from existing law. It covers statements that a reasonable person would consider to be a representation of fact. Opinions, comments, criticisms, are not covered by the Bill.
A statement is false, if it is false or misleading, whether wholly or in part, and whether on its own or in the context in which it appears. This definition addresses the various ways in which reality might be distorted.
Real words and real actions can also be edited and presented in a way that completely transforms their meaning.
In a live interview, Hillary Clinton’s adviser referred to a news article that had blamed Clinton for the death of US diplomats in Afghanistan. A clip of the interview was shown out of context to say that Clinton’s own adviser had blamed Clinton for the deaths. That is wrong.
In Sweden, a real police report listed “vulnerable areas” where police needed to regularly respond to volatile situations. A Swedish newspaper columnist exaggerated the report, and claimed that there were 50 “no go” zones in Sweden – areas filled with illegal immigrants that were too dangerous for even the police to enter. This reportedly remains one of the most persistent myths in Sweden, despite repeated attempts to debunk it.
Similarly, news reports that omit material facts can be a falsehood. This, as experts told the Select Committee, is a common disinformation tactic. For example, in Germany, online news websites spread a girl’s claims that she had been raped by refugees, and that the police were covering it up. They showed real interviews, but omitted the police’s debunk of the claim. This led to thousands of people protesting on the streets, against the alleged cover-up. Directions can also be issued against false “statements” communicated over the Internet, regardless of the platform. This means that Directions can be used against both open and closed platforms, and also remain flexible enough to deal with falsehoods spread on platforms that are developed in the future. Platform neutrality was in fact an important design principle, based on the Select Committee’s own findings at paragraph 362. In particular, evidence was given to the Select Committee, of the serious concerns with falsehoods in closed spaces. As falsehoods can be hidden from view, they are ideal platforms for the deliberate spread of falsehoods.
Researchers believe that in closed spaces, people are more susceptible to emotive falsehoods, because these are the spaces inhabited by the familiar and trusted, those they know. The Bill therefore recognises that platforms that are closed are not necessarily private. They can be used not only for personal and private communications, but also to communicate with hundreds or thousands of strangers at a time. Closed platforms – chat groups, social media groups – can serve as a public megaphone as much as open platforms. As regards closed platforms, even those with end-to-end encryption, the legislation also covers them. We will also find additional ways of dealing with the harm that can materialise from falsehoods spreading on encrypted closed platforms. For example, in such a case, a General Correction order may be used instead.
It is not enough, as Minister mentioned earlier, for there to be a false statement of fact. It must also be in the public interest for the Direction to be issued. The government powers recommended by the Select Committee were intended to prevent the public interest from being harmed. Clause 4 therefore sets out a non-exhaustive list of examples of public interest. This list reflects the dangers of deliberate online falsehoods identified by the Select Committee, based on real events around the world. The Minister had already explained the thinking behind the definition.
Finally, I will now deal with the safeguards.
The Select Committee stated, at recommendation 12, “There should be adequate safeguards in place to ensure due process and the proper exercise of power, and give assurance to the public of the integrity of the decision-making process… Measures… should include judicial oversight where appropriate.” At the same time, the Committee also stated that “The measures will need to achieve the objective of breaking virality by being effective in a matter of hours.” That is a quote from the Select Committee’s report.
The Bill thus incorporates both speed and due process. The Minister will first issue the Direction. An appeal can then be brought to the High Court to set aside the Direction. The Minister has outlined the procedure in the House earlier.
This, in our view, is the best way to be effective, while ensuring that there is always adequate judicial oversight. The Executive weighs the competing interests and acts decisively to protect society, and the Courts will then have the final say over whether the content in question is false.
During the Select Committee process, some representors preferred a Court order to an Executive Direction. But even an expedited court process may not be fast enough to deal with virality; some of the examples which you have seen outlined in the Minister’s speech.
Falsehoods can reach many with great speed, and lead to serious consequences just as quickly.
We saw earlier the false video of Muslims celebrating a terrorist attack. That video generated 500,000 views within hours of being posted on one Facebook page.
In Indonesia, a falsehood was spread that a pro-communist rally was being held. There was a real event that actually took place but it was not a pro-communist rally. It took less than 24 hours for the falsehood to mobilise thousands to turn up at the event to disrupt it and to protest.
In France, false posts were put out claiming that leaked campaign documents showed that Macron was engaging in illegal activity. The claims were posted just hours before an election reporting “black-out” – similar to our own “cooling-off” period. The posts were then amplified by bots, trolls and fake accounts. Within four hours, there were 47,000 tweets and the topic had hit Twitter’s “trending” list of most popular topics – within hours.
One final example. In 2017, a falsehood was put out, that the founder of a cryptocurrency called “Ethereum” had died in a car crash. The hoax wiped out $4 billion in market value within five hours.
So, the essence of the remedies is that they have to be able to address and counter the quick, wide and deep spread of falsehoods.
The Bill has benefited from the analysis of the relevant issues by the Select Committee who had many representors come forward, many of whom had experience and some of whom are experts in this field. The provision in this Bill is based closely on the recommendations in the Select Committee Report and the Bill seeks to achieve these outcomes.
Last updated on 30 May 2019