Can a president of a nation be deleted from Twitter or Facebook ? They are private services that have no obligation of universality, so the short answer is: yes, there is not an a-priori right to participate in a social. In some cases, however, some courts have ordered the companies to reinstate the excluded user because his expulsion was not legally valid and would have generated damage.
Can being on Twitter, Facebook, Instagram, Tiktok be an a priori right? If we’d ask a teenager, the answer would obviously be yes. A very important part of their relational life takes place in that environment. Perhaps their role as a public service should be recognized.
But who is responsible for the things that are said on social media? In the U.S., according to section 230 of the “decency in communications” law, the operator is not responsible for the content hosted. They becomes so if, in the face of a notification, they fail to remove them. In Europe, with respect to violent/defamatory content, the eCommerce Directive applies with provisions very similar to section 230. Assessing the lawfulness of content is a task for a court. Entrusting the preventive removal of content to a private party, as established for copyrighted works by the Copyight Directive in Europe, is equivalent to entrusting these companies a first level of judgement.
Are Trump’s tweets illegal? It’s practically impossible to say. A court would have to determine that. Certainly they are full of falsehoods, but lying is generally not illegal. It becomes so when harmful effects for someone arise from the lie (e.g. defamation), cases that must be evaluated by a court.
In fact, there is one case in which lying is illegal in itself: this is stock manipulation, whereby it is not necessary for the price of shares to be altered as a result of the lie. The lie itself, aimed at altering a stock market price, is in itself a crime.
Now there are questions about whether the removal of Trump’s accounts is a legitimate and appropriate act or if it calls for regulatory changes. Trump himself has intervened several times on section 230 and in Europe the Commission presented last December a new regulatory proposal for digital services.
Intervening to regulate social networks is a very delicate issue: some fundamental rights are touched.
On the merits, one can detest the messages of incitement to violence that circulate in some parts of the Parler app, but it should not be Amazon that shuts down their servers and Apple that makes it impossible to install the app on iPhones by removing it from the app store. Internet mega-nationals cannot be the guardians of people’s rights. People must be free to express themselves, any wrongdoing must be ascertained only by courts.
Before social networks, people could express themselves by addressing small groups in total freedom: think of the speakers’ corner in London, of film forums or bars. Those who could address large audiences were the publishers, for whom there were (and are) different rules that provide for an responsible editor, the obligation to correct, etc..
With social media, the potential of expression of the individual is extended from the previous limited audience to an audience characteristic of large publishers, but without specific regulation.
On the basis of the considerations that “free speech is not free reach” and that with “large audiences comes great responsibility”, one could envisage a “regulatory ladder” that places increasing constraints on the basis of the audience.
For example, mantaining the present regime for users with less than 20 thousand followers; provide the obligation to a possibility of indirect identification (for example with a mechanism similar to the one in use at airoports’ wifi ) for users with a number of followers between 20 and 100 thousand; provide a prior identification for users with more than 100 thousand followers and, for people with larger audiences, measures similar to those that regulate publishers, such as the right to rectification.
As we learned with COVID, the speed of spread is a crucial element in outbreaks. There have been experiments that have shown that virality of some contets can be limited by introducing frictions to the spread. This technique was used in a radical way by Twitter, preventing the retweeting of some of Trump’s twits.
A similar measure could be administered more gradually: some posts could be exposed gradually, for example reaching an incremental 1% of the user’s followers per hour, allowing for intervention of a court judgment.
This measure of “rate limiting”, which is common practice in some IT activities, could be envisioned only for large operators and triggeret for only those twits that an artificial intelligence judged as potentially illicit.
Finally, for some categories of particular relevance to society, a tort of malicious information manipulation could be envisaged, only for persons with very large audiences, and prosecutable in court by a lawsuit by those who are directly or indirectly harmed. The fines could be imposed according to the number of followers, let’s say for example a maximum of 5 cents per follower.
Social networks are here to stay and their relevance is likely to increase. For those who do not engage in content curation, providing for responsibilities similar to publishers would be counterproductive, putting their livelihood at risk and especially disadvantaging new market entrants. Providing obligations and constraints for all users is equally disproportionate and a harbinger of rights compression.
On the other hand, there will certainly continue to be increased pressure for accountability in the online publication of content that can have dramatic effects on societies. We must be very careful not to privatize justice, reduce guarantees for individuals and limit acquired rights.
Extreme regulation, such as no standards or black/white measures for all, cannot work. If we want to make progress, we need to think about regulating the behavior of users and not only of the intermediaries, graduated approaches according to audience level, and gradual sanctions.