We use cookies to collect and analyse information on site performance and usage to improve and customise your experience, where applicable. View our Cookies Policy. Click Accept and continue to use our website or Manage to review and update your preferences.


EU leading fight against misinformation – Little
Pic: RollingNews.ie

20 May 2022 human rights Print

EU leading fight against misinformation – Little

Former journalist Mark Little (pictured) has said that the need for moderation of digital content to protect human rights has never been greater.

He also warned, however, that the problem of disinformation could not be solved if the systems underlying how social-media platforms operate remained unregulated.

In a lecture on digital misinformation (18 May) organised by the Law Society’s Human Rights and Equality Committee, Little said that Europe was “providing the lead” in such regulation, with measures such as the Digital Services Act (DSA).

Little founded social-media news agency Storyful in 2010, and is co-founder of Kinzen, which provides services aimed at identifying harmful content.

‘Safety by design’

He said that the European approach was trying to look at the tech giants’ underlying systems, and targeting harmful practices built into social-media business models.

The EU was also aiming to ensure that all platforms had a commonly agreed way of approaching illegal content, and trying to make platforms accountable for the content decisions made by their algorithms.

There was a lack of oversight of how algorithms worked, he argued.

Little described the EU approach – which he called ‘safety by design’ – as “powerful”, as it was saying that making underlying systems more transparent and accountable was the most effective form of regulation now.

He added, however, that the DSA was not perfect, expressing concerns about the ability to declare “emergency conditions” where limitations could be imposed on platforms.

‘Lawful but awful’

Little argued, however, that the EU measures were better than much of the online-harm regulations emerging in places like Australia, Canada, and the UK.

The former journalist also pointed out that the EU legislation dealt only with illegal content, and not with equally damaging content that was “lawful but awful”.

He said that the main problem with such content was trying to define it, while there were also risks of “disproportionate remedies”, such as jailing platform executives.

Some online-harm legislation was trying to do away with anonymity, which was essential to some voices of dissent in places such as Russia, he warned.

“Safety by decree has many flaws”, said Little, adding that unintended consequences needed to be thought through.

He also told the audience that some countries were using terms such as ‘fake news’ or ‘disinformation’ as a reason to pass “very draconian” laws.

Weapon

The Storyful founder described his initial excitement, as a “journalist in the old school”, at the emergence of new forms of communication, citing their positive impact in events such as the Arab Spring, and in helping protesters in countries such as Iran.

He believed, however, that the internet had then been turned into a weapon by enemies of democracy, with the spreading of false information.

Little said that Storyful had not taken account of the role of the tech giants’ underlying systems in helping to drive such misinformation.

“The underlying business model of social media relies on advertising; it is designed to accelerate the spread of emotion, outrage, and happiness. This is not just a bug in the system, that’s the way it was originally designed,” he stated.

Codes of practice

Little referred to the current “sheer over-abundance of information”, which was starting to result in the feeling that people did not know what was true and what was not.

“They don’t have to convince us that this is true; they just have to convince us that everyone is lying”, he said of attempts by governments or other actors to spread disinformation.

The Kinzen co-founder said that platforms had to think about how to fight back without resorting to heavy-handed censorship.

He praised some new codes of practice on content moderation being developed by platforms working with civil society, such as the Santa Clara Principles, which put human rights at the centre of what he called “technological due process”.

Little said that such codes must also ask for transparency on the technology that underlies any moderation, citing fears that platforms would resort to “blunt-force instruments”, such as automated algorithmic filters, that could end up suppressing free speech.

Public media

Little told the audience that any machine-learning systems involved in content moderation must have a human being involved at every stage of the process.

He concluded by calling for a rethink on how we wanted to design a ‘public square’ for information and debate.

“Part of the problem is that we don’t have enough public media,” Little argued, calling for governments to invest not just in national organisations such as RTE or BBC, but also in helping to fund people to report on their communities.

We should put an effort into creating a form pf public information that is regarded as a utility, and to create within that the ability to have counterpoints of view, he urged.

Gazette Desk
Gazette.ie is the daily legal news site of the Law Society of Ireland

Copyright © 2024 Law Society Gazette. The Law Society is not responsible for the content of external sites – see our Privacy Policy.