AI, regulation, legislation & design of platforms
Academic studies about the regulation, legislation & design of platforms in relation to disinformation & information disorder.
Last updated
Academic studies about the regulation, legislation & design of platforms in relation to disinformation & information disorder.
Last updated
OSCE Representative on Freedom of the Media, Organization for Security and Co-operation in Europe - COURTNEY RADSCH
The full article is available here.
State-sponsored disinformation campaigns are increasingly common in political systems of all types. These campaigns leverage the design of social media platforms and the AI systems that power them to pursue a strategy of undermining, drowning out, and delegitimizing real news through coordinated efforts to silence critics and manipulate public opinion...
This paper analyses the dynamics of state-aligned disinformation campaigns and the role that AI plays in this context. It specifically examines coordinated campaigns deployed against journalists and media outlets, their gendered dimension, and how they leverage and manipulate AI systems to contort the public sphere.
A full summary of the case is available here.
Case Summary:
The case of Biancardi V. Italy responds to application no. 77419/16 against Italy under Article 34 of the Convention for the Protection of Human Rights and Fundamental Freedoms. The application was submitted by Alessandro Biancardi, editor-in-chief of an online newspaper, which published an article in 2008 about a fight and stabbing that occurred in a restaurant while mentioning the names of those involved and the motive for the fight one of the participants in the fights. A formal notice asking for the article to be removed from the internet was submitted by one of the participants in the fight, which Binacardi didn't comply with. The case went to the District Court of Chieti, which concluded that there had been a breach in the claimants' reputation and right to respect for their private life, for which each claimant was awarded €5,000 in compensation. Biancardi submitted an appeal, which was dismissed by the Supreme Court. The case was then taken to the European Court of Human Rights (ECtHC) by Biancardi under Article 10 in 2016. This Court determined that the grounds for which Biancardi had been found liable for failing to de-index the article from Google were not correct.
Significance
During this case, interventions were made by the Reporters Committee for Freedom of the Press, the UN Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, among other stakeholder organisations. The determination that the case involved only the de-indexing of the article from Google instead of its full removal established a precedent because this is the first time that the ECtHR considered the de-indexing of online newspaper material. "The 'right to reputation' of a person accused of a criminal offence outweighed the right of the newspaper to continue to make available a story about the incident which had led to the arrests and charge." Thus, the ECtHR determined "that the Article 8 reputational rights" of the claimant "took precedence over the Article 10 rights of the online publisher."
"Media fragmentation and polarization, which varies between countries, could affect the extent to which people claim to use alternative news."
“..current assessments of the effects of news personalization are predominantly based on observations from Western democracies. This Western-centric approach raises concerns about these assessments’ applicability to other contexts, in particular non-democratic ones.” To address this gap this article scrutinizes “discussions of the promises and threats of news personalization in countries characterized by limited press freedom: Belarus, Russia and Ukraine.”
“This study explored Facebook users’ hostile perceptions of shared news content and its relationship with their political participation.” The author conducted this exploration in the context of the abortion issue in South Korea.
“This article explores how Twitter’s algorithmic timeline influences exposure to different types of external media.” “This article explores how Twitter’s algorithmic timeline influences exposure to different types of external media.” The authors found that algorithmic timelines increase amount of “junk news” websites in the external link exposures, although the characterization of the algorithm as minor in comparison to the factors that influence it, such as human behavior and platform incentives.
[...] examines changes made to laws and regulations related to ‘false information’ in eleven countries across Sub-Saharan Africa 2016-2020 from Ethiopia to South Africa. By examining the terms of such laws against what is known of misinformation types, drivers and effects, it assesses the likely effects of punitive policies and those of more positive approaches that provide accountability in political debate by promoting access to accurate information and corrective speech. In contrast to the effects described for most recent regulations relating to misinformation, the report identifies ways in which legal and regulatory frameworks can be used to promote a healthier information environment.