"You are naive!" – we were told. Others said we were unworldly, because Facebook, Twitter& Co. we wouldn’t stand a chance anyway. These were the initial reactions when we told that HateAid wants to take on the big tech platforms in Brussels. What’s true: Lobbying in the EU was indeed new territory for us. We had no experience, no contacts and hardly any money – in contrast to the platforms that have been lobbying Brussels for years. For this, we were motivated and had one goal: to give a voice to all the women and girls who are attacked and degraded every day on social media in the negotiations for the new Internet fundamental law, the Digital Services Act (DSA).
Talks about sweeping changes to the law
And now? In the meantime there is no way around us in Brussels! With one over 30.We are demanding the rights of tens of thousands of Europeans by signing a petition containing. Our Head of Legal Josephine was allowed to bring the perspective of affected people several times in the EU Parliament. EU politicians invite us for personal talks – and finally listen to us.
And in the meantime, the European Parliament (EP) has negotiated on far-reaching legislative changes that could change the lives of thousands of victims of digital violence. It is crucial that these are taken into account in the upcoming Trilogue negotiations also supported by other EU institutions will be.
We move to the current position of the European Parliament a mixed record and explain to you what successes we celebrate and where urgent improvements are still needed in the upcoming negotiations.
Here is what we have achieved with your help so far
1. A milestone for victims: regulate porn platforms
At last, porn platforms could be obliged to take serious action against the abuse of nude images on their channels. There are three key points in the EU politicians’ regulations:
- Every person who uploads content must provide information about their identity,
- Trained moderators are supposed to keep an eye on all content and intervene in critical cases,
- Separate reporting channels for abusive material should make reporting and removal easier and faster.
Especially people affected by so-called revenge porn, deepfakes and other forms of image-based violence could now get more possibilities to fight against it. Especially women are affected by this kind of image-based sexual abuse affected. Together with other civil society organizations, we campaigned for the issue to be recognized by MEPs and addressed in the DSA.
Anna-Lena von Hodenberg on the developments around the DSA in Brussels. Photo: Andrea Heinsohn
"The scale and consequences of image-based sexual abuse are devastating, and I am pleased that the European Parliament has taken up this cause. This is a milestone that should be celebrated by everyone who has worked with HateAid, from victims to NGOs to academics and MEPs."
Anna-Lena von Hodenberg, executive director of HateAid
2. Power to the users: affected people can turn to authorities to obtain orders
It is also positive that the European Parliament has supported our Demand for more rights for users* has partially incorporated into its regulations. This especially affects those who are personally affected by illegal online content. It should be possible for those affected to turn to the authorities and actively obtain orders against online platforms in order to eliminate illegal content from the world. Example: If a user is attacked on the net with death threats or racist incitement and a platform ignores her report and deletion request, she can contact the authorities and request removal through this channel.
3. Open your black box: there should be more transparency about risks& Giving impact
online platforms currently shield all information around content moderation and the algorithms on which their systems are based. Together with PeopleVsBigTech, an international alliance of civil society organizations, we argued that it is not justified to make this a trade secret. Because the way content of all kinds is handled can have a significant impact on the well-being of users around the world, it is important to, Insights into the mechanisms of platforms To obtain it and regulate it when necessary. This is all the more explosive in view of Frances Haugen’s revelations with the Facebook files. As a former Meta employee, she revealed that the group’s Preferentially disseminates hateful content through certain algorithms in order to make profits.
For the European Parliament, therefore Transparency the key word. The risks to public safety, health, fundamental rights and a range of other areas must be critically evaluated by very large online platforms. Much might have been prevented if such measures had been introduced earlier. So there would perhaps be countless fewer teenagers* suffering from mental health problems due to the negative effects of Instagram.
The revelations of whistleblower Frances Haugen show what damage the Meta Group can do with its business model. Photo: HateAid
4. Access to platform data also for NGOs
AlgorithmWatch, which had to shut down its Instagram monitoring project to avoid risking a Facebook lawsuit, wasn’t the only company to suffer from Big Tech’s secretiveness in the past. Other organizations also shied away from access to their data for fear of possible charges by large platforms. The DSA now gives hope that this might change soon. The EP’s regulations stipulate that not only researchers, but also certain civil society organizations should be given access to platform data.
Aber: crucial points are missing
Those affected still have no right to complain to the platform if hateful content against them has not been removed. So if a platform refuses to delete Hatespeech, a defamation or even a death threat, affected people have Not the right to file a complaint directly with the platform. You will also see No possibility for out-of-court dispute resolution conceded. This right is only granted to users who want to complain about content that may have been wrongfully deleted by platforms.
Digital violence on the net must end – with the DSA there could be an improvement for millions of those affected. Photo: Ali Selvi
This is not only absolutely incomprehensible, but also unjust! This detail creates a two-tier system between users. In addition, millions of people are being let down by hate comments and fake news. Hence our clear appeal to the European Union:
Affected people need protection and the opportunity to fight back! This position must now be taken into account in the trilogue negotiations between the Commission, the European Parliament and the European Council.