Today, August 25, 2023, the Digital Services Act (DSA) comes into force in the EU, but what is it anyway? The Digital Services Act is a far-reaching EU law that protects the rights of consumers, combats illegal content and disinformation in the Internet promote fair competition between service providers of all sizes and increase the transparency and accountability of online services.
This DSA applies to many different types of companies operating in the EU. There are four classifications of services in the DSA:
1. Intermediation services - services offered by network infrastructures (e.g. ISPs, domain registrars)
2. Hosting Services - Cloud and Web Hosting Services
3. Online platforms – services that store and display information to their users (e.g. social networks, marketplaces, travel sites, collaboration platforms, etc.)
4. Very large online platforms – online platforms with more than 45 million active users (here is the published list)
Each additional category is considered a subset of the previous category and is subject to additional obligations under the DSA. There are 19 commitments in total, mandating the required behavior around things like content moderation and reporting, transparency on advertising and recommendation algorithms, handling of complaints, and risk management.
The majority of these requirements are to take effect from February 17, 2024, but with the DSA the EU provides that large platforms such as Google, Apple , Alibaba or X (Twitter), TikTok and whatever they are called must implement these requirements by August 25, 2023. In addition, these
comply with more stringent obligations that are proportionate to the significant risks they pose to societal when they disseminate illegal and ‘harmful’ content, including disinformation.
Furthermore, very large online platforms
Assess and mitigate systemic risks and undergo independent audits each year. In addition, large platforms that use so-called “recommendation systems” (algorithms that determine what users get to see) must offer at least one non-profiling option and
when a crisis occurs, e.g. B. a threat to public safety or health, the EU Commission can require very large platforms to limit urgent threats on their platforms. These special measures are limited to three months.
Ergo Censorship through the back door, which massively attacks freedom of expression and the rule of law. Is it therefore surprising that the DSA is also “advertised” by the WEF and we can read there:
The Digital Services Act harmonises the process by which platforms must be notified of illegal content and take action. In concrete terms, this means that platforms must remove illegal content “immediately” after it has been reported by trusted applicants. The DSA also provides that users can be notified of and contest the removal of content by platforms, with access to dispute resolution mechanisms in their own country. If platforms currently do not provide their users with explanations of their deletion decisions, this procedure must be introduced across the board.
While the Digital Services Act does not provide specific timelines for content removal, businesses must be prepared for rapid removal and have the proper procedures and capacity in place to respond to notifications from trusted markers. If platforms currently do not provide their users with explanations of their deletion decisions, this procedure must be introduced across the board.
—
(The Digital Services Act harmonizes the process by which platforms are notified and must take subsequent action on illegal content. More concretely, once notified by trusted flaggers, platforms will have to remove illegal content 'expeditiously'. The DSA also stipulates that users are informed about, and can contest removal of content by platforms having access to dispute resolution mechanisms in their own country.While the Digital Services Act doesn't have specific timelines for content removal, companies need to be prepared for quick removal and have the right processes and capacity in place in order to act on notifications from trusted flaggers. In addition, if platforms are not currently providing explanations to users about their removal decisions, this process will need to be instituted across the board.)
Ultimately, the DSA is a tightening of the Network Enforcement Act (NetzG) that we are familiar with and will pave the way for censorship in the medium term, which the WHO also envisages in its amendments to the International Health Regulations (IHR), which I have already reported on several times. It should therefore be remembered at this point that, in my opinion, even small websites will have massive problems in the medium term when distributing their content. Anyone who wants to escape this paternalism can host their pages with a non-EU provider, the other question is how long it will be until similar regulations are enforced there - because this is obviously planned worldwide...