For the complete Q&A read this: https://ec.europa.eu/commission/presscorner/detail/en/QANDA_20_2348
New rights for users: At the same time, citizens will be able to notify illegal content, including products, that they encounter and contest the decisions made by online platforms when their content is removed: platforms are obliged to notify them of any decision taken, of the reason to take that decision and to provide for a mechanism to contest the decision.
More transparency on advertising: Users will also receive more information about ads they are seeing on online platforms – for example, if and why an ad targets them specifically. Platforms will no longer present behaviourally targeted ads for minors and will no longer present ads to their users based on profiling that rests on special categories of personal data, such as their ethnicity, political views or sexual orientation.
Clearer consequences: Users will be able to seek compensation from providers of intermediary services for any damage or loss suffered due to an infringement of the DSA by such provider.
How will you keep a fair balance with fundamental rights such as the freedom of expression?
The DSA puts protection of freedom of expression at its very core. This includes protection from government interference in people's freedom of expression and information. The horizontal rules against illegal content are carefully calibrated and accompanied by robust safeguards for freedom of expression and an effective right of redress – to avoid both under-removal and over-removal of content on grounds of illegality.
The DSA gives users the possibility to contest the decisions taken by the online platforms to remove their content, including when these decisions are based on platforms' terms and conditions. Users can complain directly to the platform, choose an out-of-court dispute settlement body or seek redress before Courts.
The Digital Services Act proposes rules on transparency of content moderation decisions. For very large platforms, users and consumers will be able to have a better understanding of the ways these platforms impact our societies and will be obliged to mitigate those risks, including as regards freedom of expression. They will be held accountable through independent auditing reports and specialised and public scrutiny.
All the obligations in the DSA, including the crisis response mechanism, are carefully calibrated to promote the respect of fundamental rights, such as freedom of expression.
How does the Digital Services Act address dark patterns?
Under new rules, dark patterns are prohibited. Providers of online platforms will be required not to design, organise or operate their online interfaces in a way that deceives, manipulates or otherwise materially distorts or impairs the ability of users of their services to make free and informed decisions.
The ban complements, but does not overwrite the prohibitions already established under consumer protection and data protection rules, where a large numbers of dark patterns that mislead consumers are already banned in the EU.