403
Sorry!!
Error! We're sorry, but the page you were looking for doesn't exist.
EU Releases Digital Services Act Report on Online Risks
(MENAFN) The European Commission, together with the Board of Digital Services Coordinators, released on Wednesday the inaugural Digital Services Act (DSA) systemic risk report.
The publication highlights the primary threats arising on major online platforms and search engines throughout the EU.
The report pinpointed a wide array of systemic risks, encompassing the dissemination of illegal content, dangers to fundamental rights, and mounting worries related to mental health and the protection of minors.
It also assessed the preliminary mitigation strategies implemented by very large online platforms (VLOPs) and search engines (VLOSEs) in line with the DSA's "transparency rules."
According to the analysis, platforms are repeatedly exposed to hazards linked to public health misinformation, the sale of harmful or unlawful goods online, and extensive, coordinated disinformation campaigns.
Regulatory authorities additionally highlighted the improper use of generative AI, noting its involvement in creating manipulated media and "child sexual abuse material."
A major section of the report concentrated on threats to minors, including exposure to "child sexual abuse material (CSAM)," grooming, sextortion, cyberbullying, and dangerous social media challenges.
Civil society organizations also voiced apprehensions about the commercial exploitation of child influencers and the absence of "accessible reporting tools" for young users.
The publication highlights the primary threats arising on major online platforms and search engines throughout the EU.
The report pinpointed a wide array of systemic risks, encompassing the dissemination of illegal content, dangers to fundamental rights, and mounting worries related to mental health and the protection of minors.
It also assessed the preliminary mitigation strategies implemented by very large online platforms (VLOPs) and search engines (VLOSEs) in line with the DSA's "transparency rules."
According to the analysis, platforms are repeatedly exposed to hazards linked to public health misinformation, the sale of harmful or unlawful goods online, and extensive, coordinated disinformation campaigns.
Regulatory authorities additionally highlighted the improper use of generative AI, noting its involvement in creating manipulated media and "child sexual abuse material."
A major section of the report concentrated on threats to minors, including exposure to "child sexual abuse material (CSAM)," grooming, sextortion, cyberbullying, and dangerous social media challenges.
Civil society organizations also voiced apprehensions about the commercial exploitation of child influencers and the absence of "accessible reporting tools" for young users.
Legal Disclaimer:
MENAFN provides the
information “as is” without warranty of any kind. We do not accept
any responsibility or liability for the accuracy, content, images,
videos, licenses, completeness, legality, or reliability of the information
contained in this article. If you have any complaints or copyright
issues related to this article, kindly contact the provider above.

Comments
No comment