The European Commission has sent a request for information on the design and operation of recommender systems to YouTube, Snapchat, and TikTok. Recommender systems are machine learning algorithms that use data to suggest items that a user might be interested in.
However, under the Digital Services Act (DSA), the EU’s landmark content moderation regulation, the Commission stated that these are at the “heart” of the systemic risks that platforms present to users.
In a statement released on Wednesday (Sep. 2), the EU Commission said: “Under the DSA, platforms have to assess and adequately mitigate risks stemming from their recommender systems, including risks for the mental health of users and the dissemination of harmful content arising from the engagement-based design of these algorithms.”
We requested information from YouTube, Snapchat, and TikTok about the design of their recommender systems under the Digital Services Act.
YouTube and Snapchat must detail their algorithms’ parameters and risk amplification.
TikTok must explain measures against manipulation and… pic.twitter.com/usQkKHfPH4
— European Commission (@EU_Commission) October 2, 2024
The requests “also concern the platforms’ measures to mitigate the potential influence of their recommender systems on the spread of illegal content, such as promoting illegal drugs and hate speech.”
The Commission stated that it had asked TikTok for more information on the measures the company has implemented to prevent bad actors from manipulating the platform and to mitigate risks related to elections and civic discourse.
YouTube and Snapchat will need to address questions regarding the parameters of their recommendation algorithms and their potential role in amplifying systemic risks, including those related to the protection of minors.
The three social media firms have been given until November 15 to provide the data. The EU said their responses will inform any next steps including potentially, opening formal investigations and possible fines.
EU opens formal proceedings against Meta over its recommender system
The EU has previously initiated non-compliance proceedings under the DSA, which forces tech companies to take stronger action against illegal and harmful content on their platforms. The proceedings are related to the recommendations provided by Meta’s Facebook and Instagram, AliExpress, and TikTok.
The Commission is not addressing Meta, Facebook and Instagram’s parent company, because the same questions were covered in the formal proceedings opened against the company in May.
ReadWrite reported that the body said it was concerned that the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioral addictions in children, as well as create so-called “rabbit-hole effects.”
Featured image: Midjourney
The post EU questions YouTube, Snapchat, TikTok on their AI recommendation systems appeared first on ReadWrite.