[ad_1]
The EU announced investigations on Thursday into YouTube and TikTok to find out what action the US and Chinese-owned platforms are taking to ensure the safety of minors.
Issued on: Modified:
2 min
The European Commission said it had sent formal requests for information to TikTok and YouTube respectively, the first step in procedures launched under the EU’s new law on digital content.
The European Union’s executive arm said it wanted to know what measures the video-sharing platforms have taken to comply with the Digital Services Act (DSA), especially regarding the risks posed to children’s mental and physical health.
The DSA is part of the EU’s powerful armoury to bring big tech to heel, and demands that digital giants do more to counter the spread of illegal and harmful content as well as disinformation.
Platforms face fines of up to 6 percent of global turnover for violations.
TikTok, owned by Chinese company ByteDance, is particularly popular with younger users, while YouTube is part of the Alphabet digital empire that includes Google.
Both companies must respond by 30 November.
The EU’s top tech enforcer, Thierry Breton, said in August that child protection would be an enforcement priority for the DSA.
The law has also banned targeted advertising to minors aged 17 years and under.
We have sent a request for information to TikTok and YouTube, requesting the companies to provide more information on their obligations related to protection of minors under the Digital Services Act.
More info ↓https://t.co/rA26Km5CIx#DSA
— European Commission (@EU_Commission) November 9, 2023
The EU already launched probes into TikTok, Twitter (rebranded as X) and Facebook-parent Meta over disinformation following the 7 October Hamas attack in Israel.
Suicidal content
Earlier this week, the NGO Amnesty International also sounded the alarm, publishing two reports including one on TikTok’s “For You” feed, which it said risked “pushing children and young people towards harmful mental health content”.
“Technical research in partnership with the Algorithmic Transparency Institute and AI Forensics using automated accounts showed that after five to six hours on the platform, almost one in two videos were mental health-related and potentially harmful, roughly 10 times the volume served to accounts with no interest in mental health,” the report said.
“TikTok’s content recommender system and its invasive data collection practices pose a danger to young users of the platform by amplifying depressive and suicidal content that risk worsening existing mental health challenges.
🚨 We’ve found that TikTok abuses your privacy, is addictive by design, and younger users signaling an interest in mental health risk falling into rabbit holes of harmful content.
Join our global campaign to make TikTok safer for younger users! #FixTikTok
TW: self harm, suicide pic.twitter.com/X9cL2OENgr
— Amnesty Tech (@AmnestyTech) November 7, 2023
Lisa Dittmer, a researcher with Amnesty, said the findings exposed TikTok’s “manipulative and addictive design practices”, which she said were designed to keep users engaged for as long as possible.
“They also show that the platform’s algorithmic content recommender system, credited with enabling the rapid global rise of the platform, exposes children and young adults with pre-existing mental health challenges to serious risks of harm,” Dittmer said.
Stricter curbs
The move by the EU’s Breton comes three days after he told TikTok chief executive Shou Zi Chew to spare no effort to counter disinformation on its platform.
The DSA also demanded that tech companies do more to counter the spread of illegal goods.
The EU commission has kickstarted an investigation into China’s AliExpress over what actions it is taking to protect consumers online from illegal products, including fake medicines.
TikTok and YouTube are also among 22 services listed by the EU in September that face stricter curbs on how they do business under the DSA’s sister law, the Digital Markets Act (DMA).
Companies must fully comply with the DMA by March 2024.
(with AFP and Reuters)
[ad_2]
Source link