TikTok has until Friday to respond to Italy’s order to block users it can’t age-verify after girl’s death

TikTok has until Friday to respond to an order by Italy’s data protection agency to block users whose age it cannot verify, TechCrunch has learned.


The GPDP made an “immediate” order in response to the death of a 10-year-old girl from Palermo who died of asphyxiation after participating in a “blackout challenge” on the social network, according to reports in local media.


The agency said the ban would remain in place until February 15 — suggesting it would make another assessment about any additional action at that point.


At the time of writing it does not appear that TikTok has taken action to comply with the GPDP’s order.


A spokeswoman told us it is reviewing the notification. “We have received and are currently reviewing the notification from Garante,” she said. “Privacy and safety are top priorities for TikTok and we are constantly strengthening our policies, processes and technologies to protect all users, and our younger users in particular.”


The GPDP had already raised concerns about children’s privacy on TikTok, warning in that its age verification checks are easily circumvented and raising objections over default settings that make users’ content public. On December 22 it also announced it had opened a formal procedure — giving TikTok 30 days to respond.


The order to block users whose age it cannot verify is in addition to that action. If TikTok does not comply with the GPDP’s administrative order it could face enforcement from the Italian agency, drawing on penalty powers set out in the GDPR.


TikTok’s spokeswoman declined to answer additional questions about the order — which prohibits it from further processing user data “for whom there is no absolute certainty of age”, per GPDP’s Friday.







The company also did not respond when we asked if it had submitted a response to the agency’s formal procedure.


In a statement last week following the girl’s death the company said: “Our deepest sympathies are with the girl’s family and friends. At TikTok, the safety of our community — in particular our younger users — is our priority, and we do not allow content that encourages, promotes, or glorifies dangerous behaviour that might lead to injury. We offer robust safety controls and resources for teens and families on our platform, and we regularly evolve our policies and protections in our ongoing commitment to our community.”


TikTok has said it has found no evidence of any challenge involving asphyxiation on its platform.


Although, in recent years, there have been a number of of underage users hanging themselves (or attempting to) after trying to copy things they saw on the platform.


Users frequently create and respond to content challenges, as part of TikTok’s viral appeal — such as (recently) a trend for singing sea shanties.


At the time of writing, a search on the platform for ‘#blackoutchallenge’ returns no user content but displays a warning that the phrase “may be associated with behavior or content that violates our guidelines”.



Screengrab of the warning users see if they search for ‘blackout challenge’ (Image credit: TechCrunch)


There have been TikTok challenges related to “hanging” (as in people hanging by parts of their body other than their neck from/off objects) — and a search for does still return results (including some users discussing the death of the 10-year-old girl).


Last year a number of users also participated in an event on the platform in which they posted images of black squares — using the hashtag #BlackOutTuesday — which related to Black Lives Matters protests.


So the term “blackout” has similarly been used on TikTok in relation to encouraging others to post content. Though not in that case in relation to asphyxiation.







Ireland’s Data Protection Commission, which has been lined up as TikTok’s lead data supervisor in Europe — following the company’s last year that its Irish entity would take over legal responsibility for processing European users’ data — does not have an open inquiry into the platform “at present”, per a spokesman.


But TikTok is already facing a number of other investigations and legal challenges in Europe, including an investigation into how the app handles users’ data by France’s watchdog CNIL — announced .


In recent years, France’s CNIL has been responsible for handing out some of the largest penalties for tech giants for infringing EU data protection laws (including fines for and ).


In , it also emerged that a 12-year-old girl in the U.K. is bringing a legal challenge against TikTok — claiming it uses children’s data unlawfully. A court ruled she can remain anonymous if the case goes ahead.


Last month Ireland’s data protection regulator put out on what it couched as “the Fundamentals for a Child-Oriented Approach to Data Processing” — with the stated aim of driving improvements in standards of data processing related to minors.


While the GDPR typically requires data protection complaints to be funnelled through a lead agency, under the one-stop-shop mechanism, Italy’s GPDP’s order to TikTok to cease processing is possible under powers set out in the regulation (Article 66) that allows for “urgency procedures” to be undertaken by national watchdogs in instances of imperative risk. Although any such provisional measures can only last for three months — and only applies in the country where the DPA has jurisdiction (Italy in this case).


Ireland’s DPC would be the EU agency responsible for leading any resulting investigation.


“If TikTok does not comply with our order, the sanctions provided for in the GDPR are applicable,” a spokesperson for the Italian agency confirmed.









This report was updated with comment from the GDPD