By Sheila Mckenzie-
A mother who is among British parents suing TikTok after the deaths of five children has described a hearing in the United States as “deeply painful”.
Ellen Roome, 49, from Gloucestershire, has been campaigning since her 14-year-old son Jools Sweeney died at home in Cheltenham in 2022.
She travelled to Delaware to attend the hearing in the case filed by the Social Media Victims Law Centre against TikTok and its parent company ByteDance.
The case alleges Jools, Isaac Kenevan, 13, Archie Battersbee, 12, Noah Gibson, 11, and Maia Walsh, 13, died while attempting an online challenge.
The blackout challenge, sometimes referred to in media as part of a wave of dangerous “social media challenges” — is based on a much older risky behaviour known as the choking game. Participants deliberately restrict oxygen to their brains, often by self-strangulation or choking themselves or by holding their breath, to achieve a brief euphoric sensation. This can cause seizures, brain damage, or death even if done only once.
On platforms like TikTok, the blackout challenge appeared in clips and trends that reached young users — particularly in 2021 and the years after, making it more visible to children and teenagers through algorithmic recommendations even if they didn’t specifically search for it.
Because the challenge involves cutting off oxygen, it can lead to unconsciousness very quickly. Lack of oxygen to the brain rapidly becomes life-threatening, and even brief attempts can result in permanent brain injury or death. This is why the challenge is broadly condemned by health experts and banned on mainstream social platforms — but clips discussing, depicting, or gamifying the behaviour have still circulated.
Ms Roome is one of five British families, whose 14-year-old son Jools Sweeney died in 2022, and is suing TikTok and its parent company ByteDance in a U.S. court, alleging that their children died after trying the blackout challenge, arguing that TikTok’s system exposed them to harmful content.
The parents believe that the children saw the dangerous challenge on TikTok before their deaths. The platform says the blackout challenge has been blocked on the service since 2020 and that it removes dangerous content proactively, but the parents do not have direct evidence of what their children were shown because relevant viewing data may have been deleted under data protection rules.
The hearing the mother spoke of wasn’t about the substance of whether TikTok promoted specific videos .It was an early procedural stage in the lawsuit in Delaware, determining whether the case can move forward and what evidence might be obtained.
Sitting through legal arguments about technical motions while their children’s deaths hang over the proceedings was “deeply painful” and contrasting sharply with the personal nature of their grief.
If the judge allows the case to continue, the next phase would be discovery, where the families hope to obtain TikTok data that could shed light on what their children saw or were shown before their deaths.
“We now have to wait for the judge to decide whether the case is dismissed or whether we are allowed to proceed to the discovery stage.
“Sitting through the hearing was incredibly hard. The language was cold, technical and legal. For the court, this is about motions and procedures. For us, it is about our children. Our dead children.”
She continued: “Listening to lawyers argue abstract points while the reality of our loss sat silently behind every word was deeply painful. This is our lived experience, our grief, and our determination to find the truth and protect other children.
“Whatever the outcome, we showed up. We spoke for our children. And we will keep going. Thank you to everyone who continues to support us.”
Ms Roome sold the financial business she had run for 18 years to campaign for Jools’ Law, a right for parents to access their deceased child’s data without a court order.
She is also pushing for wider changes to social media to improve the safety of children online.
Since her son’s death, Ms Roome has been trying to obtain data from TikTok and ByteDance which she believes could provide an explanation as to what happened.
Ms Roome had spoken before the hearing about the importance of social media companies being held to account for children viewing harmful content on their platforms.
“This is about accountability,” she said previously.
“We are in Delaware to make sure social media companies are held responsible for the harm caused on their platforms.
“What happens online does not stay online. The impact is real, and for too many families, devastating.
“This is not about banning the internet. It is about stopping platforms being addictive by design, exposing children to harm, and avoiding responsibility when the worst happens.
“Children deserve protection. Parents deserve answers. And tech companies must be held to account.
“We are here to make sure other children are safer.”
Matthew Bergman , founder of Social Media Victims Law Centre, acting for the parents told the court:
“TikTok is a singular global product that operates in the same manner irrespective of location.”
“They design products to be addictive … [the For You Page] shows kids not what they want to see, but what they can’t look away from.”
Speaking to reporters outside court, he added:
“TikTok has a ‘For You Page’ that deluges young people, young kids, with dangerous material, in this case, dangerous choking challenges.”
“They take advantage of the underdeveloped frontal cortex of an adolescent. They take advantage of an adolescent’s desire for social acclimation, to engage in dangerous challenges … and they take advantage of the immaturity and less acute judgment that young people have, not because they’re bad people, not because they’re bad kids, because they’re kid
TikTok has applied to dismiss the case, stating UK residents are suing US entities which do not operate or provide the social media firm’s services in the UK.
It says established US law, such as the First Amendment, bars liability for third-party content on TikTok.
A spokesperson for TikTok said: “Our deepest sympathies remain with these families. We strictly prohibit content that promotes or encourages dangerous behaviour.
“Using robust detection systems and dedicated enforcement teams to proactively identify and remove this content, we remove 99% that’s found to break these rules before it is reported to us.
“As a company, we comply with the UK’s strict data protection laws.”
It is understood the online challenge which the parents say is responsible for their children’s deaths has been blocked on TikTok since 2020.
Since her son’s death, Ellen has been unable to access his online data – leading to her campaign for Jools’ Law in the UK, which would allow for the automatic preservation of a child’s online data immediately after their death.
“If they had nothing to hide, why would they not show me his data?” asks Ellen. “I’ve sat in Parliament with members of TikTok and begged for my son’s data with other bereaved parents, and they just won’t release it.



