Broadcast United

10-year-old Nylah Anderson dies in viral ‘blackout challenge’. Now TikTok could be held responsible

Broadcast United News Desk
10-year-old Nylah Anderson dies in viral ‘blackout challenge’. Now TikTok could be held responsible

[ad_1]

Sincere support
Independent News

Our mission is to provide unbiased, fact-based reporting that holds power accountable and reveals the truth.

Whether it’s $5 or $50, every contribution counts.

Support us in journalism without an agenda.

Louise Thomas

TikTok “Challenge” accused Severe spinal injury, Horrible burnsat some point not long ago, Ozempic global shortage.

Today, the world’s second most popular Social Media The app will be forced to explain as it faces allegations that it knowingly sent potentially dangerous videos to young people while ignoring warnings of death and destruction in an attempt to maximize profits.A ruling by a U.S. appeals court on Tuesday breathed new life into a major lawsuit brought by a grieving mother who holds TikTok responsible for her daughter’s death.

U.S. Circuit Judge Paul Matey wrote in his partial concurrence that federal law “provides TikTok with immunity from lawsuits for hosting videos produced and uploaded by third parties.” But at the same time, the law does not protect TikTok from liability for allegedly distributing and targeting videos “knowingly that they could be harmful.”

In a statement filed on Wednesday The Independent“Big Tech just lost its ‘get out of jail free’ card,” said Jeffrey Goodman, an attorney representing the Anderson family in the appeal. “This ruling ensures that powerful social media companies must play by the same rules as all other companies and will face court when they recklessly cause harm to children.”

TikTok said,User safety is our top priority

Nylah Anderson's mom attended a press conference following her 2022 lawsuit against TikTok, where she wants the company to stop providing dangerous content to impressionable kids.
Nylah Anderson’s mom attended a press conference following her 2022 lawsuit against TikTok, where she wants the company to stop providing dangerous content to impressionable kids. Screenshots Saltz Mongeluzzi and Bendesky

This is the infamous TikTok “blackout challenge” 10-year-old girl Nylah Anderson diedPhiladelphia-area resident Tawainna Anderson found her daughter’s body on her bedroom floor in December 2021 after the “active, happy, healthy and very smart” girl attempted to complete a challenge on the world’s second-most popular non-gaming app in which users strangle themselves with belts, towels or other objects until they pass out, according to her mother. Nylah was rushed to a hospital and died five days later.

Over the 18 months from 2021 to 2022, Blackout Challenge Allegedly killed 20 children in 18 monthsFifteen of them were under the age of 12. This is by no means the only dangerous “challenge” to go viral. The “skull-breaking challenge” requires kicking the legs of a person in the air while jumping into the air. Nearly paralyzed a 13-year-old boy Pennsylvania Girls, Leading to criminal charges The so-called “Angel of Death Challenge” is a “game” in which participants jump in front of moving vehicles to see if they can stop them in time. Allegedly causing multiple deathsWhen a 12-year-old Arizona boy tried TikTok’s “fire challenge,” in which young people record themselves lighting fires in their homes, he ended up in the intensive care unit and has since undergone multiple surgeries.

In May 2022, Anderson sued TikTok and its Chinese parent company ByteDance, claiming they knew about the “blackout challenge” and its dangers but still deliberately served videos to young, impressionable children. The lawsuit said the “blackout challenge” appeared on Nylah’s “Recommended for You” page.

“I can’t stop replaying that day in my mind,” Anderson said at a news conference shortly after the lawsuit was filed. “It’s time to end these dangerous challenges so other families don’t have to go through the pain we go through every day.”

Anderson said in his initial complaint that TikTok executives “had no doubt that the deadly Blackout Challenge was spreading on its app and that its algorithm specifically served the Blackout Challenge to children, including those who had died.” Tik Tok says The Independent The company said it was developing controls to protect minors from harmful content and said it had “no evidence” of a “blackout challenge” on its site.

TikTok is the second most popular app in the world, behind only Instagram
TikTok is the second most popular app in the world, behind only Instagram Getty Images

However, in October 2022, a federal judge dismissed the case, ruling that TikTok was protected by a mysterious law— But the heated debate — Part of the 1996 law Former President Donald Trump attempted to repeal After his tweets started being labeled as misinformation.

“Despite the tragic circumstances of this case, I am constrained to rule that defendants are not liable under Section 230 of the Communications Decency Act because plaintiffs seek to hold defendants liable as ‘publishers’ of third-party content,” U.S. District Judge Paul Diamond wrote in his ruling dismissing the case.

In the appeal, Anderson’s attorneys argued that while TikTok did not produce the videos in question, the company “took no steps or took wholly inadequate steps to eradicate and prevent the spread of the Blackout Challenge, and in particular to prevent the Blackout Challenge from being shown to children… Instead, TikTok continued to recommend these videos to children like Neela.”

Tuesday’s ruling by the 3rd U.S. Circuit Court of Appeals allows Anderson’s lawsuit to proceed again, with TikTok and ByteDance named as defendants.

In the lawsuit, Marty, a former white-collar criminal defense attorney who was appointed to the bench by Trump in 1996, noted that Nila “is still in her first year of puberty, (and) probably had no idea what she was doing or that following the images on the screen would kill her.”

“For decades, big tech companies like TikTok have used Section 230 to shield themselves from liability for their egregious and predatory behavior,” Goodman’s co-counsel Samuel Dordick said in his own statement. “This powerful ruling makes clear that Section 230 does not extend that far.”

Anderson, through Goodman and Dodik, issued a statement saying, “Nothing can bring our beautiful daughter back. But we take comfort in the fact that by holding TikTok accountable, our tragedy may help other families avoid unimaginable pain in the future.”



[ad_2]

Source link

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *