Close Menu
RoboNewsWire – Latest Insights on AI, Robotics, Crypto and Tech Innovations
  • Home
  • AI
  • Crypto
  • Cybersecurity
  • IT
  • Energy
  • Robotics
  • TechCrunch
  • Technology
What's Hot

Investors trust Google more than Meta when comes to spending on AI

April 30, 2026

Paragon is not collaborating with Italian authorities probing spyware attacks, report says

April 28, 2026

Microsoft cuts OpenAI revenue share as their AI alliance loosens

April 28, 2026
Facebook X (Twitter) Instagram
Trending
  • Investors trust Google more than Meta when comes to spending on AI
  • Paragon is not collaborating with Italian authorities probing spyware attacks, report says
  • Microsoft cuts OpenAI revenue share as their AI alliance loosens
  • Robotically assembled building blocks could make construction more efficient and sustainable | MIT News
  • AI showdown: Musk and Altman go to trial in fight over OpenAI’s beginnings
  • U.S., Iran seize ships as war evolves into standoff over Strait of Hormuz
  • Google launches training and inference TPUs in latest shot at Nvidia
  • Zoom teams up with World to verify humans in meetings
  • Home
  • About Us
  • Advertise
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
Facebook X (Twitter) Instagram
RoboNewsWire – Latest Insights on AI, Robotics, Crypto and Tech InnovationsRoboNewsWire – Latest Insights on AI, Robotics, Crypto and Tech Innovations
Friday, May 8
  • Home
  • AI
  • Crypto
  • Cybersecurity
  • IT
  • Energy
  • Robotics
  • TechCrunch
  • Technology
RoboNewsWire – Latest Insights on AI, Robotics, Crypto and Tech Innovations
Home » Widespread availability of graphic Charlie Kirk shooting video shows content moderation challenges

Widespread availability of graphic Charlie Kirk shooting video shows content moderation challenges

GTBy GTSeptember 12, 2025 Technology No Comments6 Mins Read
Share
Facebook Twitter LinkedIn Pinterest Email


Immediately after Charlie Kirk was shot during a college event in Utah, graphic video of what happened was available almost instantly online, from several angles, in slow-motion and real-time speed. Millions of people watched — sometimes whether they wanted to or not — as the videos autoplayed on social media platforms.

Video was easy to find on X, on Facebook, on TikTok, on Instagram, on YouTube — even on President Donald Trump’s Truth Social. The platforms, generally, said they were removing at least some of the videos if they violated their policies, for instance if the person was glorifying the killing in any way. In other cases, warning screens were applied to caution people they were about to see graphic content.

Two days after Kirk’s death, videos were still easily found on social media, despite calls to remove them.

“It was not immediately obvious whether Instagram for example was just failing to remove some of the graphic videos of Charlie Kirk being shot or whether they had made a conscious choice to leave them up. And the reason that it that was so hard to tell is that, obviously, those videos were circulating really widely,” said Laura Edelson, an assistant professor of computer science at Northeastern University.

The events illustrate the content moderation challenges platforms face in handling fast-moving real-time events, complicated by the death of a polarizing conservative activist who was shot in front of a crowd armed with smartphones recording the moment.

Ambiguous policies

It’s an issue social media companies have dealt with before. Facebook was forced to contend with people wanting to livestream violence with a mass shooting in New Zealand in 2019. People have also livestreamed fights, suicides and murder.

Similar to other platforms, Meta’s rules don’t automatically prohibit posting videos like Kirk’s shooting, but warning labels are applied and they are not shown to users who say they are under 18. The parent company of Instagram, Facebook and Threads referred a reporter to the company’s policies on violent and graphic content, which they indicated would apply in this case, but had no further comment.

YouTube said it was removing “some graphic content” related to the event if it doesn’t provide sufficient context, and restricting videos so they could not be seen by users under age 18 or those who are not signed in, the company said.

“We are closely monitoring our platform and prominently elevating news content on the homepage, in search and in recommendations to help people stay informed,” YouTube said.

In a statement, TikTok said it is “committed to proactively enforcing our Community Guidelines and have implemented additional safeguards to prevent people from unexpectedly viewing footage that violates our rules.”

TikTok also moved to restrict the footage from its “for you” feed so people have to seek it out if they want to see it and added content warning screens as well as worked to remove videos that showed graphic, close-up footage.

Rewarding engagement

Social media platforms algorithms reward engagement. If a video gets a lot of reaction, it moves to the top of people’s feeds, where more people see it and engage with it, continuing the cycle.

“I mean, this is the world that we have all made. This is the deal we all made. The person who gets to decide what’s newsworthy on Instagram is Mark Zuckerberg. The person who gets to decide what stays up on X is Elon Musk. They own those platforms, and they get to decide what is on them. If we want another world, well, then someone else needs to make it,” Edelson said. “The fact is that we live in a world where the most important channels for what information circulates are controlled by single individuals.”

And it is these individuals who decide what to make a priority. Meta, X and other social platforms have cut back on human content moderation in recent years, relying on artificial intelligence that can both over-and under-moderate.

Regulations vary by region

The U.S. has no blanket regulation prohibiting violent content from being shown on the internet, although generally platforms attempt to restrict minors from being able to see it. Of course, this doesn’t always work, since users’ ages are not always verified and kids often lie about their ages when signing up to social platforms.

Authorities in other places have drawn up laws that require social media companies to do more to protect their users from online harm. Britain and the European Union both have wide-ranging laws that make tech platforms responsible for “online safety.”

The Online Safety Act requires platforms, even those not based in the United Kingdom, to protect users from more than a dozen types of content, from child sexual abuse to extreme pornography.

Content that depicts a criminal offense such as a violent attack on someone isn’t necessarily illegal content, but platforms would still have to assess whether it falls foul of other banned material such as encouraging terrorism.

The British government says the rules are especially designed to protect children from “harmful and age inappropriate content” and give parents and children “clear and accessible ways” to report problems online.

That includes content material that “depicts or encourages serious violence or injury,” which online services are required to prevent children from seeing.

Violations of the U.K. rules can be punished with fines of up to 18 million pounds ($24.4 million) or 10% of a company’s annual revenue, and senior managers can also be held criminally liable for not complying.

The U.K. law is still fairly new, and only started taking effect in March as it gets rolled out in stages.

The rest of Europe has a similar rule book that took effect in 2023.

Under the European Union’s Digital Safety Act, tech companies are required to take more responsibility for material on their sites, under threat of hefty fines. The biggest online platforms and search engines, including Google, Facebook, Instagram and TikTok, face extra scrutiny.

Platforms should give users “easy-to-use” mechanisms to flag content deemed illegal, such as terrorism and child sexual abuse material, Brussels says, adding that platforms have to then act on reports in a “timely manner.”

But it doesn’t require platforms to proactively police for, and take down, illegal material.

—-

AP Media Writer David Bauder contributed to this story.



Source link

GT
  • Website

Keep Reading

Microsoft cuts OpenAI revenue share as their AI alliance loosens

AI showdown: Musk and Altman go to trial in fight over OpenAI’s beginnings

Apple’s new CEO Ternus is a low profile hardware veteran

US judge overturns Trump administration orders to slow wind and solar projects

UK faces cyberattacks from Russia, Iran, and China, warns NCSC head

New York lawsuit accuses Coinbase and Gemini of enabling illegal gambling

Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Investors trust Google more than Meta when comes to spending on AI

April 30, 2026

Google launches training and inference TPUs in latest shot at Nvidia

April 27, 2026

Meta tracks employee usage on Google, LinkedIn AI training project

April 25, 2026

Meta will cut 10% of workforce as company pushes deeper into AI

April 24, 2026
Latest Posts

Malicious Chrome Extension Steal ChatGPT and DeepSeek Conversations from 900K Users

April 1, 2026

Top 10 Best Server Monitoring Tools

April 1, 2026

10 Best Cybersecurity Risk Management Tools

March 31, 2026

Subscribe to News

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

Welcome to RoboNewsWire, your trusted source for cutting-edge news and insights in the world of technology. We are dedicated to providing timely and accurate information on the most important trends shaping the future across multiple sectors. Our mission is to keep you informed and ahead of the curve with deep dives, expert analysis, and the latest updates in key industries that are transforming the world.

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

Facebook X (Twitter) Instagram
  • Home
  • About Us
  • Advertise
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2026 Robonewswire. Designed by robonewswire.

Type above and press Enter to search. Press Esc to cancel.