YouTube has removed less than 1% of flagged hate videos

YouTube has removed less than 1% of flagged hate videos

YouTube has removed less than 1% of flagged hate videos. YouTube removed less than one percent of the flagged 15 million hate videos, as reported by figures given to MPs.

YouTube has removed less than 1% of flagged hate videos

Statistics requested by the Home Affairs Select Committee show that, between July and December of last year, the video giant deleted 25,145 of the 14,994,703 clips raised on its site as hateful and abusive, representing only 0.17 %. As part of the parliamentary committee’s investigation into hate crimes, the numbers were requested from YouTube.

Also Read: The new ‘on-call’ service from Careem

Ms. Cooper said: “We have raised with YouTube time and time again the issue of hateful and extremist content, yet they have repeatedly failed to act. Even worse than simply hosting these channels, YouTube’s money-making algorithms are actually promoting them, pushing people with every click on more and more extremist content.

“We know what can happen if hateful content can proliferate online, and yet YouTube and other businesses continue to benefit from pushing this poison.

“It’s simply not good enough. Other social media companies are at least trying to get to grips with the problem, but neither YouTube nor Google takes any of this seriously. They should be responsible for the damage they do and the hatred and extremism they help spread.

Most of the videos were flagged by artificial intelligence rather than by humans, according to YouTube. It said, however, that these computer programs struggled with the hate speech “complex” area and that the flags were often found to be inaccurate when human moderators reviewed them.

The selected committee asked for the figures after last month’s appearance before MPs by Marco Pancini YouTube’s Director of Public Policy Europe, the Middle East, and Africa.

Also Read: How to Start Branded SMS Marketing Business

After the release of the figures, the committee is still waiting for Google’s answers to a number of other questions that were raised during the hearing.

Among the reasons that Christchurch shooting videos still appeared more than a month after the massacre on YouTube. The company came under heavy criticism after thousands of times on the terror attack day copies of the live stream posted by the shooter on Facebook were uploaded to YouTube.

A committee letter said that over 720,000 views had been received by one of the Christchurch videos. MPs also want Google executives to explain why YouTube recommends videos of far-right figures like Stephen Yaxley-Lennon (also known as Tommy Robinson), including viewers who have never seen such content. Stay tuned for more details on Google Gangs!

Also Read: How to recover disabled Facebook Account 2019

Leave a Comment

Your email address will not be published. Required fields are marked *

Discover more from Blowing Ideas

Subscribe now to keep reading and get access to the full archive.

Continue reading