YouTube’s AI removed Twice As Many Videos Between April and June

Muscat – News agencies:

YouTube has taken down more videos in the past few months than ever before thanks to machine learning, the company reveals.

The video-sharing platform relied heavily on algorithms from April to June, due to work-from-home orders, which removed more than 11.4 million videos found with misleading or abusive content.

Prior to leaning more on computers, YouTube human moderators had only identified some five million videos from January to March that go against its policies.

32417706 8666851 Between April and June YouTube removed more than twice as many v a 20 1598462019184

YouTube employed the technology when parts of the US went into coronavirus lockdown, as allowing staff to review content outside the office could lead to sensitive data being exposed.

‘We normally rely on a combination of people and technology to enforce our policies,’ YouTube said in a blog post.

‘Machine learning helps detect potentially harmful content, and then sends it to human reviewers for assessment.’

‘Human review is not only necessary to train our machine-learning systems, it also serves as a check, providing feedback that improves the accuracy of our systems over time.’

‘When reckoning with greatly reduced human review capacity due to COVID-19, we were forced to make a choice between potential under-enforcement or potential over-enforcement,’ YouTube said.

It chose to ‘cast a wider net’ so potentially harmful content would be removed quickly.

According to YouTube’s latest Community Guidelines Enforcement report, issued Tuesday, more than 11.4 million videos were removed in the second quarter of 2020.

In the second quarter of 2019, YouTube removed just under 9 million videos.

A particular focus was made on videos that were potentially harmful to children, which were removed more than three times more often than usual.

Flagged clips included ‘dares, challenges, or other innocently posted content that might endanger minors.’

Content creators whose videos were removed without a human review were not issued strikes against their accounts, however, except in extreme circumstances.

YouTube also ramped up resources to handle the expected increase in appeals, which doubled during the second quarter.

Fully half of those appeals resulted in videos being reinstated, up from 25 percent in the first quarter.

‘The impact of COVID-19 has been felt in every part of the world, and in every corner of our business,’ the company stated. ‘Through these challenging times, our commitment to responsibility remains steadfast.’

YouTube is hardly the only social media platform faced with a content-moderation crisis during the pandemic.

In April, Facebook came under fire when posts about making DIY face masks were blocked by an algorithm designed to weed out coronavirus scams and misinformation.

‘We apologize for this error and are working to update our systems to avoid mistakes like this going forward,’ the company said in a statement to The New York Times. ‘We don’t want to put obstacles in the way of people doing a good thing.’

At the same time, Facebook blamed the pandemic for hampering efforts to remove posts about suicide and self-harm.

The company revealed that between April and June it took action on fewer posts containing such content because fewer human reviewers were working due to COVID-19.

Facebook sent moderators home in March but CEO Mark Zuckerberg warned enforcement requiring human intervention could be affected.


About us

Mercury Oman is a leading digital agency based in Muscat, Oman.
We specialize in offering complete digital solutions to enhance your online presence and propel your business to greater heights.

Explore

Follow Us

© 2023 Mercury Oman. All Rights Reserved.