YouTube asks users if they want to ‘reflect’ before posting offensive comments in new update

 (Getty Images)
(Getty Images)

YouTube will now ask people whether offensive comments are “something [they] really want to share” in a new update to make the platform more inclusive.

The pop-up will come to Android users first, and will give the “commenter the option to reflect before posting”, the company said in an announcement.

It will also come with a link for users to review YouTube’s community guidelines, and an edit button, as well as an option for the user to post their original comment.

These prompts will not appear on every offensive comment, nor will every comment they appear on be offensive.

YouTube says that its system is “continuously learning” and the pop-up might appear on comments that do not violate its community guidelines. Comments can also be removed for violating the guidelines even if they do not trigger the pop-up.

“Our system learns from content that has been repeatedly reported by users. We hope to learn more about what comments may be considered offensive as we continue to develop it”, the company said.

YouTube took a similar approach at the start of the pandemic, with regards to flagging videos that might violate its policies.

Watch: Social media star teaches simple skin care tips

During the pandemic YouTube “cast a wider net” by allowing its algorithms to remove more videos than was necessary, but it “accepted a lower level of accuracy to make sure that we were removing as many pieces of violative content as possible".

YouTube also announced a better content filtering systems for YouTube creators, which will tune out potentially inappropriate and hurtful comment sthat have been automatically held for review, so that creators have the option of removing them without reading them.

In addition, it says it is increasing the technology it uses to tackle hateful comments. “Since early 2019, we've increased the number of daily hate speech comment removals by 46x. And in the last quarter, of the more than 1.8 million channels we terminated for violating our policies, more than 54,000 terminations were for hate speech,” YouTube claims.

Finally, YouTube is attempting to better understand specific communities on its platform by asking creators to volunteer their gender, sexual orientation, race and ethnicity information – which YouTube claims will not be used for advertising purposes.

“We’ll then look closely at how content from different communities is treated in our search and discovery and monetization systems. We’ll also be looking for possible patterns of hate, harassment, and discrimination that may affect some communities more than others”, the company says.

The approach could make YouTube a more welcome place for magrinalised communities, something the company has consistently struggled with.

In 2017, the company admitted that it hid LGBTQ videos on its site under a "restricted mode" that aims to "screen out potentially objectionable content”. Two years later, a group of creators came forward to claim that YouTube systematically penalised queer content through its demonetisation system.

YouTube did not give a specific timeframe for its most recent changes, but said that its survey on gender and racial identity would launch initially in the US in early 2021. “If we find any issues in our systems that impact specific communities, we’re committed to working to fix them”, the company said.

Watch: Floral bodysuit worn by Lady Gaga to go under the hammer in London

Read More

YouTube announces lineup for discussions on racial justice

Younger audiences drifting from BBC to YouTube, Ofcom says

Controversial YouTube Rewind cancelled for first time ever