“I think we would lose a lot of voices,” Wojcicki said. 

YouTube CEO Susan Wojcicki is okay with taking content down, but she doesn’t think it’s a good idea to review it before it goes up on the massive video-sharing platform.

That’s one big takeaway from her interview with Recode senior correspondent Peter Kafka at this year’s Code Conference.

“I think we would lose a lot of voices,” Wojcicki said. “I don’t think that’s the right answer.”

She also warned that it could be difficult to come up with criteria as to what could be uploaded in the first place: “What are the factors that you’re [using to] determine that? How are you deciding who is getting to be on the platform and have speech and who’s not?”

When Kafka pointed out the company is already making such decisions — but only after content is online on YouTube’s platform — Wojcicki emphasized the importance of reviewing content after it publishes on the site. “We see all these benefits of openness, but we also see that that needs to be married with responsibility,” she said.

The YouTube CEO admitted that there will likely always be content on YouTube that violates its policies.

“At the scale that we’re at, there are always gonna be people who want to write stories,” she said, suggesting that journalists will always choose to focus on the negative aspects of YouTube in their reporting.

“We have lots of content that’s uploaded and lots of users and lots of really good content. When we look at it, what all the news and the concerns and stories have been about is this fractional 1 percent,” Wojcicki said. “If you talk about what the other 99 point-whatever-that-number is — that’s all really valuable content.”

“Yes, while there may be something that slips through or some issue, we’re really working hard to address this,” she said.

Last week, YouTube updated its hate speech policy and said it will take down “videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status.” The policy directly mentioned removing videos that promote neo-Nazi content or videos that deny commonly accepted violent events, like the Holocaust or the Sandy Hook school shooting — but lots of other conspiracy theories and “borderline content” are still allowed on the platform.

Instead of approving videos ahead of time, Wojcicki suggested using tiers in which creators get certain privileges over time, like more distribution and monetization of their content.

“I think this idea of like not everything is automatically given to you on day one, that it’s more of a — we have trusted tiers,” she said.

In recent weeks, the company has confronted numerous issues. Last week, the video platform decided that YouTube creator Steven Crowder wasn’t violating its rules when he kept posting videos with homophobic slurs directed at Vox journalist Carlos Maza, though the company eventually demonetized Crowder’s channel.

YouTube has said that by limiting recommendations, comments, and sharing, it has reduced views of white supremacist videos by 80 percent since 2017. It’s only now banned that content altogether. The company is one of several prominent tech companies trying to figure out how to deal with hateful content proliferating on their platforms. Facebook banned white supremacist content on Facebook and Instagram in March. Twitter says it is looking into it. But even when these companies do make rules prohibiting harmful content, the sheer volume of uploads and posts on their platforms make it difficult to exclude content that breaks those rules. YouTube’s army of content creators uploads 500 hours of video each minute of every day on its site.

Wojcicki instead wanted to focus on the improvements the video company has made in the past few years.

“Two years ago there were a lot of articles, a lot of concerns about how we handle violent extremism. If you talk to people who are experts in this field, you can see that we’ve made tremendous progress.”

“We have a lot of tools, we work hard to understand what is happening on it and really work hard to enforce the work that we’re doing. I think if you look across the work you can see we’ve made tremendous progress in a number of these areas,” Wojcicki said. “If you were to fast-forward a couple years and say, well, what that would look like in 12 months and then in another 12 months, what are all the different tools that have been built, I think you’ll see there will be a lot of progress.”


Recode and Vox have joined forces to uncover and explain how our digital world is changing — and changing us. Subscribe to Recode podcasts to hear Kara Swisher and Peter Kafka lead the tough conversations the technology industry needs today.

Vox – All Go to Source
Author:

Rani Molla

Powered by WPeMatico