youtube-039we-don039t-take-you-down-the-rabbit-hole039

Media playback is unsupported on your device

Media captionWATCH: “It starts to look like censorship”

YouTube has defended its video recommendation algorithms, amid suggestions that the technology serves up increasingly extreme videos.

On Thursday, a BBC report explored how YouTube had helped the Flat Earth conspiracy theory spread.

But the company’s new managing director for the UK, Ben McOwen Wilson, said YouTube “does the opposite of taking you down the rabbit hole”.

He told the BBC that YouTube worked to dispel misinformation and conspiracies.

But warned that some types of government regulation could start to look like censorship.

YouTube, as well as other internet giants such as Facebook and Twitter, have some big decisions to make. All must decide where they draw the line between freedom of expression, hateful content and misinformation.

And the government is watching. It has published a White Paper laying out its plans to regulate online platforms.

In his first interview since starting his new role, Mr McOwen Wilson spoke about the company’s algorithms, its approach to hate speech and what it expects from the UK government’s “online harms” legislation.

Algorithms

YouTube uses algorithms to recommend more videos for you to watch. These video suggestions appear in the app, down the side of the website and also show up when you get to the end of a video.

But YouTube has never explained exactly how its algorithms work. Critics say the platform offers up increasingly sensationalist and conspiratorial videos.

Mr McOwen Wilson disagrees.

“It’s what’s great about YouTube. It is what brings you from one small area and actually expands your horizon and does the opposite of taking you down the rabbit hole,” he says.

Media playback is unsupported on your device

Media captionWATCH: Has YouTube helped conspiracy theorists thrive?

“Very often it doesn’t take you to content that’s exactly like the one you’ve watched before.”

Even so, Mr McOwen Wilson says YouTube has started adding a sort of “warning” label to certain conspiracy topics.

“If it’s misinformation, we provide correct information around that. We work with Encyclopaedia Britannica and Wikipedia to provide knowledge panels that come up on the side of the screen. So if you’re watching a flat Earth video… we will present to you a link to the facts about that.”

Facebook used to do something similar with fake news. It would label false stories as “disputed” with a red warning label, and offered up other sources of information. But the social network later said this had often entrenched people’s pre-existing views and made the problem worse.

“We haven’t found that,” says Mr McOwen Wilson. He says the platform reduces the spread of content designed to mislead people, and raises up “authoritative voices”.

He names BBC News, the Guardian, the Telegraph and the Sun as examples of authoritative sources.

Some conspiracy theories – such as Holocaust denial – have been banned on the platform completely.

LGBT row

In June, a row erupted between two YouTube video-makers.

Vox reporter Carlos Maza posted a video showing all the times that comedian Steven Crowder had mocked him for being gay, or used insulting language attacking his sexual orientation and ethnicity. Mr Crowder said the videos were “friendly ribbing”.

After a series of muddled statements on Twitter, YouTube eventually confirmed that Mr Crowder had not broken its hate speech rules.

“Was the language used hate speech? Was there incitement against Carlos Maza from the other creator? In that instance, we found that there was not,” says Mr McOwen Wilson.

“I think that remains the right policy decision to have made.”

That decision disappointed Mr Maza’s supporters – and many of YouTube’s own staff. More than 100 signed a petition asking for Google to be kicked out of the San Francisco Pride parade.

Image copyright
Vox Media

Image caption

Carlos Maza presents videos for Vox

The language may not have been “hate speech”, but critics argue that mocking somebody for being gay crosses a line into bullying.

“It doesn’t currently breach our harassment policies,” says Mr McOwen Wilson.

But he adds: “We are inarguably pro-LGBT. I wouldn’t want anyone to judge us only on that. I don’t think it invalidates everything else that we’ve done.”

He points out that YouTube has provided a platform for people to express their sexuality in a largely “supportive environment”.

“I don’t think any of that should be invalidated because of where we have drawn this line on the Maza-Crowder issue.”

Time well spent

YouTube tells its video-makers that one key to success on the platform is “watch time”: making sure viewers stick around for longer.

Facebook, on the other hand, has been talking more about “time well spent” on the platform. It says it is more important that people have a good time on Facebook.

How do the two approaches compare?

“One of the biggest and most positive steps that was taken on the platform, that drove down a huge amount of trashy content, was the shift from ‘views’ to watch time”, says Mr McOwen Wilson.

“The best way for an audience to tell us whether they like what they’re being served isn’t whether they click on it in the first place, but whether they spent any of their time with it.”

But does the system encourage video-makers to make longer videos, and draw out simple how-to clips into a 20-minute extravaganza?

Mr McOwen Wilson says the videos which are most viewed are those that people watch in their entirety.

And he adds: “Clearly a longer one that is viewed the whole way through by the majority of its audience is more likely to come up.”

Regulation

The UK government is currently weighing up how online platforms such as YouTube could be regulated. In April, Culture Secretary Jeremy Wright said the “era of self-regulation for online companies is over”.

Is YouTube worried?

“The moment you put somebody in charge… there is somebody who is filtering what content goes out,” warns Mr McOwen Wilson.

“If they’re government-appointed, that begins to look very much like censorship, and we don’t launch in markets where that is a risk.

“I don’t think it would be the right answer to have anybody at YouTube – or indeed anywhere else – editorialising all of the content that comes up on to our platform.”

Image copyright
PA Media

Image caption

Culture secretary Jeremy Wright says the era of self-regulation is over

And either way, it would be impossible. About 500 hours of video are uploaded to YouTube every minute.

The summer holidays have started – or are about to start – across the UK for thousands of children.

“By the time most of them go back to school, there will be more content uploaded to YouTube than has ever been created in the history of television or film globally,” says Mr McOwen Wilson.

He suggests a regulator could determine areas where online platforms should have policies, but the platforms themselves should create the policies.

“The world will be watching where the UK lands on this,” he says.

“There are regimes out there who will mirror – in their own ways – the position that they view the UK has taken.

“There is a risk – and actually a huge opportunity – for the UK to show leadership on what balanced regulation could look like in an open environment.”