Google: YouTube’s Ad Problem is “Very, Very, Small”

Google has found itself in the middle of a PR crisis. Following report by the Washington Post that Google showed advertisements on extremist or otherwise racist and offensive videos, many companies removed their ads as from the Google advertising network. Now, Google is attempting to re-salvage their multibillion-dollar advertising business.

In an interview with Recode, Google’s chief business officer, Philip Schindler, discussed the seriousness of the Washington Post report and the advertisement pull-back by major companies.

“It has always been a small problem,” with “very very very small numbers” of ads running against videos that aren’t “brand- safe,” Google’s chief business officer told ReCode.

 “And over the last few weeks, someone has decided to put a bit more of a spotlight on the problem.”

Schindler explained that there were “error rates” that affected and appeared on a small number of offensive videos. A Google spokesperson later told Recode advertisements appeared on less than “1/1000th of a percent of the advertisers’ total impressions.”

But even if those ads appeared on controversial and racist videos for a small percentage of the time, it has caused a massive problem for Google.

Several news outlets including the Wall Street Journal and the UK times have published reports and evidence of ads appearing on anti-somatic and racist videos over the past few weeks. Aforementioned had led to major companies such as AT&T and Walmart removing their advertising campaigns from Google’s network.

Following the backlash, Google responded by changing their policies towards advertisers along the more control of where their ads appear and promising to train their artificial intelligence better to detect extreme, racist or controversial videos.

This news comes after the 2016 presidential election. Whereby fake news dominated Facebook and other social media. In response to the false news reports, many companies including Google stepped up to address the spread of misinformation. Ultimately, businesses that allow for user submitted content are in a precarious place where moderation has to meet monetization.