Ads for major corporations that are household names are routinely shown on YouTube videos identified as “extremist content,” according to a CNN investigation.
The TV network said it counted at least 300 businesses and organizations whose ads have run on videos about white nationalists, pedophilia, and even North Korean propaganda.
When an advertiser places a buy on YouTube, its message is inserted into the site’s content, sometimes playing before the start of a video. YouTube, which is owned by Google, often tries to match the advertiser's message with the tone of the video, but sometimes the ads are placed at random.
The CNN report found five U.S. government agencies had paid for ads that were paired with the out-of-the-mainstream videos, meaning U.S. tax dollars went to the videos' producers.
In addition, the analysis found ads for Hershey, Facebook, Nordstrom, Amazon, Hilton, Netflix, Adidas, and Under Armour on these videos. When contacted by CNN, the companies said they were not aware they were sponsoring extremist content.
Under Armour pauses ads
A spokesman for Under Armour told CNN the company is suspending its advertising on the video platform until it can investigate how its messages are being displayed.
YouTube gives advertisers a tool that can be used to target advertising messages to certain demographics and user behavior. They can block specific topics and employ a filter that keeps ads away from videos pertaining to sensitive subjects.
A spokeswoman for YouTube told CNN that the company has worked with advertisers to implement better controls, stricter policies, and greater transparency when it comes to ad placement.
“When we find that ads mistakenly ran against content that doesn’t comply with our policies, we immediately remove those ads,” she said.
A year ago YouTube made changes to the way video producers can monetize their videos on the platform. The change requires video producers to rack up at least 10,000 lifetime views on their channel before they have the option to have ads appear in their videos.
YouTube made the change, not to weed out extremist content, but to crack down on copycat creators – those who copy videos from other sources and put them on their channel.