TSC IntelBrief: Monetizing and Monitoring Social Media
November 2, 2017

Monetizing and Monitoring Social Media


Bottom Line Up Front

• On November 1, attorneys from Google, YouTube, and Twitter testified before the Senate Intelligence Committee on Russian online information operations to influence U.S. public opinion.

• Firms are simultaneously telling ad buyers their platforms are the best place to influence consumers while insisting political ads have little influence on voters.

• The past several years have seen both the so-called Islamic State and Moscow turn social media platforms into ministries of propaganda.

• Ad buyers should exert commercial pressure on social media firms to keep their advertisements from being associated with violent extremism and political controversies in order to assure brand safety.


Since the start of official investigations into the breadth and effectiveness of Moscow’s massive disinformation campaign to influence the 2016 U.S. presidential campaign, Google and Facebook have been in the awkward position of telling firms commercial ads on their platforms are a great way to influence consumers — while insisting political ads on their platform have little influence on voters or social issues. Only in September did Facebook CEO Mark Zuckerberg acknowledge that claims his company was being used by Moscow to exacerbate U.S. social divisions weren’t ‘crazy.’ As the scale of and reach of Russia’s manipulation of ad content and placement on YouTube and Google is better known, some in Washington, and some firms whose ads have made the platforms commercially viable, have begun demanding changes.

On November 1, at a hearing of the Senate Intelligence Committee, Senator Dianne Feinstein warned the legal counsels of Google, Facebook, and Twitter that regulatory change was coming to the essentially unregulated social media environment. According to Senator Feinstein, the firms — particularly Google and Facebook — could no longer ignore the reality that they were media entities and as such, bore some level of corporate responsibility to disclose and regulate political advertising on their platforms. Senator Feinstein told the attorneys, ‘You bear this responsibility. You’ve created these platforms. And now they are being misused. And you have to be the ones to do something about it. Or we will.’

Google — which owns YouTube — and Facebook, are by far the two dominant powers in social media advertising. The scope of their reach is enormous, as are their ad-driven profits. On the same day as the hearing, Facebook posted quarterly earnings of more than $10 billion dollars, its best performance to date, with nearly all of that profit driven by ad purchases. To satisfy advertisers needs, Facebook can offer 2.07 billion monthly, and 1.37 billion daily users. Confronted with Russia’s success in planting divisive ads on its platform, Facebook has announced plans for thousands of new hires, dedicated to cracking down on violent extremist content and manipulation by foreign governments.  While Zuckerberg has claimed those efforts will cut the firm’s profits, he’s also admitted that Facebook has both a social and a corporate responsibility to increase security for advertisers and users — following demands that the company make those investments. 

Meanwhile, YouTube is facing greater pressure from advertisers, both to show them what exactly they’re getting in return, when they buy ads on the platform, and to make sure those ads don’t run alongside violent extremist videos and other controversial content.  Companies like Verizon and Proctor and Gamble, which have spent billions of dollars on social media advertising, have pulled their ads from the platform, until YouTube can ensure the firms a measure of ‘brand safety.’ YouTube’s revenue loss is in the billions, while Google struggles to apply human value judgements to a business model that prides itself on removing the human element from its core deliverables.

It may be that combining public regulation and private commercial pressure can force these enormous hybrid media entities to change. Until recently, many firms have been hesitant to publicly challenge Google and Facebook, given their immense clout. The last several years have seen the Islamic State turn these platforms into ministries of extremist propaganda, while Moscow employs them as agents of access and influence with unprecedented scope and effectiveness. Companies have started demanding that Google and Facebook ensure their ads aren’t used on sites that feature anti-Semitic content or conspiratorial ‘fake news’; something both firms have struggled with, while touting the precision of their algorithmic targeting and placement. With the U.S. government showing more willingness to step in and regulate online platforms, the prospects for change are increasing. 

However, moving forward will require more than government action. Other forms of mass media follow rules and regulations: some legal, others ethical.  Brand safety is important and the push for that will have to come from ad buyers and agencies, not the government.  New rules and regulations need to be developed and enforced internally, within the industry.  Transparency is essential to slow the spread and effectiveness of efforts to divide Americans.  Users need to know if the ad that inflamed their opinion on a key issue came from a Russian source hoping to manipulate the public.  Social media platforms that monetize ads must also monitor those ads, to be responsible to the public they claim to serve. 

For tailored research and analysis, please contact: info@thesoufancenter.org