How and Why Social Media Should be Regulated

by | December 14, 2020

Facebook is back in the sights of Washington’s regulators. Last week, the FTC announced a major lawsuit against the social network for ‘years-long course of anticompetitive conduct,’ seeking to force a divestiture of WhatsApp and Instagram.

With the US election over, we might have expected the dust to settle on the impact of social networks on our politics. But the arrival of a new administration and new Congress presents an opportunity to address the regulation of companies such as Twitter, Facebook or Google.  Not only has President Trump fanned the flames by proposing to “eliminate the immunity of internet platforms” created by section 230 of the Communications Decency Act of 1996, but Democrats, who focus on the distribution of misinformation, and Republicans, who assert “left-wing” bias, are as vocal as ever.  And with books such as “Zucked” by Roger McNamee and the Netflix documentary “The Social Dilemma”, the general public has become more aware of the issues and debate surrounding the politics of social media. 

While it is tempting to bring out the legal sledgehammer to try to beat down the power of these social media platform providers, to do so would ignore the wide-ranging benefits they provide to a global community. Heavy-handed approaches could also curb freedom of expression and even precipitate a losing battle against modern communications technology that is loved by users for its sharing, socializing and entertainment features.

It is time to recognize that the issues involved in managing these platforms are nuanced.  To use one tool to “fix” all the problems or abuses of social media, such as eliminating Section 230, would satisfy no one and would surely cause unintended consequences that produce even worse outcomes than the status quo.

There are three issues that must be considered to properly address the challenges posed by social media platforms: (i) content and its curation; (ii) a business model which amplifies distribution, risking privacy and addictive engagement; and (iii) the market power of the giants, such as Facebook, Twitter and Google, and its potential for abuse. 

In the US, published content is protected by the 1st Amendment, which makes it difficult to regulate.  Moreover, because social media platforms claim that they are not publishers themselves, but merely facilitators, they are protected by Section 230 from being responsible for the content on their services.  Congress may still regulate activity, but the appropriate target must be the publisher, not the platform.  Still, intentionally promulgating misinformation should be as illegal as slander or liable – and existing slander and liable laws can be enhanced to encompass misinformation, a phenomenon that has exploded onto the scene and has been facilitated by social media.

Put succinctly, while the 1st Amendment protects discourse and opinion, it does not protect those who would “shout fire in a crowded theater”.  Congress should provide guidelines to the platforms for immediate take-down and/or labeling of potentially illegal misinformation content. Moreover, recognizing the ferocious speed with which information travels over the internet, an expedited method of adjudication (perhaps via an Administrative Court), with an objector stating the case and a publisher having the burden to prove its legitimacy, could expedite the removal of unlawful and harmful information.  Such content, if not removed, would expose the publisher to potential legal liability or punishment. 

The business model of the platforms is itself a recipe for abuse. It is based on advertising, which together with artificial intelligence and tracking technology, targets and follows users to provide enhanced visibility – and hence a steroidal boost to advertising revenues. It has become, as Shoshana Zuboff termed it, surveillance capitalism. 

Original content can be amplified significantly by ‘likes’ or ‘re-tweets’. The commercial aim is to increase advertising revenues to the platform provider, but it comes with potential costs including invasion of privacy and distribution beyond the knowledge or intent of the publisher.  In such cases, “republishing” should subject the platform to the same regulation as that of a publisher (without Section 230 protection) given that the social media company has broadened the audience beyond what was originally intended.  If social media companies wish to avoid such regulatory oversight, they should not amplify. 

Lastly, are these companies too big? Do they need to be broken up? 

It is vital to note that the scale of these companies creates consumer benefits.  Facebook, with billions of users, allows someone to easily and inexpensively publish across cross borders, generations, social strata, etc.  Scale is at the essence of their value to users and advertisers.  Breaking up these companies is not the answer, even if it were feasible. 

Rather, social media giants must ensure that they comply with antitrust law, which prevents their abuse of scale. Compliance monitoring and oversight is an FTC, or Justice Department job (or states attorneys), not a Congressional one.  All companies, including social media platforms, should be prevented from restraining competition or using monopoly or monopsony power to control advertising pricing.  For example, in the EU, Google cannot favor its products versus a competitor’s offering in online search. 

Proper regulation of social networks requires coordinated approaches.  Regulators, elected officials, and the general public must understand social network business models in order to judge the benefits and costs that they generate.  Platforms have become successful because they provide services that the public wants. That should not be forgotten in a hasty and misguided attempt to eliminate the abuses that also exist.

If regulation is done piecemeal and does not address content, distribution and abuse of power, respectively, it will fail. Even worse, it could spawn unintended consequences worse than the challenges we now confront.

About the Author

Jim Ramo spent his entire career in the media business. He was the CEO of Movielink, a joint venture of 5 of the major movie studios that launched the delivery of movies over the internet. Prior to Movielink, he was part of the founding team of of Directv as Executive Vice President in charge of programming, sales and marketing and customer service.

Related Posts

Pin It on Pinterest

Share This