It's Not too Late for Social Media to Regulate Itself

Getty Images

Silicon Valley’s search and social media giants determine who sees what information, and how. Never before has such a small number of companies had the power to connect billions of people instantly—and with it, the ability to shape and alter the information ecosystems of entire societies.

WIRED OPINION

ABOUT

Rob Reich is a professor of political science and the co-director of the Center on Philanthropy and Civil Society at Stanford University. David Siegel is the co-chairman of Two Sigma, a financial services company that uses technology and data science to optimize economic outcomes in investment management, insurance, and related fields.

At the 2019 World Economic Forum in Davos last month, numerous world leaders called publicly for greater international regulation around how data is collected and used. The tech industry has struggled to respond to this debate with a coordinated, constructive plan of action. If it doesn’t do so soon, the result may be overly blunt, rigid, and potentially counterproductive regulation. It’s not too late for the tech industry to help formulate rules that make sense for everyone, but time is running short.

The crux of the problem is the opaque process that determines how algorithms curate information for billions of users. Every time someone uses search or social media services, they’re relying on a secret and proprietary algorithm tuned to maximize something—usually user engagement with the service. Transparency and accountability are largely absent.

Experimentation and risk-taking are cherished hallmarks of Silicon Valley, but the norms around algorithmic governance have become a free-for-all. History teaches us that unregulated marketplaces can produce a race to the bottom, externalizing harms while socializing these costs and privatizing the financial gains. The financial crises of the 20th and 21st centuries demonstrated that unregulated markets cannot safeguard all interests of society. Now Silicon Valley’s search and social media giants, long resistant to oversight, face growing scrutiny. Too often, company-level efforts amount to a “trust us, the engineers are working on it” approach. These tactics have fallen short.

To protect the public interest and their own businesses, these companies should set up a robust self-regulatory organization along the lines of the Financial Industry Regulatory Authority (FINRA), an SRO that derives its authority from the Securities and Exchange Commission. Thanks to its independence from bureaucratic government agencies, FINRA is effective—and relatively nimble—at policing securities firms with sensible rules.

Given the extraordinarily rapid pace of technological change, it is unrealistic to expect governments to devise, update, and enforce effective rules by themselves. Such an approach can hinder innovation and produce marketplace advantages for the largest companies. And in the tech world, everything from consumer behavior to hardware and software capabilities evolves too quickly for static statutes to remain meaningful for long.

Twenty years ago, regulators faced similar challenges in the financial industry. Rules were often arbitrarily enforced and created an uneven playing field between larger incumbents and smaller players. Ultimately, through a partnership between industry and government, FINRA formed as a more agile and effective way to help protect the public interest. Industry’s involvement helped ensure that in-house technical expertise accompanied strong rule-writing and enforcement powers, reducing regulators’ reliance on blunt and infrequently updated laws.

The key advantage of strong self-regulatory organizations like FINRA is their ability bridge the gap between appropriately slow-moving governments and complex, fast-changing industries. Since FINRA is technically not a government body, it is better able to provide close, active oversight while keeping pace with constant shifts in the financial industry. At the same time, the government sanction FINRA enjoys is essential to avoid the appearance of creating a cartel, a concern that plagued its precursor, the NASD.

FINRA’s mission is “to provide investor protection and promote market integrity” in order to maintain investors’ trust in financial markets. What would the objective of a similar self-regulatory organization for the tech sector be? To protect citizens by promoting the integrity and user-controlled privacy of information on search and social media platforms.

Promoting public trust in the integrity of information on search and social platforms is more crucial than ever. Search personalization and similar algorithms work well—they keep users engaged by delivering personally relevant content—but have a dark side: the way search results are presented and the order in which social media posts appear in a feed can manipulate public opinion and behavior. In effect, whether they mean to or not, these companies are inching toward the creation of a custom echo chamber for everyone on the internet—but there’s no governance or transparency around the process.

FINRA provides a valuable blueprint for how a self-regulatory organization for the search and social media industry might work. The organization would be funded on a sliding scale by industry members (to ensure both large and small companies’ interests are represented fairly), independent of government but ultimately accountable to the Federal Trade Commission or another agency, and staffed by highly technically competent individuals paid at industry rates. The organization would be tasked with writing and enforcing rules to protect the basic integrity of the online public sphere.

For example, it could help ensure that companies’ use of proprietary algorithms supports society’s fundamental interest in a high-quality information ecosystem, just as FINRA examines trading data to detect fraud. Crucially, it would do this without compromising companies’ valuable intellectual property or removing incentives to innovate. It could create clear rules about an independent appeal process when companies ban or delete information, and it could set forth requirements on algorithmic accountability.

It’s true that some companies have instituted their own policies on these issues—Facebook recently announced an effort to create an independent appeals process for its content moderation policies. But no framework applies to the industry as a whole. Self-regulatory organizations' ability to balance the public interest with commercial imperatives should make a broad framework attractive to all stakeholders involved.

The passage of the EU’s General Data Protection Regulation and California’s Consumer Privacy Act signal that it's time to change in the way we approach governance of the online public sphere. We must seek solutions that avoid the pitfalls of clumsy legislation and signal the maturation of the tech industry as it comes to grips with its power.

If search and social media companies can’t figure out how to supervise themselves constructively, lawmakers are bound to step into the void more aggressively. Time is running out for industry leaders to take the initiative and build an effective oversight model themselves.

WIRED Opinion publishes pieces written by outside contributors and represents a wide range of viewpoints. Read more opinions here. Submit an op-ed at [email protected]

  • One man’s epic quest for his Cambridge Analytica data
  • It’s time to rethink who’s best suited for space travel
  • Now, the weather: Mars-like, with a chance of apocalypse
  • Weedmaps’ grip on the high-flying California pot market
  • The prime challenges for Amazon's new delivery robot
  • 👀 Looking for the latest gadgets? Check out our picks, gift guides, and best deals all year round
  • 📩 Want more? Sign up for our daily newsletter and never miss our latest and greatest stories