The Promise and Perils of Interoperability

Majorities of both Republicans and Democrats now support increased regulation of the tech industry. Responding to this pressure, policymakers on both sides of the aisle have become increasingly critical of big tech, and are looking to advance new legislation. However, because their underlying grievances are different—with the right focusing on anti-conservative bias, and the left focusing on harmful content and economic power—there hasn’t been a clear path to bipartisan reform. 

An emerging area of common ground is the idea of promoting interoperability and data portability, which was a major theme of last week’s House Judiciary Committee hearing. As Vice Chair Joe Neguse (D-CO) put it: 

The witnesses that are gathered before our committee all agreeing that we need to do something about interoperability to me suggests that this is an area ripe for the committee to immediately move on in a bipartisan fashion.

What exactly is interoperability? IEEE defines it as: “the ability of two or more systems or components to exchange information and to use the information that has been exchanged.” In other words, interoperability means different apps and services can interconnect and work together, even if they were developed by different companies.

Proponents of interoperability argue it would be a boon for competition, lowering switching costs between platforms, reducing barriers to entry for new firms because of network effects, and empowering users with greater choice. Yet, critics point to the danger of technological lock in, privacy risks, implementation challenges, and other potential downsides. Industry groups have suggested consumers may not even want it anyway.

But while the idea of promoting interoperability has drawn support from both Republicans and Democrats, there’s still confusion as to how it would work and what trade offs might come with it. This is to be expected, as interoperability comes in many forms, and raises complex legal, technical and economic issues.

In a broad sense, interoperability is something we routinely take for granted in our daily lives. This is as basic as the language we use to write a letter, the symbols on highway signs that prevent collisions, or the standardized threads that make it easy to screw in a lightbulb. It’s also what makes it possible for you to call someone on the AT&T network from a Verizon phone, or to move your number if you decide to switch (something that wasn’t always the case). 

In the digital world, we still enjoy a high degree of interoperability. But this also wasn’t always the case. Early computers, including those from the same manufacturer, often didn’t support the same software or have the ability to talk to one another. The emergence of the Internet spurred an increased demand for interconnection and standardization, leading to an explosion of open protocols and standards like TCP/IP, HTTP, SMTP, and others. 

This decentralized approach wasn’t an accident. It was fueled by fears of computers becoming Orwellian tools of bureaucratic control, and the utopian thinking of the 1960s counterculture. As Stewart Brand observed in a 1995 article for Time Magazine:

The counterculture’s scorn for centralized authority provided the philosophical foundations of not only the leaderless Internet but also the entire personal computer revolution.

Today the pendulum seems to be rapidly swinging away from the philosophy of the Internet pioneers. We’re witnessing the rise of a digital world increasingly dominated by walled gardens and closed platforms. 

As these centralized platforms have grown in scale, so have the challenges in managing them. Platforms are expected to moderate users generating billions of posts per day, and operate in dozens of countries with very different legal systems and cultural norms. Platforms are also increasingly held responsible for harmful content and behavior from their users, including disinformation, violent extremism, drug abuse, and harassment. This has led to complaints from all sides, putting pressure on companies to take down more speech under the threat of regulation or even structural separation. 

Thus, the architectural shift to centralized platforms has created attractive choke points for states to exert control, as well as pressure points for narrowing the scope of acceptable speech, behavior, and economic activity. In the US, this manifests as services deplatforming Parler or President Trump. In countries like Russia, China, or Thailand, you can get deplatformed (or imprisoned) for criticizing the government.

Instead of one-size-fits-all policies from central authorities—what if we moved the digital world back towards being more open and decentralized? That’s the vision of the future motivating many champions of interoperability.

First, it’s important to recognize that integrated and closed systems aren’t without their benefits compared with decentralized or open systems. For instance, Apple’s success can be at least partly attributed to its end-to-end control of hardware and software (and its benefits for reliability, security, and privacy). Similarly, economies of scale make possible consumer benefits like Amazon’s one-day shipping, and can aid in detecting and combating cyber attacks or other malicious activity. Taking it to the extreme, a requirement for perfect compatibility might eliminate differentiation between products, and bring innovation to a halt.

In their 2012 book, Interop: The Promise and Perils of Highly Interconnected Systems, Urs Gasser and John Palfrey provide a helpful framework for navigating this challenge, defining two schools of thought at different ends of the spectrum:

At one extreme, people argue that the state should have no involvement whatsoever in achieving interop. This theory is part and parcel of the dominant strain of cyberlibertarianism, which views any state involvement as anathema to innovation and to the positive development of information and technology systems. At the other end of the spectrum lies the notion that the state ought to drive the development of important complex systems by mandating certain approaches to interop. In this interventionist vision, the leadership of the state is necessary to accomplish high levels of interop; without it, the results will be too uneven and inconsistent to serve the public well.

The authors advocate for a mix of these approaches, observing that state intervention is necessary in some cases (e.g. emergency alerts), and undesirable in others. Heavy-handed government mandates bring risks such as technological lock-in, introducing cyber vulnerabilities, weakening privacy, or even favoring big incumbents. On the other hand, many Internet protocols trace back to ARPANET and the DARPA community of researchers, where the government played a key role in addressing coordination problems and solving technical challenges. Similarly, agencies like the National Institute of Standards and Technology have long had a role in working with industry to establish standards and protocols. It’s also important to be cognizant of how existing legal frameworks—like Section 230, the Computer Fraud and Abuse Act, and the Digital Millennium Copyright Act—shape and sometimes distort the business, speech, and competition landscapes.

In short, given that interoperability entails both costs and benefits, the key challenge is identifying the optimal level of interconnectedness in a particular circumstance, and how it might be accomplished or incentivized. 

Interoperability can be promoted in a number of different ways, across different layers. This can entail interoperability as an antitrust remedy (e.g. AOL Instant Messenger), as a regulatory requirement in a certain sector (e.g. phone number portability), or regulatory changes that make permissionless compatibility easier (see: adversarial interoperability). 

Interoperability can also be voluntary and industry-led. The Data Transfer Project—a collaboration between Apple, Google, Facebook, Microsoft, and Twitter—is working to build an open-source framework for users to move their data between different services. Twitter’s Blue Sky project is also explicitly exploring ways to make the platform decentralized. Additionally, there are numerous standards and protocols developed by specialized industry groups and professional associations like IEEE, IETF, and others.

Private sector solutions are often preferable, but not possible in every case. Sometimes, government action will be necessary to address coordination problems, misaligned incentives, or address existing regulatory barriers.

As Congress looks towards policy approaches, it’s important they don’t settle for “nerd harder” as the answer to thorny questions. Policymakers and their staff, who are overwhelmingly non-technical, must resist the temptation to assume technical people can just figure out the messy details, or that there’s no risk of unintended consequences. 

As a first step, they should look to low-hanging fruit. Even basic forms of interoperability and data portability could have a big impact on the landscape, promoting competition without seriously undermining innovation (particularly compared with other regulatory approaches). This might include making it easy to bring your posts and connections with you to a rival platform, open APIs for messaging between platforms, and facilitating adversarial interoperability. While challenges exist, they’re not insurmountable.

Beyond this, more aggressive forms of interoperability regulation should require increasingly substantial levels of due diligence, engaging with technical experts at entities like the National Academies of Sciences, Engineering, and Medicine; the National Institute of Standards and Technology; and the Government Accountability Office’s Science, Technology Assessment, and Analytics Team; with careful consideration of the tradeoffs.

Despite its challenges, we’re optimistic about the potential for interoperability as a solution to our current debate over digital free expression and competition, and returning the Internet to its roots of empowering rather than undermining individual freedom. We hope you’ll follow our continuing work on this issue, and invite you to reach out if you’re interested in discussing these issues more.

Tags:

Image
Name
Designation
Short Description
Social Links
Dan Lips
VP for National Security and Oversight
Garret Johnson Lincoln Executive Director
Garrett Johnson
Executive Director
Zach Graves
Head of Policy
Sean Roberts
Senior Advisor
Alexiaa Jordan
Policy Analyst
J. Scott McKaig
CFO and General Counsel
Arthur Rizer
Senior Advisor
Grace Meyer
Head of Development
Marshall Kosloff
Director of Outreach and Media