A case against net neutrality

December 4th, 2017

Ajit Pai BDS 2016

An Internet is debating net neutrality and a handful of issues are as contentious as the 2015 policy regulating the transfer of bytes therein contained. Echoing Commissioner Pai’s sentiment in his Business Data Services dissent of 2016 I can only imagine things are now twice as hard for poor Alice. Figuring out her way back to Wonderland from the net neutrality world is no small feat, because practically nothing in it makes any sense.

Once one’s brain starts looping around this airport called net neutrality, there’s a risk it’ll notice apocalyptic terminals everywhere. For the less eschatological among us the gist of the debate is as follows: one side of the net neutrality argument prioritizes consumer access to and investment for the backbone infrastructure, while the other prioritizes only the consumer (and in some cases partially the startup) 1 market segment effectively rendering competition, investment, infrastructure, and innovation wholly inconsequential. For the E&C nerds among the distinguished readership: the Federal Communications Commission’s 2016 Business Data Services guidance is a relevant proxy debate.

Contrary to popular belief, content providers are in favor of net neutrality not because they’re idealistic or actively discriminated against as they claim (or would be.) Rather, companies like Netflix, Google, Facebook, Amazon, and Reddit beat the drum of net neutrality policies because it removes their incentive to reduce their digital footprint by paying the same for ever increasing bandwidth usage. Their services require sizable infrastructure investment of which they don’t share the financial burden at all.

Assume, for instance, Netflix started with a plain and honest 10 Mbps connection. After a while the service exploded in popularity resulting in much bigger bandwidth demands. So there went Netflix to Comcast asking if they’d open to peer with them and use part of their backbone. Usage-based peering agreements between tiered ISPs as well as ISPs and content providers are a very common industry tool. 2 However, there are two caveats to this story. Namely, Netflix didn’t want to pay more for their increased consumption while Comcast demanded Netflix doing so. In that regard, Netflix’s statement, that Comcast demanded of them to pay more for higher speeds, is not technically incorrect but it’s a half, concealed truth. For it leaves out the part where Comcast is actually treating every content provider equally, footing the majority of the bill for them, and them getting special treatment for free from the other ISPs.

Regardless of the topicality of Netflix’s case, in honestly evaluating a regulatory proposal it is helpful to ask the following three questions in order to determine if the proposal is appropriate, useful, and good.

  1. Is there a problem at all?
  2. Can this problem be solved by regulation?
  3. Is regulation the best or the only solution for this problem?

Not all regulation achieves its goals or is actually effective and truly productive. As such, scrutiny of a proposal should stem from the above questions.

By all accounts, better infrastructure yields better, faster, and cheaper service across all market segments (both in terms of purchasing power and geographic coverage.) A handy example is the telecom market, where behemoths like AT&T and Verizon are now openly challenged by T-Mobile and others across the nation. Backbone infrastructure in the US is lagging because of disincentivized competition between ISPs as well as federal programs that are net negative against their initial (and noble) goals. The Lifeline example referenced by WSJ in its Pai profile published last May is pertinent: essentially the United States Government pays for and subsidizes basic Internet access rather than incentivizing ISPs to expand, build, improve, and develop their network. The product USG purchases is hardly ever improved, thereby stagnating a market that needs (and deserves) as good Internet access as the rest of us. Another key and oft-neglected issue is rural access. A multitude of rural communities still rely on dial up to connect. They suffer from minimal infrastructure investment and lagging network expansion. As a result, folks are hurt by higher prices, slower speeds, limited access, and an overall worse experience.

Consider the following: suppose a regional ISP has a bandwidth of 100 Gbs and a demand for 200 Gbs. They can:

  1. Prioritize packets over others such as live streaming and try to mitigate congestion;
  2. Slow down everything (because of finite bandwidth limitations) as compelled by the 2015 FCC rules and make the streams unwatchable;
  3. Invest in bigger, expensive infrastructure and try to split cost between existing funds and passing it down to consumers.

Due to not enough competition, incentives to expand, and excessive compliance costs, option #3 is not feasible. Because of the rules (option #2) neither is option #1. Thus, by heavily regulating the Internet, the Commission has effortlessly created a worse, i.e., slower product with an eventual higher cost to the consumer, since it prevents competition between ISPs and smaller companies from being able to innovate. FCC’s new Title II interpretation essentially makes it difficult for ISPs to charge bandwidth hogs such as content providers more for using more of their bandwidth. 3

By the same token, I’d remiss to not mention another key, controversial, and yet less publicized aspect of the issue at hand: relying on regulatory guidance at the expense of Congress versus enacting new or amending existing law. The Communications Act of which Title II is a provision of was enacted in 1934 and its last major overhaul was completed in 1996. The problem with the 2015 FCC decision is twofold: first, it solely relied on arbitrary regulatory interpretation, and second, it interpreted from a—by a technological perspective at least—old statute: the 1996 overhaul. That isn’t to say the 1996 overhaul was bad, 4 rather it is a huge leap of logic to suddenly do a policy 180 and drastically change the interpretation of a law ex post facto without amending said law after that many years. For instance, virtual MVPDs hardly existed in 1996. There should be a proper relationship between statute and regulatory guidance and interpretation, and in the case of the 2015 rules there isn’t. In other words: if one truly cared about net neutrality, they’d amend the law in a consensus-building, bipartisan manner, and wouldn’t relinquish their policy goal by issuing a guidance.

If one were to further engage with the FCC’s reasoning for issuing the 2015 rules, they’d find that anti-competitive problems of the sort the Commission is highlighting are not abundant. Moreover, regardless of the scale of the problem the Commission cites, antitrust and anticompetitive issues, which are the FCC’s pivotal concerns, can be better addressed by antitrust enforcement such as the Federal Trade Commission without harming current lawful business models, practices, and products. FTC has long had the authority over pursuing wrongdoers and taking targeted action against them.

Ultimately, the 2015 FCC rules make it harder for smaller (usually local and regional) ISPs to build new or update existing infrastructure and for new entrants to even try to enter the regional or national market due to additional and unwarranted compliance costs which bigger ISPs can of course afford. Consequently, by increasing barriers of entry for new ISPs, not only the FCC reduces competition and hurts the consumer (increased prices, worse product) but at the same time strengthens incumbent monopolies which stand to benefit the most from net neutrality policies as they can better retain their market share. Thus, the notion that the Internet should generally remain a level playing field for all stakeholders and businesses—something I think (and hope) the most reasonable among us would agree on—is rendered false.

The following excerpt from Net neutrality and consumer welfare shows how the 1996 Title II was performing a few years after its enactment—quite well, actually.

"Between mid-2002 and mid-2008, the number of high-speed broadband access lines in the United States grew from 16M to nearly 133M, and the number of residential broadband lines grew from 14M to nearly 80M. Internet traffic roughly tripled between 2007 and 2009. At the same time, prices for broadband Internet access services have fallen sharply."

In 2002 $40 bought a household 768kbps; in 2010 the same $40 equaled 10mbps. Clearly, the 1996 Title II was in the right direction. That said, Net neutrality and consumer welfare is a remarkably interesting paper and definitely worth a read or two.

When all is said and done and when all facts are properly considered, the curious reader should ask: what are the available remedies against FCC’s 2015 reading of Title II? What is a prudent defense against the Commission’s regulatory overreach?

The short and high-level answer, much like the erroneous FCC interpretation is twofold. Generally, it should be acknowledged that unjust and preemptive government intervention in the ISP market is anti-competitive, hinders innovation, assists incumbent monopolies, distributes the costs unfairly among the stakeholders, diminishes the quality of the product consumers (and federal agencies) purchase, and reduces Internet access and connectivity. To that extend, in order to guarantee market fairness, consumer welfare, and a free and liberal Internet, the government should stay as neutral and as far away from it as possible.

First, Congress should ideally amend where applicable the current Title II statute and better codify business practices and the regulatory environment. Lawmakers should update what needs be updated. However, Congress unsuccessfully attempted an amendment in 2014. The House discussed a bill which would create a new section—Title X—preserving elements of net neutrality principles in exchange for restricting the FCC from regulating broadband providers and the Internet via Title II. The bill didn’t amass enough bipartisan support and fell off the radar, significantly diminishing any chances for another attempt in the future.

Second, the public, pro-net neutrality advocates, and other stakeholders should acknowledge and consider current alternative market-based practices on the local level. Throughout Europe and across parts of the US, folks enjoy high-speed, unrestricted, and low-cost Internet access via municipally or cooperatively owned carrier-neutral fiber. Using the last mile principle, residents fund the fiber and contract an ISP to install and run it. Physical space in central offices (where home and business subscriber lines are connected to the backbone) has to be shared among providers—a remnant of the 1934 Communications Act. Thus, if the fiber can be funded, it can be opened up to any and all ISPs removing on one hand the impetus for unmerited and immoderate regulation and intervention on behalf of the government, and on the other creating market competition incentivized to deliver what consumers truly want: fast, reliable, cheap Internet access.

In lieu of conclusion I want to expand on two more points because net neutrality goes beyond a bland policy debate about infrastructure and market competition. To wit, innovation and (a potential, yet explicit) government control of the Internet. A simple example of how net neutrality is stifling innovation is Facebook’s Free Basic Internet initiative which was aimed for parts of poor rural India. Facebook wanted to offer basic Internet access for free, which, despite including useful websites such as news, weather information, Wikipedia, and Facebook, was blocked based on concerns about net neutrality. Yet, considering the alternative of no Internet access whatsoever, even basic Internet access would have been preferential and perhaps a life saver.

With regards to the latter point touching the government’s prerogative: isn’t it sort of ironic that net neutrality proponents who care deeply about one of the most disrupting inventions since time immemorial, which allows for free and liberal exchange of information without government interference (and even finding ways to circumvent it when it occurs, such as in totalitarian regimes,) openly invite more government to control this invention? Folks fulminate online about not wanting the government to take our Internet from us, when the entire point of net neutrality policies is a government-regulated Internet.

Inevitably, when a legal framework for a regulatory body exists to compel an ISP or any private or public entity for that matter to provide equal access to all legitimate content 5 it is not too much of a stretch to only briefly entertain the thought that a natural extension of this could be for a future FCC to compel an ISP to block all illegitimate content. Some might recall that during the ’80s and ’90s FCC censored TV and radio, for they were classified as public utility. Accordingly, if the FCC truly considers the Internet a public utility, they will be granted with the same powers as in broadcast television.

Who among us would honestly want an Internet like that?


  1. Generally between traditional and virtual MVPDs, i.e., established and big incumbents against up-and-coming challengers, i.e., infrastructure owners against content providers. Usually, challengers initially argue for net neutrality until they reach a maturity point when they magically pivot.

  2. Peering and caching save both parties a lot of bandwidth. By offloading caches of content on big regional data centers and exchanges, traffic doesn’t have to go through the entire backbone infrastructure. See for example MIT’s NOX.

  3. See Netflix et al. above.

  4. It wasn’t; even Pai himself agrees Clinton and the then FCC did a good job. After all, the ecosystem did fine for many years after the amendment.

  5. Who and how defines legitimate content? What about the First Amendment?