Internet openness pits collaborative history against competitive future

The debate about how open the internet should be to free expression – and how much companies should be able to restrict, or charge for, communication speeds – boils down to a conflict between the internet’s collaborative beginnings and its present commercialized form.

The internet originated in the late 1960s in the U.S. Department of Defense’s ARPANET project, whose goal was to enable government researchers around the country to communicate and coordinate with each other. When the general public was allowed online in the early 1990s, intellectuals saw an opportunity to include all mankind in the collaborative online community that had developed. As internet rights pioneer John Barlow wrote, “We are creating a world that all may enter without privilege or prejudice accorded by race, economic power, military force, or station of birth. We are creating a world where anyone, anywhere may express his or her beliefs.”

Even today, many of the people who contribute to the technical evolution of the network continue to view the internet as a place to share human knowledge for self-improvement and the betterment of society. As a result, many people are troubled when internet companies try to charge more money for faster access to digital commodities like streaming videos.

As a researcher in computer networks and security, I note that the problems are not just philosophical: The internet is based on technologies that complicate the task of commercializing the online world.

The ‘true’ internet

In practice, the designers of the technology at the foundation of the internet were not really attempting to enforce any particular philosophy. One of them, David Clark, wrote in a 1988 paper that early internet architects did consider commercial features, such as accounting. Being able to keep track of how much data – and which data – each user is sending is very useful, if those users are to be charged for connectivity. However, most of those commercial features didn’t get included because they weren’t needed for a government and military network.

These decisions decades ago echo through the years: There is no effective and universal way to distinguish between different types of internet traffic, for example, to give some priority or charge extra for others. If whomever produces the traffic actively tries to evade restrictions, separating content gets even more difficult.

Using old tools in new ways

One of the few sources of information about how internet companies handle this challenge comes from recent research at Northeastern University. It suggests that they may be using a technique called “deep packet inspection” to identify, for example, video traffic from a particular streaming service. Then internet companies can decide at what speed to deliver that traffic, whether to throttle it or give it priority.