Catching the Web in a Net of Neutrality

Imagine a world in which millions of senior citizens and disabled Americans, among others, can have, if they want, their medical conditions monitored continuously by devices that communicate over high speed, broadband networks that can automatically alert them if they require immediate medical attention. Such “remote disease management” systems not only would be highly convenient for patients, but based on evidence from the Veterans Administration’s use of systems that do not yet make extensive use of broadband, could lead to huge savings in health care costs. I have calculated in a recent report that the health care cost savings and the reduced need for institutionalizing seniors and the disabled could top $1 trillion over the next 25 years.

But there is a hitch. Remote disease monitoring — and telemedicine more broadly — cannot use broadband networks unless they are reliable. Even more important than not having your streamed movie interrupted by heavy traffic from other Internet users is not having your vital signs transmitted without interruption to the individual or computer that is remotely monitoring your health.

Yet perhaps without realizing it, those who are now advocating “net neutrality” — the notion that those who shell out the big bucks to build new much higher speed networks can’t ask the websites that will use the networks intensively to help pay for them — could keep this new world from becoming a reality. Further, they could deprive the websites themselves of the benefits of being able to use the networks to deliver their data-heavy content.

Admittedly, this is not readily apparent from the broadsides that net neutrality supporters are lobbing against the telecommunications companies. If they are to be believed, the firms that want to build a premium network could engage in price gouging or unfair discrimination, perhaps even destroying the Net itself. Nothing could be further from the truth.

Economists are fond of saying that there are no “free lunches,” which is to say that new products and services don’t magically appear. Those who benefit from them pay for them. A corollary of this simple principle is that markets will not work efficiently — that is, they will not generate the maximum output at the least cost — unless prices fully reflect all of the costs of products sold or services delivered. “All” of the costs include not only those that the producers bear, but also the “externalities” that they or users may impose on others. Pollution, for example, is a well known externality. We have a whole raft of environmental laws to “make polluters pay” because otherwise the prices of producing steel, cars and other pollution-generating industrial processes and products would be too low: society would purchase more of these items than is socially desirable.

The Internet is no different. There are well known externalities associated with the Internet. One positive externality is that the more users there are, the more beneficial it is to be plugged in, and the more profitable it is to write software for Net applications. But increasingly, as content like movies, real-time games, and other data-heavy services like remote disease monitoring are made available, some data imposes negative externalities — traffic congestion, if you will — that adversely affect the ability of others to use the Net reliably.

Until recently, traffic congestion on the Net was not a problem. There was so much excess capacity in the fiber optic cables and other parts of the complex telecommunications network that additional data heavy traffic delivered from one site did not threaten the reliability of traffic delivered from other sites and routed through the Net. But that blissful world is gone now. The existing networks are rapidly running out of excess capacity. We need new cyber-highways if the brave new world of movies, fast Google searches, and telemedicine — to take a few examples — is to become at all viable.

The question, then is: who should pay for these much higher speed networks? Asking all users to pay the same amount, regardless of how much they data they download, hardly seems fair. It would be like asking double wide trailer trucks to pay the same taxes for using our real world highways as you and I who drive our much smaller cars. In fact, state governments tend not to do this: they require trucks to pay taxes based on weight per axle that you and I with our cars (or SUVs) don’t pay.

Why should telecoms companies that want to build the next-generation cyber-highways be treated any differently? Shouldn’t they at least be allowed to charge data heavy sites more than others so that the many of us who don’t download lots of data don’t get socked?

It’s that simple, but the implications are profound. If, instead, telecoms companies are required by law to charge everyone the same amount for the next upgrade, there is a real risk that the charge will be so high that only a few data heavy sites will be able to pay. But because they, in effect, will have been subsidized, there will not be enough revenue collected to pay for the new networks. And they won’t get built. And the brave new world of telemedicine and wondrous economic and personal benefits it could bring — let alone the benefits of all other kinds of uses for higher speed broadband networks — will be stillborn.

We all want our broadband and the benefits it can bring. Let’s hope our policy-makers in Washington can resist the siren song of “net neutrality” and keep government out of Internet regulation so that the future that beckons becomes a reality.