« Fair Trade: Fixing Antitrust for the Internet | Main | Postal Hikes and Time Warner's Role Discovered by New York Review of Magazines »

May 15, 2008


barry payne


In the Stanford hearing on April 17th before the FCC, George Ford of the Phoenix Center asserted in effect, that as a negative externality, network congestion is caused directly by crowded customers and their overuse of underpriced bandwidth via flat rate pricing, and implied that its alleviation was essentially an act of altruism by network providers since only customers benefited from its mitigation.

In the case of Comcast, congestion is a direct consequence of its strategy to maximize low-use subscribers at the expense of high-users, and does not reflect typical conditions of congestion arising directly from phenomena like drivers in traffic jams or visitors to overcrowded free public parks. Nor does it reflect conditions of managed congestion by club goods subject to competition.

Instead, congestion is a consequence of incentives by Comcast in the absence of effective competition to oversell and underinvest in bandwidth capacity while manufacturing shortages and degraded service through deceptive marketing practices (aside from additional anti-competitive incentives). In short, the results manifest as classic monopoly undersupply and overprice.

When Comcast markets bandwidth, it intentionally underprices it by way of degraded quality - overstating its availability at the maximum rate sold with no minimum provided, then placing all risk of service interruption on the customer.

"Large users" of bandwidth and GBs are selectively forced off the network through a combination of threats, delays, interruptions and cancellations administered arbitrarily on a case-by-case basis - tactics specifically designed in the absence of uniform, neutral transparent limits placed on bandwidth or GBs (such as the recently floated cap on 250GB by Comcast).

Customer expectations of "low users" not subject to these tactics could be met by loading up the network with such subscribers as close to the point of congestion as possible without going past it. At the margin, the same revenue is gained for a low use subscriber as a high one under the flat rate for maximum bandwidth, which generates more total revenue across all customers for a given network bandwidth capacity.

However, congestion is still allowed to occur under this policy, and when it does, bandwidth is effectively overpriced to everyone in the peak period since it is degraded to a lower, unexpected level of service quality.

But Ford would insist the opposite - that bandwidth is underpriced at zero for the marginal GB consumed in peak, which is posed to drive aggregate consumption past the point of congestion.

This is a Catch-22 pricing model. Service X is sold for price Y available for expected use Z, but when GB use exceeds maximums intentionally withheld from users, individual use is restrained selectively over the longer run as aggregate congestion is allowed to occur in peak periods in the short run.

The effect is to sell bandwidth on an "as is" spot market basis over the billing cycle while blaming the customers for congestion due to "shared bandwidth, exafloods of GB use and bandwith hogs" when no individual connection ever uses more than its maximum bandwidth assigned and made available by Comcast.

Clearly, if subject to effective competition, even providers who refuse to provide limits on bandwidth use beyond the maximum burst rate would have incentives to invest in sufficient bandwidth to avoid congestion.

Otherwise, differentiated prices such as for on and off-peak periods would be expected under competition to supplant the practice of widespread arbitrary discrimination enacted under the guise of "network management".

The comments to this entry are closed.

Blog powered by Typepad