INTERNET: Building a Crash-Proof Internet

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view

INTERNET: Building a Crash-Proof Internet

David P. Dillard


Building a Crash-Proof Internet

Building a Crash-Proof Internet
29 June 2009
by Bennett Daviss
Magazine issue 2714.
New Scientist

ON 18 July 2001, a freight train derailed in the Howard Street tunnel
running beneath downtown Baltimore, spilling 20,000 litres of hydrochloric
acid. The resulting chemical fire destroyed fibre-optic cables owned by
eight major US internet carriers. Moments later, Verizon Communications,
which operates key portions of the internet's physical infrastructure in
the US, lost links to two operations buildings and several other carriers'
networks. For many hours, internet traffic slowed to a crawl across the
entire country. "That tunnel is basically the I-95 [the main US East Coast
highway] for fibre," one repair contractor told reporters. "It was a
once-in-a-lifetime place for vulnerability."

Eight years on, and events have proved otherwise. A series of catastrophic
failures seems to suggest that the internet is rather more vulnerable to
accidents, earthquakes or misplaced ships' anchors than people thought. At
tens, perhaps hundreds, of places around the world, the net seems to be
hanging by a thread.

These days a major failure has the potential to cause far greater
disruption than in 2001. Yet much of the internet's physical
infrastructure is decades old. It badly needs upgrading, but clearly we
can't just tear up sections of the network and rebuild them from scratch.
Nor is it likely that governments and telecoms companies will bear the
enormous costs of laying extra connections simply to insure against
temporary problems. So how can we make the net more resilient?

Nick McKeown, a computer scientist at Stanford University in California,
thinks he has the answer. He believes the key to a better net lies with a
prosaic black box called a router.

Routers are the internet's traffic controllers. There are millions in
service, linking up the thousands of networks that make up the internet.
They can direct huge flows of traffic for internet service providers, or
just provide connectivity between a handful of computers. They check the
addresses on data packets, direct them to the right destination and
dictate which physical path they take to get there. When a connection
breaks, they play a crucial role in helping divert data around it.

At the moment, though, routers are part of the problem, not the solution.
For one thing, they can be very slow to find a way around a blockage, and
in the many minutes it often takes, traffic backs up into jams so huge
that much of the data is simply discarded.

Though numerous potential solutions to these problems exist, the other big
sticking point is that there is nowhere to test them. Any update of router
software ought first to be thoroughly tested on a large network - one that
has all the complexity of the internet but which is physically isolated
from it. Yet nothing like that exists.


The complete article may be read at the URL above.

David Dillard
Temple University
(215) 204 - 4584
[hidden email]
Index: <>
General Internet & Print Resources
Nina Dillard's Photographs on Net-Gold
Net-Gold Membership Required to View Photos
Twitter: davidpdillard

Bushell, R. & Sheldon, P. (eds),
Wellness and Tourism: Mind, Body, Spirit,
Place, New York: Cognizant Communication Books.
Wellness Tourism: Bibliographic and Webliographic Essay
David P. Dillard


Improve Your Chances for Indoor Gardening Success