Web Gateways,

Designed for Distributed. Built for the Cloud.
Delivered as a Service.


The Disappearance of the Network Edge and Why Cloud Will Prevail

Modern content delivery technology minimizes the significance and impact of the network edge, thanks in large part to the benefits and capabilities of cloud computing.

Greek philosopher Empedocles defined God as “a circle whose center is everywhere and its circumference nowhere.” As modern definitions of the cloud go, theology entirely to the side, that same language is equally apt. The emerging truth of modern networking is, however, that the network edge is nowhere near as useful or necessary a concept as it was in an earlier world with a more crisply established sense of what’s local, and what’s core, and where the interstitial boundary between them might lie.

The Cloud Is Never More Than Two Hops Away
These days, with peering relationships multiplying at furious rates, and the extension of infrastructure servers like those from Akamai, Incapsula, Bitgravity, Cloudflare, EdgeCast Networks, CDNnetworks, and the like, there’s not much latency to overcome in accessing even large volumes of content or data. Chances are better than 50-50 that most requests (or much of their pieces and parts) will come from servers strategically positioned to be no more than two hops away from any ISP’s closest Internet point of presence (POP). That brings the core (or rather, the connections that the core delivers) incredibly close to most Internet users, and helps to make the network edge mostly irrelevant.

It's still as true as it ever was that the bandwidth and responsiveness drops dramatically for the proverbial last mile of any WAN link, including links to the Internet. But the impacts of waiting for responses to requests for information (HTTP GET, by and large, given the typical composition of modern Internet traffic) have been mostly forestalled by the content servers queuing up at the edge. Much of what is requested has been requested already and is waiting to be delivered on those servers.

The Cloud Makes the Connections, and Keeps the Content
Behind the scenes, what enables these localized content servers to do their jobs is a seamless and direct connection into the cloud. Content creators and controllers push their content into the cloud first and foremost, and content aggregators and distributors grab that content from the cloud to make it as local for consumers as geography and network topology will allow. The magic comes from the ability for producers to move and store information into the cloud as needed, then later on, for consumers to request and obtain that information from the cloud in similar fashion.

Thus, the ability of the cloud to short-circuit connections on the sending and receiving side cuts most of the network latency out of the communications equation. Cloud providers do everything they can to minimize the latency of the links on each side of that connection. Cloud providers also neatly cut the need for producers and consumers to interact directly as much as intelligent pre-fetch, caching, and deduplication technologies will allow. This neatly takes the impact of the slower communications tier from the edge to the endpoint out of the equation, and bolsters the proposition that the network edge is disappearing into irrelevance as the cloud triumphs over all.

Read about revolutionary node technology that offers more security in the cloud