Cloud computing’s advantages over legacy, on-premises deployments are well understood by most modern enterprises. These advantages include the low overhead of hardware maintained by a third party, massive horizontal scalability due to unlimited hardware from a customer point of view, and a billing model in which customers pay only for what they use when they use it.
These advantages and others have driven an increasing share of enterprise workloads to the cloud. As of the end of 2020, enterprise spending on cloud services and infrastructure was 56% more than what was spent on on-premises systems. When rate of growth is considered, cloud spending will continue to take a larger share of the enterprise IT market in coming years.¹
This trend brings more and more enterprise software workloads to geographically centralized locations, which are simply wherever cloud data centers have been built.
At the same time, a counter-force works to make computing more evenly distributed around the globe. This paradigm is edge computing.
Edge computing is a distributed computing paradigm that brings computation and data storage closer to the location where it is needed to improve response times and save bandwidth.²
When anyone uses an internet connected device, a server computer somewhere is needed to execute instructions and relay content to that device. Edge computing seeks to bring that logic and content closer to the end user, saving both time (usually in units of milliseconds) and money.
I recently read a book called How to Hide an Empire, part of which explains the United States’ post-WWII military strategy to build hundreds of military bases in far-reaching locations all over the world. This strategy of building a distributed network of many smaller bases has aided the US in responding to events more quickly and leveraging agile teams to fight more effectively.
I find interesting parallels in how leading providers of edge computing services today have built networks with a higher number of smaller, more agile data centers all over the world, increasing their literal footprint and flexibility. I’ll focus on how Cloudflare, a leading provider in the space, has taken advantage of point of presence (PoP) servers and software-defined networking (SDN) to build a strategic position.
A Primer on Cloudflare
Cloudflare builds Infrastructure-as-a-Service solutions that help companies deliver digital content quickly and securely to end users. One capability Cloudflare has developed to make their solution best in class is an extensive network of PoP servers, as mentioned above.
The physical proximity of the server computer that delivers content (loading a youtube video, wikipedia article / images, twitter timeline, etc.) makes a real difference in the latency an end user experiences. The bulk of the decrease in latency comes from reduced network complexity when digital content doesn’t have to travel as long of a distance. Traveling across the world by airplane is similar in one sense. We generally try to avoid routes with numerous layovers not because the time spent flying is significantly less, but because the time spent getting on and off of planes, moving bags around, and walking between terminals decreases the total travel time.
While Cloudflare’s network with many edge server locations across the globe is crucial to its success, it is not the most innovative aspect of its architecture. Similar networks have existed since the early days of the internet in the 1990’s as millions of people came online for the first time, requiring a similarly distributed network of server computers. However, Cloudflare has also taken the concept of software-defined networking to a new level, giving them an edge in flexibility and speed.
In the earliest days of the internet, networking routes were defined by the literal wiring patterns of servers and switches.
In this networking paradigm, the mechanisms which govern where requests are routed and where data flows exist on the same hardware through which the data itself is flowing, making them tightly coupled together. This decreases the control and flexibility that network administrators have as it is difficult to change the networking logic without having to touch the hardware, and vice versa.
Software-defined networking, on the other hand, separates networking logic from the hardware network itself, providing significantly more flexibility.
The concept of software-defined networking was largely popularized by the OpenFlow network communications protocol founded in 2008 by a group of researchers at Stanford. Cloudflare, founded in 2009, rode the wave as SDN became increasingly popular and brought its benefits to customers at enterprise scale. With the flexibility of software-defined networking and the strategic positioning of PoP servers around the globe, Cloudflare positioned itself well to deliver innovation after innovation leveraging these capabilities.
CDN to Serverless to SASE
Cloudflare’s architecture in its earliest days allowed the company to build a best-in-class content-delivery network, or CDN. CDNs ensure that static content can quickly be delivered to end users by caching content between them and the origin application server. I alluded to this above with the example of loading a youtube video or twitter timeline.
Similarly to how cloud data centers provide flexible resource allocation to customers on-demand, Cloudflare’s flexible networking model helped the company to deliver both better performance and a lower price point with its CDN, even offering a free version of the product, which was unusual at the time of its launch.
Having leveraged its PoP network and networking model to help customers more easily deliver static content to end users, Cloudflare moved adjacently into serverless compute with the introduction of Cloudflare Workers in 2017.
From Cloudflare’s website:
Cloudflare Workers provides a serverless execution environment that allows you to create entirely new applications or augment existing ones without configuring or maintaining infrastructure.
Most recently, Cloudflare has further expanded their security capabilities with the development of Cloudflare One, a set of network security products formalized into a single solution in an October 2020 announcement.³
Enterprise IT used to take a sort of moat and castle approach to information security. The bulk of sensitive company data was stored in the corporate data center, to which access was restricted to devices within the local corporate network. The company data center was the castle in which value was stored, and employees were required to travel to the office to be within its moat and have any access to it.
Enforced centralization of enterprise information security via castle and moat has become unwieldy for two key reasons: 1) SaaS applications have brought corporate data into the cloud and made its access more distributed in nature and 2) distributed workforces increase latency when all corporate network access requires connecting with a central location.⁴
Enter SASE, which stands for Secure Access Service Edge. SASE employs distributed security enforcement, requiring requests to be authenticated at the edge of the network rather than within a single hub. SASE is commonly associated with the Zero Trust security model, which dictates that requests must be verified always and everywhere, even when originated from with a local corporate network.
Yet again, Cloudflare’s two key capabilities have been essential in developing this new solution. A distributed security information architecture requires (a) a well-distributed network of point-of-presence servers and (b) a flexible network routing model defined by software to keep latency low no matter the location of the originating request and to handle the complexities that inevitably arise with managing a network with so many endpoints.
The Flexible Edge of the Future
In their 2017 research paper on edge computing⁵, Jianli Pan and James McElhannon estimate that 50 billion internet-connected devices will be in circulation by 2025. This explosion of devices will generate more data and require lower latency than traditional, centralized data center companies will be able to fully service. Additionally, the authors make the point that the high fixed cost of such massive data centers is prohibitively expensive to the point that edge computing networks will benefit from more open source development, making edge-native products increasingly viable for enterprises.
With such powerful tailwinds at their backs, companies with strategic investments at the edge and with the software-enabled flexibility to develop new products will continue to build best-in-class solutions and win market share, as we’ve seen Cloudflare do over the last ten years.
- Future Edge Cloud and Edge Computing for Internet of Things Applications