Clear up some confusion

Edge or cloud: Which horse to back in the content delivery race?

Arianna Aondio
© Shutterstock / Sam72  

Edge computing has been referred to as “nebulous”. Yet, there appears to be much hype about the term on the web. In this article, Arianna Aondio gives a clearer image of what edge computing is and discusses whether and when to choose edge over cloud.

Two competing approaches – cloud and edge – are in a race to become the winning content delivery architecture. If you browse the web, you get the impression that edge computing is the clear favorite.

Before we discuss whether and when to choose edge over cloud (or combine the two) let’s dig deeper into “edge computing”. What is edge computing and why is there is so much hype about a term that RedMonk analyst James Governor described in a recent webinar as “nebulous”.

Edge computing is all about pushing application and logic to the extremes of a network. The need for it arose with the Internet of Things (IoT). As IoT relies on many distributed devices and sensors talking to each other, data collection and processing must be handled to suit this new paradigm. Instead of transmitting lots of data from the device to the backend, the idea arose to push the logic onto the device itself, reducing latency and processing times, improving performance and allowing the device to become “smart”. The broader definition of edge computing includes any strategy that moves logic closer to the end user – within the device or on a server living on an architecture layer close to the client browser.

SEE ALSO: The state of serverless computing: Current trends and future prospects

This broader definition includes the strategy to combine content delivery with edge computing and edge caching. You are likely familiar with caching or creating a temporary storage area that mirrors a site or application’s content. The cache serves a visitors’ request to a site or application, eliminating the need for the server to fetch the requested content from the backend. Consequently, server overload, the most common cause for slow content delivery is removed.

Let’s have a look at a simplified content delivery architecture: apart from the origin servers producing the content, you have a cloud layer responsible for the “heavy lifting”  where this content is usually cached and distributed. You might also have a fog layer to improve network and bandwidth. Finally, the edge layer is where data is collected, processed and filtered.

Depending on how flexible your cache software is, you can add caching to any of those layers. Adding caching to the edge layer reduces the response time as the request doesn’t have to travel all the way back to the cloud. This not only offloads your cloud layer but also improves the performance of the content delivery (lower response latencies), improves security (authentication and authorization)  and provides a better user/customer experience. Also, having more granular control at the edge increases the flexibility of content delivery as you stay in control of the HTTP behavior.

However, introducing edge caching also means more maintenance plus the need for scheduling and provisioning to keep the servers up and running. Therefore, before you place all your bets on the single horse called edge computing, you should evaluate if your cloud layer is actually sufficient to fulfill your content delivery needs.

SEE ALSO: Monitoring serverless computing for modern business applications

The following questions help you evaluate your needs and find the right tradeoffs that are in line with your business requirements:

  • Do I really need to process logic on the edge or can I afford to wait until the user requests get to the cloud layer?
    Background: Shuffling data back and forth is quite expensive, as bandwidth limitations are still a big issue in the content delivery world. If you already know that your network gets easily congested, it makes sense to run logic on an edge layer.
  • How much user personalization does my website have?
    Background: For an e-commerce site, user personalization is mandatory. The better it is, the more the site will likely sell. However, user-personalized content requires more round trips and more bytes to be exchanged between the client browser and the servers with the content. Therefore for personalized e-commerce sites, using at least some degree of edge logic or edge caching makes a lot of sense and is a good investment to support the personalization initiatives.
  • Do I use a lot of streaming to deliver the content?
    Background: For streaming, you don’t really need much edge logic, instead you rather would purchase extra cloud instances very close to the audience you are streaming to.
  • Where is my audience located?
    Background: If your users are distributed in several geographic locations it might make sense to have many distributed edge caches to provide a low-latency experience also to the most remote consumer.

Arianna Aondio

Arianna Aondio is Tech Evangelist at Varnish Software, the company behind the open source HTTP engine Varnish Cache. Follow Arianna on Twitter @ariannaaondio.

Leave a Reply

Be the First to Comment!

Notify of