Edge Computing Platform

Optimize and deliver rapid-speed user experiences

Play Video

Edge Computing ensures ultra-low latency and high bandwidth/performance computing.

CDNetworks Edge Computing Platform (ECP) enables customers to meet growing business demands by effortlessly deploying and scaling up container-based applications. ECP places high-performance compute, storage and network resources as close as possible to end users. Doing so lowers the cost of data transport, decreases latency, and increases locality. ECP is a container orchestration system built on Kubernetes and Docker for customers to write container-based applications once and deploy them everywhere.

Resources

ECP Free Tier Program

Sign Up and get $500 credit

Product Highlights

1500+ PoPs Global Presence

CDNetworks provides unmatched scale for rapidly expanding your container-based applications

50+ Tbps High Bandwidth

Aggregated bandwidth ensures high performance and availability, even with peak traffic

< 50 ms Ultra Low Latency

Fast application processing and communication between edge and end points

Distributed PoPs coverage to insure ultra-low latency

Compatible with TCP protocol

Automated deployment, self-healing, auto scaling, application monitoring & reporting

Comprehensive technical support

Edge Computing Platform Solution

ECP is an Infrastructure As a Service (IaaS) that offers both Computer, Network, Storage resources for container instances and Kubernetes (K8s) container management at the edge.

Compute

CPU
Memory

Network

Public IPv4 and IPv6 network interface
Static IPs
Load Balancing

Storage

High performance local SSD persistent storage

Features

Automated Application Deployment

When developers specify a Pod, they can optionally specify the resources each container needs. Kubernetes runs a scheduler that automatically makes decisions about which nodes to place their Pods on, based on requests as well as predefined schedule policies and preferences. Manual application planning is not required.

Self-healing

Kubernetes scheduler will restart containers that fail, replace and reschedule containers when nodes die, and kill containers that don’t respond to any health check.

Automatic Rolling Updates

Deployment controller allows developers do application rollouts and rollbacks with ease.

Horizontal Pod Autoscaling (HPA)

Scale applications up and down automatically based on resource usage such as CPU and memory.

What is Edge Computing?

Edge computing is a network philosophy that aims to bring computing power, memory and storage as close to the end users as possible. The “edge” refers to the edge of the network, the location where the network’s servers can deliver computing functionalities to customers most expediently.

Instead of relying on a server at a centralized location like a data center, edge computing moves processing physically closer to the end user. The computation is done locally, like on a user’s computer, an IoT device or an edge server.

Edge computing minimizes the amount of long-distance communication that has to happen between a client and a centralized cloud or server.  This results in less delay, or latency, faster response times and bandwidth usage.

Edge Computing

How Edge Computing Works

Edge computing works by allowing data from the local devices to be analyzed at the edge of the network they are in, before being sent to centralized cloud or edge cloud ecosystem. A network of data centers, servers, routers and network switches distributed across the globe processes and stores data locally and each can replicate its data to other locations. These individual locations are called Points of Presence (PoP). Edge PoPs are physically closer to the device, unlike cloud servers which could be far away.

Traditionally, organizations ran multiple applications on physical servers. There was no easy way to allocate resources to all applications to ensure they all performed equally well. Then came virtual machines (VM) which allowed applications to be isolated for better utilization of a server’s resource on the same hardware infrastructure.

Containers are similar to VMs, except that they can share the operating system (OS) among the applications. This makes containers portable across clouds and OS distributions. Developers can bundle and run applications effectively and in an agile manner, with no downtime.

In fact, the open-source platform Kubernetes helps developers automate much of the management of container applications. For example, it allows you to distribute network traffic in case one container is receiving high traffic, automate rollouts and rollbacks, restart containers that fail, check on their health and more.

Developers can deploy applications on the edge by building pods – small units of computing that group together one or more containers with shared storage and network resources. Kubernetes or K8s as they are called, can be deployed on every edge PoP to allow developers to build these pods on the edge themselves.

Consider a cloud gaming company with users from across the world accessing graphics-intensive content to their devices from a centralized cloud. The game has to respond to users’ keystrokes and mouse action and the data must travel to and from the cloud in milliseconds or less. This continual interactivity requires immense computing power to be stored, fetched and processed by the company’s servers. Additionally, modern cloud-gaming requires 5G networks because of the stable ultra-low latency it promises.

The greater the distance to the servers, the more the data has to travel and the higher the chances of latency and jitter. This could lead to delays and a poor gaming experience for users.

By moving the computing closer to the edge and the users, data travels the minimum possible distance and players have a latency-free experience. This makes the actual user devices like a console or personal computer irrelevant. Running the data workloads at the edge thereby making it possible to render graphically intensive video and creating a better gaming experience overall., and also helps the company do away with the costs of running a centralized infrastructure.

Why is Edge Computing Important for Privacy & Security?

Edge computing does come with some security concerns. Since the edge nodes are closer to the end users, edge computing often deals with large volumes of highly sensitive data. If this data leaks, there can be serious concerns about privacy violations.

As more IoT and connected devices join the edge network, the potential attack surface also expands. The devices and users in the edge computing environment could also be moving. This makes it difficult to design security rules to thwart attacks.

One approach to ensure security with edge computing is to minimize the processing done on the devices themselves. The data can be collected from the device, packaged and routed to an edge node for processing. This may not always be possible though, such as when sensors on self-driving cars or building-automation systems need to process data and make decisions in real-time.

Encryption of data at rest and in transit can help address some of the security concerns with edge computing. This way, even if the data from the devices is leaked, they will not be able to decipher any personal information.

The edge devices can also differ in their requirements for power, electricity and network connectivity. This raises concerns about their availability and what happens when one of the nodes go down. Edge computing addresses this using Global Server Load Balancing (GSLB), a technology which distributes traffic among the several different edge nodes. So when one node is overwhelmed and about to go down, others can take over and continue to fulfil user requests.

How Does Edge Computing Differ From Cloud Computing?

Cloud computing is a technology that allows for the delivery of storage, applications and processing power on an on-demand service basis over the internet. In the early days of computing, businesses had to set up data centers, hardware and other computing infrastructure to run their applications. This meant upfront costs, managing complexity and spending manpower on maintaining the infrastructure, all of which multiplied with scale.

Cloud computing essentially lets businesses “rent” access to data storage and applications from cloud service providers. The providers will be responsible for owning and managing the centralized applications in their data centers while businesses only pay based on their usage of these resources. Edge computing is different in that the applications and computation is moved closer to users.

How Does Edge Computing Differ From Cloud Computing?

Stateless VS Stateful

Another crucial difference between cloud computing and edge computing lies in how they handle stateful and stateless applications.

Stateful applications are those that store information on previous transactions. Online banking or email are examples, where new transactions are performed in context to what has happened before. Since these applications need to store more data about their state, they are better suited to be stored on the conventional cloud.

Stateless applications are those that don’t store any information in reference to past transactions. For example, entering a query in a search engine is a stateless transaction. If the search is interrupted or closed, you will start a new one from scratch. The applications which run on the edge are often stateless as they need to be moved around and require less storage and computation.

Bandwidth requirements

Cloud computing and edge computing also differ in the bandwidth requirements of the applications they handle. Bandwidth refers to the amount of data that can travel between the user and the servers across the internet. The higher the bandwidth, the greater the impact on the performance of the applications and the resulting costs.  

Since the distance that the data has to travel to a centralized cloud is much more, applications require less bandwidth. When you have applications that require high bandwidth for their performance, edge computing is the way to go.

While edge computing and cloud computing may differ in many aspects, utilizing one does not preclude the use of the other. For example, to address the latency issues in a public cloud model, you can move processing for mission-critical applications closer to the source of the data.

Latency

One of the main differences between cloud computing and edge computing pertains to latency. Cloud computing can introduce latency because of the distance between users and the cloud. The edge infrastructure moves computing power closer to end users to minimize the distance that data has to travel, while still retaining the centralized nature of cloud computing.  Thus edge computing is better for latency-sensitive applications while cloud computing works for those for which latency is not a major concern.

Benefits of Edge Computing

Edge computing helps businesses provide seamless user experiences to their users. Developers can use edge computing platforms to specify the right resources needed for applications and deploy and scale them up as necessary. Here are four ways in which edge computing benefits businesses.

1.   It helps save costs by optimizing bandwidth and cloud resources

As more offices get equipped with smart cameras, printers and other IoT devices, the bandwidth and cloud resources required also increase, driving up costs. Statista predicts that by 2025 there will be over 75 billion Internet of Things devices installed worldwide. To support all those devices, significant amounts of computation will have to be moved to the edge.

2.   It improves performance by reducing latency

When web applications run processes that communicate with an external server, users will encounter delays. The duration of these delays can vary based on the available bandwidth and server location. But if more processes are moved to the network edge through edge computing, such delays can be minimized or avoided altogether. In some cases, the deployment of low latency applications can also be automated based on the resources available.

3.   It helps businesses offer new functionalities

Since edge computing moves computation closer to the source, it allows businesses to deliver new functionalities. Think of a business deployment of a graphics-heavy web page that has elements of augmented reality. Or an autonomous vehicles manufacturer exploring applications that need to process a lot of artificial intelligence algorithms and machine learning capabilities. Relying on sending the data to a centralized source far away is not practical, while edge computing will allow the business to run these processes in real-time.

4.   It ensures high availability for applications

Businesses that provide services over the internet need to ensure continuous availability for their applications. Disruptions can affect customer experience and satisfaction, such as in the case of e-commerce stores. In more critical scenarios such as a refinery gas leakage detection system, the impact of disruptions could be the difference between life or death. Edge computing ensures any disruptions that arise are localized to specific nodes as opposed to the entire network and ensures that applications are always available and running.

High availability is more about to when 1 PoP down GSLB will bring traffic to the other node, so service won’t stop

Edge Computing Use Cases

Unified Streaming, an Amsterdam based streaming services technology provider, was looking for a way to deal with the rising costs of large scale content distribution. They were seeing a rise in CDN cache and cloud storage costs with more video formats, protocols and encryption schemes.

Using CDNetworks Edge Computing Platform, they were able to generate different alternative formats and encodings for streaming in real-time. The result was a 50 percent reduction in cloud egress and CDN caching footprint. Rufael Mekuria, Head of Research and Standardization at Unified Streaming says of the CDNetworks experience, “We were impressed with the ease of use of the ECP and the consistent performance results.”

Play Video