Cloud Talk
Cloud perspectives by cloud people at Ridge
Jonathan Seelig
Ridge

What Is Edge Computing? — The Complete Guide


Introduction to Edge Computing

Edge computing originated with Content Delivery Networks (CDNs), a system of servers deployed close to users to deliver web, video, and other content. In the early 2000s, CDNs evolved to include applications and application hosting for real-time data aggregation and analysis. Today, the definition of edge computing has expanded to include virtualization technology and containers; modern edge computing platforms can deploy and manage a massive range of applications to further enhance the power of the network edge. 

How does edge computing work?

Unlike the large, bulky server farms of the past, edge computing utilizes infrastructure near end user locations to deliver content seamlessly with minimal latency. 

Edge computing platforms utilize localized infrastructure, whether that’s servers or user devices, to process and store data depending on where and how it is needed. By decentralizing the data and the infrastructure, edge computing solutions reduce the time it takes to access and share data, and in instances of infrastructure failure, can provide immediate disaster recovery and on-demand scaling to avoid productivity losses. 

When edge computing applications process and share data with the edge computing network in real-time, customer-facing applications and other business-critical applications can leverage instantaneous feedback to maximize business performance. 


A number of major industries, like gaming, cloud computing, security, production lines, virtual reality, and industrial equipment use this distributed network to process data faster and provide a low-latency experience for users. The exponential growth of Internet of Things (IoT) devices that demand an uninterrupted internet connection means that businesses need to develop distributed networks to handle data analysis and delivery.




Benefits of edge computing


The core benefit of Edge Computing is to fully address a number of critical failure points inherent in centralized networks. 

Bandwidth

The proliferation of devices that generate huge streams of data at the edge of the network creates significant network challenges. Edge IoT devices, security cameras, video games, and autonomous devices can’t possibly send all of their data to centralized facilities. Now, pre-processing, compression, and analysis for IoT edge computing at edge nodes deployed much closer to the source is becoming a necessity.  

Security

Data security is a major problem for modern operations, and the world’s increased emphasis on protecting consumer data rights necessitates strong security for edge computing devices. Generally, this is handled at the OS or application layer on the actual edge devices (phones, laptops, etc…)  Security concerns are offloaded from the central cloud to the user’s device, which minimizes strain on the network and reduces the amount of personal data being sent back to the central cloud. 


Latency

Latency is very simply a measure of network distance between the compute node and the client. As more and more businesses develop and rely on real-time or hyper-responsive applications, access to low latency edge cloud platforms will become critical to their success. 

Data Analysis

A number of industries, like gaming, security, and self-driving cars, rely on edge Artificial Intelligence to process data in real-time and make split-second updates to applications. When the distance from the network edge node to the centralized cloud region is too great, these applications will fail. Edge clouds can handle data processing on local devices to reduce latency and send back the data critical to application performance. 

Video & Media

The world relies on visual mediums to communicate, especially with the massive growth of streaming services. On-demand streaming platforms, like Netflix, Google Stadia, and Twitch, send an incredible amount of data per second over user networks. By using a distributed edge network to handle data processing, the burden of performance is localized, thereby ensuring a smooth experience for users anywhere in the world. 


The core benefit of Edge Computing is to fully address a number of critical failure points inherent in centralized networks. 

Edge computing examples


A number of industries either already rely on or are investigating edge computing architectures: 

Drones

Modern drone systems need access to city-level server networks for lower latency operational commands. By utilizing an edge computing platform, drones can increase throughput of large image and video files to local servers and in-region computational resources eliminates bandwidth bottlenecks and speeds up drone performance. 

Gaming

Gaming is a worldwide phenomenon. Players are located across the world on different platforms and devices, each with their own unique quirks. Delivering a flawless experience, especially in online multiplayer games, is an incredible challenge for centralized infrastructure. Edge computing provides the on-demand scaling and a distributed network that developers need in order to deploy their software closer to users to deliver that crisp, responsive experience players demand. 

Security Cameras

A network of security cameras generates a massive amount of data. Parsing and analyzing that data is a critical step to ensure that the raw, unedited footage doesn’t need to be streamed over the network to a central storage location. With edge computing, that footage can be processed and analyzed in real-time to remove unneeded footage, thereby ensuring only critical information is uploaded, instead of hundreds of hours of empty frames. 

Self-driving Cars

As we begin to see an increase in AI-driven machinery, it is important to remember how much data needs to be processed for these machines to function safely. Independent operation will require real-time data analysis of visual data. Captured performance data will also need to be processed and sent back to the manufacturer to pinpoint issues and performance. By processing this data at the edge, we can offload the amount of information that needs to be sent back to the cloud. 


A number of exciting new technologies are changing the nature of edge computing. 

Edge Computing Enabling Technologies


There are a number of enabling technologies that are making it possible for application developers to deploy their code at the network’s edge. 


Containers

Kubernetes is commonly associated with container orchestration at the edge. Containers are smaller, easier to manage application deployments. Since they can be scaled on-demand with granular control, they are a perfect tool to address any performance challenges at the edge related to latency or user demand. Ridge has a container service, known as Ridge Kubernetes Service, which fully integrates with the Ridge Cloud. 


Caching

Local caching is a common tool to reduce the network load in remote locations. Rather than accessing the entire file when needed, related information is stored locally to reduce bandwidth usage and file load times. Only recent files are stored locally, rather than archive data. Since the archive data is stored in the central cloud, there’s little risk of being out of compliance and having issues with file versions or synchronization. 



The Future of Edge Computing

A number of exciting new technologies are changing the nature of edge computing. 


Mobile Edge Computing

Multi-access edge computing, or mobile edge computing, involves placing computational and storage resources at the edge of the Radio Access Network. This architecture will provide the lowest possible latency to mobile users.


Fog Computing

Edge computing can be regarded as a subset of fog computing. Fog computing processes data from where it is created to where it will be stored by utilizing the distributed network between the cloud and the edge. It acts as the missing link between the data that needs to be pushed between the cloud and the data that should be processed at the edge. 


Cloudlets

These small-scale cloud data centers act as the middle tier between the cloud and mobile devices. They’re located at the edge of a network and reduce the strain of resource-intensive applications on the network by providing more computing resources and reducing latency. 



Edge Computing with Ridge

Ridge gives businesses a unique advantage. While edge computing solves many use cases around local performance and latency, it requires enterprises to invest in local infrastructure close to users, which, depending on the organization’s size, could cost billions. 


With Ridge, there’s an easier solution. Ridge enables organizations to connect to existing data centers around the world as part of  the Ridge Cloud. Your users are always connected to the nearest location based on the factors you set like compliance, location, and latency. Enterprises can leverage the power of the edge without the upfront investment. 


As cloud-native solutions continue to grow, organizations need to operate closer to the edge. Ridge provides a single, easy-to-use API to manage your entire deployment. Accelerate customer deployments and scale automatically to address any challenge with “out-of-the-box” cloud elasticity. 


Frequently Asked Questions


How is edge computing implemented?

Implementation of an edge computing solution is based on needs of the business and the type of data being produced. An intelligent edge network is built around providing end-to-end security for devices for the data created on the network. Some businesses might rely on a cloud solution, like the Ridge Edge Cloud, to address issues with compliance, while others might utilize fog computing for IoT edge analytics. 



What is an edge data center? 

Edge micro data centers are smaller server farms distributed across an enterprise's network, or more often maintained independently by another company for the purpose of processing data closer to where it is produced at the edge. 


What are edge locations?

An intelligent edge network processes and analyzes data based on where it is produced, namely locations at the edge of a businesses network. These locations are offices or infrastructure that produces data, and typically most of a businesses network will be considered an edge location. 


What is edge VS. cloud?

The edge and cloud work in tandem, but each solves a different problem. The cloud provides a centralized data storage platform and simplifies application access for users, while the edge requires a different set of resources to ensure similar performance for distributed users at the edge. 


How fast is an edge compute network?

An edge computing network will process data significantly faster than a centralized network due to the location of the infrastructure. However, the speed is relative to the data being processed and the infrastructure in place. 


What is the difference between fog computing and edge computing?

Fog computing utilizes the infrastructure between the edge and the cloud, and should be regarded as a step up from the edge. By combining fog and edge computing, businesses can process data from anywhere at any time, even as that data is sent back to the central cloud.


Topic: 
Cloud secrurity
Distributed Cloud
This may also interest you