“If you are not on the Edge, you are wasting space” – A little twist to the original quote of famous Everest climber Jim Whittaker, rather apt when it comes to distributed computing reaching the Edge.
Today’s cloud computing strategy now spans beyond hybrid clouds and is even leaving the bus of hypervisors and prefers bare-metal deployments for reusable resources via serverless or containers lead microservices running on the Edge cloud.
Gartner predicts that by 2025, three-quarters of enterprise-generated data will be created and processed at the edge – outside a traditional centralized data center (read, core) or the cloud, saying Edge completed the cloud. Edge Computing and cloud have a symbiotic relationship they both address different problem sets and workload characteristics, yet they complement each other.
The Evolution of Edge Computing
The basic concept of Edge Computing can be traced back to the 90s when Akamai launched its content delivery network (CDN). The idea back then was to introduce nodes at locations geographically closer to the end-user for the delivery of cached content such as images and videos. The Edge Computing concept is thus not new, it is just got better aligned to the principles of decentralized computing with much more focus on the workloads, data flows by and between components, and manageability from a single pane of glass for effective control.
Although use cases are evolving with the technological advancements, we often hear that Edge Computing is associated with IoT, AI/ML and Data ingestion, etc. and in the paradigms of sensors, machine to machine and distributed application deployments such as containers, serverless functions, and network function virtualization to name the few.
What gives an edge to Edge Computing?
The tenets for the Edge Computing advantages are:
- Faster Response Time – Closer the workload to the consumer, the better is the experience and response time.
- Improved Availability – The Edge ecosystem is resilient by design and auto-heal the failures by respawning the microservice-based workloads thus reduces complexities in operations and manageability, once set up the Edge workloads continue to work.
- Interoperability & Scalability –The Edge workloads are designed and developed on the distributed computing principles and each application module can scale and run. independently on multiple locations thus scalability and interoperability by and between Edge and Core is well taken care of.
- Security –The significant risk reduction due to the ability to patch the vulnerabilities across all edge locations, constant iteration beyond the initial setup to deal with patching and emerging security issues via automated rollout of the updated of the container images from a central repository, whitelisting the network access via secured VPN tunnels.
- Lower Cost of Ownership –The entire Edge stack is all about reusability of resources and optimization of the infrastructure and services thereof, from compute, memory, storage, and network all infrastructure resources are provisioned on the fly and metered only upon utilization thus the pool is universal yet available on-demand to all subscribers hosted on the Edge.
Edge Computing Value Addition
Edge Computing is evolving and making breakthroughs banking on emerging technologies and leveraging the opportunity due to the surge in data volume from the massive number of devices enabled by radio or wireless networks has made Edge Computing more important than ever before. Besides its abilities to reduce network latency and improve real-time user experience, edge computing now plays a critical role in enabling use cases for ultra-reliable low-latency communication in industrial manufacturing, telecom, and a variety of other sectors.
The key benefit of Edge Computing remains in the ability to move workloads from devices into the cloud, where resources are less expensive and it is easier to process and store large volumes of data. At the same time, it optimizes the latency and reliability by and between applications and users thus helps us achieve significant savings in network resources by re-locating certain application components at the Edge, close to the user devices.
Looking forward to the Smart Edge
Edge Computing is becoming a more powerful and promising opportunity to turn AI data into real-time value across almost every industry. The intelligent Edge is the next stage in the evolution and success of AI technology. The adoption is easier for newer entrants in the application space but poses rework on engineering the legacy application code making it ready in line with distributed computing principle and related technology stack upgrades, the responses are encouraging and many are tasting the water as a paradigm shift.
–By Rajesh Dangi, Chief Digital Officer, NxtGen Datacenter and Cloud Technologies