Successful 5G deployments with Kubernetes at the edge

Kubernetes 5G containers
Image by Sergey Novikov | Bigstockphoto

One-third of the entire world’s population is expected to be covered by 5G networks by 2025. As networks become increasingly adaptable to meet insurmountable user expectations, edge services hold the key to enhancing the Quality of Experience (QoE), whilst achieving significant growth and return on investment for operators globally.

Multi-access Edge Computing (MEC) enables cloud-computing capabilities at the edge of the network, driving innovation and amplifying the opportunities made possible through efficient service delivery. Edge solutions are closely linked with 5G Radio Access Networks (RAN), and often co-located in the Mobile Network Operator’s (MNO) edge data centers. 5G is expected to support 10,000 times more network traffic than its predecessor, as MEC continues to pave the way for a connected, digitalized world by offering high bandwidth connections with vastly reduced latency.

Connecting profitable 5G services from ‘the edge’

5G connectivity is already beginning to revolutionize how we live our lives, offering real-time connections on an unprecedented scale. Internet of Things (IoT) devices are now broadly embedded across a wide variety of industries – from energy to agriculture – as companies look to harness the abundant potential of connected operations.

The cloud-computing capabilities offered at the edge of the network allow operators to successfully deliver services that require real-time functionality, hosting virtual environments close to the devices that require it. Applications such as Autonomous X, Virtual Reality (VR), Augmented Reality (AR), Industry 4.0 and Ultra-High-Definition (UHD) videos that require a real-time connection will be able to thrive on the connectivity offered through MEC. Capability is moved closer to the user to produce a low latency, high bandwidth environment. Instead of backhauling all data to a central site to be analyzed and processed, operators can now access a service that runs locally, offering high throughput alongside minimum latency.

Completing lifecycle tasks in seconds with Kubernetes

Developments in Operating System (OS)-level virtualization are leading users to turn away from Virtual Machine (VM)s in favor of containers, with more than 75% of global organizations predicted to be running containerized applications by the end of the year. Traditionally, with VMs, to scale just one part of an application operators would need to instantiate an entire additional VM, including the compute, store, network resources, and guest operating system associated with it.

With the aid of Kubernetes, containers reduce this process from minutes to seconds, allowing isolated systems to run on a singular OS. Applications are streamlined, broken down into constituent parts and functions called micro-services. Doing this allows operators to scale out the micro-serviced container that is responsible only for a specific function or task, improving efficiency. They can also offer greater reliability when running applications, with operations that can provide rolling updates to change software, without any downtime required. Operators can reduce network times, with tools to auto-scale microservices that can relate back to any number of KPIs, whilst innovative auto-healing abilities also contribute to the goal of shortening lifecycle tasks.

Choosing a cloud-native Kubernetes platform that enhances operations

Industries across the globe continue to recognize the optimization Kubernetes provides, with the container management market forecasted to grow to around $944m U.S dollars by 2024. More professionals are moving their big data applications to containers with Kubernetes, as demand for IoT technology and ML applications rises equidistantly. As more vendors turn to these platforms to harness the benefits of cloud automation, the assumption may be that a simple cure-all for any repetitive or scale-out task is now at hand. While Kubernetes is supporting the mass rapid move to the cloud, variations between platforms and orchestration solutions mean there can be large disparities in time to outcome, resource utilization, solution costs, and opportunities.

How you automate is just as important as what you automate, and the ease of use of a system will influence an operator’s success throughout the lifecycle of its service. Even when deployed across multiple locations and VM and container environments, lifecycle automation, workflows, and the overall operations stack needs to be unified. If chosen appropriately, operators have the potential to reduce their OpEx costs by 40%, and CapEx savings by 50%, while transforming scale-out tasks from weeks to minutes. That is the power of a cloud-native platform.

Another key factor, becoming more and more relevant to Kubernetes deployments is how you handle stateful workloads, such as subscriber information and edge applications. When handled well, agility and efficiency improve, but as Kubernetes microservices add a level of complexity, just snapshotting and cloning storage volumes is no longer enough.  For zero-touch-automation, one also needs to snapshot the other constructs such as application metadata, configuration, and SLA policies. This will enable teams to very quickly roll back an entire application to a previous state or clone it so that one has a fully functional running database from a previously taken snapshot. No hunting, no hardcoding and no restarting from scratch should be required from the user.  The storage only way of doing things goes against the agility and efficiency expected of a platform like Kubernetes and will hamstring your overall solution’s capabilities.

Powering future 5G requirements with Kubernetes

As the industry continues to evolve, cloud-native Kubernetes platforms empower MEC and RAN environments to become more automated, cost-effective and streamlined, allowing operators to reap the rewards of optimized services. Kubernetes has been dubbed the secret weapon for unlocking cloud-native potential, leveraged by leaders in the space like Rakuten, which has become the first telecom operator to deliver a 100% cloud-native architecture.

The race to deploy 5G services quickly and at scale is on. The operators that will prosper, will be the ones to automate deployment, while remaining agile, eliminating time and resource silos. Kubernetes at the edge is the key to delivering these anticipated 5G services now, and as new developments occur.

Related article: Apps business next phase will be more user-centric and decentralized

Article contributed by Brooke Frischemeier, Sr Director of Product Management at Robin.io

Be the first to comment

What do you think?

This site uses Akismet to reduce spam. Learn how your comment data is processed.