For the past five years, the primary impact of shifting IT platforms was to drive greater data center concentration and the creation of a core IT portfolio. It included the consolidation of smaller corporate data centers (often inherited during acquisitions) into larger facilities that leverage technologies such as SSD storage, converged systems, and software-defined infrastructure to boost agility and operational efficiency. It also included greater use of mega data centers owned by major colocation, managed services, and SaaS/IaaS cloud data center operators to enable faster creation and scale-up of new digital services.
This concentration at the core enables faster, more capable modernization of critical business systems of record needed to support and provide a layer of “trust” for new business initiatives.
In the meantime, the extension of the core to the edge enables rapid, lower-risk access to critical compute, data, and network resources needed to develop fast-evolving and highly scalable mobile engagement and analytical services.
As workloads collapse to the core and demands for applications to come closer to the machines and people who consume them, the question remains: what exactly is the edge?
A series of industry and vendor bodies have proposed definitions. However, recent IDC research shows that the majority of enterprises remain uncertain as the what the edge might be.
The Edge Computing Consortium (ECC) proposes that edge computing is performed on an open platform at the network edge near things or data sources, integrating network, computing, storage, and application core capabilities and providing edge intelligent services.
The Industrial Internet Consortium proposes that edge computing comprises all computation, storage, communications, and processing associated with collecting, transforming and acting upon information captured from the edge, or transmitted to the edge.
Lastly, EdgeX Foundry is a vendor-neutral open source project hosted by The Linux Foundation building a common open framework for IoT edge computing. EdgeX proposes that that the edge is an IoT architecture allowing customers to deploy a mix of plug-and-play microservices on compute nodes at the edge.
Each of these defines edge computing by including reference to the edge – a circular argument at best. In combination, they describe the attributes of the edge – the need for compute, storage and applications, the role of intelligence and analytics, the urge for a microservice oriented, container deployed, open source (and open interface) model. None, however, goes beyond the definition of the edge as anything more than proximity to consumers of compute and providers of data.
Here, the proposed answer to “Where is the edge?” is one which acknowledges all the component driven definitions above. It goes one step further, though, by suggesting that the edge itself is the domain between an IoT endpoint and its nearest connected compute resource where the network transit time between those two points is less than 2 ms. Such a proposal supports an approach allowing for an understanding of where to place compute nodes to maximize value to the endpoints they support.
Two milliseconds of network latency offers a good compromise to distance over performance and can be easily understood by readily available network monitoring and application performance tools. On a perfect network, light will travel about 600 km in 2 ms. Allocating 70% efficiency, this practical distance is reduced to around 470 km.
Combining all these definitions and proposals together provides an approach to edging for the modern day – apply open source platforms with rich APIs, provide apps and analytics, design and deploy devices and gateways to the 2-ms network latency rule and carefully choose which workloads are core and those to edge.
Written by Hugh Ujhazy, associate vice president of IOT & Telecoms at IDC | Originally published on LinkedIn