The fourth industrial revolution is underway. What is it? How do we know it’s happening? What will it mean? And how do we prepare for it?
If the 1950s onwards was the age of automation, Industry 3.0, the age we’re entering is the era of hyper-automation. It’s the age of the self-driving car and the self-driving network, a development that promises to be more revolutionary still.
From the first industrial revolution in the eighteenth century, where steam and water power were used to drive machines that began to supplant hand production, such as the textile loom, we have been finding new ways to replace human labour. The late nineteenth and early 20th-century saw electrification drive a second wave of change in the manufacturing industry.
But the second revolution was more than a new source of power for factories; it also saw the development of railway networks that revolutionised distribution and telegraph networks that enabled the spread of information, foreshadowing modern communications networks.
Industry 3.0 introduced electronics and computers, enabling calculations to be performed at speed and on a scale previously unimagined, to do everything from processing records and controlling machine tools to forecasting the weather.
If Industry 3.0 saw humanity begin to grapple with automation problems, Industry 4.0 is when we’ll start to solve them. Although we have been able to grasp concepts such as artificial intelligence (AI) and machine learning (ML) for most of the last revolution, we are only now beginning to apply them in the real world. In Industry 3.0, we relied on human intervention to program, configure, correct, fix and improve the systems we developed. In Industry 4.0, they won’t need our help.
If we want to put a date on the start of the fourth revolution – or at least the first use of the term – it was 2011 at the Hannover Fair in Germany, where a new phase of industrial automation was described in which machines and the processes associated with them would be self-diagnosing, self-configuring and self-optimising.
They would have the intelligence to understand, at some level, what they were doing and how they could do it better, and they would be equipped to make improvements without asking permission and without human intervention. They would also, through the ability to analyse vast amounts of data, anticipate and compensate for problems – anything from component failure to switching sources of supply in a manufacturing system.
Recently Softbank announced the disposal of Boston Dynamics, a world-leading robotics company. Some analysts were confused by the news. Robots, at least for those of us who were kids in the last century, are the future, right?
Perhaps the message Softbank was sending is more subtle: the value of robots is in the intelligence that controls them, not the shiny, moving parts that do the heavy lifting and the intricate work. The Boston Dynamics disposal doesn’t mean that robots are no longer the future but that they’re firmly in the present. Time to move on.
Think about the factors driving uptake of advanced technology in industry. They include:
- Pressure on efficiency, particularly the need to reduce costs
- Desire to improve customer service
- Shortening product cycles
- Fear of tech-savvy rivals
- Difficulty of finding people with the right skills
- Need for earlier identification of issues
- Need for more timely decision-making
- Desire for better analytics.
Technology promises limitless everything. The Cloud and 5G networks will meet our insatiable desire for network and compute capacity, but history shows that we will use whatever capacity is available – and continue to ask for more. We’ll see new applications and services arise for every increase in available bandwidth to suck it up.
At the root of this problem is complexity. The Internet of Things (IoT) is expected to connect 75 billion devices by the middle of this decade, roughly ten times the human population of the planet. It will be the job of networks and data centers to manage this sprawling population of sensors, meters, monitors, controllers and wearable devices.
To make sense of everything that’s happening in the future network will add a huge potential overhead of design, management and oversight. The job of sifting the nuggets of critical information from the raw data, orchestrating systems and subsystems, and maintaining security throughout the network will be impossible if we apply Industry 3.0 methods and organisational practices to the challenge. And as the complexity of the management task increases, so the risk of performance degradation intensifies.
The laws of physics, the one thing technology can’t change, are changing the shape of the network. Monolithic data centres, whether they’re on-premises or in the Cloud, won’t cope with IoT. The further away a user or device is from the data center, the more network performance degrades. So, the data center itself is changing, becoming highly distributed. Now we talk about edge or micro-edge to take processing power to the endpoints of the network and shorten the distance between the data center and its consumers – humans and things. We’ve solved the latency problem and added another layer of complexity.
The fact is that the human interventions we’ve always relied on to manage it in the past no longer work. Just as autonomous vehicles start to solve some of the problems of overcrowded highways full of vehicles controlled by demonstrably fallible human drivers,
Intelligent, autonomous networks are the critical enabling technology for the evolution of a safe and orderly Industry 4.0.
The future network will not be a traditional WAN but an increasingly complex multi-cloud. Think of the Cloud as exploding into little pieces. Customers will care about services and applications and proximity to the data centre. For distributed global networks, that inevitably means multi-cloud networks. The downside of this heterogeneity is more complexity.
The self-driving network automates the network’s design and operation, abstracting the complexity of the network and hiding it from the customer. Network architects don’t need to be spending their time troubleshooting and solving routine, repetitive problems. AI can do that much better.
The architect’s job is not to build the house or fix the plumbing when it goes wrong, but to make the plan. Similarly, the network architect should decide how the organisation’s business needs – the intent – are expressed in the overall design. The design detail, the translation of intent to an ideal, high-performance network, can be automated. So too can network operations, allowing real-time diagnostics and problem-solving, reconfiguration and performance optimisation.
Organisations that survive and thrive in the era of Industry 4.0 will be those that embrace digital transformation. Yet this remains an elusive goal. According to McKinsey, only 30% of such initiatives succeed, and the success rate is much lower for large organisations where the complexity of the challenge is greater.
The key, as confirmed by multiple studies, is the transformation of IT operations themselves. Not only is IT operational expenditure (OPEX) a major drain on the bottom line, but 20th-century practices will prevent today’s businesses from exploiting the opportunities of Industry 4.0.
There were plenty of people who resisted the move from the handloom in the first industrial revolution. Their fate is well-documented.
Related article: Is automation just a digital panacea for Asia’s manufacturers?