A Casual Discussion on the Internet of Things, Artificial Intelligence, and the Thinking Behind the Fourth Industrial Revolution

A Casual Discussion on the Internet of Things, Artificial Intelligence, and the Thinking Behind the Fourth Industrial Revolution
A Casual Discussion on the Internet of Things, Artificial Intelligence, and the Thinking Behind the Fourth Industrial Revolution
Over the past year or two, everyone has been talking about the industrial internet, the Internet of Things, big data, cloud computing, and artificial intelligence. Today I want to briefly walk through how these pieces are connected and, based on my own understanding, explain several questions. First, what exactly is the Internet of Things, and why do so many people believe the IoT (and the industrial internet) has such promising prospects? Second, what is artificial intelligence, and why has the concept of AI gained a new round of support—and even hype—today? Third, how are the Internet of Things, big data, cloud computing, and artificial intelligence connected and mutually influential, and what impact might this have on industry? Could this lead to the arrival of a fourth industrial revolution?
The concept of the Internet of Things (IoT) may sound like something new today, but Bill Gates had already mentioned it in his 1995 book The Road Ahead. As an aside, in that book Gates made a number of bold yet scientific predictions beyond just the IoT. Quite a few of them have already become reality today, while others are still in the process of being realized. For example, the book mentioned that a stolen camera could automatically send its location back to its owner—something that has long since been implemented on the iPhone. It also mentioned being able to replace a movie protagonist’s face with your own, a feature that recent AI technology has already made “partially” possible. As for the Internet of Things, an important component of the broader idea of “everything connected,” the reason it did not develop fully in the more than twenty years since the concept was proposed is due to several factors. This development process also reveals a broader pattern and trend in how industries evolve.
First, when the idea of everything being interconnected was originally proposed, the internet itself had not yet been widely adopted and promoted, which meant there were not enough connected entities to form a true network. Second, computer hardware, sensor technologies, and related components were not sufficiently mature, and the cost of hardware hindered the widespread adoption and promotion of universal connectivity. Third, and this is a factor that is often overlooked, at the time we did not have enough motivation to collect more data through ubiquitous connectivity. The need to use such data would not become a rigid demand until the later rise of artificial intelligence. These were the main obstacles preventing large-scale adoption of universal interconnection. However, in the industrial field, the need to monitor and control key parameters led industrial automation systems to use PLCs, DCSs, and similar approaches to achieve distributed control and interconnection among devices. As hardware and software technologies developed and iterated, these systems expanded from standalone control systems to field-level, workshop-level, and even factory-level systems, with MES (Manufacturing Execution Systems) enabling more refined monitoring of the production process. These forms of interconnection can be seen as a small model and demonstration of universal connectivity. But fundamentally, they are still different from today’s concepts of the IoT and everything connected. The former is driven more by monitoring and automated decision-making and control under fixed rules, while the latter aims for something much broader.
As mentioned earlier, in the past we did not have much demand for data. A lot of process data might be monitored and then discarded, with only the portion related directly to production being retained. Under such circumstances, existing sensors and control systems were already sufficient to meet the needs of manufacturing systems, so connecting more devices and collecting more data seemed unnecessary. So why is there suddenly a need for more data now? To answer that, we need to talk about the several waves of development and revolution in artificial intelligence.
The concept of artificial intelligence was proposed back in the 1950s, and debate surrounding AI has never ceased since then. Up to now, AI has gone through three major boom periods (the present being the third) and two downturns. As for the current boom, many people—including myself—believe that it will not sink into another trough like the previous two. Even if there is some degree of bubble, it is still likely to continue developing at a very rapid pace. That is because this new surge in AI is built on a combination of factors involving algorithms, hardware, communications, and systems. In other words, the timing, conditions, and momentum are all in place. What remains to be seen is just how far artificial intelligence can lead this era.
A major reason behind the rise and fall of the first two AI booms was the limitation of computer hardware performance. Today, on the one hand, the cost of computer hardware and storage has fallen dramatically; on the other hand, large-scale parallel computing has replaced the traditional model of relying on a single supercomputer, greatly increasing available computing power. In addition, the potential of GPUs has been fully exploited, allowing them to replace CPUs in handling large volumes of floating-point calculations. These three factors, from the perspective of hardware architecture, ensure sufficient computing capability. As an extension of this, low-cost, high-performance embedded systems have made distributed intelligent terminals possible, while various communication technologies—especially the imminent arrival of 5G—have paved an information superhighway for connectivity. These are the enabling conditions at the hardware level. The essence of cloud computing and big data technologies is likewise the large-scale parallel computing mentioned above: using more computers to provide stronger computing and storage capabilities. All of these are necessary prerequisites for AI applications.
An even more important factor, however, lies in the revolution of AI’s internal algorithms—more specifically, the development and application of deep learning. In previous AI booms, the intrinsic limitations of AI technology itself were the fundamental reason behind the eventual decline. Traditional machine learning algorithms were clearly limited in the scenarios they could be applied to and in the functions they could achieve. Moreover, larger amounts of data did not produce especially significant gains for traditional machine learning models. At the same time, approaches centered on semantics and logical reasoning led AI down a detour for quite a long period, because the existing technologies and theories could not enable AI to make a leap toward fully understanding human logic and semantics. The rise of deep learning, however, has fit today’s technological development needs remarkably well. Deep learning requires more data for model training and stronger computing power to build deeper and more complex neural networks for more sophisticated training and higher accuracy. This approach has achieved tremendous success in areas such as natural language translation, visual recognition, and speech recognition. These successes are enough to give us greater hope for the development of artificial intelligence.
Taking all of the above together, we can see that technological progress does not happen in isolation; technologies are interconnected and mutually influential. At the present stage, low-cost, high-performance embedded devices provide more convenient and efficient terminal technologies for data acquisition; the development of communications technologies, especially 5G, provides high-speed and reliable channels for information transmission; cloud computing and big data provide artificial intelligence with massive amounts of data and enormous computing power; and AI itself, through advances in deep learning algorithms, has become better equipped to make use of all of the above devices and infrastructure. These factors reinforce and promote one another. Even without a qualitative breakthrough in fundamental knowledge and theory, they have brought a new glimmer of hope for technological development and innovation. They are highly likely to promote the real arrival of the fourth industrial revolution, bringing about a new revolutionary leap in human production and daily life.
Finally, in this wave of change driven by artificial intelligence, where do the opportunities lie for small and medium-sized enterprises? This is also one of the questions we care about most. Among the layers mentioned earlier, large-scale infrastructure construction—including data centers and storage centers—requires massive concentrated capital investment and is often an opportunity for large enterprises. Likewise, the development of computing frameworks and algorithm implementation requires major investment in mathematicians, computer scientists, and related resources, which is also typically an arena for large companies. Google, IBM, and others are already developing in these broad directions. At the same time, more convenient IoT platforms and frameworks will also become battlegrounds for major enterprises. For SMEs, however, in my view, viable directions and opportunities include AI applications in niche fields, the R&D of IoT devices and products, and more flexible technological breakthroughs with distinct strengths in specific areas such as monitoring, diagnosis, and prediction. Thank you.
(This article was adapted and edited from the author’s remarks at an internal meeting.)


