Advancements in artificial intelligence (AI) are increasingly being utilized in furthering the growth of Internet of Things (IoT) applications, but challenges such as market fragmentation, employee skill sets, and limitations in deep learning systems are just a few of the barriers in leveraging AI for IoT.
In part one of this Q&A, Telco Transformation spoke with Steve Bell, IoT senior analyst at Heavy Reading, about the evolution of AI and its impact on IoT applications. In part two, Bell expands on the impact of AI, focusing on industrial IoT applications and how AI can be used to optimize network operations and reduce costs. (See Heavy Reading's Steve Bell on the Intersection of AI & IoT.)
Telco Transformation: How does AI and IoT data impact verticals that have historically been early adopters, such as supply-chain management and industrial IoT?
Steve Bell: Data collected at the edge is particularly important because it can be analyzed in context. In industrial applications, 90% of data loses value if it's not utilized within 60 seconds. The analysis of the data at the edge not only considers proximate contextual variables, but its utility for decision-making is not degraded by the delays caused by the centralization of data. Analytics and AI at the edge can help find timely patterns in machine and product performance.
TT: In an interview with MIT Technology Review this March, Peter Lee, corporate vice president of AI & research at Microsoft, said that high-end machine-learning systems of greatest value to tech giants remain too expensive and inflexible for Microsoft to offer to customers. The most viable AI solutions available are for image, face, speech and video recognition. Is AI widely viable for IoT applications?
SB: There are companies like IBM Corp. (NYSE: IBM) and Microsoft Corp. (Nasdaq: MSFT) that seek to offer cognitive AI as a service. At this point, the market is too fragmented to project what is going to happen in the near term. However, there is huge startup activity.
The larger companies are focused on platforms to encourage the growth of ecosystems to develop AI as a service. Their platforms can be used by startups or by developers to address specific problems in individual verticals. It is difficult to anticipate exactly how the market will evolve but the potential for progress is palpable.
TT: How are networks coping with the demands made by AI-driven applications in terms of the volume, variability and variety of data processed?
SB: Telcos have a huge potential opportunity to provide microservices in a virtualized and software-driven network. Between the large technology companies and network providers, most of the talent in data science has been recruited. Most other companies, by the time they realize the importance of machine learning, will struggle to find the talent in-house. This is an opportunity to provide AI-as-a-service linked to the microservices that they are providing, to help customers tailor and optimize their operations based on the intelligence they are generating.
These microservices that telcos can provide can be enabled by an emerging capability -- network slicing -- which is already possible for the network core with the virtualization of the network. These network slices can be configured to meet the performance requirements of individual types of services. Now network slicing is possible for the radio network with the advent of narrowband IoT that is designed for low-powered IoT devices and sensors.
Want to know more about the companies, people and organizations driving developments in the virtualization sector? Check out Virtuapedia, the most comprehensive online resource covering the virtualization industry.
End-to-end network management for network slices will become increasingly possible for specific performance needs of services. These network services can be intelligently segmented to address the needs of individual companies, locations, and service types. AI can be used to optimize network operations to reduce costs proportionate to the service quality and value of the needs met.
Some telcos, such as AT&T Inc. (NYSE: T) and Deutsche Telekom AG (NYSE: DT), are beginning to put together this capability, but mass adoption will only happen with widespread deployment of narrowband IOT, which will provide valuable lessons for 5G which will use the slicing capability extensively.
Some lead users, such as Maersk, are already working with IBM and Ericsson AB (Nasdaq: ERIC) to create global IoT solutions for tracking the movement of containers since they use different means of transportation ranging from trucks to railways to ships. This tracking of data reduces the risk of misplacement or theft as the containers are moved through ports from one transport to another. The aggregation of data from multiple wireless networks, ranging from satellites to cellular to WiFi, uses the data to automatically monitor and track the containers and send alerts when items are missing or when there are anomalous patterns in the state of the container and cargo.
Each industry will have its own global supply chain use cases -- the food industry, for example, would like to track temperature and humidity in real time to predict and avoid spoilage, or dispense with items that have inadvertently perished part of the way through transit.
TT: What are the shifts in the analytics techniques that contribute to the step function changes that we discussed earlier in this interview?
SB: Deep learning, a neural process of analysis which tries to mimic the brain, is certainly one of the techniques that has gained the most momentum -- particularly for image classification and language translation. The problem with deep learning systems is that they are trained for one purpose and the learning gained from one context is not easy to reproduce in multiple tasks in other industries or application data sets.
Deep learning parses the input data step-wise through tens -- or possibly hundreds -- of interconnected layers of simulated neurons until an output is produced, but you can't retrace the steps to see how the results were achieved. Deep learning is using an inductive process of reasoning rather than deductive -- new layers of knowledge are gained rather than using the same process to analyze another data set. Each time the network is shown new data it tweaks the calculations of individual neurons to improve the outcome on the next set of data.
Within the last month, progress has been made that could allow the transfer of knowledge between tasks. This would overcome the issue of "catastrophic forgetting," allowing the system to hold the data of the step-wise analysis rather than overwriting the learning when presented with new data. Google's DeepMind has created an algorithm that determines the value of what has been learned and slows the rate at which it is altered which allows the process to be retained for another problem.
TT: How urgent is the need for self-organized networks (SON)? Is AI mature enough to enable SON?
SB: Over the next five years, as you see the growth of SD-WANs, enterprises want to optimize their networks from the perspective of cost, flexibility and accessibility. They can reduce their costs by phasing out MPLS and using virtualized private networks, or public Internet capabilities and fiber links, and choose the lowest-cost option for the service delivered. AI-augmented SONs would be useful to make the most optimal decision for the type of traffic and the least cost means to serve it in real-time. From the carriers' perspective, it is a way to manage fluctuations in bandwidth needs without incurring large costs while responding to spikes in traffic.
— Kishore Jethanandani, Contributing Writer, Telco Transformation