Two trends in the telco industry that have been discussed up and down for several years are certainly Edge and AI. It is no surprise that at some point there is the attempt to bring the two together, but there are question marks behind profitable use cases. Can such two early-stage technologies create sufficient return on investment?
There is currently no telco conference that does not try to combine the two hypes. At first glance, it sounds like a sensible thing to do, after all, many telco & cloud manufacturers and providers have jumped the bandwagon mentioning the importance of “EdgeAI” for the telco business.
To understand why research is still needed, it is worth taking a look first at what is called the “Journey to the Cloud”.
Enterprise businesses are either in the process of or have just completed their cloud transformation. Cloud vendors incl. telco operators gave good reasons why companies should move IT workloads to the cloud such as economy of scale, flexibility, security, redundancy, etc. The prospect of not being responsible for equipment that can be error-prone is tempting. One major concern, the data storage location, can be considered a risk, but the advantages somehow outweigh the disadvantages.
Everyone agrees that there is no way back to classic server installations in enterprise data centers, but there is a new candidate for on-prem equipment: Edge Cloud. The community has identified new use cases that have a greater demand at the cloud than the security of data storage: remoteness of compute resources.
The most obvious examples for edge cloud installation are low latency applications for smart factories and industry 4.0. Serving these campus networks from the cloud doesn’t seem right. There are technical and economic reasons to locate compute resources where the data is created, here are four of them:
- Low Latency: ultra-reliable applications not only require low latency but also an upper bound for the maximum latency, which is very difficult to get because each service provider (for connectivity or cloud) only guarantees its part.
- Time Sensitive Networking (TSN): low latency applications also require predictability of the latency i.e. a max. allowed jitter to accurately synchronize machines, robots etc.
- Network Offloading: sending massive amounts of video content over the network to process them in the cloud might not leave enough margin for other types of communication.
- Protection & Control: besides the data security aspects, smart factory and other operators are reluctant to move mission critical protection & control functions to the cloud if the whole factory relies on them.
Reducing the latency is not enough
To make sure that more industry segments benefit from processing power at the edge, the group of beneficiaries needs to be expanded. So, why not combine the edge cloud trend with the latest megatrend: Artificial Intelligence? Edge cloud equipment by itself doesn’t do anything smart without applications running on top of it. Adding machine learning capable processing units at the edge can be the crucial difference.
The first AI use case that comes to mind is inference at the edge. High-definition video analytics and training of AI models in real-time is a perfect example of something that can only be solved by edge and AI together. The same applies to AI powered image matching in factories and healthcare applications that have “real-time like” requirements such as colonoscopy etc.
In the consumer business, major use cases have yet to be found. Gaming, as well as augmented and virtual reality, can certainly benefit from GPUs which are a key component of most AI at the edge installations.
Introducing AI at the edge is a marathon, not a sprint. That means in the early days (i.e. today) it’s all about exploration. The biggest chance for cloud and communication service providers lies in the area of partnerships and alliances. This includes building ecosystems and partnerships to understand the value of AI at the edge.
It is unlikely that the current enterprise premise equipment has the resources to process major AI workloads, we have just moved all processing capabilities to the cloud. This is even more true for battery-powered mobile devices where any form of offloading is welcome.
But the communication market has clearly identified AI use cases with low latency requirements that can drive incremental revenue in the next years. In that sense, AI and Edge will be a match.
If you’re interested in this or any other topics around telecommunication, network and infrastructure, feel free to reach out to me: Cornelius Heckrott, VP, Technology @ Swisscom Outpost, Palo Alto, California, US, email@example.com
December 7, 2020