AI and edge computing is a match made in IT heaven
However, as both technologies continue to mature, they are increasingly being included together in an IT leader’s decision-making. The two technologies go hand in hand - AI is now a key use case for edge computing and edge computing is a significant enabler for AI.
A new breed of AI
Traditionally, AI has lived inside data centres powered by cloud computing. In time, AI made its way into software and has since become embedded into the Internet of Things (IoT) endpoints and other end-user devices.
As consumers began to spend more time on their smartphones, major technology companies have realised they needed to bring the compute power of the data centres closer to the end-user to deliver a good user experience.
Big data will always be processed via the cloud. However, instant data that is generated by the users can be computed and operated on the edge. The likes of Google and Amazon are already exploring potential edge computing benefits.
To keep pace with these major technology companies, more businesses involved in AI have started to realise the benefits of edge computing. In fact, Deloitte now predicts that more than 750 million edge AI chips - designed to enable on-device machine learning - will be sold this year.
Overcoming AI challenges with edge
So, why does edge fit so well with AI? AI is undoubtedly a data-heavy and computing-intensive technology. As a result, bandwidth, latency, security and cost pose significant hurdles for the majority of businesses. Edge benefits AI by helping overcome these technological challenges.
AI has, quite literally, got a big data problem. With edge computing, rather than sending this data to the cloud or distance data centre, it can be processed nearer to the end-user.
This process not only reduces the amount of bandwidth required, while also backhauling costs. What’s more, bringing processing to the point of capture gains immediate value from the data through instant insights gathering.
With this also comes reduced latency, perhaps one of the more obvious reasons for adopting edge computing. As technologies and services become more distributed across a business’ network, latency will naturally occur.
Where real-time decision making and actions are required at the device level especially, latency must be kept to a minimum. By locating key processing tasks closer to end-users, edge computing can deliver faster and more responsive AI-based services.
The security conundrum
Privacy remains an unsolved challenge in the AI industry, especially with the increasing variety of AI-enabled devices across a business’ network. Edge computing offers a solution to this security conundrum. With edge-based AI, sensitive information is stored and processed locally on a device, rather than being sent to the cloud.
Only less time-intensive data sets need to be pushed to the cloud, the rest remain local. If there is less transfer of sensitive data between devices and the cloud, this means better security for business and their customers.
According to Gartner, in two years time, 74% of all data will need analysis and action on the edge. Data from AI-enabled devices are no exception to this.
In fact, edge AI is the next wave of AI and even many major technology companies are recognising this. Earlier this year, Intel and Udacity announced a joint edge AI nanodegree programme to help train developers in the field.
Ultimately, As data continues to grow exponentially, there’s a real need for data storage and data computation to be located on the device. Not forgetting other factors - speed, privacy and security. For those wondering where AI will be heading next, now they know, it’s heading to the edge.