Arm on Tuesday announced its new direction in “the industry’s most scalable, versatile ML compute platform.”
They are talking up their new platform called Project Trillium.
The project involves a new Machine Learning (ML) processor and an Object Detection (OD) processor.
The Arm ML processor (1) delivers more than 4.6 trillion operations per second and (2) an efficiency of over 3 trillion operations per second per watt (TOPs/W), with “unmatched” performance in “thermal and cost-constrained environments.”
Project Trillium is a codename, not a commercial brand name, for Arm machine learning technology.
The codename will be replaced by a commercial brand name.
Jem Davies, vice president, fellow and general manager, Machine Learning, Arm, said the project is “to kickstart a new wave of invention in the world of artificial intelligence (AI), of which machine learning is a key part.”
MIT Technology Review said, Arm’s “latest mobile processors are tuned to crunch machine-learning algorithms as efficiently as possible.”
So, what does that all mean for consumers buying mobile products? Are they talking about AI for phones?
As MIT Technology Review said, AI will bring us hardware that will allow our phones to run “artificial-intelligence algorithms.”
ARM’s new processors were made to deliver enhanced machine learning and neural network functionality.
Arm’s Jem Davies remarked, “Indeed, my answer to the question:
‘Why would you introduce more intelligence into your device?’
is ‘Why wouldn’t you,'” in an Arm blog.
The processors are focused on mobile devices.
“Users will enjoy high-resolution, real-time, detailed face recognition on their smart devices delivered in a battery-friendly way,” said Arm.
The Arm OD processor was designed to identify people and other objects with “virtually unlimited objects per frame,” with “Real-time detection with Full HD processing at 60 frames per second.”
While the initial launch focuses on mobile processors, though, Arm said there will be future Arm ML products with the ability “to move up or down the performance curve – from sensors and smart speakers, to mobile, home entertainment, and beyond.”
Arm’s Jem Davies, vice president, fellow and general manager, machine learning, clarified what the suite’s capabilities might serve up in a real-world scenario. (Davies is a qualified scuba diver.)
“Imagine you’re 30 meters down, diving above a reef surrounded by amazing-looking creatures and wondering what species the little yellow fish with the silver stripes is.
You could fumble around for a fish chart, if you have one, but what you really want is an easier and faster solution.
Fast forward to 2019, and technology has provided. Now your waterproof smartphone is enabled by Arm Machine Learning (ML) and Object Detection processors.
Your experience is very different.”
The dive mask, said Davies, would give you information via a heads-up display. “An Arm-based chip inside your smartphone is now equipped with an advanced Object Detection processor that is filtering out the most important scene data while an operating system tasks a powerful Machine Learning processor with detailed identification of fish, other areas of interest and hazards.”
Jamie Condliffe in MIT Technology Review assessed Arm’s news.
“Currently, most small or portable devices that use machine learning lack the horsepower to run AI algorithms, so they enlist the help of big servers in the cloud.”
Arm’s solution has the advantage of speed, with a mobile device running its own AI software “cutting the lag inherent in sending information back and forth.”
Also, he said, “It pleases privacy advocates, who are comforted by the idea of data remaining on the device.”
Gary Sims discussed the same plus points in Android Authority including security advantages of not having to send personal data up to the cloud.
“The argument for supporting inference (recognition) on a device, rather than in the cloud, is compelling.
First of all it saves bandwidth.
As these technologies become more ubiquitous then there would be a sharp spike in data being send back and forth to the cloud for recognition.
Second it saves power, both on the phone and in the server room, since the phone is no longer using its mobile radios (Wi-Fi or LTE) to send/receive data and a server isn’t being used to do the detection.”
As for latency, Sims also noted results will be delivered quicker if the inference is done locally.
At the same time, Condliffe pointed out that Arm is not the only player exploring mobile AI chips. Condliffe noted (1) a neural engine in iPhone X as part of its main chipset (2) Huawei’s Mate 10 smartphone with a chip it calls neural processing unit and (3) the Pixel 2 handset with a chipset “to help it crunch imaging and machine-learning problems.”
Sims said, “we should start to see SoCs with it built-in sometime during 2019.”
“Machine learning is indeed the hot new topic in the semiconductor business and has particularly seen a large focus in the mobile world over the last couple of months,” said Andrei Frumusanu in AnandTech, with announcements from companies.