MENU

Steep growth of AI chip market will produce new winners

Steep growth of AI chip market will produce new winners

Market news |
By Peter Clarke



According to the market analysis firm, the revenues from the sale of chips for AI processing was US$1.3 billion in 2018 and will grow to $23 billion in 2023, according to market analysis firm ABI Research. This equates to a compound annual growth rate of 78 percent over the six years.

Breaking the sector down further ABI reckons the revenue from the sale of AI chips and chipsets for edge inferencing will grow with a CAGR of 65 percent over the same time period but the smaller class of more powerful and higher-priced chips for inference-training will grow with a CAGR of 137 percent.

However, these market increases do not necessarily favour the incumbent market leaders Intel and Nvidia, ABI said.

The market is essentially a totally new one and will see intense competition from both established chip companies and numerous startup companies, the market research firm said.

“Companies are looking to the edge because it allows them to perform AI inference without transferring their data. The act of transferring data is inherently costly and in business-critical use cases where latency and accuracy are key, and constant connectivity is lacking, applications can’t be fulfilled,” said Jack Vernon, an analyst at ABI Research, in a statement. “Locating AI inference processing at the edge also means that companies don’t have to share private or sensitive data with cloud providers, something that is problematic in the healthcare and consumer sectors,” he added.

AI at the edge will have a significant impact on the semiconductor industry and the biggest winners are likely to come from among vendors with intellectual property for AI-related ASICs.

Next: Tensor processing


So-called tensor-based processing architectures are more efficient at performing the maths needed for deep learning (DL) tasks and more scalable than traditional scalar CPU-style architectures. As a result, by 2023 such tensor ASICs will overtake GPUs as the architecture supporting AI inference at the edge, both in terms of annual shipments and revenues, ABI said.

Such ASICs or their IP core equivalents are in use by smartphone manufacturers Apple and Huawei for image recognition processing in their devices. Other ASICs such as those produced by Intel’s Movidius division are used for image recognition inferencing at the edge.

Drone vendor DJI uses Movidius chips to help support flight and the tracking of objects and people. Security camera vendor Hikvision is also using Movidius’s AI chips in its security cameras to support facial recognition and tracking, said ABI. ASICs are also being adopted by companies developing autonomous driving systems, industrial automation, and robotics.

For AI inferencing Intel will face competition from startups such as Cambricon Technology, Horizon Robotics, Hailo Technologies, Habana Labs and scores of other fabless chip companies besides.

Nvidia its GPU-based AGX platform has also been gaining momentum in industrial automation and robotics. While FPGA leader Xilinx can also expect an uptick in revenues on the back of companies using FPGAs to perform inference at the edge.

“Cloud vendors are deploying GPUs for AI training in the cloud due to their high performance. However, Nvidia will see its market share chipped away by AI training focused ASIC vendors such as Graphcore, who are building high-performance and use-case specific chipsets,” said Vernon.

Related links and articles:

www.abiresearch.com

News articles:

China startup releases AI processors then raises $100 million

Cambricon licenses Moortec’s 16nm in-chip monitor

Startup offers inference processor for data centers

Intel buys Movidius to ramp computer vision

Is that Graphcore’s Colossus IPU in package?

Intel to acquire deep learning Nervana

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s