Edge AI Chip Market to Hit $60B by 2028 as Small Models, PCs Boost Demand
New research from Omdia shows AI processors are entering the automotive, robotics and local PC markets
At a Glance
- Omdia report states edge AI processor revenue will rise from $31 billion in 2022 to $60 billion in 2028.
- Among the growth drivers are AI PCs and increased demand for hardware to run small AI models locally.
New research by Omdia predicts the edge AI processor market will generate $60.2 billion in revenue by 2028, a compound annual growth rate of 11%.
Omdia's latest Processors at the Edge Forecast states that the revenue rise is driven by increased demand for hardware as various industries and devices adopt AI.
Among the areas fueling the market growth is the PC space, with a rise in product availability from major providers including Intel, AMD and Apple.
According to the report, PC vendors are trying to market the inclusion of AI processors in their devices as “a unique selling point.”
Alongside the PC space, the report highlights rapid AI processor adoption in areas like automotive, drones, security cameras and robotics.
Omdia’s report suggests that while GPUs currently dominate the market, AI accelerator chips like ASSPs from the likes of Qualcomm could potentially disrupt Nvidia’s grip on the market.
“AI ASSPs will push from 19% to 28% of the market, largely at the expense of GPUs,” said Alexander Harrowell, Omdia’s principal analyst for advanced computing.
“PCs are beginning to look more and more like smartphones or tablets as they adopt the CPU-GPU-NPU architecture familiar from nearly all modern smartphones, while in-CPU acceleration has been unexpectedly slow to take off.”
Developer demands and small models
Omdia’s report predicts an “architecture split” between the data center and the edge.
AI training traditionally occurs in the data center, but amid the rise of developers, Omdia’s report predicts an increased demand for inference power available at the edge.
“The developer experience is crucial, and there is a need for software tools that bridge the gap between cloud training and edge inference,” the report reads.
Another trend the report notes is that of small language models. Increasingly firms are publishing powerful models that are smaller in size, which require far less computing power to run than bigger systems.
Instead of hefty models with billions of parameters, Omdia’s report notes a rise in smaller, often domain-specific models. For example, just last week Microsoft showcased Orca-Math, which stands at seven billion parameters but solves grade school math problems better than GPT 3.5, Gemini Pro and Llama 2 70B.
Omdia’s report states that the rise of small models is driving demand for more edge processors capable of running these models locally for inferencing and fine-tuning.
“Applications that might have needed a gigantic model can be served with something small enough to run on a PC or a smartphone,” the report says.
Omdia found that vendors are exploring models in the growing small model space with parameters ranging from one billion to 10 billion for spatial and multi-modal use cases.
“As small LLMs push to the edge, they will bring more versatility through multi-modal AI and plain language prompt development.”
About the Author
You May Also Like