AI Business is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 3099067.

AI Research Reports

Content posted in June 2020
Deep Learning Chipsets
  • AI Research Reports

Deep learning (DL) is slowly moving past its hype cycle as proof-of-concept (PoC) AI applications developed in the past two years go into production. AI chipset customers have become more sophisticated in terms of chipset needs for AI application acceleration and are asking for specific benchmarks when talking to vendors. Customers’ needs for chipsets are coming to the forefront, forcing chipset companies to rethink the applicability of their technology. All prominent chip companies, such as Intel, NVIDIA, and Qualcomm, have invested heavily in AI. Cloud companies have started rolling out graphics processing units (GPUs), field-programmable gate arrays (FPGAs), and application-specific integrated circuits (ASICs), giving developers a choice for AI acceleration. Omdia forecasts that global revenue for DL chipsets will increase from $11.4bn in 2019 to $71.2bn by 2025.

This Omdia Market Report assesses the industry dynamics, technology issues, and market opportunity surrounding DL chipsets, including CPUs, GPUs, FPGAs, ASICs, and SoC accelerators. As an update to Omdia’s 2019 Deep Learning Chipsets report, it captures the state of this fast-moving chipset market. Global market forecasts, segmented by chipset type, compute capacity, power consumption, market sector, and training versus inference, extend through 2025. Omdia also provides profiles of 23 key industry players.

Artificial Intelligence for Edge Devices
  • AI Research Reports

Edge inference has emerged as a key workload in 2019–20, and many companies have introduced their chipsets. Several different factors are driving AI processing to the edge device. Privacy, security, cost, latency, and bandwidth are all being considered when evaluating data center versus edge processing needs. Applications like autonomous driving and navigation have sub-millisecond latency requirements that make edge processing mandatory. Other applications such as speech recognition on smart speakers generate privacy concerns. Keeping AI processing on the edge device circumvents privacy concerns while avoiding the bandwidth, latency, and cost concerns of cloud computing. Omdia forecasts that global AI edge chipset revenue will grow from $7.7bn in 2019 to $51.9bn by 2025.

This Omdia report provides a quantitative and qualitative assessment of the opportunity for AI edge processing across several consumer and enterprise device markets. The device categories include automotive, consumer and enterprise robots, drones, head-mounted displays (HMDs), mobile phones, PCs/tablets, security cameras, smart speakers, machine vision, and edge servers. Global revenue and shipment forecasts, segmented by chipset architecture, power consumption, compute capacity, training versus inference, and application attach rate for each device category, extend through 2025.



Practitioner Portal - for AI practitioners

Story

InterDigital releases AI-based video compression codec tool

9/24/2020

Available on GitHub under Apache 2.0 license

Story

MLOps startup Verta gets $10m in funding, launches first product

9/1/2020

The company plans to commercialize open source ModelDB project, developed by CEO Manasi Vartak

Practitioner Portal

EBooks

More EBooks

Upcoming Webinars

More Webinars
AI Knowledge Hub

Experts in AI

Partner Perspectives

content from our sponsors

Research Reports

More Research Reports

Infographics

Smart Building AI

Infographics archive

Newsletter Sign Up


Sign Up