The Future of AI: Bigger, Faster, Smarter
Everybody knows artificial intelligence and machine learning are huge growth areas for hardware and software developers alike, but just how huge? A report from ResearchAndMarkets.com suggests the North American AI chip market will account for over 30 billion dollars US by 2027. Up from 2.5 billion in 2018, ResearchAndMarkets.com predicts a compound annual growth rate (CAGR) of 32% in the North American AI chip market over the next eight years.
With ever-expanding digital services and the resulting data generation driving the market, there’s every reason to think AI will continue to proliferate into a wide range of use-cases across various industries. AI chips are already being used in cloud-computing applications, SmartHome and SmartOffice interfaces, and autonomous or self-guided vehicles. The potential ubiquity of AI contributes to the fragmentary nature of the AI chip market, as stalwarts like Intel and IBM find themselves in competition with consumer electronics giants like Samsung or even internet outfits like Alphabet (Google’s parent company).
Photo Credit: www.medium.com
Speaking of Intel, the company recently announced the availability of two new CPUs that feature artificial intelligence capabilities. The first products to debut from Intel’s new Nervana Neural Network Processor line, the chips are intended for large data and computing centers. The Nervana NNP-T, an SoC designed to help train AI systems, features 24 Tensor processing clusters to help power neural networks, while the Nervana NNP-I is an inference SoC that implements Ice Lake cores and 10 nanometer process technology to help users implement those AI-trained systems. Intel acknowledges that their new AI-centric SoCs (codenamed Spring Crest and Spring Hill, respectively) are competing with Amazon’s AWS Inferentia chips and Google’s Tensor processing unit as well as NVDLA-based tech from Nvidia for a bigger slice of the AI-chip pie.
While we’re on the topic of size, last month, Cerebras Systems, a relatively new player in the AI market, released what the company is calling the largest semiconductor chip ever made. The Cerebras Wafer Scale Engine (WSE), designed specifically for AI applications, is said to feature 1.2 trillion transistors, easily topping AMD’s Epyc 2 processors’ 32 billion. Samsung’s flash memory chip, the eUFS, features 2 trillion transistors, but the WSE is the still believed to be the largest processor on the market, over 56.7 times larger than Nvidia’s largest graphics processing unit. In the world of AI, size really does matter, as bigger chips process data more quickly and efficiently than their smaller counterparts. Greater size also helps reduce “time to insight,” the amount of time it takes to train AI systems to perform their intended functions. Experts agree that the reducing the time it takes to train AI models is the key to unlocking the potential future growth in the field.
What does the future hold for the AI chip market? With so many different companies across different industries vying to get a foothold on the market, it’s hard to predict where the next big developments will come from. AI is already a bigger part of our daily lives than most people realize, and its further infiltration is sure to accelerate as the technology evolves. How quickly will AI proliferate? Only time will tell, but it will definitely be something to keep a watchful eye on. Growth can be measured not only in sales numbers but in the rate at which an ecosystem of software libraries and development kits evolves and expands to support the technology, something we plan of keeping track of in future blogs.
The full ResearchAndMarkets report can be found here: https://www.researchandmarkets.com/reports/4801379/north-america-artificial-intelligence-chip-market