Posted by on
Categories: Amazon AWS google Nvidia

@Amazon (NASDAQ: #AMZN) recently joined other tech giants such as @Apple (NASDAQ: #AAPL) and @Alphabet @Google (NASDAQ: #GOOG) (NASDAQ: #GOOGL) in designing its own custom chips. According to a report by The Information, Amazon has begun designing custom artificial intelligence ( #AI ) chips for its @Alexa-enabled home speakers. The report also said that Amazon may be looking to extend the chipmaking venture into products for its Amazon Web Services ( @AWS ) #datacenters. If that happens, it could potentially spell trouble for high-flying AI chipmakers NVIDIA (NASDAQ: NVDA) and Intel (NASDAQ: INTC). Here’s how Amazon’s ambitions could affect the AI race.  COULD AMAZON’S NEW ARTIFICIAL INTELLIGENCE CHIP THREATEN NVIDIA’S DATA CENTER GROWTH? IMAGE SOURCE: GETTY IMAGES. First to Alexa Alexa has been the main focus for Amazon this year, as Alexa adoption has surprised even optimistic company executives with its growth. With Apple’s recent entry into home speakers with the HomePod, however, Amazon likely feels the need to up Alexa’s game in both sound and speed. A custom chip could boost Alexa’s inference capabilities — the ability to process external inputs like speech and make sense of them at the device level. Doing more processing at the device level means less information needs to be sent back to the cloud, which means faster response times. Amazon has developed these chipmaking capabilities over the past few years through both internal hires as well as acquisitions, buying Annapurna Labs, an Israeli-based chipmaker, in 2015 for $350 million, and Blink, a maker of low-power security cameras, for $90 million late last year. Many believe Amazon was really after Blink’s power-management chipmaking capabilities, rather than just its camera products, as Amazon already has its own Cloud Cam product that it rolled out last year. Then to the data center? The report was mostly centered on Alexa, but an AWS data center chip was also mentioned as a possibility. While Amazon has a big lead in cloud computing, other rivals, especially Alphabet, are investing huge sums to catch up in this all-important race. Alphabet has been investing in custom hardware for years, and recently unveiled its second-generation tensor processing unit (TPU), designed specifically for AI neural networks. According to The New York Times, the TPU development not only puts Alphabet in a race for technological superiority with makers of graphics processing units (GPUs) like NVIDIA, but may also afford Alphabet negotiating leverage over GPU producers as well. (Google still buys NVIDIA chips, according to The Times. GPUs, originally designed for graphics and gaming, are the leading machine-learning chips today.) In addition, Alphabet has started renting out its TPU computing power to developers using Google’s Cloud Platform. The combination of technological innovation and the ability to lower costs is likely what caused Amazon to start developing its own chips, as it looks to maintain its cloud computing lead against these well-heeled competitors.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.