Malaysia to put former Goldman Sachs banker on trial extradition to US

US-China trade talks weigh down European stocks

CapitaLand states that Singapore property not set for ‘big bump’

South Korean Capital to invest over $1 billion in fintech and blockchain

Singapore condo resale prices down 0.3% in January

 
TECHNOLOGY TOP STORIES

Amazon’s cloud unit enters the AI chip market

Amazon's cloud unit enters the AI chip market

Amazon Web Services stated at its AWS Re:Invent user conference in Las Vegas that its new Inferentia chips will provide A.I. researchers “high performance at low cost.”

Amazon’s cloud business is developing its own computer chips for artificial intelligence projects. This was the latest example of a giant provider of cloud services building next-generation processors.

Among public cloud services providers, Amazon is following Google into the chip market. Google had announced its first Tensor Processing Unit (TPU) in 2016. Alibaba in China had also announced an AI chip.

AWS is by far the leader in public cloud infrastructure, which companies can rely on to remotely run software and store data. Microsoft, Google, IBM are in competition with AWS for business as companies move their workloads from traditional data centers to the cloud. The Inferentia chips will become available in late 2019. Like with other AWS services, customers will be able to pay based on how much they use.

Commonly, there are two phases in AI– training models by feeding them lots of data, and then showing them new data that they can then use to run predictions. Since 2016, Google has introduced new TPU chips that compete with Nvidia for training AI models. Inferentia is focused only on inference for now.

Amazon stated that some inference workloads require an entire graphics processing unit, which is expensive. “Solving this challenge at low cost requires a dedicated inference chip,” said the company.

Earlier this week AWS announced ARM-based chips that represent an alternative to traditional computing processors from chipmakers like Intel. Those are more focused on low-cost, energy-efficient computing workloads. The new Inferentia silicon is specialized for AI.

AWS said customers will be able to use Inferentia with TensorFlow AI software (created by Google), as well as other AI frameworks like PyTorch and the ONNX format for converting models.

Leave a Comment