Connect with us

Hi, what are you looking for?


Amazon finds more efficient cloud computing solution

Amazon is the world’s biggest player in cloud computing. This is delivered via Amazon Web Services (AWS), which is the company’s the $27-billion cloud business. AWS on-demand cloud computing platforms to individuals, companies and governments, on a paid subscription basis.

The reason behind Amazon changing the technology behind its cloud services is to deliver a faster performance and to save costs. The new system is expected to provide Amazon with a performance-per-dollar advantage.

The ARM Graviton processor contains 64bit Neoverse cores. Speaking with EE News Europe, ARM’s Drew Henry, senior vice president, indicates that the Graviton system is based on the Cosmos 16nm processor platform.

In addition, the Israeli designed Graviton operates on the Cortex-A72 64bit core, which functions at clock frequencies up to 2.3GHz. The servers run on Intel and AMD processors. The system will assist Amazon with scale-out workloads. Here it is possible for users of the service to share the load across a group of smaller instances, such as containerized microservices, web servers, development environments, and caching fleets

There are other advantages to Amazon from the new technology, centered around being more independent in relation to technology providers. The Register notes that Amazon will now have the ability to license Arm blueprints, via Annapurna. In addition, the company will be able to customize and tweak those designs, and the ability to go to contract manufacturers like TSMC and Global Foundries, and get competitive chips made.

Commenting on the new technology, independent engineer James Hamilton said: “I’ve been interested in ARM server processors for more than a decade so its super exciting to see the AWS Graviton finally public, it’s going to be exciting to see what customers do with the new A1 instances, and I’m already looking forward to follow-on offerings as we continue to listen to customers and enhance the world’s broadest cloud computing instance selection.”

In related news, AWS are building a custom ASIC for AI Inference, called Inferentia, for Amazon. This could be capable of scaling from hundreds to thousands of trillions of operations per second and further reduce the cost of cloud-based. This will allow Amazon to compete with its rivals in the cloud computing space. Forbes reports that Google already has its own TensorFlow Processing Unit (TPU. In addition, Microsoft is using Intel and Xilinx FPGAs to accelerate inference processing.

Avatar photo
Written By

Dr. Tim Sandle is Digital Journal's Editor-at-Large for science news. Tim specializes in science, technology, environmental, business, and health journalism. He is additionally a practising microbiologist; and an author. He is also interested in history, politics and current affairs.

You may also like:

Tech & Science

Massive overseas and domestic investments offer Japan a chance to reclaim its tech crown.

Tech & Science

A part of the US surveillance program known as Section 702 allows intelligence agencies to conduct warrantless electronic monitoring of foreigners outside the United...


Roberto Cavalli dressed A-listers for decades and was known for his exotic animal prints and feather designs - Copyright AFP Brendan SmialowskiAlexandria SAGEItalian fashion...


Salman Rushdie, targeted for assassination since 1989 over his writing, had long wondered who would kill him.