Connect with us

Hi, what are you looking for?

Tech & Science

Digital transformation about “reducing the penalties of failure” says AWS CEO

Adam Selipsky talks AI and cloud transformation for Amazon Web Services.

Adam Selipsky talks AI and cloud transformation for Amazon Web Services.
Adam Selipsky talks AI and cloud transformation for Amazon Web Services.

Since 2006, AWS has provided companies, startups, and governments with cloud computing platforms. People know and love them for their competitively high uptime, superior security measures, and “customer obsession.” 

But like many other digital transformation moves in the last 20 years, the rise of AI (including generative AI) trends are yet another that will impact AWS and their services. CEO Adam Seplinsky was a recent guest on The Verge’s Decoder podcast, chatting with host Nilay Patel about AI’s inextricable link with the cloud and how Amazon adapts to it. 

Here are some highlights from the interview.

On the efficiency of cloud transformation and “reducing the penalties for failure”

“Pharmaceutical after pharmaceutical will tell you that they have improved and shortened their time to market with using AWS in the cloud model… If you have to buy a lot of CapEx for a big project, spend a lot of money, you’ve spent it. You’re not getting it back. It’s sitting on your premises. You feel like you have to succeed. The penalties for failure become huge…And things take a long time because nobody wants to admit failure.”

“So with the cloud model, you just turn stuff on and you turn stuff off. So what happens is you get rapid experimentation. So when I talk about transformation, it’s not a buzzword. It is about, for example, a specific concept of reducing the penalties for failure.”

“…you increase the ability to innovate, and you actually get more great new breakthrough ideas per person per month than you used to get before. And that’s a cultural change inside of our customers, which they find to be incredibly powerful.”


On AI’s omniscience, Amazon’s expertise, and the AWS approach

“We’ve been doing AI since 1998. Personalization on the Amazon website is AI. We launched in 2017 SageMaker, which is the largest machine learning platform in the world. We have over 100,000 customers doing machine learning on SageMaker. Then, if you want to talk specifically about generative AI and foundation models, Amazon has foundation models that have been running in production for a couple years now. Parts of retail websites search are powered by large language models. And if you look at Alexa, a lot of Alexa’s voice responses are powered by LLM. We’ve got a lot of expertise in this area and kind of pivoting it specifically to generative AI.”

“AI is not this separate thing. It is intrinsically bound up with the cloud. Now, why do I say that? Well, for one thing, you need a data strategy for AI to work for you at all. Whether you’re talking about serving education better, serving financial services clients better, whether you’re talking about drug discovery, whether you’re talking about media, asset creation, you have to know what data you have. You’ve got to know what data you want to take and have that as inputs into your generative AI.”

“We also innovate and design our own silicon, our own chips. We’ve got general-purpose chips, which are already in their third generation, but we also have specific chips for AI and machine learning: Trainium for training models and then Inferentia for running models and production. Those are doing really well, growing quickly. I’m highly confident that they’re going to have the best price performance of any chip technology for doing AI. And that’s going to be incredibly important for startups like Coherent, Anthropic, Stability AI, and Hugging Face, who are building models.”

On balancing sustainability with customer demands

“We have a lot of customers who are consuming GPUs, and tomorrow, we’re going to have a lot more customers who want to consume GPUs… In addition, there are going to be a ton of customers who are going to want the innovation and the energy efficiency and the price performance for their use cases that we’ll have on our Trainium and Inferentia chips. It’s not an “or”; it’s an “and.” And we’re committed to providing the choice, and those will both be huge sets of demand, huge sets of use cases for it.”

Listen to the whole podcast episode here.

Avatar photo
Written By

Veronica Ott is a freelance writer and digital marketer with a specialization in finance and business. As a CPA with experience in the industry, she's able to provide unique insight into various monetary, financial and economic topics. When Veronica isn't writing, you can find her watching the latest films!

You may also like:

Social Media

But like TikTok, ByteDance's AI services could face trouble in overseas markets owing to issues from data privacy.

Entertainment

It is another setback for the world's biggest live televised music event, after five countries pulled out of this year's contest.

World

A SpaceX Falcon 9 rocket with the company's Dragon spacecraft on top launches from Cape Canaveral, en route to the International Space Station -...

Business

When women face barriers entering or progressing in STEM roles.