Skip to content

Blog

Sustainable AI + Muppets Underwater

Sustainable AI + Muppets Underwater
3:32

 

Dr. Ryan Ries here. Right now, while I’m drafting this week’s Mission Matrix, our temps in the Los Angeles Area are hitting 100 — it’s a scorcher.

So, as I sit here trying to cool off, I can't help but think about the massive cooling needs of our ever-expanding AI infrastructure.

This week, I want to dive into the hot (pun intended) topic of data center sustainability and the creative solutions being explored.

AI's Cooling Crisis

Unfortunately, pushing the boundaries of AI means we're also pushing the limits of our planet's resources.

Case in point: Arizona.

The state is literally running out of water, and big tech data centers are partly to blame. Google's planned facility in Mesa was set to use up to 4 million gallons of water a day for cooling.

That's a lot of H2O in a desert!

This water crisis has led to construction limits around Phoenix, potentially putting the brakes on future data center expansions.

It's a stark reminder that our digital ambitions have very real physical consequences.

Diving Deep for Solutions

But tech companies aren't taking this lying down. Microsoft decided to take the plunge — literally.

Their Project Natick experiment sank data centers to the bottom of the ocean, using the natural cooling properties of seawater.

These underwater data centers saw one-eighth of the failure rate compared to their on-land counterparts.

While Microsoft has since surfaced these underwater data centers, the experiment provided valuable insights into alternative cooling methods and energy efficiency.

They're now exploring how to apply these learnings to improve data centers on dry land.

Small Models, Big Impact

Another approach gaining traction is the development of smaller, more efficient language models.

Instead of massive GPUs guzzling energy, these SLMs (Small Language Models) run on lower-power hardware.

It's like switching from a gas-guzzling SUV to an electric compact car — you might sacrifice a bit of power, but the efficiency gains are substantial.

This shift towards SLMs isn't just about energy savings, though. 

It's about making AI more accessible and deployable in a wider range of scenarios. 

Plus, it aligns perfectly with the growing demand for edge computing and on-device AI.

The Road Ahead

The innovations around AI have been super exciting. 

But we need to make sure sustainability isn't an afterthought — it needs to be baked into every decision we make. 

Whether it's exploring different cooling solutions, optimizing our models for efficiency, or rethinking where we place our data centers, the tech industry has a responsibility to lead the charge in sustainable innovation.

What do you think? Are underwater data centers the way of the future? Should we be focusing more on SLMs? Or do you have an out-of-the-box idea up your sleeve? 

Drop me a line — I'm always curious to hear everyone’s thoughts.

Until next time, stay cool (literally and figuratively)

Ryan

PS. Na Yu and Jonathan LaCour are hosting our next generative AI Ask Me Anything event on September 14th. Will I see you there? Bring all your questions!

Now, time for this week’s AI-generated image and the prompt I used to generate it.

DALLE2~1-Aug-21-2024-06-43-39-9294-PM

"Generate an image of a group of muppets creating an underwater data center. The muppets are confused. There is a shark watching in the distance."

Sign up for Ryan's weekly newsletter to get early access to his content every Wednesday.

Author Spotlight:

Ryan Ries

Keep Up To Date With AWS News

Stay up to date with the latest AWS services, latest architecture, cloud-native solutions and more.