Google Cloud is leaning on all its strengths to support enterprise AI

As the sun sets on Google Cloud 2025, customers will be walking away with a promise ringing in their ears: you tell us how, and we’ll deliver.

Throughout this year’s event, held at the Mandalay Bay hotel in Las Vegas, Google Cloud made repeated promises to offer customers choice when it comes to cloud AI adoption.

In my pre-conference analysis, I argued that Google Cloud could call this year’s event a success if it presented a cohesive message and a convincing reason why GCP should be any business’ number one choice for AI in the cloud.

Aiming for this, it’s very much accomplished its mission.

The company’s big sales pitch at Google Cloud Next 2025 has been that it provides the infrastructure, you choose the models, an arrangement underlined by everything it says is special about its own cloud offerings.

This arrangement also has the added benefit of slowly, dextrously removing Google Cloud from the competitive mudslinging of AI model development. True, Sundar Pichai spent some time onstage reeling off praise for Gemini 2.5 and Gemini 2.5 Flash, but the firm’s focus has been on how its infrastructure and networking makes any of the models it offers competitive, not just its own.

Is this moving the goalposts? If anything, it’s closer to changing the game altogether.

That’s not to say Google Cloud went small with its announcement this year – the company unveiled a slew of new products and services, as covered in my live blog. It’s simply that instead of taking shots at the likes of AWS and Azure, Google Cloud has gestured to everything that helps it deliver powerful, low latency AI for customers.

Crucially, it’s repeatedly affirmed its close work with partners to make its tools as interoperable as possible and providing its AI agents with access to valuable data from across different platforms. The concept of cloud choice permeated this year’s event, with more of a consistent story of multi-cloud collaboration underpinning every announcement.

The new way to cloud

In an era of U-turns and hasty reactions to changes in the market, Google Cloud is clearly unafraid to stick with its core approach.

“Our entire focus with cloud computing, including AI, has been simplifying technology, to make it easier for people to use,” Thomas Kurian, CEO at Google Cloud, told assembled press at the conference.

He explained that while in the past businesses had to build data centers to adopt new technologies, from procuring power and buying equipment and storage to hiring people to operate them, cloud and AI have simplified the whole process.

“AI is really about simplifying technology, it’s about changing the interface that humans have with compute systems, so that you don’t need, for example, to be an expert in programming. You can ask in English and it will generate code for you.”

Retaining last year’s slogan ‘The new way to cloud’ helped to show how far it has progressed in the last 12 months.

Though Google Cloud Next 2024 was a success for the firm, the ‘new’ in the cloud referred mainly to all the AI goodies it was packing into its existing offerings, and less about how it was changing the fundamentals of its infrastructure and network to build a better cloud for the future.

This year, ‘new way to cloud’ felt like more of a concrete offering to customers: a promise that if you’re looking to leverage AI, Google Cloud will help you deploy it at the scale and detail you need.

Not only has Google Cloud worked to deliver on these promises – it’s also made a convincing case for its unique capacity to meet these needs.

Take Cloud WAN, which opens its backbone network – all two million miles of fiber and 33 subsea cables that have been used to keep services such as Gmail and YouTube running – for enterprise use. This is Google infrastructure, built over many years, and on a much larger scale than that of its direct rivals.

There’s also the firm’s new Ironwood TPU, which it has tailored to meet customer training and inference needs on an enormous scale. It shows sizable performance and efficiency improvements over its predecessors.

Finally, Google Cloud shows a willingness to converge all the data and software at its disposal to improve the effectiveness of its products and reduce cloud complexity, as seen in Google Unified Security. Central to this new approach is Google’s more open attitude toward third-party apps and AI models.

Google CEO Thomas Kurian speaks to journalists at Google Cloud Next 2025

(Image credit: Future)

While the hyperscaler will naturally claim its Gemini models are the best in the world, it also recognizes that when it comes to practices such as vibe coding this can be a highly subjective matter. Developers may well favor one model over another for a range of reasons, and Google Cloud has affirmed its commitment to making a wide range available in the Model Garden on Vertex AI.

To date, Google Cloud has teed up customer choice within its range of internally developed LLMs. From Gemma 3, the firm’s most lightweight, open models, through to the variously sized Gemini models,

“I think this is a matter of finding the right tool for the right purpose and it’s not enough to have a hammer,” Wayne Kurtzman, Research VP, Collaboration and Communities at IDC.

“It’s nice to have a set of hammers, a set of screwdrivers, and a set of power tools. And I think we’re moving into that direction so we can utilize the right technology for the right purpose and that results in a better experience.

In the developer keynote, attendees were shown how someone using the new Agent Developer Kit could build an agent with either Gemini or third party models from within Vertex, including Meta’s Llama 4 or Anthropic’s Claude 3.7.

It’s clear Google Cloud is keen to show it’s working hand in glove with partners to make AI easier to use and adopt. Agentspace, Google’s AI-powered knowledge engine for enterprise, can now connect directly to third-party data and AI tools while the new Agent2Agent protocol will allow Gemini AI agents to communicate with third-party agents a firm may already use.

Alongside ease of adoption, Google Cloud has repeatedly stressed its commitment to ease of use for AI features in Google Workspace. Now available on every business subscription, the Gemini AI tools found in apps such as Docs, Sheets, and Slides are intended to be intuitive and easy to use.

“A design principle we have is, how do we make AI super accessible, really easy to discover, and bring it into the flow of work?” says Kristina Behr, VP for Product Management, Collaboration Apps, Google Workspace at Google Cloud.

“So you don’t need to switch tabs or go to AI to get help with creativity and productivity, but how do we bring AI to you, wherever you are?”

Lingering questions still remain

There are still some areas that Google Cloud is missing detail on, however – namely sustainability.

Throughout the event, executives stated AI and similar technologies will help combat the climate crisis and are therefore worth pursuing, while avoiding giving specific answers on the impact of liquid-cooled hardware within data centers on water-stressed regions.

Mark Lohmeyer, VP/GM, AI & Computing Infrastructure at Google Cloud, told ITPro that Google has been an “innovator” in the liquid-cooled infrastructure space, having used the approach since its fourth-generation TPUs. He stated that this is fortunate as it has unlocked new opportunities in the GPU and TPU space, thereby reducing overall energy consumption.

For now, Google Cloud can continue to give its line on efficiency and gesture to its record renewable energy investments. But within this decade, it and its hyperscale rivals will face increasingly intense questioning over fresh water consumption – and will have to prepare more convincing long-term arguments to garner trust.

Other big questions hang over the firm. It’s unclear just how much compute Google Cloud’s customers will actually end up needing in the long-run, for example.

Speak to anyone on the ground and it’s clear that with the likes of its new Ironwood TPU and network upgrades, the hyperscaler is readying itself for eye-watering demand for AI inference in the coming months and years.

Whether this materializes is largely dependent on whether AI agents become as widely-used as their developers would hope. Google Cloud has helped itself along in this regard by beginning to embed AI tools in all of its offerings by default and hoping that by making them easier to connect to others, businesses will find genuine convenience and value in them.

Economic uncertainty

Tariffs imposed by the Trump administration, which IDC has said will impact tech spending, loomed large over Google Cloud Next 2025. Though Sundar Pichai took to the keynote stage on day one to confirm $75 billion in AI spending for 2025, questions still remain over the exact shape and scope of how Google will be impacted by global trade disruptions.

Responding to a question on the tariffs and how potential European service tariffs could harm Google, Kurian described the situation as “extremely dynamic”. He drew a direct comparison between this trade cycle, the 2008 financial crisis, and the economic impacts of the pandemic.

Regardless of Google Cloud’s strategy, it’s clear it will have to navigate shifting market sentiment amid these enormous shifts.

“We are very well positioned, both because of the breadth of the portfolio but also because of differentiation in what we’re doing with AI,” Kurian told assembled media.

It’s in the breadth, flexibility, and openness that Google Cloud is already showing that the company has placed its hopes for the coming year. Customers and partners will have every reason to trust in the direction the company is headed.

Google Cloud is making a convincing argument, for it’s less a leap of faith and more an investment in a tested, unified platform and AI dominance. Success in 2025 and beyond will require it to stick to its commitments, work hard on maintaining transparency, and to keep its eye firmly on the challenges of the decade to come.


Source link
Exit mobile version