Google chief scientist Jeff Dean: AI needs ‘algorithmic breakthroughs,’ and AI not to blame for most of data center emissions increase

Date:



Google sent a jolt of unease into the climate change debate this month when it disclosed that emissions from its data centers rose 13% in 2023, citing the “AI transition” in its annual environmental report. But according to Jeff Dean, Google’s chief scientist, the report doesn’t tell the full story and gives AI more than its fair share of blame.

Dean, who is chief scientist at both Google DeepMind and Google Research, said that Google is not backing off its commitment to be powered by 100% clean energy by the end of 2030. But, he said, that progress is “not necessarily a linear thing” because some of Google’s work with clean energy providers will not come on line until several years from now.

“Those things will provide significant jumps in the percentage of our energy that is carbon-free energy, but we also want to focus on making our systems as efficient as possible,” Dean said at Fortune’s Brainstorm Tech conference on Tuesday, in an onstage interview with Fortune’s AI editor Jeremy Kahn.

Dean went on to make the larger point that AI is not as responsible for increasing data center usage, and thus carbon emissions, as critics make it out to be.

“There’s been a lot of focus on the increasing energy usage of AI, and from a very small base that usage is definitely increasing,” Dean said. “But I think people often conflate that with overall data center usage — of which AI is a very small portion right now but growing fast — and then attribute the growth rate of AI based computing to the overall data center usage.”

Dean said that it’s important to examine “all the data” and the “true trends that underlie this,” though he did not elaborate on what those trends were.

One of Google’s earliest employees, Dean joined the company in 1999 and is credited with being one of the key people who transformed its early internet search engine into a powerful system capable of indexing the internet and reliably serving billions of users. Dean cofounded the Google Brain project in 2011, spearheading the company’s efforts to become a leader in AI. Last year, Alphabet merged Google Brain with DeepMind, the AI company Google acquired in 2014, and made Dean chief scientist reporting directly to CEO Sundar Pichai.

By combining the two teams, Dean said that the company has “a better set of ideas to build on,” and can “pool the compute so that we focus on training one large-scale effort like Gemini rather than multiple fragmented efforts.”

Algorithmic breakthroughs needed

Dean also responded to a question about the status of Google’s Project Astra—a research project which DeepMind leader Demis Hassabis unveiled in May at Google I/O, the company’s annual developer conference. Described by Hassabis as a “universal AI agent” that can understand the context of a user’s environment, a video demonstration of Astra showed how users could point their phone camera to nearby objects and ask the AI agent relevant questions such as “What neighborhood am I in?” or “Did you see where I left my glasses?” 

At the time, the company said the Astra technology will come to the Gemini app later this year. But Dean put it more conservatively:  “We’re hoping to have something out into the hands of test users by the end of the year,” he said.

“The ability to combine Gemini models with models that actually have agency and can perceive the world around you in a multimodal way is going to be quite powerful,” Dean said. “We’re obviously approaching this responsibly, so we want to make sure that the technology is ready and that it doesn’t have unforeseen consequences, which is why we’ll roll it out first to a smaller set of initial test users.” 

As for the continued evolution of AI models, Dean noted that additional data and computing power alone will not suffice. A couple more generations of scaling will get us considerably farther, Dean said, but eventually there will be a need for “some additional algorithmic breakthroughs.”

Dean said his team has long focused on ways to combine scaling with algorithmic approaches in order to improve factuality and reasoning capabilities, so that “the model can imagine plausible outputs and reason it’s way through which one makes the most sense.”

Those kind of advances Dean said, will be important “to really make these models robust and more reliable than they already are.”

Read more coverage from Brainstorm Tech 2024:

Wiz CEO says ‘consolidation in the security market is truly a necessity’ as reports swirl of $23 billion Google acquisition

Why Grindr’s CEO believes ‘synthetic employees’ are about to unleash a brutal talent war for tech startups

Experts worry that a U.S.-China cold war could turn hot: ‘Everyone’s waiting for the shoe to drop in Asia’



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Share post:

Subscribe

spot_imgspot_img

Popular

More like this
Related

SpaceX to launch 20 Starlink satellites on the 400th Falcon 9 rocket – Spaceflight Now

SpaceX is counting down Saturday to its 400th...

Dark energy sheds light on life in the cosmos

Back to Article List What does dark energy have...

The Lion’s markings | Astronomy Magazine

The Lion’s markings | Astronomy Magazine ...