APK Oasis

AI Agents Will Work with AI Agents For Chip Design in 2025 - EE Times

By Nitin Dahad
From EE Times

AI Agents Will Work with AI Agents For Chip Design in 2025 - EE Times

Last week, Synopsys' lead on AI technology strategy, Stelios Diamantidis, stated that AI will start collaborating with AI in 2025, bringing in the next phase of AI deployment. He said that AI agents -- which started out as simple AI bots performing rudimentary tasks using predefined rules and decision trees -- have evolved into sophisticated AI agents that can understand human language, generate content, continuously learn and adapt their behavior accordingly.

These may be built for specific use cases and isolated within certain applications, but that will soon change when one AI agent could perceivably collaborate with another AI agent. In a blog post, Diamantidis added that AI agents are being trained for greater integration and collaboration, including in chip design.

To highlight their own internal use of this, Synopsys told EE Times, "Based on results from pilot programs, Synopsys internal GenAI applications are expected to yield 250,000 hours of employee capacity this coming year, freeing our teams to focus more of their time on high-value activities for our customers."

In his blog, Diamantidis added, "Highly specialized AI agents could combine and analyze incalculable amounts of information spanning software workloads, architecture, data flow, timing, power, parasitics, manufacturing rules and other parameters. This AI-to-AI collaboration would help identify unseen patterns and correlations, develop new solutions for longstanding problems, and provide detailed recommendations for optimizing chip design and performance."

In order to get more perspective on this and the background to Synopsys' work in AI for chip design, we sat down with Diamantidis at the company's headquarters in Mountain View, Calif. You can watch the video interview with him below:

AI is not just about compute: it is also about power efficiency

While we all get excited about AI and generative AI, we are constantly hearing about the huge energy demands of AI, as well. At several conferences over the last two years, I have heard Mark Papermaster, CTO of AMD, talk about running out of energy to power the huge explosive growth in AI in the near future, and there is constant talk about trying to get more compute with less energy to address such issues.

There are of course many companies innovating, either with improved AI compute, interconnect and memory architectures, to completely innovative compute in memory (such as the recently emerged startup Sagence AI with its analog in-memory compute to address more efficient AI inferencing).

William Ruby, a senior director at Synopsys addressing power analysis, told EE Times, "We need to make power and energy efficiency one of the primary considerations when you start looking at the architecture [in chip design]." Ruby has extensive experience in low power IC design and design methodology.

We sat down with Ruby at Synopsys' headquarters in Mountain View, Calif., earlier this month to chat about the impact of AI, on the need for power efficiency and how Synopsys is helping to address that requirement.

Previous articleNext article

POPULAR CATEGORY

Software

35304

Artificial_Intelligence

12291

Internet

26604