SOC Design

Wednesday, March 23, 2005

EDA's Malaise

EETimes' managing editor of design automation Richard Goering has just written an opinion piece decrying the problems in the EDA industry (see "Addressing EDA's malaise"). Richard does a good job of summarizing some big problems our industry faces:

... when you hear the word EDA, what comes to mind? Failed companies. Flat revenues. Endless lawsuits. Marketing hype. Depressed market values. Vision and excitement seem in short supply, and EDA's reputation among investors and customers is lukewarm at best.

and,

...the EDA industry has limited itself by focusing its efforts on digital ASIC and system-on-chip design — a discipline that is just one small part of the challenging job of getting products into customers' hands. At 90 nanometers and below, fewer and fewer companies are even going to try chip design. There aren't all that many custom-chip designers in the world. But there are legions of FPGA, pc-board and embedded-software developers out there, and they have requirements that remain unmet.

Richard then outlines what he sees as one way to address the problem:

The EDA industry needs to encompass more of the design flow and get out in front of a much larger group of users.


My esteemed colleague is absolutely correct. I am appalled at the conventional EDA industry's focus on synthesis and post-synthesis tools at the expense of system-level design tool development. Last year at a the first International SOC conference in Newport Beach, I asked a panel of representatives from EDA tool vendors if it bothered them that all of the SOC designs shown in the presentations made at the conference all had block diagrams that resembled the computer systems I'd designed 20 years ago. Didn't they think that system design had advanced more than that in two decades?

I got blank stares and the the reply, "Whay are you asking us that question?" Sort of highlights one of the EDA industry's problems, doesn't it? (By the way, EETime's editor Ron Wilson was there and he clearly heard me ask that question. Just look at the intro paragraphs in the article he wrote this week on on-chip interconnects.)

Here's Richard Goering's succinct summary of this issue from his opinion piece:


There hasn't really been a major methodology shift since the move to RTL in the early 1990s.

Yes, that's it in a nutshell. We've been designing the same systems at the same abstraction level for 15 years. In my opinion, based on the pace of development in the electronics industry over the past 60 years, that's about 5 years too long. We need a good revolution in design techniques every decade or so to keep the industry vital and we've been lucky enough to get that sort of evolutionary pace. Until now.

I believe what's needed, what's always happened to our industry in the past, is for designers to move up a level of abstraction. In the late 1960s, engineers stopped designing digital systems transistor by transistor and started using SSI and MSI integrated circuits. In the 1970s, custom ICs started to replace standard parts in high-volume products (such as calculators) and these chips were designed polygon by polygon on Calma systems. By the 1980s, digital-IC designers had moved up to gate-level design on Daisy, Mentor, and Valid CAD systems. In the 1990s, Verilog and VHDL, logic synthesis, and Synopsys became ascendant.

Today, designers are still developing RTL blocks by hand using Verilog and VHDL but the number of gates on the chips they're designing has increased by a couple of orders of magnitude. We're drowning in gates and we're smothering in verification! The low-level details of logic design are suffocating the industry.

What's needed is a new approach to system design that leaves lower-level block design to automated tools that create pre-verified, correct-by-construction RTL blocks. There just aren't enough engineers in the world (even with the legions in India and China) to manually design and verify all of the RTL today's nanometer SOCs can absorb. With the right synthesis tools, these same block-generating tools can crank out logic just as suitable for the big FPGAs with their acres of logic cells.

It's time to leave manual logic design behind and concentrate of new and interesting system architectures with truly high-performance abilities. As Richard Goering concludes:

Rather than just another tool, what's needed are solutions to broad-ranging problems.

I concur. What's needed is for the industry to break out of its narrow focus on RTL design and the straitjacket of the simple, antiquated system architectures of the 1980s and to develop 21st-century architectures that address today's application problems.

0 Comments:

Post a Comment

<< Home