SOC Design

Tuesday, March 29, 2005

Evil Tech

Designers don't always put their talents to good uses. Case in point: Gauri Nanda, a research associate at the MIT Media Lab and her invention, Clocky, an autonomous mobile alarm clock. When it's time to wake up, Clocky's buzzer goes off like any self-respecting alarm clock. However, to prevent you from hitting the snooze button, rolling over, and going back to sleep, Clocky jumps off your night table and hides, while still beeping incessantly.

Clocky is upholstered in shag carpeting so it looks a bit like a Duraflame firelog transformed into a brown Chia pet that's sprouted a pair of toy plastic wheels. The idea behind this evil invention is that by the time you locate and silence Clocky, you're no longer sleepy enough to fall back into bed for more shuteye. Clocky has enough microprocessor-based intelligence to find a different hidey hole each day so your morning wakeup ritual doesn't become too routine.

Here's to the MIT Media Lab, working on systems to make your life better in the 21st century.

Monday, March 28, 2005

Scrambled Flash

I killed my Sony DVD R/RW drive last Friday. I had just downloaded and was applying a factory-supplied firmware upgrade that would turn my double-speed drive into a quad-speed demon. However, the Flash updater, running under Windows XP, died in the middle of the operation. (Windows crashed! What a surprise!) The result: scrambled Flash and a dead drive. A little Googling on the Internet established that the drive was most likely dead. Permanently. A help email to Sony is as yet unanswered so today, I replaced the drive with a Plextor PX-716A from Surplus Computers, which is conveniently located just down the street. Works like a champ, although the external drive box now looks funny, with Sony silkscreened on the top and Plextor silkscreened on the front.

The developers of the Sony drive who created a design that can be permanently disabled by an aborted Flash update need some remedial system-design training. An official download from the Sony support Web site applied by a reasonably competent operator (my engineering skills aren't all that decrepit) should not be able to destroy the drive. Yet that's what happened.

This episode holds an important lesson for all system designers: design for contingencies and expect Murphy to visit. The lesson is no less true for SOC designers as it is for board-level designers. Expect the unexpected and don't let the fates turn your design into junk.

Thursday, March 24, 2005

Wizards of Greater Oz

To my mind, Ray Bolger's role as the scarecrow in The Wizard of Oz stands out more than any other performance (even Judy Garland's). His song and dance rendition of "If I only had a brain" is a real high point in the movie for me. More to the point, the scarecrow, Dorothy, the Tin Man, and the Cowardly Lion all journey to Oz's Emerald City to ask The Wizard for things they think they need to complete their life (brain, way home, heart, and courage).

All of us in the high-tech industry are wizards of greater Oz. We labor to produce artifacts that we think will help complete the lives of the people on this planet (light, heat, food, transportation, entertainment, etc.). To do this, we've increasingly relied on processors, particularly microprocessors, to provide the "brains" of the electronic gadgets we develop. As the head of The Microprocessor Report, I and my colleagues often wrote about new and more "powerful" microprocessors that various companies developed to provide ever-increasing abilities to the new products of the tech industry.

The word "powerful" applied to even today's microprocessors is laughable. My cat's brain has more processing power than Intel's finest von Neumann machine and it doesn't need upwards of 100W to operate either. As an algorithmic-processing, image-recognizing, rule-generating device, the brains of most animals far outclass and outdistance today's simple silicon wonders.

What leads me to this topic today is Dean Takahashi's article in the March 24 San Jose Mercury News about a small company in Menlo Park run by former PDA gurus Jeff Hawkins and Donna Dubinski ("Numenta works to develop brain-like computing," requires free subscription). Hawkins has been funding brain research at the Redwood Neuroscience Institute in Menlo Park for several years and wrote a book last year titled On Intelligence detailing his theories on how the human brain works. Essentially, Hawkins believes that the brain works like a huge, deeply hierarchical memory. Each stage of the hierarchy is responsible for part of the decision making.

Hawkin's company Numenta intends to move these theories into practice. To that end, there's a software simulation already available. The products are in the future, to be harnessed by us Wizards.

Wednesday, March 23, 2005

EDA's Malaise

EETimes' managing editor of design automation Richard Goering has just written an opinion piece decrying the problems in the EDA industry (see "Addressing EDA's malaise"). Richard does a good job of summarizing some big problems our industry faces:

... when you hear the word EDA, what comes to mind? Failed companies. Flat revenues. Endless lawsuits. Marketing hype. Depressed market values. Vision and excitement seem in short supply, and EDA's reputation among investors and customers is lukewarm at best.

and,

...the EDA industry has limited itself by focusing its efforts on digital ASIC and system-on-chip design — a discipline that is just one small part of the challenging job of getting products into customers' hands. At 90 nanometers and below, fewer and fewer companies are even going to try chip design. There aren't all that many custom-chip designers in the world. But there are legions of FPGA, pc-board and embedded-software developers out there, and they have requirements that remain unmet.

Richard then outlines what he sees as one way to address the problem:

The EDA industry needs to encompass more of the design flow and get out in front of a much larger group of users.


My esteemed colleague is absolutely correct. I am appalled at the conventional EDA industry's focus on synthesis and post-synthesis tools at the expense of system-level design tool development. Last year at a the first International SOC conference in Newport Beach, I asked a panel of representatives from EDA tool vendors if it bothered them that all of the SOC designs shown in the presentations made at the conference all had block diagrams that resembled the computer systems I'd designed 20 years ago. Didn't they think that system design had advanced more than that in two decades?

I got blank stares and the the reply, "Whay are you asking us that question?" Sort of highlights one of the EDA industry's problems, doesn't it? (By the way, EETime's editor Ron Wilson was there and he clearly heard me ask that question. Just look at the intro paragraphs in the article he wrote this week on on-chip interconnects.)

Here's Richard Goering's succinct summary of this issue from his opinion piece:


There hasn't really been a major methodology shift since the move to RTL in the early 1990s.

Yes, that's it in a nutshell. We've been designing the same systems at the same abstraction level for 15 years. In my opinion, based on the pace of development in the electronics industry over the past 60 years, that's about 5 years too long. We need a good revolution in design techniques every decade or so to keep the industry vital and we've been lucky enough to get that sort of evolutionary pace. Until now.

I believe what's needed, what's always happened to our industry in the past, is for designers to move up a level of abstraction. In the late 1960s, engineers stopped designing digital systems transistor by transistor and started using SSI and MSI integrated circuits. In the 1970s, custom ICs started to replace standard parts in high-volume products (such as calculators) and these chips were designed polygon by polygon on Calma systems. By the 1980s, digital-IC designers had moved up to gate-level design on Daisy, Mentor, and Valid CAD systems. In the 1990s, Verilog and VHDL, logic synthesis, and Synopsys became ascendant.

Today, designers are still developing RTL blocks by hand using Verilog and VHDL but the number of gates on the chips they're designing has increased by a couple of orders of magnitude. We're drowning in gates and we're smothering in verification! The low-level details of logic design are suffocating the industry.

What's needed is a new approach to system design that leaves lower-level block design to automated tools that create pre-verified, correct-by-construction RTL blocks. There just aren't enough engineers in the world (even with the legions in India and China) to manually design and verify all of the RTL today's nanometer SOCs can absorb. With the right synthesis tools, these same block-generating tools can crank out logic just as suitable for the big FPGAs with their acres of logic cells.

It's time to leave manual logic design behind and concentrate of new and interesting system architectures with truly high-performance abilities. As Richard Goering concludes:

Rather than just another tool, what's needed are solutions to broad-ranging problems.

I concur. What's needed is for the industry to break out of its narrow focus on RTL design and the straitjacket of the simple, antiquated system architectures of the 1980s and to develop 21st-century architectures that address today's application problems.

Red Herring Profiles Tensilica, Inc.

Here at Tensilica, we’re thrilled that a publication as prestigious as The Red Herring chose to profile our company and CEO Chris Rowen in its March 21 issue. The 1-page article (plus a second page with a really big photo of Chris) does what it can to convey the arcane world of configurable, extensible processors and SOC design with limited space and much to say about the business of chip development in the 21st century. In the name of journalistic integrity and balance, the article quotes people who work at a couple of Tensilica’s competitors.

Unfortunately, our competition understandably doesn’t share the Herring’s journalistic sense of fair play and they used the opportunity to take some cheap shots. Imagine! Although Tensilica would never officially address the shortcomings of the competitors quoted by The Red Herring, through the wonder of the blogosphere and my personal blog it’s now a lot easier for me to return those underhanded volleys.

The Red Herring article quotes MIPS’ director of product strategy Tom Peterson. I’ve been on conference panels with Tom in the past and he’s a wily (as in cartoon character Wile E. Coyote) marketer. I really respect Tom’s ability to sling FUD; I think he’s one of the best. In the article, Tom is quoted as saying that “configurable cores are not as good a fit for products that run lots of software.” Now that’s just silly. The MIPS processors that Tom’s trying to flog are 32-bit RISC processors that can’t be configured.

Tensilica’s Xtensa processor cores are also 32-bit RISC processors. They can do everything a MIPS processor can do except run crusty old code compiled specifically for MIPS processors that can’t be updated because the original programmers have left or died off. Xtensa processors can also be configured so that they run specific target code faster…a lot faster. However, configuring and extending an Xtensa processor core doesn’t impair its ability to run any sort of software, but that’s just the impression Mr. Peterson would like to leave while he valiantly tries to distract the reader from realizing that MIPS cores are just too slow for most of the heavy lifting on today’s SOCs. Tom’s truly a wily guy. It’s too bad his processors are so big and slow.

Then the Herring article weighs in with a quote from Carl Schlachte, ARC International’s fifth CEO in four years. However, Mr. Schlachte’s approach to discussing the world of configurable processor cores closely resembles his many predecessor’s—he wraps a lie inside a truth sort of the way Tootsie Roll hides a soft chewy center inside a hard candy lollipop. Schlachte’s quote is: “We have been at this the longest. We can do it better than those guys.” It’s true that ARC’s had a configurable core longer than anyone. They developed a simple configurable processor core back when the company was called Argonaut Software, a video-game developer. That core was useful for developing new video games but the configuration tools were clearly never meant to be used outside of the company. Except for a few configuration options, any application-specific extensions made to an ARC core must be built by hand and manually verified.

Being first doesn’t mean being best. Tensilica has elevated automatic processor core extension to a fine art, especially with last year’s introduction of the XPRES Compiler, which automatically analyzes C or C++ code and then generates optimized processor extensions to accelerate the execution of that target code.

If you look at where Tensilica’s processors are being used in products today, you’ll see the target applications all require significant data processing: image and video compression/decompression, audio processing, and network packet processing. The Xtensa processor also runs operating systems and has displaced MIPS processors in that role. SOC developers see the Tensilica approach as a very fast way to develop the high-performance processing blocks they need for their SOC designs. On average, Tensilica’s customers put six processor cores on each of their SOC designs and some put a lot more (like a couple hundred). Tensilica isn’t so much competing with ARM, MIPS, or ARC as it is fundamentally changing the way SOCs are designed in the 21st century.

Monday, March 21, 2005

Clive "Bebop" Maxfield rides again

As the Editor in Chief of EDN in the mid 1990s, I had the privilege of publishing some of Clive Maxfield's earlier writing under the bombastic title of Designus Maximus. It was an appropriate name for Clive's expansive personality. Clive has published several books over the years, which you can find on Amazon, but he started a new role today as a biweekly columnist for John Miklosz's SOC Central Web site. In this first column, Clive does a very respectable job of discussing how much SOC design has changed in just the last six years. We've gone from dipping our toes in deep-submicon design issues to neck deep, and the water's rising. All of these issues portend great changes in the way we'll be designing SOCs in the coming years.

Saturday, March 19, 2005

Shoe Biz

Adidas, the German shoe manufacturer, has just rolled out a $250 pair of athletic shoes with active cushioning, controlled by a microprocessor embedded inside the arch of each shoe. A magnetic sensor measures the impact in each step and the microprocessor adjusts a cable-tensioning system to add or remove cushioning as needed. The integral, replaceable battery lasts 100 hours, which is how long a top-line pair of running shoes like these Adidas-1 shoes is supposed to last anyway.

Compared to what?

I think Jack Ganssle is one of the smartest guys in the world of embedded development. He’s certainly one of the best writers. Jack’s latest column at www.Embedded.com, titled “Software is Cheap,” reverses something he’s been saying for a long time. Previously, Jack’s often been known to say, “Software is the most expensive thing in the universe.” In his latest column, Jack writes software-development expert Tom DeMarco’s retort to that statement: “Compared with what?”

In this latest column, Jack’s backpedaling on his long-held view. He notes that software is expensive because of its incredible complexity. A standardized microprocessor core running a simple 7-line C program implements a bubble-sort algorithm that would take a substantial amount of hardware to replace. Jack, who tried to design the bubble sorter using only hardware and no CPU, writes:

“In an hour I managed a rough block diagram, one above the chip level (blocks have names like ‘adder’ and ‘16-bit latch’). But the sequencing logic is clearly pretty messy so I've just tossed in a PLD, assuming at some point it wouldn't be too hard to write the appropriate equations. And, yes, perhaps that breaks the no-programmable-logic rule, but to design and debug all that logic using gates in any reasonable amount of time is as unlikely as buck-a-gallon gas.

Translating the rough block diagram to a schematic might take a day. Then there's the time to design and produce a PCB, order and load parts (and change the design to deal with the unexpected but inevitable end-of-life issues), and then of course make the circuit work. We could be talking weeks of effort and a lot of money for the board, parts, and appropriate test equipment.

All this to replace seven little lines of code.”

I thoroughly agree with Jack. You don’t need software to replace simple functions (no one would develop a software-based UART any more, though we once did). However, using code running on processors is the right way to handle complex tasks, on a board or on a chip. Most algorithms start as programs written in C or C++ and then are either compiled for a target embedded processor or hand translated into Verilog or VHDL. Manual translation of complex algorithms increasingly seems like a fool's errand to me.

Here comes the IP rep

EETimes reports that former Synopsys executive John Atwood is starting a company named The LogicWorks to serve as a "specialized sales channel partner" for smaller IP companies that do not have direct sales forces. I see this as another step in the normalization of IP for SOC design. Atwood's business looks a lot like the manufacturers’ reps (and these days, the big distributors) who sell ICs for the smaller IC companies that likewise can’t afford direct, worldwide sales teams.

Frankly, the controversy over IP use has always baffled me. Using IP blocks to simplify and accelerate SOC design seems no different to me than buying LSI chips to use in board-level designs. Someone else went to the trouble to design that IP. If the IP design is a good one (a thoroughly tested design with the proper documentation and a verification test bench), you’ll save time in designing your SOC. Poor quality IP designs are just like bad chips. They waste your time and money.

These days, time is precious. (Honestly, when was it not?) A missed market window incurs a huge expense. So do chip respins. Missed opportunites literally kill companies. Yet there are still pundits out there that decry IP as some sort of unproven concept. Sorry, I don’t get it.

Friday, March 18, 2005

And we're on the air...

Hello and welcome to the blog dedicated to the art, science, and business of SOC design. In this blog, you'll find my take on the events and products that directly affect IC designers working on megagate digital and mixed-signal ICs.