EMBEDDED SYSTEM & ROBOTICS

HARDWARE - IC DESIGN

DSP & APPLICATIONS

HCI

Technology

Design Virtualization Technology: VMWare for SoCs

by Paul McLellan

It was way back in 2001 that Pat Gelsinger, then CTO of Intel, pointed out that if we kept increasing clock rates that chips would have the power density of rocket nozzles and nuclear reactor cores. Ever since then power has been public enemy #1 in chip design. In 2007 Apple announced the iPhone and the application processor inside it, and smartphones became one of the most intense battlegrounds for power. After all, the length of time that a battery lasts is much more visible to the consumer than, say, the power dissipated by the chips in their wireless router. But routers are not immune to power either, at least at the datacenter level.



There is a sense in which all chips today are low power. A chip for a hearing aid might have power measured in millwatts whereas a chip for a datacenter might have a budget of 150W. But both chips are face the challenge of meeting their performance, very different of course, under their power envelope.

There are many techniques for power reduction, way beyond the scope of a single blog. But some of the most confusing are the selection of libraries, process technology, signoff corners, giving up yield for power, multiple voltage rails and so on. Basically, what underlying fabric should be used to construct the design.

eSilicon have a huge amount of data on this sort of thing based on the large number of designs they have run and also on a lot of additional characterization that they have done in addition so that they can easily estimate the effect of, say, reducing the upper temperature for characterization to 100°C (nobody's cellphone can run that hot or your pockets catch fire) or lower bound to 0°C (when did you last see a datacenter with icicles), or lowering the margin of error on the voltage regulator from 5% to 3%. They call this design virtualization technology. Today this is provided as a service but under the hood is a lot of software and a huge amount of characterization data of all types, far more than is practical to process by hand or even in Excel.


Rather than give the marketing pitch, I think it is good to actually show a real-world example of the technology in action on a real design. eSilicon worked with a customer on a design to go into a networking design. When the design was essentially complete the first power estimate came up at 130W. which was too much for the power budget of 75W.

Design virtualization technology to the rescue:

  1. Where is the power coming from? 95% of the power turned out to be coming from a single 450MB memory.
  2. Can we customize the memory? Yes. So eSilicon did that using an off-the-shelf memory from their extensive portfolio of memory IP that they develop internally. They then removed all the peripheral logic that supports options not required for this particular design. There were also device swaps in the periphery by using libraries with multiple thresholds.
  3. This got the power down to 90W, but the customer target was lower at 75W. eSilicon’s design virtualization technology analyzed the design. By applying low power techniques such as lowering the core voltage a tiny bit, using more multi-Vt libraries and so on they achieved 75W. The chip was 3 days away from its scheduled tape-out. But at this stage life looked good.
  4. Marketing came back and said the power budget had to be 35W. eSilicon fired up their design virtualization technology again. If they got aggressive at voltage, temperature and process corners could they get there? What about frequency? Was there any flexibility to reduce that and still meet the performance requirements?
  5. It turned out they needed to do all 4. Voltage down 2%. Tighter process window, eating a tiny potential yield loss, temperature maximum of 105°C, Frequency from 500MHz down to 400MHz. Power at 35W.
  6. Tapeout on schedule 3 days later. Success.


One thing that I learned covering from this design is that 3 sigma really is very conservative. If you reduce that to 2 sigma then the maximum yield loss is under 5% but the potential gains in power (and performance) can be significant. But you need a lot of data to make that kind of change with confidence, and design virtualization technology is your "ring of confidence."

The eSlicon White Paper on power reduction using design virtualization technology is here.

  

From: https://www.semiwiki.com/forum/content/4614-design-virtualization-technology-vmware-socs.html

Everything You Need to Know About Smart Home Networking

Learn the difference between Wi-Fi, Bluetooth, Zigbee and Z-Wave

Right now, as you kick back on your couch and daydream about your next smart home upgrade, you may not realize it, but you’re awash in data. From Wi-Fi-enabled thermostats to Bluetooth-accessible door locks to Z-Wave-connected alarm sensors to Zigbee-networked lightbulbs, there could be an array or wireless signals criss-crossing your house.

Why do we need so many different technologies that essentially do the same thing?

On the face of it, that’s a reasonable question, but it’s also analogous to asking the difference between a ball-peen and a sledge hammer — both are used to bang on things, but you wouldn’t drive a fencepost with a mallet. Likewise, the various wireless networks that make up a smart home each have their own use.

And just as those networks are invisible in the real world, solutions likeLogitech’s Harmony Home Hub, which can talk to a database of more than 270,000 connected devices, are trying to make them invisible to smart home owners by controlling all these separately-networked smart home products — everything from connected coffee makers to smart window shades — through one interface. But until you get a home hub solution like the Harmony, it’s worth knowing why smart home product designers choose the networks that they do:

Wi-Fi: “Wi-Fi is a whole-home network,” says Chris Coley, principle engineer and architect with Logitech. Primarily used for media streaming, browsing the web, and other data-heavy activities, it’s a high-bandwidth network that’s power-intensive — just watch how fast your laptop battery dies when you’re watching a video on Netflix.

Many smart home products eschew Wi-Fi-connectivity because it would require their devices to have a dedicated power source or a long-lasting battery. This is why most Wi-Fi webcams aren’t actually wireless — they need to be plugged in to an electrical outlet. It’s also a reason why the Nest Learning Thermostat is such an ingenious device: its developers created a method of power-sipping from the low-voltage electrical cable that has historically powered home temperature controllers.

Until recently, says Coley, Wi-Fi chips have also been relatively expensive, another reason why tech companies seeking cheaper alternatives have turned to other wireless technologies for their products.

Bluetooth: In linking the smartphone in your pocket with the computer on your desktop and the headphones on your ears, Bluetooth makes secure connections between nearby devices. “It was originally developed to be what people call a personal area network,” says Coley.

But in the smart home, Bluetooth appears on products that require a person to have a close physical proximity to the device, like the Kwikset Kevo smart door lock. And because Bluetooth uses frequency hopping and government-grade encryption to help ensure no one can intercept or unscramble your interaction with your smart home gear, Bluetooth is also very secure. It also has higher data bandwidth than Zigbee and Z-Wave (though lower than Wi-Fi), allowing Bluetooth-enabled products to do more than simply flip a switch or report movement.

The newest version of the protocol, Bluetooth LE, which stands for “low energy,” uses very little power in comparison to Wi-Fi. The developers behind it also recently announced it will be able to form “mesh networks,” a capability that puts it in further competition with Zigbee and Z-Wave. Mesh networking is where a device has the ability to receive a networked signal and also send out the same signal, extending the range of that network. For these reasons, Bluetooth is not only increasingly popular in smart homes, but many smart phone accessory makers are tapping it for their products.

Zigbee and Z-Wave: “A lot of home control devices that are primarily Zigbee/Z-Wave are primarily driven from a range and power consumption perspective by using a mesh network,” says Coley. Both very low-powered wireless networks (their devices can run for years on a little watch battery), Zigbee’s and Z-Wave’s ability to mesh network have made them great for reaching far-flung sensors in the smart home — and they’ve been doing it for years already, making them the incumbent technology against Bluetooth’s inroads.

But before you ask “Can’t Wi-Fi be extended?” consider the difference in how that’s done with mesh versus extending devices. Wi-Fi extenders pull in a signal but end up kicking only about half the data rate back out, meaning netowrks stretch farther but at the cost of performance. Mesh networks like Zigbee and Z-Wave don’t experience this kind of signal loss — partly because they are very low-bandwidth to begin with. And that low-bandwidth makes these two standards great for simple devices like window and door motion sensors, or smart lightbulbs that only need data connections to turn on or off.

One problem with Zigbee and Z-Wave, however, is that their signals aren’t directly compatible with any mainstream computing device, like a smartphone, tablet, or laptop. So, the bulbs and motion sensors need to communicate with a hub that is either connected to your home network via Wi-Fi or through an ethernet cable plugged into to your Internet router. This is precisely how Philips Hue smart lighting products work — the all the various bulbs and lights use Zigbee talk to the hub, which connects to an Internet router. Then, when you use the app to turn on the light or change its colors, the command runs from your phone through your router, to the hub, and ultimately to the bulb.

For purposes of this explainer, Zigbee and Z-Wave have been lumped together, but they are not the same thing. While they’re generally similar in power, range, and low price (making them attractive for product manufacturers), they’re incompatible with each other, and there are some technical differences that attract product developers to one over the other.

In the long run, which of these four wireless network standards will emerge as the clear victor? The answer to that is unclear, and it’s quite possible that none ever will. As their technologies continue to evolve, the variety of variables — from price drops to improved power usage to faster data rates — make product developers continually re-evaluate which wireless chip is the best for their needs. “It’s a horse race that’s changing all the time,” says Coley.

John Patrick Pullen from http://time.com/