Friday, April 29, 2005

Nanotechnology– What’s Getting Attention and Investment

In a survey by R&D Magazine about areas of current nanotechnology research, nanomaterials and nanostructures top the list at a 56% reader’s response rate. In second place is nanobiotechnology (25%), followed by nanoelectronics (22%), nanosensors (22%), and nanoscale instrumentation (11%). Nanotechnology is the science of building and designing at the nanometer level at which point the physical properties (magnetism, conductance, optics, and others) of a material changes. For example, nano-level materials can be engineered to provide greater strength than steel but at a lighter weight. Wiring can be developed to provide greater conductivity and less resistance. Changes in optics, magnetism and other properties can make materials stronger, lighter, cheaper, etc.

The R&D magazine survey goes on to discuss instrumentation needs. Microscopy/imaging systems top the list at 48.9% of the readers responding, followed by:

Analytical instruments - 40.1%
Thin-film characterization - 30.5%
Electronic measurements – 28.7%
Physical characterization systems – 26.1%
Optical characterization systems – 25.4%
Topographical measurement systems – 15.8%
Mechanical characterization systems – 14.7%
Magnetic measurement systems – 12.9%
Failure analysis systems – 9.9%

Working at the nano-level will require new instrumentation that can detect signals with greater sensitivity given the low level signals that can be generated. The Atomic Force Microscope is one example of this type of instrumentation.
Pacific Nanotech uses LabVIEW to control the positioning of the machine and the movement of the cantilever device for making measurements. Another example is SSI Robotics who uses LabVIEW for nanotech applications. In this one, they wrote LabVIEW drivers to control the positioning of the robotic system.

The instrumentation industry has responded to the growing research and development in the Nanotechnology area. This article describes the need for greater collaboration between instrument vendors and researchers.

A great deal of FUD (fear, uncertainty, and doubt) has been cast over the Nanotech industry. In a recent discussion with a customer who provides aerosol systems which can produce droplets at the nano-level, he indicated his company keeps quiet about their ability to work at the nano-level for fear of negative publicity. There’s an interesting blog on these topics from the Center for Responsible Nanotechnology.

Nanotechnology is a well-funded sector with over a $1B per year invested by the US government through the National Nanotechnology Initiative. At that level of funding, there will be more innovations and technologies from this sector.

If you are working in Nanotechnology, I would like to hear from you. Please contact me at hall.martin@ni.com.

Best regards,
Hall T. Martin

Friday, April 22, 2005

USB Continues to Innovate – Look for Wireless USB and USB On-the-Go

You know a technology has some promise when it makes your personal life better. I recently switched from an analog phone line to a VOIP through my cable modem. I was watching TV one day. While sitting there thinking about how much money I was saving I noticed my TIVO was no longer able to make a service call to update the show guide. After several calls to the TIVO service center it became clear that TIVO couldn’t download the service guide over a VOIP line. The support engineer on the phone wasn’t sure why this was so, but he was sure it would not work. He recommended I buy a Wireless USB device for the TIVO which would establish a connection through my wireless hub on my cable modem. It worked like a charm. I was impressed with how easy it was.

The Universal Serial Bus (USB) continues to innovate bringing new variations to the industry. In the promotion stage is Wireless USB (WUSB). It is a star topology network that can connect up to 127 devices to create a “cluster”. To achieve the high data rates (pegged at 480 Mpbs in the initial phase, but reaching above 1 Gbps in future implementations), the units in the cluster must be in close proximity to one another. Also, multiple clusters can be created within a group of units. It is meant to be a wireline replacement supporting digital multimedia, audio, and video formats, as well as data transfers. WUSB could handle multiple HDTV streams, each running between 19 and 24 Mbps. A more detailed definition can be found here.

Existing USB 2.0 could be upgraded to Wireless USB with the addition of an adaptor. An Israeli company called Wisair showed a demo of their device connecting a digital camera, inkjet printer, and hard drive to a USB hub wirelessly connected to a host PC. You can read more about it here.

The industry is about to approve the specification for WUSB 1.0. This article
has more about it, which should happen in May of this year.

It’s still in the “promoter” phase, but given the traction behind USB, it could overtake Bluetooth which has been around for awhile now but still has a difficult time gaining traction beyond cell phones.

The other technology to look for is USB On-the-Go (OTG). USB(OTG) was developed to enable portable devices such as handhelds, cell phones, and digital cameras to communicate. USB OTG is USB with the following additions:

1. Ability to be host or peripheral and switch between the two – to enable point to point communications.
2. Lower power consumption mode – for battery powered devices
3. Smaller form factor -- for mobile device connectivity

There’s a nice Q&A for USB OTG here.

As always, if you are working with USB, I would like to talk with you further. Please contact me at hall.martin@ni.com.

Best regards,
Hall T. Martin

Friday, April 15, 2005

Software Defined Radio – What’s it all about

Software Defined Radio began in the late 70s in the military and continues to evolve today primarily in the defense industry. The Joint Tactical Radio System
(JTRS) is one of several programs continuing the development. The JTR hopes to deploy new radios that use software to change frequency and modulation and support both narrowband voice and broadband data requirements. The Joint Tactical Radio will include software application waveforms with Wideband Networking Waveform (WNW), network services, and the programmable radio set. You can find a more detailed architectural overview here.

Joe Mitola, now with, Mitre coined the term Software Defined Radio in the 90s and began by building them for the military. His definition is used throughout the industry which states,

“Software Defined Radio is a radio that is flexible (programmable) to accommodate various physical layer formats and protocols.”

His book, Software Radio Architecture: Object-Oriented Approaches to Wireless Systems Engineering talks about the roots of the technology and its early days.

One of the challenges in implementing SDR is the need for a fast and flexible architecture. Researchers at Rice propose combining the DSPs and FPGAs to a stream-based architecture to achieve the performance that DSP and FPGAs can’t reach alone. You can see how they implemented the Imagine Media processor from Stanford to test their theory, by clicking
here.

The next step beyond Software Defined Radio is Cognitive Radios. Cognitive Radios go a step further by learning the waveforms and protocols for adapting to the local spectral activity and needs of the user. For example, a cognitive radio could identify empty spectrum to communicate more efficiently. This
EETimes article provides more info.

If you are interested in learning more, you can find a rich set of resources for Software Radio can be found here.

As always, if you are working with Software Defined Radio, I would like to talk with you further. Please contact me at hall.martin@ni.com.

Best regards,
Hall T. Martin

Friday, April 08, 2005

Experimentation Matters – A Book Review

I recently read the book Experimentation Matters: Unlocking the Potential of New Technologies for Innovation by Thomke

In summary, the book argues that experimentation is the key to innovation and that successful companies put processes into place to encourage it. Aside from the writing style which is quite pedantic, the author makes several key points:

1. Anticipate and Exploit Early Information through Front-loaded Innovation Processes – by performing experiments early on in the development process (front-loading), one can build a better product faster rather than performing testing at the end of the process.

2. Experiment Frequently but do not overload your organization – more repetitive testing early-on in the development cycle can save money but there is a limit to how many experiments can be run.

3. Integrate new and traditional technologies to unlock performance – new experimental technologies enhance existing processes but rarely replace traditional processes altogether.

4. Organize for rapid experimentation – experiments must be done with rapid feedback or the benefit of learning is lost.

5. Fail early and often but avoid “mistakes” – early failures are desirable because they generate learning. Mistakes are defined as a misuse of resources that generate little or no learning.

6. Use projects as experiments – the book discusses several examples in which a company put out a tool that lets customers experiment on their own. This generates tremendous amounts of learning and innovation because now the company has empowered customers to create their own innovations based on the company’s tool.

I particularly like #5 because so often people judge the result of an experiment by whether or not it becomes a sellable product when the real value is in the learning. Sometimes we learn more from the failures than the successes. The goal of experimentation is to generate learning and if the project accomplishes that than it’s a success.

The author goes on to outline factors that affect learning by experimentation:

1. Fidelity of experiments – the degree to which a model and its testing conditions represent a final product, process, or service under actual use conditions.

2. Cost of experiments – the total cost of designing, building, running, and analyzing an experiment, including expenses for prototypes, laboratory use, and so on.

3. Iteration Time – the time from planning experiments to when the analyzed results are available and used for planning another interaction.

4. Capacity – the number of same fidelity experiments that can be carried out per unit time.
Strategy – the extent to which experiments are run in parallel or series.

5. Signal-to-noise ratio – the extent to which the variable of interest is obscured by experimental noise.

6. Type of experiment – the degree of variable manipulation (incremental versus radical changes); no manipulation results in observations only.

Finally, the author reminds us of the realities of new technologies:

1. Technologies are limited by the processes and people that use them.

2. Organizational interfaces can get in the way of experimentation.

3. Technologies change faster than behavior.

The only drawback in reading the book is the length. The author has a knack for stretching out a good story. What could be told in three or four pages ends up as forty pages or more. Nevertheless, the points are important for those working in Emerging Technologies. I recommend skimming it.

If you are working with Emerging Technologies, I would like to talk with you further. Please contact me at hall.martin@ni.com.

Best regards,
Hall T. Martin

Friday, April 01, 2005

What will replace WIMP?

WIMP (Windows, Icons, Menus, and Pointer) is still pretty much the standard interface technology in today’s computer world. It’s been around for twenty years and it seems there would be something coming up to take its place. I reviewed a number of emerging technologies for Graphical User Interfaces and found there are some interesting efforts out there. The most common approach is to take the two dimensional screen of the computer and convert it into a three dimensional world in which the user participates rather than simply views. Other techniques take the data out of the computer and imbue it into the user’s environment – Ambient
as an example has a lamp that glows a different color based on the current status of the stock market.

I find 3DNA’s tool interesting in which you can turn your files and folders into a 3D view and interact with it as if you were walking through the files and folders in a three dimensional world. They have free downloads if you want to try it out.

Microsoft has a research team working on a similar path.
The Microsoft Task Gallery creates a 3-D room in which documents line the wall and can be retrieved for review. I could see users programming their graphical code by moving the icons around in 3-D. Also, data files could be manipulated this way. Is this a step forward for us? I don't know but it may be interesting to experiment with some of these techniques.

Zui puts an interesting twist on the 3-D perspective by replacing the „windows“ paradigm with a zooming technique that allows one to change the perspective from distant to up close and thus keep track of the files and folders without having to open, close, or manage a series of windows.

With the emergence of ubiquitous computing through the use of handheld devices and mobile phones, comes the need for a different GUI paradigm. Here WIMP won’t cut it. One approach is the use of Software Lenses that makes more efficient use of pixels (there are fewer of them on your phone than your desktop). Also, the WIMP interface is designed for personal use, while the Software Lens concept is designed for collaborative use.

Animation is another technique used to take GUI’s to the next level. In this paper, the author uses animation within an icon to show its status and also describes the use of auditory signals to augment the visual information.

I also looked at the
Project Looking Glass
by Sun. It has some interesting interactivity features though I'm not sure how I would apply this to virtual instrumentation.

If you are working with graphical user interfaces, I would be interested in talking with you about this topic. You can reach me at hall.martin@ni.com.

Best regards,
Hall T. Martin