1. Home
  2. Opinion
  3. Measure of Success

Measure of Success

Measure of Success

by David Vye, Director of Technical Marketing, AWR Group, NI

Early in my engineering career during the mid-1980s, I was tasked with developing test and measurement systems to support the emerging gallium arsenide (GaAs) metal-semiconductor field-effect transistor (MESFET) technology at the heart of the defense monolithic microwave integrated circuit (MMIC)/microwave IC (MIC) market. The goal was to better understand the impact of processing technology on yields by conducting a set of on-wafer DC measurements, correlating this data with the wafer location of each device and subsequent RF measurements (1 dB power compression, gain, and more). 

Over time, as measurement methods improved and the amount of collected data grew, I (and my employer) became more sophisticated in refining the measurements on metrics that provided the most reliable prediction of RF performance. Reams of data were distilled into a real-time quality control indicator that guided process and device development efforts. 

With the emergence of electronic design automation (EDA) software, including harmonic balance simulation, the focus for some RF measurements shifted towards device characterization and modeling. With few commercial tools on the market, many companies developed their own systems based on DC, pulsed-IV, and vector network analyzer (VNA) measurements that provided the data for home-grown parameter extraction and optimization software programs. 

Some ten years later, in the mid-1990s, Krylov subspace methods vastly improved the speed and capacity of harmonic balance engines for solving nonlinear problems, making EDA software the tool of choice for developing MMICs and MICs. Test and measurement took on the role of providing verification as to which design on the “pizza mask” was the winning candidate for production.

Then came the digital wireless revolution, with numerous performance metrics necessitating new and more complex measurement technologies. For nearly twenty years, the demand for greater data capacity pushed the performance requirements for communication ICs, impacting both measurements in the test lab and those derived from simulation software. 

Today, it is reasonable to expect that device requirements will only continue to become more stringent, requiring faster measurement systems and simulation technology to capture a broader range of metrics in a reasonable time.  Hand in hand with the ability to measure an increasing number of performance metrics faster, engineers need to analyze the data in an efficient and insightful manner. Advanced data management and graphical representation must evolve to keep up with ever more complex performance specifications. 

For example, within my current position at National Instruments, I see this challenge through the EDA software solutions we develop to help our customers get high performing products to market faster. The latest release of NI AWR Design Environment software now enables users to readily construct data display dashboards of linked simulation results, providing a powerful visualization of how design decisions impact different performance metrics. 

This is just one such example of how software automation is playing a critical role in supporting evolving market needs and associated design challenges. With the release of V14, the NI AWR software development team continues to enhance the speed, accuracy, and levels of automation that engineers need to meet their next design goals. After all, to quote John Foster Dulles, “The measure of success is not whether you have a tough problem to deal with, but whether it is the same problem you had last year.” 

For more information about NI AWR Design Environment software and the new features within the latest V14 release, visit ni.com/awr as well as awrcorp.com/whatsnew.