The spectral region above about 30 GHz hasn’t been densely populated, or at some frequencies, populated at all. There are well known reasons for this, from inhospitable propagation conditions to the lack of useful semiconductor technologies, and the fact that it’s extremely difficult to build components and systems when wavelengths are measured in thousandths of an inch. However, there is more interest in the millimeter-wave region today as the fifth generation of cellular moves to 38 GHz and beyond and the Department of Defense addresses new threats from adversaries. Although millimeter wavelengths offer new opportunities for manufacturers of microwave hardware, it also presents significant challenges to which connector companies are not immune.
While it might be logical to assume that as everything from components to systems is smaller at millimeter-wave frequencies, there will be significantly fewer opportunities for cables and connectors. That is, at these frequencies, functions performed by discrete components are typically integrated in SoCs that include all radio functions, from baseband through RF. However, even with this high level of integration, interconnects will still be required at various places in the network, and millimeter-wave components will need to be tested, and in greater numbers than in the past.
It All Began When….
Connectors designed for millimeter-wave applications are widely associated with measurement systems and vector network analyzers in particular, as well as connecting integrated microwave assemblies in defense systems and others. In the test and measurement industry, the goal is to develop instruments based on what customers will need as early as possible, before those needs arise. So as operating frequencies increase, network analyzers with measurement frequencies higher than the device’s operating frequency must be available to test them. This in turn requires a way to connect them. In short, connector development has been in lockstep with the needs of test equipment manufacturers, and many of the most impressive connector developments over the years have been by instrument manufacturers themselves.
So it’s not surprising that in addition to efforts by manufacturers to make connectors for low frequencies better, much of the leading-edge work on precision higher-frequency connectors is conducted for the measurement community. In particular, the C83.2 and IEEE P287 connector committees working in the 1960s and then later, from the 1980s to today, established a forward-looking approach for precision connector development. Their initial goals were, among other challenges, to revise IEEE Standard 287 to make it more representative of the current state of the art, to standardize laboratory and general-purpose precision connectors to reduce the growing number of variants, and to ensure that advances would make their way into production. It’s interesting to note that even in the early days, their efforts extended to 110 GHz, which at the time was virtually vacant. It is this work that resulted in many of the most substantial advances in millimeter-wave connectors and paved the way for others that are used today. Various millimeter-wave connectors are shown in Table 1.
For measurement applications, coaxial cables and connectors offer big benefits over waveguide, as they cover broad bandwidths rather than designated waveguide bands, are smaller and weigh less, and are flexible and easier to work with. The latter advantage makes bench measurements considerably less cumbersome than when waveguide or even semi-rigid cable is used.
The SMA, the world’s most widely-used microwave connector, was the basis for many millimeter-wave connector types. Even though today’s very high frequency connectors are different in many ways from their SMA ancestor, most have at least something in common with it. The earliest millimeter-wave connectors were the BRM/OSM/SMA types that appeared in 1958, and later the BRMM/OSSM/SSMA family introduced in 1960. They extended mode-free operating frequencies to 18 to 24 GHz and 26 to 34 GHz, respectively. The next advance was the MPC2 connector developed by Maury Microwave, whose founder Mario Maury was a leading light in microwave measurements.
It could operate to 40 GHz but could not mate with other connector types, so the company developed a 2.92 mm air interface connector (the MPC3) that was compatible with the SMA and mode-free to 40 GHz. Unfortunately, there was little or no commercial instrumentation available at the time, so until Wilton (later acquired by Anritsu) “relaunched” it as the K connector to support its new instruments, it wasn’t terribly successful.
After this, Kevlin Microwave released an SSMA-compatible air interface connector, the KMC-SM, that operated mode-free to frequencies of 40 GHz, and this was followed by the 3.5 mm connector developed at Hewlett-Packard in the 1970s. It was later marketed by Amphenol as the APC 3.5 that is mode-free to 26.5 GHz. As is usually the case, the APC 3.5 became successful as it was supported by a leading test equipment manufacturer.
The 2.92 mm connector, best known as the K connector, was developed by Wiltron under the leadership of Bill Oldfield and satisfied the connector needs of Wiltron’s latest scalar network analyzer, whose measurement range was up to 40 GHz. The K connector is SMA compatible, has an air interface, and is mode-free to 46 GHz. After this, HP, Amphenol, and Omni-Spectra (acquired by MACOM) introduced the 2.4 mm connector that reaches 50 GHz, this time led by Julius Botka and Paul Watson at HP.
The 2.4 mm connector was a significant contribution in many ways. It was designed to satisfy the needs of three different applications—production, instrumentation, and metrology—and introduced a new interface type that eliminated the need to mate with an existing connector, and so could be improved unimpeded over time. Bottka and his team expanded this concept to create the 1.85 mm connector (Figure 1) that is mode-free to 65 GHz. From a measurement perspective, it was actually Wiltron that first introduced it to meet the needs of a 60 GHz network analyzer, calling it the V connector. The 1.85 mm connectors are compatible with 2.4 mm connectors.
Millimeter-wave connectors exceeded the 100 GHz benchmark in 1989 with the 1 mm connector credited to Watson at HP. It is mode-free to 110 GHz and was a mechanical tour de force, pushing the tolerance limits of fabrication technology, as it is incredibly small. It has been a major benefit to probe stations for the evaluation of MMICs, reducing several sets of waveguide-based measurements to a single step.
Another major development, credited to Omni-Spectra, is the blind-mate connector that makes it possible to connect an entire multi-connector subsystem to another, which would otherwise be difficult and sometimes impossible. Blind-mate connectors are self-aligning and guide themselves properly into position (Figure 2). They are now commonly used in rack-and-panel and module-to-module situations, allowing multiple RF connectors to be simultaneously connected with precision. Blind-mate connectors include the OSP (22 GHz), OSSP (28 GHz) and OS-5OP (40 GHz), as well as SMP (aka GPO), and mini-SMP (aka SMPM). Their mode-free operating range has increased to 65 GHz over the years.
Small Waves, Big Challenges
Connectors designed for high millimeter-wave frequencies, such as the 2.92, 1.85, and 1-mm types, are precision-fabricated devices and like all such works of art must be treated with care. Consider that a full wavelength at 10 GHz is about 30 mm but at 100 GHz is about 3 mm, and the diameter of a center connector at 70 GHz is a miniscule 0.5 mm.
Unlike their lower-frequency counterparts, millimeter-wave connectors cannot withstand even a fraction of the punishment to which many lower-frequency connectors are subjected. Flight-line test systems are a classic example of such an application, as test cables are notoriously used for pulling equipment carts along, and connectors aren’t carefully mated and unmated. It could be said that for $5,000 a set, VNA cables should be able to withstand this, and they can, but try this with millimeter-wave cables and connectors and it’s game over.
Even dust particles and tiny, almost imperceptible scratches on a connector interface can degrade performance at millimeter wavelengths, and it’s wise to inspect connector pairs under a microscope, preferably every time they’re inserted. Tightening with a torque wrench is the recommended procedure at lower frequencies, although it’s not always used. For millimeter-wave connectors, it’s mandatory.
For fabrication, a variety of factors must be considered that collectively determine the materials from which the connector is made, including smoothness, thermal conductivity, metal characteristics, and their ability to withstand high temperature without distortion or failure. Common choices for the connector body include brass and stainless steel, and for spring contacts bronze copper beryllium is increasingly replacing copper beryllium.
The copper, tin, and zinc alloy (white bronze) is often used for plating today, as it can provide better performance in areas such as wear and intermodulation. However, some manufacturers also still use nickel and silver. To meet the rigors of defense applications and the MIL-STD-348B standards and other hostile operating conditions, stainless steel is typically used as it is more rugged and can better ensure that a 50 ohm impedance is maintained over a broad operating bandwidth.
Keyed connectors are experiencing some momentum for certain applications, although currently not in the millimeter-wave region. They have the advantage of reducing the possibility of incorrect mating and can be more resistant to shock and vibration. This approach could be valuable for comparatively fragile millimeter-wave connectors to prevent improper mating, which can ruin the center conductor and possibly other parts of the connector as well.
Stainless steel connectors are also not plated, which eliminates the possibility of flaking of plated material. In measurement applications, the connector industry has performed the remarkable feat of delivering products that even with thin structures and air spacings required to ensure low insertion loss can maintain their stringent phase and amplitude requirements through hundreds of mating cycles.
Interest in millimeter wavelengths extends far beyond 110 GHz to hundreds of gigahertz, so is it humanly (or mechanically) possible to create a connector to reach at least somewhat higher frequency? The answer appears to be yes, as is illustrated by the 0.8 mm connector developed by Anritsu that is mode-free to 145 GHz. The 0.8 mm connector has an air dielectric front side interface like K and V connectors, with the center conductor supported by a proprietary, low-loss support bead on one end and a PTFE bead on the other. As the support bead is made of high temperature material, it can survive exposure to 200o C for short periods.
Even smaller connectors are in development that, incredibly, measure 0.6 and even 0.4 mm, and they be usable at frequencies up to hundreds of gigahertz. It’s almost inconceivable that such connectors could be fabricated let alone handled by humans, and at these frequencies it’s likely that systems, whatever they may be, will need to be fully integrated from baseband to antenna.
Measurement systems may need to once again resort to waveguide, for which there are designations into the terahertz range (WR-051), but even a WR-3 waveguide that covers 220 to 330 GHz has internal dimensions of 0.8 x 0.4 mm. It’s also safe to assume that both connectors and waveguides at such frequencies would be extremely expensive, and coaxial cable is simply out of the question as loss would be measured in decibels per inch rather than feet. That said, where there’s an application, smart minds will find ways to accommodate it, as millimeter-wave connector manufacturers have demonstrated for decades.