1. Home
  2. Featured Articles
  3. RF Network Testing: Techniques and Procedures

RF Network Testing: Techniques and Procedures

142
0

by Anritsu Company

Standardized testing has been a part of RF communications since the first days of communications, and to ensure reliable communication, the components’ health must be tested and verified. Until recently, this applied to the communication hardware but not the RF network, the latter being the hardware between the transmitter, receiver, and antenna. The RF network transports the modulated RF energy to the antenna or from the antenna to the receiver, which allows the antenna to radiate or receive energy effectively.

Proper evaluation of an RF network involves consistently performing standardized tests. If performed in the same manner, the results will be consistent and reflect real-world performance. If not performed properly, they will be of no use to anyone trying to evaluate the system. While the capabilities for testing the integrity of transmission lines, connectors, and antennas are readily available in the field, the results and conclusions of this testing are not consistent. Consequently, cable and antenna manufacturers are often blamed for failed tests even though the hardware has been proven to be compliant and functional. In an effort to improve these valuable tests and establish consistency in the results, the industry has established some basic guidelines for the tests, the results of which are Methods of Procedure (MOP).

Figure 1: Anrtisu’s SiteMaster S331L handheld cable and antenna analyzer

The MOP is needed to ensure a standardized approach to testing. The MOP is a written document of procedures and testing methods that outlines the data and how it is collected. It’s similar to the checklist used by airline pilots to ensure everything is done completely in a specific order. When an engineer or technician performs a deployment verification, hardware validation, or relative testing of an existing system, understanding what and how the tests were performed is critical in obtaining acceptance of the results by reviewers that were not on site. The MOP establishes a consistent foundation for the testing methodology and removes uncertainties that could cause invalidation of the data.

A MOP is not intended to replace proper training in the field of antenna system concepts, nor is it intended to replace training in the use and operation of the test equipment. The MOP is designed as a guideline for trained and experienced technologists and engineers. The MOP must contain several important components to consider as line sweeping is performed. The results may be questionable if these components are not included in the process.

The Test Evolution

For the most part, the RF network has been taken for granted and is expected to be a drop-in component, and very little testing and evaluation was performed. However, while components may be reliably designed and manufactured, they can be damaged when installed or shipped. The simple testing performed on-site was inadequate to evaluate the effectiveness of the hardware.

With the evolution of new test equipment, the ability to test the RF network components and antennas to the degree that it resembles the manufacturer’s testing has led to new procedures and expectations. Field personnel now expect to be able to reproduce the same data that the manufacturers declare on the components. While this expectation is achievable, discipline is required.

Return loss versus frequency is the primary means of testing RF networks because the effectiveness of an RF network depends so greatly on impedance matching. Physics dictates that maximum power is transferred from the point of origin to the destination when the origin, transmission network, and destination impedances are perfectly matched. Return loss (RL) and voltage standing wave ratio (VSWR) measure deviation from a theoretically ideal impedance match.

The higher the absolute value of the return loss, the better the match, resulting in better power transfer. Impedance irregularities anywhere in the RF network will result in power being reflected back to the source. This reflected power reduces the amount of power transferred from the source to its load, so to maximize radiated power, return loss of all components must be established and verified.

The impedance could vary at certain points due to manufacturing variances or because of faults at certain points in the network. Regardless of how or why these occur, the result is a reduction in transferred power. Transmission line testing is critical in determining if there are irregularities and locating where these occur.

Testing is also important to establish a benchmark for future measurements to find changes in or deterioration of components. More important than performing these tests is to ensure testing is done consistently and competently. Without standardized testing procedures, the opportunity for inconsistencies exists. Without the ability to test the installed equipment, the system coverage is a leap of faith.

Test Equipment

System engineers aim to provide site designs that will perform as the customer requires, and test equipment allows the components to be optimized and verified to specific standards used in the design. When the performance of a system equals the designed performance, energy transfer from the source to the load will meet or exceed design criteria, and maximum coverage can be achieved. This is how systems were designed and installed in the 20th century, but not today, when best practices mandate empirical validation versus the design goals.

RF network test equipment migrated from the laboratory into the field beginning in the late 80s and early 90s as integrated circuits capable of performing the required analysis and computation became available. Test equipment that laboratory engineers could maintain and use was now available to technicians and RF installation field personnel. Tests can now be performed in the field that were not even achievable in large, expensive laboratory-grade equipment a decade ago. The next section describes the various test procedures and the instruments that make them as well as the importance of the MOP.

The Wattmeter

Wattmeters have been used for decades to evaluate the quality of the RF network using transmit power. While providing a very crude VSWR measurement, the wattmeter did not allow testing of the receive network or assist in understanding where problems occurred in it. The wattmeter simply measures the power in the forward and reverse directions. Comparing these two measurements allows the calculation of VSWR, which can be converted mathematically to return loss.

While the measured VSWR may be accurate for the position of the wattmeter in the system, it does not allow for an understanding of where a mismatch is located or how bad it may be. If the mismatch occurs far down the transmission line, the full effect of the mismatch may be masked by the loss of the transmission line in between. That is, a mismatch occurring in the antenna cannot be distinguished from cable/connector problems.

Time Domain Reflectometer (TDR)

A TDR inserts a DC pulse into a system and the pulse travels from the insertion point to the antenna and is reflected by any irregularities, shorts, or opens within the system. The speed of the pulse is known as the speed of electromagnetic radiation, i.e., the speed of light. Because of the “velocity factor” of the cable, the pulse is slowed by a known amount, which is included in the calculations.

The device calculates the distance to any faults in the system using the return time and the level of the returning signal. However, a TDR does not take the frequency-specific characteristics into account. A TDR’s pulsed DC stimulus reflects little energy at RF faults or impedance mismatches. Further, the antenna or any other in-line, frequency-selective device (e.g., frequency combiners, filters, or quarter-wave lightning arrestors) reflects almost all the TDR’s source energy.

Due to the square wave nature of the DC pulse, the TDR’s spectral content is splattered across a wide frequency range, but the amplitude is not consistent with frequency and the spectral magnitude and the output pulses tend to roll off rapidly at high frequencies. Less than 2% of a TDR’s pulsed energy is typically distributed in the RF frequency ranges and for these reasons and others the use of a TDR is considered marginal for evaluating RF networks.

Frequency Domain Reflectometer (FDR)

An FDR generates an RF sweep that includes only the frequency range selected by the operator, allowing frequency-selective characteristics to be displayed. The FDR injects RF energy of constant amplitude across the frequency band of interest and analyzes the returned signal to look at each part of the RF system across the band. Measurements include RL or VSWR vs. frequency RL versus distance. FDR evolved as embedded processors became available to handle the higher data rates and complex mathematics required to perform this type of test.

The FDR works much like a TDR as they both inject energy into a system and compare it to the returned energy, but an FDR uses a constant amplitude sweep of frequencies. As a result, an FDR can detect the reactance of components instead of DC resistance or the presence of a short or open. Doing this can quickly give the operator a “snapshot” of how the entire system reacts to the RF bands of interest. By applying mathematics to convert the frequency domain into the time domain, fault location is possible.

The Anritsu Site Master® (Figure 1) is a good example of the FDR devices available. All of these products display insertion or cable loss relative to frequency, VSWR relative to frequency, RL relative to frequency, and distance-to-fault in RL or VSWR relative to distance. Each of these measurements is helpful in evaluating and maintaining a system, and FDR capabilities are being integrated into multipurpose test equipment.

All test equipment, including frequency domain reflectometers, have accuracy specifications published by their manufacturers that should be understood and considered by the end user and included in determining the condition of feedline and other components in transmit or receive systems.

Absolute Versus Relative Testing

An RF network can be tested in absolute and relative configurations, and the accuracy of the information obtained is the primary difference between them. Absolute testing is performed with a 50 ohm precision load as the termination of the product or system being tested. Relative testing uses a non-precision load, such as the antenna, to terminate the system.

Absolute testing relates to using precision terminations and controlled testing techniques and is precise within a controlled environment that emulates the tests performed by the manufacturer and can be used to validate manufacturer specifications.

The RF network is tested in a closed manner that factors out external uncertainties. Absolute testing is never conducted with the final antenna attached. Depending on the measurement, a known good and calibrated open, short, or load termination must be inserted at the end of the network under test. A calibration standard (sometimes called a cal kit) has three different terminations: a calibrated OPEN, a calibrated SHORT, and a calibrated 50 ohm LOAD.

A calibrated load is different from other 50 ohm loads used for line termination because it is an extremely pure load that has not only been made from a precision resistor but is also designed to have known consistent frequency, amplitude, and phase characteristics. Likewise, the calibrated open and short are designed to respond to RF energy in a specific and repeatable way. The quality of measurements is only as good as the quality of the calibration standard.

The calibration standard must be verified before any absolute testing can be performed. A calibration standard that has been stored in a toolbox or never calibrated could be damaged and should not be used for absolute testing. The calibration standard must be treated with the same respect and care as any piece of precision test equipment. Understanding the accuracy and repeatability of the calibration standard’s RF response is crucial in absolute testing. It is critical that the calibration standard be returned along with the test equipment during the regular calibration cycle. This allows the calibration lab to validate and verify each of the calibration standards.

A cross-verification should be performed regularly to ensure an accurate calibration standard. Cross-verification refers to using a second network analyzer or FDR to verify the calibration standard in question. Remember, the calibration standard is a piece of test equipment itself and should be treated as such. Since absolute testing is based on known matching characteristics, the results can be used to compare with a manufacturer’s specifications.

Performing absolute testing on an existing system requires taking off the air and inserting the appropriate termination at the top of the tower. Because of its expense and difficulty, absolute testing is normally reserved for initial commissioning or critical troubleshooting. Feedline manufacturers publish product specifications in various formats that must be properly applied. Some present impedance characteristics such as “50 ohms +/-1 ohm.” Others may supply VSWR instead of RL or impedance may be given in the frequency domain only rather than as a distance-to-fault specification. Regardless of the format, the specification provided by the manufacturer is what should be applied in determining feedline health.

Relative testing relates to performing tests outside of a controlled environment and without controlled test terminations. This type of testing is employed when an installed system is tested without the benefit of calibration standards. In a relative testing environment, the RF system’s matching network or “load” is the antenna itself. Antennas have significantly varying impedances and matching characteristics depending on the frequency, design, quality, antenna type and installation.

They can also be affected by movement, proximity to other objects, including people, and RF signals from other systems. Because of the uncertainty of the antenna as a precision load, the results cannot be referenced back to manufacturer specifications other than the performance specifications of the antenna itself. However, this type of testing is beneficial compared to a benchmark portfolio of tests and sweeps performed during the initial installation.

As its name implies, relative testing must be compared with something. Absolute testing is compared with the manufacturer’s testing and specifications, but relative testing must be compared with the initial installation test results. When initial test results and sweeps are available, a comparison with the current sweeps will show changes that may affect operation. As cable matching in the same network is used in the initial benchmark tests, the comparison establishes a benchmark to which later tests can be reliably compared. As long as the comparison is equal or close, it can be assumed that the RF network has not changed significantly.

Return Loss and VSWR

The RL and VSWR measurements are essential for anyone making cable and antenna measurements in the field. These measurements show the user the impedance match of the system and if it conforms to system engineering specifications. If problems show up during this test, it is likely that the system will have problems. A poorly matched antenna will reflect RF energy that will not be available for use at the load, and this extra energy returned to the source will affect the efficiency of the transferred power and the corresponding coverage area.

An absolute return loss or VSWR test is taken with a known calibrated load at the end of the RF network to ensure a perfect match. This allows the network to be the limiting factor in most reflections. With the calibrated load in place of the antenna, most reflections that occur will be the result of impedance mismatches in the network itself. This test allows the network to be compared to manufacturer specifications that were taken in a similar manner.

The return loss measurement should also be taken with the antenna connected and installed in the final location, as this shows the system’s delivered return loss and considers installation distortions. This relative test will uncover irregularities not caused by the hardware, such as mounting too close to other metal objects. Figure 2 shows a typical return loss sweep over the frequency that can be used to validate manufacturer specifications when a calibrated termination is used.

Figure 2: A typical return loss RL sweep over frequency

Some energy will dissipate in the cable and the components as the RF signal travels through the RF network. A cable loss measurement is usually made during the installation phase to ensure that the cable loss is within the manufacturer’s specifications. A cable loss measurement is not isolated to the transmission line but to all components in the network. When performing cable loss measurements, be aware of components that may have frequency characteristics that could affect the results.

There are two types of cable loss measurement. Two-Port Insertion Loss (2PIL) uses a test instrument in which the test signal is generated on one RF port and received by a second RF port on the same instrument. This method directly measures the loss of a system with high accuracy, but it is not always possible to connect to both ends of a cable, so a second method is needed, called One-Port Cable Loss (1PCL, or just CL), which uses a measurement method in which the RF energy is generated and received by a single port.

In effect, 1PCL compares the generated signal against the reflected signal and divides the difference by two, so it suffers from the same uncertainties as return loss. 1PCL is only an absolute test when done with a calibrated open or short connected to the end of the RF network because only a calibrated open or short provides a consistent and total reflection of the test signal.

The 1PCL data is normally the average of the maximum/minimum value. The 1PCL measurement can only be accurate if the reflection is total, i.e., a calibrated short or open must be used. Relative CL tests cannot be performed as the energy used to measure system loss will be radiated by the antenna and not reflected. Figure 3 shows a typical CL sweep taken with a calibrated open or short at the end of the cable. The results of this sweep can be compared with the insertion loss of the cable, connectors, and any other devices present in the manufactured standard.

Figure 3: A CL sweep taken with a calibrated open or short at the end of the cable

Distance-to-fault is the most controversial test. It maps the return loss or VSWR over the length of the entire RF network, which is called DTF-RL. While sweep is a great troubleshooting tool, it is also a great quality analysis tool. There are times when the CL and RL sweeps meet the manufacturer specifications, yet irregularities along the cable cause failure in DTF expectations. The DTF sweeps can only be performed reliably and effectively in the absolute testing mode. DTF is not reliable in the relative mode for determining failure unless another previous relative sweep is available for comparison. Figure 4 shows a typical DTF-RL sweep.

Figure 4: A DTL-RL sweep maps the return loss or VSWR over the length of the network

The most questionable item is the DTF-RL threshold to be used for pass/fail. Cable manufacturers are just now beginning to perform DTF on their products and have not published specifications for DTF-RL. While there is no true standard for the acceptable level of DTF-RL, it is a reasonable expectation that it should be between -40 dB and -50 dB when sweeping primary feedlines.

A DTF return loss of -50 dB will have fewer imperfections and irregularities than a network measuring -40 dB. Risk and system requirements are the two determining components in selecting a tolerance threshold. A mission-critical public safety system may require a DTF-RL better than -45 dB while a cellular system may require only -40 dB. The expectation must be identified before testing begins.

Figure 5: A dented cable can cause the failure of a DTF-RL measurement

Failure of DTF-RL is associated with impedance changes along the cable or at the transition of a connector. These changes can be caused by cable bends that kink, improperly installed connectors, stretched cable, dents that change the dielectric spacing, or water intrusion (Figure 5).  Because one or two irregularities can have minimal effect on the overall return loss characteristics of the cable, the absolute sweep may not be affected. Nevertheless, if dents, kinks, or other impairments exist, the cable could be considered bad. DTF helps identify where irregularities occur on the cable and their severity.

The Essential Ingredients of MOP

Below are components that should be considered in any MOP. While the expected results may vary depending on the system deployment team’s system type and requirements, the components and the processes should vary very little.

Before starting antenna and line commissioning or testing, it is necessary to have the electrical specifications for all the RF network components. The system designer should supply this information or make it available before the MOP is begun. This site-specific information establishes the expectation and allows rapid comparison of the collected data. This data is also needed to program the test equipment to ensure it knows the cable type and characteristics. Drawings also allow the person performing the test to fully understand the components included in the network, which will assist in understanding and interpreting the results.

The electrical specifications needed are:

  • Antenna frequency range and return loss specifications
  • Jumper cable type, velocity factor, insertion loss, and return loss
  • Transmission line type, velocity factor, insertion loss, and return loss
  • Lightning suppressor frequency range, insertion loss, and return loss
  • RF connector type, insertion loss, and return loss
  • Expected transmission line system insertion loss

Analyzer and Test Requirements

The analyzer used in the field is considered laboratory-grade equipment and must be treated and used accordingly. The precision load is a delicate piece of equipment and must be treated carefully. If the precision load is dropped from any height, it should not be used again on projects until its proper operation is verified by the manufacturer. Never use the primary calibration standard on the tower or at the remote end for absolute testing. Keep the calibration standard in a controlled environment to ensure integrity.

The analyzer should be loaded with the most current firmware and serviced at the factory as recommended by the manufacturer, and the precision load must be in known good working order. The analyzer must be calibrated at the ambient temperature in which it will be operated and must be re-calibrated whenever its temperature changes significantly or when the analyzer display indicates that the calibration is no longer valid due to temperature change.

Recalibration is also required when the setup frequency changes; when a test port extension cable is added, removed, or replaced; and when it has been turned off for any length of time. Re-calibration should also be performed when “noise” or “picket fencing” appears at the bottom of the display during a distance-to-fault measurement (down around -50 dB). The calibration results and precision load should be tested by performing a return loss test on the load after calibration. A return loss of -42 dB or better should be obtained from the precision load.

Analyzer resolution should be set to maximum for the highest quality and most accurate printouts and only a precision device is acceptable when using a load. Adapters should be avoided whenever possible, but only precision adapters can be used if they’re needed. Finally, if an extension cable is needed, it must be phase-stable.

Test Documentation

When antenna system commissioning is performed, it is necessary that all tests are properly documented for use in the system manual and for future antenna system testing and/or troubleshooting. All relative testing accuracy will depend on the quality and the attention to detail used in the commissioning documentation.

The results of the MOP tests should be readily available in the system manual as a viewable file, such as *.pdf or *.wmf formats. The raw data files should also be available to allow side-by-side comparison of relative data. All traces should contain common information and be in similar formats.

 Standardized Tests to Perform

A list of all tests that should be performed is critical in ensuring complete and thorough testing. These tests will involve both relative testing and absolute testing. While the procedure for performing these tests is very important, it is beyond the scope of this paper. Each MOP should outline special considerations to be used in performing the tests. Standardized testing must be preceded by a visual inspection of the cable and reel to identify any shipping or other damage that may have occurred.

Tests that should be considered a part of the MOP are:

  • Jumper insertion loss and return loss: Jumpers should be tested and verified before installation.
  • Each antenna should be tested on the ground before installation. The location should not be near metallic materials and needs to be above the ground as much as possible. If using a directional antenna, the antenna should be pointed vertically or away from possible sources of RF energy.
  • The end of the antenna system must be verified. This test involves performing a DTF using precision terminations (open or short) and is the absolute verification of the cable that can be compared with the manufacturer’s specifications. This test verifies that the cable supplied is the specified length and that the cable’s end is visible on the sweep. It may involve installing test connectors on a reel of cable with connectors.
  • Antenna system insertion loss: This test measures and validates the insertion loss of the entire antenna system (i.e., main transmission line, jumpers, and lightning suppressor) with a calibrated open or short. The results can be compared with the design engineer’s expectations.
  • Transmission line distance-to-fault while terminated with antenna jumper and precision load: This test is similar to the “end of the antenna system” test except that it is performed after installation and will show installation errors.
  • Transmission line return loss while terminated with antenna jumper and precision load: This is final return loss test after installation, validates the RF network match, and will be used as the foundation for operational validation.
  • Complete antenna system return loss: This test is similar to the absolute test of return loss, except that it is performed with the antenna connected. It will be used to validate changes in the antenna system over time without the need for a tower climb.
  • Complete antenna system distance-to-fault: Similar to the end of the antenna system test, except the final antenna is used for termination. This test is used to validate changes in the antenna system over time without the need for a tower climb.

Summary

Transmission line and antenna testing to understand the integrity of RF networks after installation becomes more important every year, and advances in test equipment make this easier than ever. However, it has also become important to have standardized testing to ensure consistency and integrity of the results, regardless of the cable, connector, test equipment, or antenna being evaluated. The guidelines within the Methods of Procedure described in this article go a long way toward achieving this goal.

(142)

print

LEAVE YOUR COMMENT