Conformance Testing is the Key to Base Station Performance
by Paris Akhshi, PhD, Product Marketing Manager, Keysight Technologies
In light of ever-evolving standards, test solutions need to support higher frequencies, wider bandwidths, and new physical layer capabilities. We enjoy a variety of innovative and comprehensive mobile wireless communication applications daily thanks to the much faster, more reliable, and nearly instantaneous connections that come with 5G. Rigorous testing ensures that base stations deliver on their promises and support 5G connections.
Base stations must pass new conformance tests to ensure they meet specific standards. Performing conformance testing is an important part of the base station lifecycle and it requires a thorough understanding of 3rd Generation Partnership Project (3GPP) specifications. Release 16 of 3GPP brings several significant enhancements and extensions to new radio (NR), along with additional Long-Term Evolution (LTE) extensions and enhancements.
Release 16 introduces two specific sets of features. The first set includes new verticals such as multiple radio access technology (multi-RAT), dual connectivity and carrier aggregation (CA) enhancement, the industrial Internet of Things (IIoT), ultra-reliable low-latency communications (URLLC), and vehicle-to-everything (V2X). The second set of features addresses enhanced capacity and operational efficiencies, such as improvements in multiple-input, multiple-output (MIMO); integrated access and backhaul (IAB); cross-link interference/remote interference management; user equipment (UE) power savings; and mobility.
Every 5G NR base station or user equipment manufacturer must pass all the required tests before releasing products to the market. Without 3GPP compliance, products are not usable for network deployment.
Overview of 3GPP Base Station Conformance Testing
3GPP defines the RF conformance test methods and requirements for NR base stations in the technical specification TS 38.141. For example, TS 38.141-1: Part 1 covers conducted conformance testing, and TS 38.141-2: Part 2 includes radiated conformance testing in both frequency range 1 (FR1) and FR2 depending on the base station type. Understanding the main conducted and radiated transmitter tests for FR1 and FR2 base stations involves robust knowledge of transmitter test items and requirements. Table 1 summarizes base station conformance tests for conducted and radiated situations.
Base Station Transmitter Conformance Test Requirements
To have full coverage on transmitter tests, the 5G NR measurement application running on a signal analyzer should be able to measure the required tests the standards specify. The main tests include channel power and occupied bandwidth, adjacent channel leakage ratio (ACLR), operating band unwanted emissions (OBUE), spurious emission, transmit on/off power, error vector magnitude (EVM), frequency error, and time alignment error (TAE). To ensure base station transmitters comply with 3GPP standards, it is necessary to evaluate transmitter characteristics and perform test measurements on output power, output power dynamics, transmit on/off power, transmit signal quality, unwanted emissions, and transmitter intermodulation.
Power Characteristic Measurements
Power characteristic measurements include “output power dynamics measurements” and “transmit on/off power measurement.” The purpose of performing the output power test is to measure the power accuracy relative to the base station declared value when transmitting at the maximum power level. Figure 1 (left) is an example of the measurement result on a 100 MHz bandwidth time division duplex (TDD) signal with a measurement mode of channel power. The gate start and stop lines indicate the part within a frame used for the power measurement. Output power dynamics refers to the difference in levels when the base station is transmitting at maximum and minimum power levels.
It’s necessary to measure on the orthogonal frequency division multiplexing (OFDM) symbols that only carry physical data shared channel (PDSCH) data, with no synchronization signal block (SSB) or no demodulation reference signal (DMRS) within the symbol. The OFDM symbol transmit power limit (OSTP) is the OFDM symbol transmit power measured for data symbols only, which is a requirement in dynamic power measurement. Figure 1 (right) shows an example of the output power dynamics measurement using a signal analyzer with 5G NR measurement application.
Transmit on/off power measurement verifies that the transmit power is within the limits defined by the standard. Technically, two aspects require verification for this test. The first aspect is to measure the power level when the transmitter is off to check it against the pass or fail requirement. The second is to measure the transient time, the ramp-up, and the ramp downtime of a burst in the TDD signal.
Transient time is measured during the period in which the transmitter changes from off power level to on, and vice versa. A pass or fail indicator on a signal analyzer displays the results and allows the results to be compared with the defined standard limits. The limits are typically set to the values from 3GPP specifications by default, but there is flexibility to modify the limits to specific test purposes.
Transmit Signal Quality Measurements
Transmit signal quality is an important metric to demonstrate the quality of the transmitted signal. It includes three measurements: frequency error, modulation quality (EVM), and time alignment error for MIMO or carrier aggregation cases. Frequency error measures the difference between the actual base station transmit frequency and the assigned frequency. The purpose of frequency error measurement is to verify that frequency error falls within the limit specified by the standard minimum requirement.
Modulation quality or EVM is the difference between the measured carrier signal and an ideal signal. The error vector is defined as the vector difference between the reference, or the ideal signal, and the measured signal. In Figure 2, the EVM is the root mean square (RMS) result averaged over all the allocated subcarriers and all OFDM symbols. The purpose of this test is to verify that modulation quality is within the limit specified by the minimum requirements defined as standards.
Time alignment error (TAE) measurement finds the delay between signals from two transmit antennas and verifies that the time alignment error is within the limit specified by the minimum requirement. TAE refers to the timing gap between different DMRS ports in the transmit signal that should be lower than a certain level. This requirement applies to frame timing in MIMO transmission, carrier aggregation, and respective combinations. TAE result is the greatest timing difference between any two different NR signals.
Unwanted emissions are out-of-band and spurious emissions. Out-of-band emissions are the unwanted emissions immediately outside the channel bandwidth resulting from the modulation process and non-linearity in the transmitter excluding spurious emissions. Spurious emissions occur due to unwanted transmitter effects such as harmonics emission, parasitic emission, intermodulation products, and frequency conversion products; they exclude out-of-band emissions.
The base station transmitter’s specified out-of-band emission requirements are ACLR and OBUE. These requirements target the emission impact to different frequency offsets. ACLR only focuses on the power leakage to its adjacent channels, while the OBUE covers the entire operating band as well as an offset at each side.
ACLR is the ratio of the filtered mean power centered on the assigned channel frequency to the filtered mean power centered on an adjacent channel frequency. The purpose of ACLR is to control the power leakage to adjacent channels under a certain level to reduce interference. The transmitted signal could be single or multicarrier, and the requirements we discuss here apply to both.
OBUE measures the emissions close to the assigned channel bandwidth of the wanted signal while the transmitter is running. The defined unwanted emissions from the operating band measure out-of-channel emissions over the entire base station transmitter operating band with an offset of ΔfOBUE (maximum offset outside the band) on each side. These emissions are the unwanted emissions resulting from the modulation process and non-linearity in the transmitter excluding spurious emissions.
The green trace in Figure 3 represents the limit mask for OBUE, which covers the frequency range from channel edge to ΔfOBUE outside the operating band. The entire mask consists of a few segments on each side. For example, the offset from the channel edge to the center of the measurement filter defines each segment’s start and stop points. Each segment defines two key properties.
The first one is the basic limit, which means that unwanted emissions within the segment must be under this limit, and the second one is the bandwidth of the measurement filter. The basic limit and measurement bandwidth changes with the test conditions, resulting in a series of masks for different frequency bands, base station type, class, and category. This process corresponds to more than 10 limit masks for FR1 alone. However, using an advanced preset enables the user to apply the wanted mask.
A spurious emission test verifies the spurious emissions are within the specified standard minimum requirements — under the limit from 9 kHz to 12.75 GHz, excluding the operating band plus ΔfOBUE at each edge. The spurious emission test measures spurious emissions while the transmitter is running. The test measures emissions across a much wider frequency range to ensure the emission level is under the limit.
The measurement region includes two parts. The first part is from 9 kHz to ΔfOBUE below the operating band, and the second part is from ΔfOBUE above the operating band to 12.75 GHz. For some operating bands, if the fifth harmonic of the upper-frequency edge is higher than 12.75 GHz, the emission limit should also cover the fifth harmonic point.
The transmitter intermodulation requirement measures the transmitter’s ability to inhibit the generation of signals in its non-linear elements. This result is due to the presence of the wanted signal and an interfering signal reaching the transmitter unit via the antenna, radio distribution network (RDN), and antenna array. The transmitter requirement applies during the transmit on and transient periods.
5G NR Release 16 Conformance Test Challenges
As standards continue to evolve to include higher frequency ranges and wider channel bandwidths, users face new testing challenges and limitations. All FR2 device and base station tests,and some FR1 base station tests radiate, according to conformance requirements. This process results in over-the-air (OTA) testing which introduces additional test challenges.
The excessive path loss at higher frequencies, such as millimeter wave (mmWave), between instruments, and devices under test (DUT), results in a lower signal-to-noise (SNR) for signal analysis, all of which make transmitter measurements such as EVM, adjacent channel power (ACP), and spurious emissions challenging.
Additional testing challenges such as wideband noise and frequency errors contribute even more to the complexity of test setups and reduce accuracy significantly. Therefore, it is important that signal analyzer hardware and software are flexible to enable the right solution.
5G NR Advanced Testing Solutions
All wireless standards specify transmitter measurements at the maximum output power. Attenuating the input power level at the signal analyzer’s first mixer protects the analyzer from distortion caused by high-power input signals. An integrated preamplifier can provide a lower noise figure, but at the cost of worse intermodulation distortion-to-noise-floor dynamic range. Higher input mixer levels can improve SNR, and lower input mixer levels can help the distortion performance. Therefore, choosing the right input mixer level is a trade-off between distortion performance and noise sensitivity.
Use measurement hardware, the input signal’s characteristics, and the test specifications to determine the best mixer level setting. From there, optimize the input level by using an external low noise amplifier (LNA) at the mixer input.
Obtain a reliable signal analyzer to provide a built-in LNA and preamplifier for various test scenarios for FR1 and FR2 applications. Figure 4 shows how the two-stage gain balances noise and distortion for optimizing the best low input level measurement performance which improves the EVM sensitivity by up to 5 dB compared to the previous signal analysis results in the market.
Cables, connectors, mixers, and fixtures between the analyzer and DUT can significantly affect path loss and SNR. By using an external frequency extender with an integrated preselector and RF switch, frequency ranges can extend up to 110 GHz, without managing band breaks and images. See the real performance of the device by correcting for magnitude and phase errors in the measurement setup.
As standards continue to evolve, it is important to access appropriate equipment while meeting new test and specification requirements. It is essential to consider how quickly equipment can integrate software releases to meet the latest test cases. By pairing measurement solutions with robust software applications, this will enable a complete solution to stay ahead of ever-changing regulations. To learn more visit: