The Data Center Industry: Growth Opportunities for the Test & Measurement Industry
by Rohan Thomas, Industry Analyst — Measurement and Instrumentation, Frost & Sullivan
In 2018, total investments into the global data center infrastructure industry accounted for more than $250.0 billion. These included investments in IT infrastructure, facility infrastructure, and outsourcing. Several factors affect the industry; however, two of the most significant trends are Internet of Things (IoT) and the deployment of 5G.
While the IoT continues to place a substantial number of connected devices on the grid, 5G deployment will enhance this level of automation with dedicated network slices for individual enterprise use-cases. Although that kind of 5G automation is a long way off, the amount of data generated from IoT proliferation, the limited 5G deployments commercially which are already rolled out around the world, as well as other trends like augmented and virtual reality have forced enterprise and service providers to re-think their data center architecture strategies. The data center must be re-engineered to become more scalable and versatile and offer higher throughput while also containing costs and enhancing energy efficiency.
Key Technology Trends in the Data Center Infrastructure Industry
Proliferation of the Cloud
The adoption of the cloud has helped enterprise and service providers mitigate operating expenditures by providing end users with a higher degree of scalability. Security concerns pertaining to the adoption of a multi-tenant cloud environment are a key challenge that often gets in the way of enterprise adoption of the cloud. This challenge is compounded by interoperability issues that can arise when operating on data center infrastructure that is simultaneously multi-vendor and multi-geographic.
Data Center Decentralization
Another trend is the shift toward a highly decentralized form of data center infrastructure. Decentralized architecture means that functions specific to a branch of an enterprise are broken off from the central server and placed closer to that branch in the form of a localized data center that could be bare metal servers or even deployed on the cloud. This type of infrastructure will further increase the degree of versatility associated with an enterprise provider or a service provider’s data center, while at the same time reducing the overall redundancy of the data center network.
Edge Computing
Decentralized data centers are also synonymous with edge computing, wherein the core functions of the end user device are placed nearer to said device. Functions critical to end users are computed closer to them rather than being sent back to the cloud. Such an arrangement will help reduce latency and enable real-time insights in the end user device. Edge computing will also gain importance for many technologies and ecosystems looking at complete automation. The autonomous vehicle is a good illustration in which edge computing will prove crucial. Data from the many sensors on the vehicle are processed on the vehicle itself. It is not advisable to send it to the cloud because a delay in data transfer can jeopardize the instant action the vehicle has to take.

In summary, cloud computing, data center decentralization, and edge computing are important trends in the industry. However, the shift to more decentralized, hyper-scale, and cloud-based data centers with edge compute capabilities is a major challenge for enterprise and service providers. To accomplish this task, end users require innovative testing equipment and solutions.
Growth Opportunities in the Data Center Infrastructure Industry
A plethora of equipment and solutions are used to validate data centers. These can be grouped according to the three phases of the data center lifecycle, namely, the production phase, the pre-deployment phase, and the post-deployment phase.
Production Phase
With the rising trends of 5G and IoT, data centers will have to transfer increasing amounts of data, causing the jump from 10G to 100G and, eventually, 400G (most likely skipping 200G). High-density data centers that offer higher throughput will require powerful chipsets that can operate at high frequencies. These chipsets must also be energy and cost efficient. However, the interconnects which are embedded within the chipsets impede this objective as they create a substantial bottleneck to data flow. The replacement of traditional interconnect materials with silicon photonics will be the most cost-efficient workaround to this predicament as the technology is capable of transferring data at a higher throughput and a lower latency without raising costs.
To validate such powerful silicon photonics-based chipsets that are embedded into high-density data centers, chipset manufacturers require a specific set of electronic testing equipment. This includes source measurement units that are used for the characterization and the validation of semiconductor components that comprise a high-performance transceiver. In addition, as the network bandwidth moves to higher frequencies, design engineers will need high-resolution oscilloscopes as well as bit error rate testers that test chipsets for higher layers of network loads with tighter design margins.
Pre-deployment Phase
This phase covers that portion of the data center lifecycle right after the fabrication of essential data centers such as chipsets and transceivers, which are used in the data center but before the data center goes live. This phase comprises many installation and maintenance activities.
As enterprise and service providers continue to migrate to hyper-scale cloud-based data centers, adopt IoT, and also begin to accommodate 5G, data traffic will surge. This will fuel the transition to Ethernet interfaces of up to 400GbE. To enhance broadband, carrier aggregation technologies such as FlexE and OTUCn FlexO will gain importance. The advent of 5G will also lead to the shift in the type of modulation used—from non-return to zero (NRZ) modulation to pulse-amplitude modulation with the 4 (PAM4). These changes in Ethernet interfaces will push for Ethernet testing solutions that are compatible with slower, legacy-type interfaces and should also be able to test for Ethernet interfaces with capacities of up to 400GbE, and be compatible with carrier aggregation technologies.
The high-speed and voluminous data transfer across data center infrastructure will have a substantial impact on physical infrastructure. To handle all that data, architects must incorporate cables with high-density fiber. Further, the reconfiguration of offices into central offices re-architected as data centers (CORD) and head-end re-architected as data centers (HERD) will drive multi-fiber adoption and the shift from MPO 8 to MPO 12 and, eventually, MPO 24.
Therefore, fiber optic testing equipment such as optical time-domain reflectometers (OTDRs), fiber inspection probes (FIPs), and optical loss test sets (OLTSs), which can test high-density fiber cables and also check the integrity of new MPO connectors, will gain prominence.
Post-deployment Phase
This phase covers that portion of the lifecycle in which the data center is fully functional. This phase mainly comprises network and application performance monitoring solutions that help to ensure high quality of service. Of the three phases, post-deployment is the most significant in terms of revenue, unlike the other phases in which the solutions included also have other use cases.
Network infrastructure modernization has substantial implications on data center infrastructure and the solutions that are deployed live on the network to measure performance. The virtualization of network functions, the incorporation of small cells, the adoption of edge computing, and the inclusion of higher degrees of automation are key trends that will impact the solutions used in this phase of the lifecycle. Moreover, with the migration to a multi-tenant cloud environment, security will be brought to the forefront.

Such an environment requires a solution that can simultaneously monitor the several workloads deployed on the network while ensuring privacy without compromising on security.
Across the different testing equipment and solutions that are used during the production phase, prior to full deployment, as well as after the deployment of the data center network, the common trend is the manufacturer’s emphasis on software and the incorporation of artificial intelligence (AI) and data analytics capabilities into the testing solutions.
Challenges
Numerous growth opportunities exist for the testing equipment and solutions used in data centers. According to Frost & Sullivan, the industry was worth an estimated $5.59 billion in revenue in 2018, and it is expected to register a healthy compound annual growth rate (CAGR) of 13.1% between 2019 and 2024. The industry is poised for growth; however, certain factors can act as impediments, and vendors must be cognizant of them.
High Market Entry Barriers
The test and measurement market is characterized by high barriers to entry. This situation is exacerbated by the intense consolidation prevalent in the communications test and measurement space, which is caused by the trend of prominent vendors acquiring other participants. While acquisitions act as an effort to create an end-to-end testing solution for the customer, they can destroy competition from new and smaller companies that probably have a solution that can address end user problems. They are also restrained by the brand loyalty enjoyed by the well-established larger companies and their expansive product portfolios. All these factors are detrimental to the competitive environment of the communications test and measurement market.
Testing Equipment Cost
The testing equipment used in the production phase comprises oscilloscopes, bit error rate testers, and source measurement units, and procurement costs generally tend to be high. Many of the functionalities associated with today’s hyper-scale data centers require testing equipment with enhanced features that are capable of testing higher frequencies. However, end users tend to shy away from new equipment purchases due to the high costs involved.
Leading testing equipment companies try to address the cost issue by enhancing the return on investment through different asset optimization services. Such services can include a type of leasing service in which the owner can lease the equipment and increase its degree of utilization. In addition, the emphasis on software that contains key testing capabilities is higher than ever before. As a result, end users need not purchase new equipment whenever there is a new use-case that needs testing. Instead, the software with the essential testing capabilities can be downloaded and integrated with existing equipment.
Rise of Hyper-converged Data Centers
Another trend is the shift to faster, smaller, more integrated, hyper-converged data centers. While these are more powerful, scalable, and energy-efficient, the amount of fiber optic required is far less than in other types of data center installations, thereby reducing the requirement for the tools required to test that medium.
So What Should Communications Test and Measurement Vendors Do to Stay Relevant in Todays’ Data Center Infrastructure Industry?
Vendors can focus on building a comprehensive end-to-end testing platform that can be used in the device as well as in the network infrastructure. Such a platform is extremely useful in the context of IoT, which connects a host of devices that have different power requirements, operate on different latencies, and are deployed on different types of networks.
As IoT proliferation continues, the migration to cloud, which provides a more scalable environment, will also increase. However, security concerns and service-level agreement (SLA) requirements restrain migration to the cloud. This challenge will be overcome when testing vendors are able to engineer a Software-as-a-Service (SaaS)-based solution that can be deployed on the cloud. Vendors can offer such services in the cloud through collaborations with leading cloud computing service providers.
In the cloud environment, enterprise and service providers require enhanced visibility in terms of the various third-party applications. To easily interface with any third-party application, testing solutions should be engineered to be API-first so that they can interface with any third-party application that is used by the end user. Testing solutions should also have enhanced AI and machine learning (ML) capabilities so that they can be deployed across environments that are multi-tenant, heterogeneous, and automated.
Conclusion
While there is a substantial amount of investment going into the data center infrastructure industry, end users and service providers are challenged by a slew of issues that need to be addressed. These include the use of more powerful yet more energy-efficient engineering transceivers, the installation of high-density fiber optics that can handle substantially large loads, the incorporation of fast Ethernet technologies that have carrier aggregation technologies, and enhancing visibility and securing a network that has a substantially higher degree of virtualization, automation, and multi-tenancy.
These challenges are important opportunities that the testing community can capitalize on, provided they offer the right kind of solution. The solutions must be economically viable, have AI/ML capabilities, and be able to support the entire lifecycle of the data center.
Article based on “Growth Opportunities in the Data Center Test and Measurement Market, Forecast to 2024.”
(252)