skip to main content

ThinkSystem Mellanox ConnectX-6 HDR/200GbE VPI Adapters

Product Guide

Home
Top
Author
Updated
1 Aug 2024
Form Number
LP1195
PDF size
26 pages, 591 KB

Abstract

The ThinkSystem Mellanox ConnectX-6 HDR/200GbE VPI Adapters offer 200 Gb/s Ethernet and InfiniBand connectivity for high-performance connectivity when running HPC, cloud, storage and machine learning applications.

This product guide provides essential presales information to understand the adapter and its key features, specifications, and compatibility. This guide is intended for technical specialists, sales specialists, sales engineers, IT architects, and other IT professionals who want to learn more about the ConnectX-6 HDR VPI adapters and consider their use IT solutions.

Change History

Changes in the August 1, 2024 update:

Introduction

The ThinkSystem Mellanox ConnectX-6 HDR/200GbE VPI Adapters offer 200 Gb/s Ethernet and InfiniBand connectivity for high-performance connectivity when running HPC, cloud, storage and machine learning applications.

The following figure shows the ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-port PCIe 4 VPI Adapter connected to the ThinkSystem Mellanox HDR/200GbE Aux Adapter (the standard heat sink has been removed in this photo).

ThinkSystem Mellanox ConnectX-6 HDR QSFP56 1-port PCIe 4 InfiniBand Adapter and ThinkSystem Mellanox HDR/200GbE Aux Adapter
Figure 1. ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-port PCIe 4 VPI Adapter (right) and ThinkSystem Mellanox HDR/200GbE Aux Adapter (left)

Did you know?

Mellanox ConnectX-6 brings new acceleration engines for maximizing High Performance, Machine Learning, Storage, Web 2.0, Cloud, Data Analytics and Telecommunications platforms. ConnectX-6 HDR adapters support up to 200 Gb/s total bandwidth at sub-600 nanosecond latency, and NVMe over Fabric offloads, providing the highest performance and most flexible solution for the most demanding applications and markets. ThinkSystem servers with Mellanox adapters and switches deliver the most intelligent fabrics for High Performance Computing clusters.

Part number information

For servers with support for PCIe 4.0 host interfaces, the ConnectX-6 HDR adapter can be used by itself in a single PCIe 4.0 x16 slot to provide 200 Gb/s connectivity. For servers with PCIe 3.0 interfaces, the ConnectX-6 HDR adapter is used in conjunction with the Aux adapter. The HDR adapter and the Aux adapter are connected together via a cable (included with the Aux adapter) and their combined host interfaces of PCIe 3.0 x32 provides enough bandwidth for 200 Gb/s connectivity.

The following table shows the part numbers for the adapters.

CTO orders: For configure-to-order builds, these adapters are only available when you select one of the HPC & AI modes in the DCSC configurator. Not available in General Purpose mode of DCSC.

Table 1. Ordering information
Part
number
Feature
code
Mellanox
equivalent
Description
Primary adapter
4C57A15326 B4RC / BN38 MCX653105A-HDAT ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-port PCIe 4 VPI Adapter
CTO only B4RG None ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-port PCIe 4 VPI Adapter (SharedIO) DWC
CTO only B951 None ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-Port PCIe 4 VPI Adapter (SharedIO) DWC
CTO only B952 None ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-Port PCIe 4 VPI Adapter DWC
4XC7A86672 BKSK None ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-Port PCIe 4 VPI Adapter (SharedIO) DWC
Shared I/O auxiliary adapter/cable
4C57A14179 B4RB MTMK0012 ThinkSystem Mellanox HDR/200GbE 2x PCIe Aux Kit
CTO only BPZG None ThinkSystem SD665 V3 SharedIO Cable
CTO only BRL2 None ThinkSystem SD650, SD650-I V3 Auxilary Cable

Part number 4C57A15326 includes the following:

  • One Mellanox adapter with full-height (3U) adapter bracket attached
  • Low-profile (2U) adapter bracket
  • Documentation

Part number 4C57A14179 includes the following:

  • One Mellanox adapter with full-height (3U) adapter bracket attached
  • Low-profile (2U) adapter bracket
  • 350mm cable
  • Documentation

Note: 4C57A15326 was previously named ThinkSystem Mellanox ConnectX-6 HDR QSFP56 1-port PCIe 4 InfiniBand Adapter

Supported transceivers and cables

The adapter has an empty QSFP56 cage for connectivity.

The following table lists the supported transceivers.

Table 2. Transceivers
Part number Feature code Description
100Gb Transceivers
4M27A67042BFH1Lenovo 100Gb SR4 QSFP28 Ethernet Transceiver
7G17A03539AV1DLenovo 100GBase-SR4 QSFP28 Transceiver
4TC7A86257BVA4Lenovo 100GBase-SR4 QSFP28 Transceiver
Converters/Adapters
4G17A10853B306Mellanox 100G QSFP28 to 25G SFP28 Cable Adapter

Configuration notes:

  • Transceiver AV1D also supports 40Gb when installed in a Mellanox adapter.
  • For the transceiver and cable support for the Mellanox QSA 100G to 25G Cable Adapter (4G17A10853), see the 25G Cable Adapter transceiver and cable support section.

The following table lists the supported fiber optic cables and Active Optical Cables.

Table 3. Optical cables
Part numberFeature codeDescription
QSFP28 EDR InfiniBand Optical Cables
00MP563ASRN3m Mellanox EDR IB Optical QSFP28 Cable
00MP540ASQZ5m Mellanox EDR IB Optical QSFP28 Cable
00MP544ASR010m Mellanox EDR IB Optical QSFP28 Cable
00MP548ASR115m Mellanox EDR IB Optical QSFP28 Cable
00MP552ASR220m Mellanox EDR IB Optical QSFP28 Cable
00MP556ASR330m Mellanox EDR IB Optical QSFP28 Cable
00MP566ASRP50m Mellanox EDR IB Optical QSFP28 Cable
QSFP 40Gb Active Optical Cables
7Z57A04256AX42Lenovo 1m 40G QSFP+ Active Optical Cable
00YL652ATZ3Lenovo 3m 40G QSFP+ to QSFP+ Active Optical Cable
00YL655ATZ4Lenovo 5m 40G QSFP+ to QSFP+ Active Optical Cable
00YL658ATZ5Lenovo 7m 40G QSFP+ to QSFP+ Active Optical Cable
00YL661ATZ6Lenovo 15m 40G QSFP+ to QSFP+ Active Optical Cable
00YL664ATZ7Lenovo 20m 40G QSFP+ to QSFP+ Active Optical Cable
QSFP OM3 Optical Cables (these cables require a transceiver)
00VX003AT2ULenovo 10m QSFP+ MPO-MPO OM3 MMF Cable
00VX005AT2VLenovo 30m QSFP+ MPO-MPO OM3 MMF Cable
QSFP28 100Gb Ethernet Active Optical Cables
4X97A94703B2UZLenovo 1m 100G QSFP28 Active Optical Cable
4X97A94014AV1LLenovo 3m 100G QSFP28 Active Optical Cable
4X97A94015AV1MLenovo 5m 100G QSFP28 Active Optical Cable
4X97A94016AV1NLenovo 10m 100G QSFP28 Active Optical Cable
4X97A94704AV1PLenovo 15m 100G QSFP28 Active Optical Cable
4X97A94705AV1QLenovo 20m 100G QSFP28 Active Optical Cable
4Z57A10844C1MCLenovo 1m 100G QSFP28 Active Optical Cable
7Z57A03546C10PLenovo 3m 100G QSFP28 Active Optical Cable
7Z57A03547C10QLenovo 5m 100G QSFP28 Active Optical Cable
7Z57A03548C10MLenovo 10m 100G QSFP28 Active Optical Cable
7Z57A03549C1MDLenovo 15m 100G QSFP28 Active Optical Cable
7Z57A03550C1MELenovo 20m 100G QSFP28 Active Optical Cable
100G MPO OM4 MMF Cables (these cables require a transceiver)
7Z57A03567AV25Lenovo 5m MPO-MPO OM4 MMF Cable
7Z57A03568AV26Lenovo 7m MPO-MPO OM4 MMF Cable
7Z57A03569AV27Lenovo 10m MPO-MPO OM4 MMF Cable
7Z57A03570AV28Lenovo 15m MPO-MPO OM4 MMF Cable
7Z57A03571AV29Lenovo 20m MPO-MPO OM4 MMF Cable
7Z57A03572AV2ALenovo 30m MPO-MPO OM4 MMF Cable
QSFP56 HDR IB Optical Cables
4Z57A14188B4QW3m Mellanox HDR IB Active Optical QSFP56 Cable
4Z57A14189B4QX5m Mellanox HDR IB Active Optical QSFP56 Cable
4Z57A14190B4QY10m Mellanox HDR IB Active Optical QSFP56 Cable
4Z57A14191B4QZ15m Mellanox HDR IB Active Optical QSFP56 Cable
4Z57A14192B4R020m Mellanox HDR IB Active Optical QSFP56 Cable
4Z57A16016B68P30m Mellanox HDR IB Optical QSFP56 Cable
4Z57A16017B68N50m Mellanox HDR IB Optical QSFP56 Cable
4Z57A16018B68M100m Mellanox HDR IB Optical QSFP56 Cable
4Z57A72553BFXR3m Mellanox HDR IB Optical QSFP56 Low Latency Cable
4Z57A72554BFXS5m Mellanox HDR IB Optical QSFP56 Low Latency Cable
4Z57A72555BFXT10m Mellanox HDR IB Optical QSFP56 Low Latency Cable
4Z57A72556BFXU15m Mellanox HDR IB Optical QSFP56 Low Latency Cable
4Z57A72557BFXV20m Mellanox HDR IB Optical QSFP56 Low Latency Cable
4Z57A72558BFXW30m Mellanox HDR IB Optical QSFP56 Low Latency Cable
4Z57A72559BFXX50m Mellanox HDR IB Optical QSFP56 Low Latency Cable
4Z57A72560BFXY100m Mellanox HDR IB Optical QSFP56 Low Latency Cable
QSFP56 HDR IB to 2x HDR100 Optical Splitter Cables
4Z57A14196B4R43m Mellanox HDR IB to 2x HDR100 Splitter Optical QSFP56 Cable
4Z57A14197B4R55m Mellanox HDR IB to 2x HDR100 Splitter Optical QSFP56 Cable
4Z57A14198B4R610m Mellanox HDR IB to 2x HDR100 Splitter Optical QSFP56 Cable
4Z57A14199B4R715m Mellanox HDR IB to 2x HDR100 Splitter Optical QSFP56 Cable
4Z57A14214B4R820m Mellanox HDR IB to 2x HDR100 Splitter Optical QSFP56 Cable
4Z57A11490B68K30m Mellanox HDR IB to 2x HDR100 Splitter Optical QSFP56 Cable
4Z57A72561BFXZ3m Mellanox HDR IB to 2x HDR100 Splitter Optical QSFP56 Low Latency Cable
4Z57A72562BFY05m Mellanox HDR IB to 2x HDR100 Splitter Optical QSFP56 Low Latency Cable
4Z57A72563BFY110m Mellanox HDR IB to 2x HDR100 Splitter Optical QSFP56 Low Latency Cable
4Z57A72564BFY215m Mellanox HDR IB to 2x HDR100 Splitter Optical QSFP56 Low Latency Cable
4Z57A72565BFY320m Mellanox HDR IB to 2x HDR100 Splitter Optical QSFP56 Low Latency Cable
4Z57A72566BFY430m Mellanox HDR IB to 2x HDR100 Splitter Optical QSFP56 Low Latency Cable
Mellanox NDR NDRx2 OSFP800 to 2x HDR QSFP56 Optical Splitter Cables
4X97A81844BQKCLenovo 5m NVIDIA NDRx2 OSFP800 to 2x HDR QSFP56 Active Optical Splitter Cable
4X97A81845BQKDLenovo 10m NVIDIA NDRx2 OSFP800 to 2x HDR QSFP56 Active Optical Splitter Cable

The following table lists the supported direct-attach copper (DAC) cables.

Table 4. Copper cables
Part numberFeature codeDescription
QSFP28 EDR InfiniBand Passive Copper Cables
00MP516ASQT0.5m Mellanox EDR IB Passive Copper QSFP28 Cable
00MP520ASQU0.75m Mellanox EDR IB Passive Copper QSFP28 Cable
00MP524ASQV1m Mellanox EDR IB Passive Copper QSFP28 Cable
00MP528ASQW1.25m Mellanox EDR IB Passive Copper QSFP28 Cable
00MP532ASQX1.5m Mellanox EDR IB Passive Copper QSFP28 Cable
00MP536ASQY2m Mellanox EDR IB Passive Copper QSFP28 Cable
00MP560ASRM3m Mellanox EDR IB Passive Copper QSFP28 Cable
QSFP28 100Gb Ethernet Passive DAC Cables
7Z57A03561AV1ZLenovo 1m Passive 100G QSFP28 DAC Cable
7Z57A03562AV20Lenovo 3m Passive 100G QSFP28 DAC Cable
7Z57A03563AV21Lenovo 5m Passive 100G QSFP28 DAC Cable
QSFP56 HDR InfiniBand Passive DAC Cables
4Z57A14182B4QQ0.5m Mellanox HDR IB Passive Copper QSFP56 Cable
4Z57A14183B4QR1m Mellanox HDR IB Passive Copper QSFP56 Cable
4Z57A14184B4QS1.5m Mellanox HDR IB Passive Copper QSFP56 Cable
4Z57A14185B4QT2m Mellanox HDR IB Passive Copper QSFP56 Cable
QSFP56 HDR InfiniBand to 2x HDR100 Passive DAC Splitter Cables
4Z57A14193B4R11m Mellanox HDR IB to 2x HDR100 Splitter Passive Copper QSFP56 Cable
4Z57A14194B4R21.5m Mellanox HDR IB to 2x HDR100 Splitter Passive Copper QSFP56 Cable
4Z57A11477B68L2m Mellanox HDR IB to 2x HDR100 Splitter Passive Copper QSFP56 Cable
QSFP56 200Gb Ethernet Passive DAC Cables
4X97A11113BF6WLenovo 1m Passive 200G QSFP56 Ethernet DAC Cable
4X97A12613BF92Lenovo 3m Passive 200G QSFP56 Ethernet DAC Cable
QSFP56 HDR InfiniBand Active DAC Cables
4X97A12610BCQW3m Mellanox HDR IB Active Copper QSFP56 Cable
4X97A12611BCQX4m Mellanox HDR IB Active Copper QSFP56 Cable
Mellanox NDR NDRx2 OSFP800 to 2x HDR QSFP56 Copper Splitter Cables
4X97A81841BQK9Lenovo 1m NVIDIA NDRx2 OSFP800 to 2x HDR QSFP56 Passive Copper Splitter Cable
4X97A81842BQKALenovo 1.5m NVIDIA NDRx2 OSFP800 to 2x HDR QSFP56 Passive Copper Splitter Cable
4X97A81843BQKBLenovo 2m NVIDIA NDRx2 OSFP800 to 2x HDR QSFP56 Passive Copper Splitter Cable
4X97A81844BQKCLenovo 5m NVIDIA NDRx2 OSFP800 to 2x HDR QSFP56 Active Optical Splitter Cable
4X97A81845BQKDLenovo 10m NVIDIA NDRx2 OSFP800 to 2x HDR QSFP56 Active Optical Splitter Cable
4X97A87753BVB8Lenovo 20M NVIDIA NDRx2 OSFP800 to 2x HDR QSFP56 Active Optical Splitter Cable
4X97A87754BVB9Lenovo 30M NVIDIA NDRx2 OSFP800 to 2x HDR QSFP56 Active Optical Splitter Cable
Mellanox NDR NDRx2 OSFP800 to 4x HDR QSFP56 Copper Splitter Cables
4X97A81846BQKELenovo 1m NVIDIA NDRx2 OSFP800 to 4x HDR100 QSFP56 Passive Copper Splitter Cable
4X97A81847BQKFLenovo 1.5m NVIDIA NDRx2 OSFP800 to 4x HDR100 QSFP56 Passive Copper Splitter Cable
4X97A81848BQKGLenovo 2m NVIDIA NDRx2 OSFP800 to 4x HDR100 QSFP56 Passive Copper Splitter Cable

25G Cable Adapter transceiver and cable support

The Mellanox QSA 100G to 25G Cable Adapter (4G17A10853) supports the transceivers listed in the following table.

Table 5. Transceivers for Mellanox QSA 100G to 25G Cable Adapter (4G17A10853)
Part number Feature code Description
1Gb transceivers
00FE333 A5DL SFP 1000Base-T (RJ-45) Transceiver
10Gb transceivers
46C3447 5053 SFP+ SR Transceiver
7G17A03130 AVV1 Lenovo 10GBaseT SFP+ Transceiver
4TC7A78615 BNDR ThinkSystem Accelink 10G SR SFP+ Ethernet transceiver
25Gb transceivers
7G17A03537 AV1B Lenovo Dual Rate 10G/25G SR SFP28 Transceiver
4M27A67041 BFH2 Lenovo 25Gb SR SFP28 Ethernet Transceiver

The Mellanox QSA 100G to 25G Cable Adapter (4G17A10853) supports the fiber optic cables and Active Optical Cables listed in the following table.

Table 6. Optical cables for Mellanox QSA 100G to 25G Cable Adapter (4G17A10853)
Part number Feature code Description
LC-LC OM3 Fiber Optic Cables (these cables require a 10 GbE SFP+ SR or 25 GbE SFP28 SR transceiver)
00MN499 ASR5 Lenovo 0.5m LC-LC OM3 MMF Cable
00MN502 ASR6 Lenovo 1m LC-LC OM3 MMF Cable
00MN505 ASR7 Lenovo 3m LC-LC OM3 MMF Cable
00MN508 ASR8 Lenovo 5m LC-LC OM3 MMF Cable
00MN511 ASR9 Lenovo 10m LC-LC OM3 MMF Cable
00MN514 ASRA Lenovo 15m LC-LC OM3 MMF Cable
00MN517 ASRB Lenovo 25m LC-LC OM3 MMF Cable
00MN520 ASRC Lenovo 30m LC-LC OM3 MMF Cable
MTP-4xLC OM3 MMF Breakout Cables (these cables require a transceiver)
00FM412 A5UA Lenovo 1m MPO-4xLC OM3 MMF Breakout Cable
00FM413 A5UB Lenovo 3m MPO-4xLC OM3 MMF Breakout Cable
00FM414 A5UC Lenovo 5m MPO-4xLC OM3 MMF Breakout Cable
SFP+ 10Gb Active Optical Cables
00YL634 ATYX Lenovo 1m SFP+ to SFP+ Active Optical Cable
00YL637 ATYY Lenovo 3m SFP+ to SFP+ Active Optical Cable
00YL640 ATYZ Lenovo 5m SFP+ to SFP+ Active Optical Cable
00YL643 ATZ0 Lenovo 7m SFP+ to SFP+ Active Optical Cable
00YL646 ATZ1 Lenovo 15M SFP+ to SFP+ Active Optical Cable
00YL649 ATZ2 Lenovo 20m SFP+ to SFP+ Active Optical Cable
SFP28 25Gb Active Optical Cables
7Z57A03541 AV1F Lenovo 3m 25G SFP28 Active Optical Cable
7Z57A03542 AV1G Lenovo 5m 25G SFP28 Active Optical Cable
7Z57A03543 AV1H Lenovo 10m 25G SFP28 Active Optical Cable
7Z57A03544 AV1J Lenovo 15m 25G SFP28 Active Optical Cable
7Z57A03545 AV1K Lenovo 20m 25G SFP28 Active Optical Cable
QSFP28 100Gb Ethernet Breakout Active Optical Cables
7Z57A03551 AV1R Lenovo 3m 100G to 4x25G Breakout Active Optical Cable
7Z57A03552 AV1S Lenovo 5m 100G to 4x25G Breakout Active Optical Cable
7Z57A03553 AV1T Lenovo 10m 100G to 4x25G Breakout Active Optical Cable
7Z57A03554 AV1U Lenovo 15m 100G to 4x25G Breakout Active Optical Cable
7Z57A03555 AV1V Lenovo 20m 100G to 4x25G Breakout Active Optical Cable
OM4 LC to LC Cables (these cables require a transceiver)
4Z57A10845 B2P9 Lenovo 0.5m LC-LC OM4 MMF Cable
4Z57A10846 B2PA Lenovo 1m LC-LC OM4 MMF Cable
4Z57A10847 B2PB Lenovo 3m LC-LC OM4 MMF Cable
4Z57A10848 B2PC Lenovo 5m LC-LC OM4 MMF Cable
4Z57A10849 B2PD Lenovo 10m LC-LC OM4 MMF Cable
4Z57A10850 B2PE Lenovo 15m LC-LC OM4 MMF Cable
4Z57A10851 B2PF Lenovo 25m LC-LC OM4 MMF Cable
4Z57A10852 B2PG Lenovo 30m LC-LC OM4 MMF Cable

The Mellanox QSA 100G to 25G Cable Adapter (4G17A10853) supports the direct-attach copper (DAC) cables listed in the following table.

Table 7. Copper cables for Mellanox QSA 100G to 25G Cable Adapter (4G17A10853)
Part number Feature code Description
SFP+ 10Gb Passive DAC Cables
00D6288 A3RG 0.5m Passive DAC SFP+ Cable
90Y9427 A1PH 1m Passive DAC SFP+ Cable
00AY764 A51N 1.5m Passive DAC SFP+ Cable
00AY765 A51P 2m Passive DAC SFP+ Cable
90Y9430 A1PJ 3m Passive DAC SFP+ Cable
90Y9433 A1PK 5m Passive DAC SFP+ Cable
00D6151 A3RH 7m Passive DAC SFP+ Cable
90Y9436 A1PL 8.5m Passive DAC SFP+ Cable
SFP+ 10Gb Active DAC Cables
00VX111 AT2R Lenovo 1m Active DAC SFP+ Cables
00VX114 AT2S Lenovo 3m Active DAC SFP+ Cables
00VX117 AT2T Lenovo 5m Active DAC SFP+ Cables
SFP28 25Gb Passive DAC Cables
7Z57A03557 AV1W Lenovo 1m Passive 25G SFP28 DAC Cable
7Z57A03558 AV1X Lenovo 3m Passive 25G SFP28 DAC Cable
7Z57A03559 AV1Y Lenovo 5m Passive 25G SFP28 DAC Cable
QSFP28 100G-to-4x25G Ethernet Breakout Cables
7Z57A03564 AV22 Lenovo 1m 100G QSFP28 to 4x25G SFP28 Breakout DAC Cable
4Z57A85043 BS32 Lenovo 1.5m 100G to 4x25G Breakout SFP28 Breakout DAC Cable
4Z57A85044 BS33 Lenovo 2m 100G to 4x25G Breakout SFP28 Breakout DAC Cable
7Z57A03565 AV23 Lenovo 3m 100G QSFP28 to 4x25G SFP28 Breakout DAC Cable
7Z57A03566 AV24 Lenovo 5m 100G QSFP28 to 4x25G SFP28 Breakout DAC Cable

Features

Machine learning and big data environments

Data analytics has become an essential function within many enterprise data centers, clouds and hyperscale platforms. Machine learning relies on especially high throughput and low latency to train deep neural networks and to improve recognition and classification accuracy. ConnectX-6 offers an excellent solution to provide machine learning applications with the levels of performance and scalability that they require.

ConnectX-6 utilizes the RDMA technology to deliver low-latency and high performance. ConnectX-6 enhances RDMA network capabilities even further by delivering end-to-end packet level flow control.

Security

ConnectX-6 block-level encryption offers a critical innovation to network security. As data in transit is stored or retrieved, it undergoes encryption and decryption. The ConnectX-6 hardware offloads the IEEE AES-XTS encryption/decryption from the CPU, saving latency and CPU utilization. It also guarantees protection for users sharing the same resources through the use of dedicated encryption keys.

By performing block-storage encryption in the adapter, ConnectX-6 excludes the need for self-encrypted disks. This allows customers the freedom to choose their preferred storage device, including those that traditionally do not provide encryption. ConnectX-6 can support Federal Information Processing Standards (FIPS) compliance.

ConnectX-6 also includes a hardware Root-of-Trust (RoT), which uses HMAC relying on a device-unique key. This provides both a secure boot as well as cloning-protection. Delivering best-in-class device and firmware protection, ConnectX-6 also provides secured debugging capabilities, without the need for physical access.

Storage environments

NVMe storage devices offer very fast access to storage media. The evolving NVMe over Fabric (NVMe-oF) protocol leverages RDMA connectivity to remotely access NVMe storage devices efficiently, while keeping the end-to-end NVMe model at lowest latency. With its NVMe-oF target and initiator offloads, ConnectX-6 brings further optimization to NVMe-oF, enhancing CPU utilization and scalability.

Cloud and Web 2.0 environments

Telco, Cloud and Web 2.0 customers developing their platforms on software-defined network (SDN) environments are leveraging the Virtual Switching capabilities of server operating systems to enable maximum flexibility in the management and routing protocols of their networks.

Open V-Switch (OVS) is an example of a virtual switch that allows virtual machines to communicate among themselves and with the outside world. Software-based virtual switches, traditionally residing in the hypervisor, are CPU intensive, affecting system performance and preventing full utilization of available CPU for compute functions.

To address such performance issues, ConnectX-6 offers Mellanox Accelerated Switching and Packet Processing (ASAP2) Direct technology. ASAP2 offloads the vSwitch/vRouter by handling the data plane in the NIC hardware while maintaining the control plane unmodified. As a result, significantly higher vSwitch/vRouter performance is achieved minus the associated CPU load.

The vSwitch/vRouter offload functions supported by ConnectX-5 and ConnectX-6 include encapsulation and de-capsulation of overlay network headers, as well as stateless offloads of inner packets, packet headers re-write (enabling NAT functionality), hairpin, and more.

In addition, ConnectX-6 offers intelligent flexible pipeline capabilities, including programmable flexible parser and flexible match-action tables, which enable hardware offloads for future protocols.

Socket Direct

Mellanox’s Socket Direct technology improves the performance of dual-socket servers in numerous ways, such as by enabling each of their CPUs to access the network through a dedicated PCIe interface. As the connection from each CPU to the network bypasses the QPI (UPI) and the second CPU, Socket Direct reduces latency and CPU utilization. Moreover, each CPU handles only its own traffic (and not that of the second CPU), thus optimizing CPU utilization even further.

Socket Direct also enables GPUDirect RDMA for all CPU/GPU pairs by ensuring that GPUs are linked to the CPUs closest to the adapter card. Socket Direct enables Intel DDIO optimization on both sockets by creating a direct connection between the sockets and the adapter card.

Socket Direct technology is enabled by a main card housing the ConnectX-6 and an auxiliary PCIe card bringing in the remaining PCIe lanes. The ConnectX-6 Socket Direct card is installed into two PCIe x16 slots and connected using the supplied 350mm cable.

The two PCIe x16 slots may also be connected to the same CPU. In this case, the main advantage of the technology lies in delivering 200Gb/s to servers with PCIe Gen3-only support.

SharedIO

An implementation of Sockets Direct is SharedIO (Shared I/O), where a Mellanox VPI adapter is installed in one slot in one server and an auxiliary adapter is installed in a slot in second server in the same enclosure.

The result is that the two servers or processors share the network connection of the VPI adapter with significant savings both in the cost of the adapters but also the cost of switch ports.

The following figure shows the Mellanox SharedIO Adapter and Auxiliary Card installed in two ThinkSystem SD650 V2 servers in the same tray.

SharedIO adapters installed in the two SD650 V2 servers on a tray
Figure 2. SharedIO adapters installed in the two SD650 V2 servers on a tray

Technical specifications

The adapters have the following technical specifications.

Form factor

  • Single-slot low-profile main adapter (6.6-inch x 2.71 in.)
  • Single-slot low-profile auxilary adapter

PCI Express Interface

  • Supports PCIe 4.0 or PCIe 3.0
    • In PCIe 4.0 servers, the ConnectX-6 adapter is used by itself to connect 16 PCIe lanes. For two-socket servers, the adapter can be used with the Aux adapter to enable the Socket Direct feature.
    • In PCIe 3.0 servers, the ConnectX-6 adapter is used with the Aux adapter to connect 32 PCIe lanes. For two-socket servers, the adapter can be used with the Aux adapter to enable the Socket Direct feature.
  • PCIe Atomic
  • TLP (Transaction Layer Packet) Processing Hints (TPH)
  • PCIe switch Downstream Port Containment (DPC) enablement for PCIe hot-plug
  • Advanced Error Reporting (AER)
  • Access Control Service (ACS) for peer-to-peer secure communication
  • Process Address Space ID (PASID) Address Translation Services (ATS)
  • IBM CAPIv2 (Coherent Accelerator Processor Interface)
  • Support for MSI/MSI-X mechanisms

Connectivity

  • One QSFP56 port
  • Supports passive copper cables with ESD protection
  • Powered connectors for optical and active cable support

InfiniBand

  • Supports interoperability with InfiniBand switches (up to HDR, as 4 lanes of 50Gb/s data rate)
  • Total connectivity is 200 Gb/s:
    • One port adapter supports a single 200 Gb/s link
  • HDR / HDR100 / EDR / FDR / QDR / DDR / SDR
  • IBTA Specification 1.3 compliant
  • RDMA, Send/Receive semantics
  • Hardware-based congestion control
  • Atomic operations
  • 16 million I/O channels
  • 256 to 4Kbyte MTU, 2Gbyte messages
  • 8 virtual lanes + VL15

Ethernet (requires firmware 20.28.1002 or later)

  • Support interoperability with Ethernet switches (up to 200GbE, as 4 lanes of 50Gb/s data rate)
  • Total connectivity is 200 Gb/s:
    • One port adapter supports a single 200 Gb/s link
  • Supports 200 GbE / 100GbE / 50GbE / 40GbE / 25GbE / 10GbE / 1GbE
  • IEEE 802.3bj, 802.3bm 100 Gigabit Ethernet
  • IEEE 802.3by, Ethernet Consortium 25, 50 Gigabit Ethernet, supporting all FEC modes
  • IEEE 802.3ba 40 Gigabit Ethernet
  • IEEE 802.3ae 10 Gigabit Ethernet
  • IEEE 802.3az Energy Efficient Ethernet
  • IEEE 802.3ap based auto-negotiation and KR startup
  • IEEE 802.3ad, 802.1AX Link Aggregation
  • IEEE 802.1Q, 802.1P VLAN tags and priority
  • IEEE 802.1Qau (QCN) – Congestion Notification
  • IEEE 802.1Qaz (ETS)
  • IEEE 802.1Qbb (PFC)
  • IEEE 802.1Qbg
  • IEEE 1588v2
  • Jumbo frame support (9.6KB)
  • IPv4 (RFQ 791)
  • IPv6 (RFC 2460)

Enhanced Features

  • Hardware-based reliable transport
  • Collective operations offloads
  • Vector collective operations offloads
  • PeerDirect RDMA (GPUDirect) communication acceleration
  • 64/66 encoding
  • Enhanced Atomic operations
  • Advanced memory mapping support, allowing user mode registration and remapping of memory (UMR)
  • Extended Reliable Connected transport (XRC)
  • Dynamically Connected transport (DCT)
  • On demand paging (ODP)
  • MPI Tag Matching
  • Rendezvous protocol offload
  • Out-of-order RDMA supporting Adaptive Routing
  • Burst buffer offload
  • In-Network Memory registration-free RDMA memory access

CPU Offloads

  • RDMA over Converged Ethernet (RoCE)
  • TCP/UDP/IP stateless offload
  • LSO, LRO, checksum offload
  • RSS (also on encapsulated packet), TSS, HDS, VLAN and MPLS tag insertion/stripping, Receive flow steering
  • Data Plane Development Kit (DPDK) for kernel bypass applications
  • Open VSwitch (OVS) offload using ASAP2
    • Flexible match-action flow tables
    • Tunneling encapsulation / de-capsulation
  • Intelligent interrupt coalescence
  • Header rewrite supporting hardware offload of NAT router

Storage Offloads

  • Block-level encryption: XTS-AES 256/512 bit key
  • NVMe over Fabric offloads for target machine
  • Erasure Coding offload - offloading Reed-Solomon calculations
  • T10 DIF - signature handover operation at wire speed, for ingress and egress traffic
  • Storage Protocols: SRP, iSER, NFS RDMA, SMB Direct, NVMe-oF

Overlay Networks

  • RoCE over overlay networks
  • Stateless offloads for overlay network tunneling protocols
  • Hardware offload of encapsulation and decapsulation of VXLAN, NVGRE, and GENEVE overlay networks

Hardware-Based I/O Virtualization

  • Single Root IOV
  • Address translation and protection
  • VMware NetQueue support
  • SR-IOV: Up to 512 Virtual Functions
  • SR-IOV: Up to 16 Physical Functions per host
  • Virtualization hierarchies (network partitioning, NPAR)
    • Virtualizing Physical Functions on a physical port
    • SR-IOV on every Physical Function
  • Configurable and user-programmable QoS
  • Guaranteed QoS for VMs

HPC Software Libraries

  • HPC-X, OpenMPI, MVAPICH, MPICH, OpenSHMEM, PGAS and varied commercial packages

Management and Control

  • NC-SI, MCTP over SMBus and MCTP over PCIe - BMC interface
  • PLDM for Monitor and Control DSP0248
  • PLDM for Firmware Update DSP0267
  • SDN management interface for managing the eSwitch
  • I2C interface for device control and configuration
  • General Purpose I/O pins
  • SPI interface to Flash
  • JTAG IEEE 1149.1 and IEEE 1149.6

Remote Boot

  • Remote boot over InfiniBand
  • Remote boot over Ethernet
  • Remote boot over iSCSI
  • Unified Extensible Firmware Interface (UEFI)
  • Pre-execution Environment (PXE)

NVIDIA Unified Fabric Manager

NVIDIA Unified Fabric Manager (UFM) is InfiniBand networking management software that combines enhanced, real-time network telemetry with fabric visibility and control to support scale-out InfiniBand data centers.

The two offerings available from Lenovo are as follows:

  • UFM Telemetry for Real-Time Monitoring

    The UFM Telemetry platform provides network validation tools to monitor network performance and conditions, capturing and streaming rich real-time network telemetry information, application workload usage, and system configuration to an on-premises or cloud-based database for further analysis.

  • UFM Enterprise for Fabric Visibility and Control

    The UFM Enterprise platform combines the benefits of UFM Telemetry with enhanced network monitoring and management. It performs automated network discovery and provisioning, traffic monitoring, and congestion discovery. It also enables job schedule provisioning and integrates with industry-leading job schedulers and cloud and cluster managers, including Slurm and Platform Load Sharing Facility (LSF).

The following table lists the subscription licenses available from Lenovo.

Table 8. NVIDIA Unified Fabric Manager subscriptions
Part number Feature code
(7S09CTO6WW)
Description
UFM Telemetry
7S090011WW S921 NVIDIA UFM Telemetry 1-year License and 24/7 Support for Lenovo clusters
7S090012WW S922 NVIDIA UFM Telemetry 3-year License and 24/7 Support for Lenovo clusters
7S090013WW S923 NVIDIA UFM Telemetry 5-year License and 24/7 Support for Lenovo clusters
UFM Enterprise
7S09000XWW S91Y NVIDIA UFM Enterprise 1-year License and 24/7 Support for Lenovo clusters
7S09000YWW S91Z NVIDIA UFM Enterprise 3-year License and 24/7 Support for Lenovo clusters
7S09000ZWW S920 NVIDIA UFM Enterprise 5-year License and 24/7 Support for Lenovo clusters

For more information, see the following web page:
https://www.nvidia.com/en-us/networking/infiniband/ufm/

Server support

The following servers offer a PCIe 4.0 host interface. All other supported servers have a PCIe 3.0 host interface.

  • ThinkSystem SR635
  • ThinkSystem SR655
  • ThinkSystem SR645
  • ThinkSystem SR665

The following tables list the ThinkSystem servers that are compatible.

Table 9. Server support (Part 1 of 4)
Part Number Description AMD V3 2S Intel V3/V4 4S 8S Intel V3 Multi Node V3/V4 GPU Rich
SR635 V3 (7D9H / 7D9G)
SR655 V3 (7D9F / 7D9E)
SR645 V3 (7D9D / 7D9C)
SR665 V3 (7D9B / 7D9A)
ST650 V3 (7D7B / 7D7A)
SR630 V3 (7D72 / 7D73)
SR650 V3 (7D75 / 7D76)
SR630 V4 (7DG8 / 7DG9)
SR850 V3 (7D97 / 7D96)
SR860 V3 (7D94 / 7D93)
SR950 V3 (7DC5 / 7DC4)
SD535 V3 (7DD8 / 7DD1)
SD530 V3 (7DDA / 7DD3)
SD550 V3 (7DD9 / 7DD2)
SD520 V4 (7DFZ / 7DFY)
SR670 V2 (7Z22 / 7Z23)
SR675 V3 (7D9Q / 7D9R)
SR680a V3 (7DHE)
SR685a V3 (7DHC)
SR780a V3 (7DJ5)
Mellanox adapters
4C57A15326 ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-port PCIe 4 VPI Adapter Y Y Y Y N Y Y N Y Y N Y Y Y N Y Y N N N
B4RG ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-port PCIe 4 VPI Adapter (SharedIO) DWC N N N N N N N N N N N N N N N N N N N N
B951 ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-Port PCIe 4 VPI Adapter (SharedIO) DWC N N N N N N N N N N N N N N N N N N N N
B952 ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-Port PCIe 4 VPI Adapter DWC N N N N N N N N N N N N N N N N N N N N
BKSK ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-Port PCIe 4 VPI Adapter (SharedIO) DWC N N N N N N N N N N N N N N N N N N N N
Auxiliary adapters and cables
4C57A14179 ThinkSystem Mellanox HDR/200GbE 2x PCIe Aux Kit N N N N N N N N N N N N N N N Y N N N N
BPZG ThinkSystem SD665 V3 SharedIO Cable N N N N N N N N N N N N N N N N N N N N
BRL2 ThinkSystem SD650 V3 NDR Auxiliary Cable N N N N N N N N N N N N N N N N N N N N
Table 10. Server support (Part 2 of 4)
Part Number Description 1S V3 Edge Super Computing 1S Intel V2 2S Intel V2
ST50 V3 (7DF4 / 7DF3)
ST250 V3 (7DCF / 7DCE)
SR250 V3 (7DCM / 7DCL)
SE350 (7Z46 / 7D1X)
SE350 V2 (7DA9)
SE360 V2 (7DAM)
SE450 (7D8T)
SE455 V3 (7DBY)
SC750 V4 (7DDJ)
SD665 V3 (7D9P)
SD665-N V3 (7DAZ)
SD650 V3 (7D7M)
SD650-I V3 (7D7L)
SD650-N V3 (7D7N)
ST50 V2 (7D8K / 7D8J)
ST250 V2 (7D8G / 7D8F)
SR250 V2 (7D7R / 7D7Q)
ST650 V2 (7Z75 / 7Z74)
SR630 V2 (7Z70 / 7Z71)
SR650 V2 (7Z72 / 7Z73)
Mellanox adapters
4C57A15326 ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-port PCIe 4 VPI Adapter N N N N N N N N N N N N N N N N N N Y Y
B4RG ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-port PCIe 4 VPI Adapter (SharedIO) DWC N N N N N N N N N N N N N N N N N N N N
B951 ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-Port PCIe 4 VPI Adapter (SharedIO) DWC N N N N N N N N N N N N N N N N N N N N
B952 ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-Port PCIe 4 VPI Adapter DWC N N N N N N N N N N N N N N N N N N N N
BKSK ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-Port PCIe 4 VPI Adapter (SharedIO) DWC N N N N N N N N N Y N Y Y N N N N N N N
Auxiliary adapters and cables
4C57A14179 ThinkSystem Mellanox HDR/200GbE 2x PCIe Aux Kit N N N N N N N N N N N Y N N N N N N Y Y
BPZG ThinkSystem SD665 V3 SharedIO Cable N N N N N N N N N Y N N N N N N N N N N
BRL2 ThinkSystem SD650 V3 NDR Auxiliary Cable N N N N N N N N N N N Y Y N N N N N N N
Table 11. Server support (Part 3 of 4)
Part Number Description AMD V1 Dense V2 4S V2 8S 4S V1 1S Intel V1
SR635 (7Y98 / 7Y99)
SR655 (7Y00 / 7Z01)
SR655 Client OS
SR645 (7D2Y / 7D2X)
SR665 (7D2W / 7D2V)
SD630 V2 (7D1K)
SD650 V2 (7D1M)
SD650-N V2 (7D1N)
SN550 V2 (7Z69)
SR850 V2 (7D31 / 7D32)
SR860 V2 (7Z59 / 7Z60)
SR950 (7X11 / 7X12)
SR850 (7X18 / 7X19)
SR850P (7D2F / 2D2G)
SR860 (7X69 / 7X70)
ST50 (7Y48 / 7Y50)
ST250 (7Y45 / 7Y46)
SR150 (7Y54)
SR250 (7Y52 / 7Y51)
Mellanox adapters
4C57A15326 ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-port PCIe 4 VPI Adapter Y Y N Y Y Y N N N Y Y Y N N N N N N N
B4RG ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-port PCIe 4 VPI Adapter (SharedIO) DWC N N N N N N N N N N N N N N N N N N N
B951 ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-Port PCIe 4 VPI Adapter (SharedIO) DWC N N N N N N Y N N N N N N N N N N N N
B952 ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-Port PCIe 4 VPI Adapter DWC N N N N N N Y Y N N N N N N N N N N N
BKSK ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-Port PCIe 4 VPI Adapter (SharedIO) DWC N N N N N N N N N N N N N N N N N N N
Auxiliary adapters and cables
4C57A14179 ThinkSystem Mellanox HDR/200GbE 2x PCIe Aux Kit N N N Y Y N Y N N Y Y Y N N N N N N N
BPZG ThinkSystem SD665 V3 SharedIO Cable N N N N N N N N N N N N N N N N N N N
BRL2 ThinkSystem SD650 V3 NDR Auxiliary Cable N N N N N N N N N N N N N N N N N N N
Table 12. Server support (Part 4 of 4)
Part Number Description 2S Intel V1 Dense V1
ST550 (7X09 / 7X10)
SR530 (7X07 / 7X08)
SR550 (7X03 / 7X04)
SR570 (7Y02 / 7Y03)
SR590 (7X98 / 7X99)
SR630 (7X01 / 7X02)
SR650 (7X05 / 7X06)
SR670 (7Y36 / 7Y37)
SD530 (7X21)
SD650 (7X58)
SN550 (7X16)
SN850 (7X15)
Mellanox adapters
4C57A15326 ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-port PCIe 4 VPI Adapter N N N N N Y Y Y N N N N
B4RG ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-port PCIe 4 VPI Adapter (SharedIO) DWC N N N N N N N N N Y N N
B951 ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-Port PCIe 4 VPI Adapter (SharedIO) DWC N N N N N N N N N N N N
B952 ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-Port PCIe 4 VPI Adapter DWC N N N N N N N N N N N N
BKSK ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-Port PCIe 4 VPI Adapter (SharedIO) DWC N N N N N N N N N N N N
Auxiliary adapters and cables
4C57A14179 ThinkSystem Mellanox HDR/200GbE 2x PCIe Aux Kit N N N N N Y Y Y N Y N N
BPZG ThinkSystem SD665 V3 SharedIO Cable N N N N N N N N N N N N
BRL2 ThinkSystem SD650 V3 NDR Auxiliary Cable N N N N N N N N N N N N

Operating system support

The following table indicates which operating systems can be preloaded in the Lenovo factory for CTO server orders where this adapter is included in the server configuration.

Table 13. OS preload support
Operating system Preload support
Windows Server 2016 Supported as a preload for factory orders
Windows Server 2019 Supported as a preload for factory orders
Windows Server 2022 No support as a preload
Windows Server 2025 No support as a preload
VMware ESXi 6.5 U3 Supported as a preload for factory orders
VMware ESXi 6.7 U3 Supported as a preload for factory orders
VMware ESXi 7.0 Supported as a preload for factory orders
VMware ESXi 7.0 U1 No support as a preload
VMware ESXi 7.0 U2 No support as a preload
VMware ESXi 7.0 U3 No support as a preload
VMware ESXi 8.0 No support as a preload
VMware ESXi 8.0 U1 No support as a preload
VMware ESXi 8.0 U2 No support as a preload
VMware ESXi 8.0 U3 No support as a preload

Tip: If an OS is listed as "No support" above, but it is listed in one of the support tables below, that means the OS is supported by the adapter, just not available to be preloaded in the Lenovo factory in CTO orders.

The adapters support the operating systems listed in the following tables.

Tip: These tables are automatically generated based on data from Lenovo ServerProven.

Table 14. Operating system support for ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-port PCIe 4 VPI Adapter, 4C57A15326 (Part 1 of 2)
Operating systems
SD530 V3
SD535 V3
SD550 V3
SR630 V3 (4th Gen Xeon)
SR630 V3 (5th Gen Xeon)
SR635 V3
SR645 V3
SR650 V3 (4th Gen Xeon)
SR650 V3 (5th Gen Xeon)
SR655 V3
SR665 V3
SR675 V3
SR850 V3
SR860 V3
SD630 V2
SR630 V2
SR650 V2
SR670 V2
SR850 V2
SR860 V2
Microsoft Windows 10 N N N N N N N N N N N N N N N N N N N N
Microsoft Windows 11 N N N N N N N N N N N N N N N N N N N N
Microsoft Windows Server 2016 N N N N N N N N N N N N N N Y Y Y Y Y Y
Microsoft Windows Server 2019 N N N Y Y Y Y Y Y 1 Y Y Y Y Y Y Y Y Y Y Y
Microsoft Windows Server 2022 Y Y Y Y Y Y Y Y Y 1 Y Y Y Y Y Y Y Y Y Y Y
Microsoft Windows Server 2025 Y Y Y Y Y Y Y Y Y Y Y Y Y Y N Y Y Y Y Y
Microsoft Windows Server version 1709 N N N N N N N N N N N N N N N N N N N N
Microsoft Windows Server version 1803 N N N N N N N N N N N N N N N N N N N N
Red Hat Enterprise Linux 6.10 N N N N N N N N N N N N N N N N N N N N
Red Hat Enterprise Linux 6.9 N N N N N N N N N N N N N N N N N N N N
Red Hat Enterprise Linux 7.3 N N N N N N N N N N N N N N N N N N N N
Red Hat Enterprise Linux 7.4 N N N N N N N N N N N N N N N N N N N N
Red Hat Enterprise Linux 7.5 N N N N N N N N N N N N N N N N N N N N
Red Hat Enterprise Linux 7.6 N N N N N N N N N N N N N N N N N N N N
Red Hat Enterprise Linux 7.7 N N N N N N N N N N N N N N N N N N N N
Red Hat Enterprise Linux 7.8 N N N N N N N N N N N N N N N N N N N N
Red Hat Enterprise Linux 7.9 N N N N N N N N N N N N N N Y Y Y Y Y Y
Red Hat Enterprise Linux 8.0 N N N N N N N N N N N N N N N N N N N N
Red Hat Enterprise Linux 8.1 N N N N N N N N N N N N N N N N N N N N
Red Hat Enterprise Linux 8.10 Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.2 N N N N N N N N N N N N N N Y Y Y Y Y Y
Red Hat Enterprise Linux 8.3 N N N N N N N N N N N N N N Y Y Y Y Y Y
Red Hat Enterprise Linux 8.4 N N N N N N N N N N N N N N Y Y Y Y Y Y
Red Hat Enterprise Linux 8.5 N N N N N N N N N N N N N N Y Y Y Y Y Y
Red Hat Enterprise Linux 8.6 N N N Y N Y Y Y N Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.7 N N N Y N Y Y Y N Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.8 Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.9 Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 9.0 N N N Y N Y Y Y N Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 9.1 N N N Y N Y Y Y N Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 9.2 Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 9.3 Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 9.4 Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 9.5 Y Y Y Y Y Y Y Y Y Y Y N Y Y Y Y Y Y Y Y
SUSE Linux Enterprise Server 12 SP3 N N N N N N N N N N N N N N N N N N N N
SUSE Linux Enterprise Server 12 SP4 N N N N N N N N N N N N N N N N N N N N
SUSE Linux Enterprise Server 12 SP5 N N N N N N N N N N N N N N Y Y Y Y Y Y
SUSE Linux Enterprise Server 15 N N N N N N N N N N N N N N N N N N N N
SUSE Linux Enterprise Server 15 SP1 N N N N N N N N N N N N N N N N N N N N
SUSE Linux Enterprise Server 15 SP2 N N N N N N N N N N N N N N Y Y Y Y Y Y
SUSE Linux Enterprise Server 15 SP3 N N N N N N N N N N N N N N Y Y Y Y Y Y
SUSE Linux Enterprise Server 15 SP4 N N N Y N Y Y Y N Y Y Y Y Y Y Y Y Y Y Y
SUSE Linux Enterprise Server 15 SP5 Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y
SUSE Linux Enterprise Server 15 SP6 Y Y Y Y Y Y Y Y Y Y Y Y Y Y N Y Y Y Y Y
Ubuntu 18.04.5 LTS N N N N N N N N N N N N N N Y Y Y Y N N
Ubuntu 20.04 LTS N N N Y Y N N Y Y N N N N N N Y Y N N N
Ubuntu 20.04.5 LTS N Y N N N Y Y N N Y Y Y Y Y N N N N N N
Ubuntu 22.04 LTS N N N Y N Y Y Y N Y Y Y Y Y Y Y Y Y Y Y
Ubuntu 22.04.3 LTS Y Y Y N Y N N N Y N N N N N N N N N N N
Ubuntu 24.04 LTS Y Y Y Y Y Y Y Y N Y Y Y Y Y Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 6.7 N N N N N N N N N N N N N N N N N N N N
VMware vSphere Hypervisor (ESXi) 6.7 U1 N N N N N N N N N N N N N N N N N N N N
VMware vSphere Hypervisor (ESXi) 6.7 U2 N N N N N N N N N N N N N N N N N N N N
VMware vSphere Hypervisor (ESXi) 6.7 U3 N N N N N N N N N N N N N N Y Y Y Y N N
VMware vSphere Hypervisor (ESXi) 7.0 N N N N N N N N N N N N N N N N N N N N
VMware vSphere Hypervisor (ESXi) 7.0 U1 N N N N N N N N N N N N N N N N N N Y Y
VMware vSphere Hypervisor (ESXi) 7.0 U2 N N N N N N N N N N N N N N Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 7.0 U3 Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 8.0 N N N Y N Y Y Y N Y Y N N N Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 8.0 U1 N N N Y N Y Y Y N Y Y Y Y Y Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 8.0 U2 Y Y Y N Y Y Y N Y Y Y Y Y Y Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 8.0 U3 Y Y Y Y Y Y Y Y Y Y Y Y Y Y N Y Y Y Y Y

1 IONG-11838 tips #TT1781

Table 15. Operating system support for ThinkSystem Mellanox ConnectX-6 HDR/200GbE QSFP56 1-port PCIe 4 VPI Adapter, 4C57A15326 (Part 2 of 2)
Operating systems
SR635
SR645
SR655
SR665
SR630 (Xeon Gen 2)
SR650 (Xeon Gen 2)
SR670 (Xeon Gen 2)
SR950 (Xeon Gen 2)
SR630 (Xeon Gen 1)
SR650 (Xeon Gen 1)
SR950 (Xeon Gen 1)
Microsoft Windows 10 N N Y 2 N N N N N N N N
Microsoft Windows 11 N N Y N N N N N N N N
Microsoft Windows Server 2016 Y Y N Y Y Y N Y Y Y Y
Microsoft Windows Server 2019 Y Y Y Y Y Y N Y Y Y Y
Microsoft Windows Server 2022 Y Y Y Y Y Y Y Y Y Y Y
Microsoft Windows Server 2025 N Y N Y N N N N N N N
Microsoft Windows Server version 1709 N N N N N N N N Y N Y
Microsoft Windows Server version 1803 N N N N N N N N Y N Y
Red Hat Enterprise Linux 6.10 N N N N N N N N Y Y Y
Red Hat Enterprise Linux 6.9 N N N N N N N N Y Y Y
Red Hat Enterprise Linux 7.3 N N N N N N N N Y Y Y
Red Hat Enterprise Linux 7.4 N N N N N N N N Y Y Y
Red Hat Enterprise Linux 7.5 N N N N N N Y N Y Y Y
Red Hat Enterprise Linux 7.6 Y 1 Y 1 N Y 1 Y Y Y Y Y Y Y
Red Hat Enterprise Linux 7.7 Y 1 Y 1 Y 1 Y 1 Y Y Y Y Y Y Y
Red Hat Enterprise Linux 7.8 Y 1 Y 1 Y 1 Y 1 Y Y Y Y Y Y Y
Red Hat Enterprise Linux 7.9 Y 1 Y 1 Y 1 Y 1 Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.0 Y 1 N Y 1 N Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.1 Y 1 Y 1 Y 1 Y 1 Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.10 Y Y Y Y N N N Y N N N
Red Hat Enterprise Linux 8.2 Y 1 Y 1 Y 1 Y 1 Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.3 Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.4 Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.5 Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.6 Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.7 Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.8 Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.9 Y Y Y Y Y Y Y Y N N N
Red Hat Enterprise Linux 9.0 Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 9.1 Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 9.2 Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 9.3 Y Y Y Y Y Y Y Y N N N
Red Hat Enterprise Linux 9.4 Y Y Y Y N N N Y N N N
Red Hat Enterprise Linux 9.5 Y Y Y Y N N N Y N N N
SUSE Linux Enterprise Server 12 SP3 N N N N N N N N Y Y Y
SUSE Linux Enterprise Server 12 SP4 Y 1 N Y 1 N Y Y N Y Y Y Y
SUSE Linux Enterprise Server 12 SP5 Y Y Y Y Y Y Y Y Y Y Y
SUSE Linux Enterprise Server 15 N N N N Y Y N Y Y Y Y
SUSE Linux Enterprise Server 15 SP1 Y 1 Y 1 Y 1 Y 1 Y Y Y Y Y Y Y
SUSE Linux Enterprise Server 15 SP2 Y Y Y Y Y Y Y Y Y Y Y
SUSE Linux Enterprise Server 15 SP3 Y Y Y Y Y Y Y Y Y Y Y
SUSE Linux Enterprise Server 15 SP4 Y Y Y Y Y Y Y Y Y Y Y
SUSE Linux Enterprise Server 15 SP5 Y Y Y Y Y Y Y Y Y Y Y
SUSE Linux Enterprise Server 15 SP6 Y Y Y Y N N N N N N N
Ubuntu 18.04.5 LTS N N N N N N N N N N N
Ubuntu 20.04 LTS N N N N N N N N N N N
Ubuntu 20.04.5 LTS N N N N N N N N N N N
Ubuntu 22.04 LTS Y Y Y Y Y Y Y Y Y Y Y
Ubuntu 22.04.3 LTS N N N N N N N N N N N
Ubuntu 24.04 LTS Y Y Y Y N N N N N N N
VMware vSphere Hypervisor (ESXi) 6.7 N N N N N N N N Y Y Y
VMware vSphere Hypervisor (ESXi) 6.7 U1 N N N N Y Y N Y Y Y Y
VMware vSphere Hypervisor (ESXi) 6.7 U2 N N N N Y Y N Y Y Y Y
VMware vSphere Hypervisor (ESXi) 6.7 U3 Y Y Y Y Y Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 7.0 Y 1 Y 1 Y 1 Y 1 Y Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 7.0 U1 Y 1 Y Y 1 Y Y Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 7.0 U2 Y Y Y Y Y Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 7.0 U3 Y Y Y Y Y Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 8.0 Y Y Y Y Y Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 8.0 U1 Y Y Y Y Y Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 8.0 U2 Y Y Y Y Y Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 8.0 U3 Y Y Y Y N N N N N N N

1 The OS is not supported with EPYC 7003 processors.

2 ISG will not sell/preload this OS, but compatibility and cert only.

Table 16. Operating system support for ThinkSystem Mellanox HDR/200GbE 2x PCIe Aux Kit, 4C57A14179
Operating systems
SR630 V2
SR650 V2
SR670 V2
SR850 V2
SR860 V2
SR645
SR665
SR630 (Xeon Gen 2)
SR650 (Xeon Gen 2)
SR670 (Xeon Gen 2)
SR950 (Xeon Gen 2)
SR630 (Xeon Gen 1)
SR650 (Xeon Gen 1)
SR950 (Xeon Gen 1)
Microsoft Windows Server 2016 Y Y Y Y Y Y Y Y Y N Y Y Y Y
Microsoft Windows Server 2019 Y Y Y Y Y Y Y Y Y N Y Y Y Y
Microsoft Windows Server 2022 Y Y Y Y Y Y Y Y Y Y Y Y Y Y
Microsoft Windows Server 2025 Y Y Y Y Y Y Y N N N N N N N
Microsoft Windows Server version 1709 N N N N N N N N N N N Y N Y
Microsoft Windows Server version 1803 N N N N N N N N N N N Y N Y
Red Hat Enterprise Linux 6.10 N N N N N N N N N N N Y Y Y
Red Hat Enterprise Linux 6.9 N N N N N N N N N N N Y Y Y
Red Hat Enterprise Linux 7.3 N N N N N N N N N N N Y Y Y
Red Hat Enterprise Linux 7.4 N N N N N N N N N N N Y Y Y
Red Hat Enterprise Linux 7.5 N N N N N N N N N Y N Y Y Y
Red Hat Enterprise Linux 7.6 N N N N N Y 1 Y 1 Y Y Y Y Y Y Y
Red Hat Enterprise Linux 7.7 N N N N N Y 1 Y 1 Y Y Y Y Y Y Y
Red Hat Enterprise Linux 7.8 N N N N N Y 1 Y 1 Y Y Y Y Y Y Y
Red Hat Enterprise Linux 7.9 Y Y Y Y Y Y 1 Y 1 Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.0 N N N N N N N Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.1 N N N N N Y 1 Y 1 Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.10 Y Y Y Y Y Y Y N N N Y N N N
Red Hat Enterprise Linux 8.2 Y Y Y Y Y Y 1 Y 1 Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.3 Y Y Y Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.4 Y Y Y Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.5 Y Y Y Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.6 Y Y Y Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.7 Y Y Y Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.8 Y Y Y Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 8.9 Y Y Y Y Y Y Y Y Y Y Y N N N
Red Hat Enterprise Linux 9.0 Y Y Y Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 9.1 Y Y Y Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 9.2 Y Y Y Y Y Y Y Y Y Y Y Y Y Y
Red Hat Enterprise Linux 9.3 Y Y Y Y Y Y Y Y Y Y Y N N N
Red Hat Enterprise Linux 9.4 Y Y Y Y Y Y Y N N N Y N N N
Red Hat Enterprise Linux 9.5 Y Y Y Y Y Y Y N N N Y N N N
SUSE Linux Enterprise Server 12 SP3 N N N N N N N N N N N Y Y Y
SUSE Linux Enterprise Server 12 SP4 N N N N N N N Y Y N Y Y Y Y
SUSE Linux Enterprise Server 12 SP5 Y Y Y Y Y Y Y Y Y Y Y Y Y Y
SUSE Linux Enterprise Server 15 N N N N N N N Y Y N Y Y Y Y
SUSE Linux Enterprise Server 15 SP1 N N N N N Y 1 Y 1 Y Y Y Y Y Y Y
SUSE Linux Enterprise Server 15 SP2 Y Y Y Y Y Y Y Y Y Y Y Y Y Y
SUSE Linux Enterprise Server 15 SP3 Y Y Y Y Y Y Y Y Y Y Y Y Y Y
SUSE Linux Enterprise Server 15 SP4 Y Y Y Y Y Y Y Y Y Y Y Y Y Y
SUSE Linux Enterprise Server 15 SP5 Y Y Y Y Y Y Y Y Y Y Y Y Y Y
SUSE Linux Enterprise Server 15 SP6 Y Y Y Y Y Y Y N N N N N N N
Ubuntu 18.04.5 LTS Y Y Y N N N N N N N N N N N
Ubuntu 20.04 LTS Y Y N N N N N N N N N N N N
Ubuntu 22.04 LTS Y Y Y Y Y Y Y Y Y Y Y Y Y Y
Ubuntu 24.04 LTS Y Y Y Y Y Y Y N N N N N N N
VMware vSphere Hypervisor (ESXi) 6.7 N N N N N N N N N N N Y Y Y
VMware vSphere Hypervisor (ESXi) 6.7 U1 N N N N N N N Y Y N Y Y Y Y
VMware vSphere Hypervisor (ESXi) 6.7 U2 N N N N N N N Y Y N Y Y Y Y
VMware vSphere Hypervisor (ESXi) 6.7 U3 Y Y Y N N Y Y Y Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 7.0 N N N N N Y 1 Y 1 Y Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 7.0 U1 N N N Y Y Y Y Y Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 7.0 U2 Y Y Y Y Y Y Y Y Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 7.0 U3 Y Y Y Y Y Y Y Y Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 8.0 Y Y Y Y Y Y Y Y Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 8.0 U1 Y Y Y Y Y Y Y Y Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 8.0 U2 Y Y Y Y Y Y Y Y Y Y Y Y Y Y
VMware vSphere Hypervisor (ESXi) 8.0 U3 Y Y Y Y Y Y Y N N N N N N N

1 The OS is not supported with EPYC 7003 processors.

Regulatory approvals

The adapters have the following regulatory approvals:

  • Safety: CB / cTUVus / CE
  • EMC: CE / FCC / VCCI / ICES / RCM / KC
  • RoHS: RoHS Compliant

Operating environment

The adapters have the following operating characteristics:

  • Typical power consumption (passive cables): 19.3W
  • Maximum power available through QSFP56 port: 5W
  • Temperature
    • Operational: 0°C to 55°C
    • Non-operational: -40°C to 70°C
  • Humidity: 90% relative humidity

Warranty

One year limited warranty. When installed in a Lenovo server, the adapter assumes the server’s base warranty and any warranty upgrades.

Related product families

Product families related to this document are the following:

Trademarks

Lenovo and the Lenovo logo are trademarks or registered trademarks of Lenovo in the United States, other countries, or both. A current list of Lenovo trademarks is available on the Web at https://www.lenovo.com/us/en/legal/copytrade/.

The following terms are trademarks of Lenovo in the United States, other countries, or both:
Lenovo®
ServerProven®
ThinkSystem®

The following terms are trademarks of other companies:

AMD is a trademark of Advanced Micro Devices, Inc.

Intel® and Xeon® are trademarks of Intel Corporation or its subsidiaries.

Linux® is the trademark of Linus Torvalds in the U.S. and other countries.

Microsoft®, Windows Server®, and Windows® are trademarks of Microsoft Corporation in the United States, other countries, or both.

Other company, product, or service names may be trademarks or service marks of others.