Administrator Guide

Copyright © 2020 Dell Inc. or its subsidiaries. All Rights Reserved. Dell, EMC and other trademarks are trademarks of Dell Inc. or its subsidiaries
Direct from Development
Server and
Infrastructure
Engineering
Validation of SNAP I/O Performance
on Dual-Socket PowerEdge Rack Servers
SNAP I/O Value Proposition
Dual-socket servers offer ample compute
power to meet the needs of a wide range of
workloads. However, if the network
adapters in the system are unbalanced,
users may be at risk of creating a bottleneck
that will reduce bandwidth and increase
latency. SNAP I/O is a solution which
leverages Mellanox Socket Direct
technology to balance I/O performance
without increasing the TCO. By allowing
both CPUs to share one adapter, data can
avoid traversing the UPI inter-processor link
when accessing remote memory.
As seen in Figure 2, the unbalanced configuration has CPU 0 in direct
communication with the NIC through a PCIe x16 slot, while CPU 1 must
traverse the UPI channel to CPU 0 first before it can communicate with
the NIC. This data travel path adds latency overhead when traversing the
UPI channel and can impact total bandwidth at high speeds. One solution
to this is to have an additional NIC card connected directly to CPU 1, but
this solution will introduce a 2x cost multiplier, including a 2nd NIC card,
cable and switch port. Rather than doubling NIC and switch costs, Dell
SNAP I/O can bridge the two sockets together by splitting the PCIe x16
bus into two x8 connectors and allowing the OS to see it as two NICs.
Tech Note by
Matt Ogle
Mike Darby
Rich Hernandez
Summary
Using Non-SNAP IO
communication paths for
one-NIC dual-socket
servers increases UPI
overhead, which slows
down bandwidth and
increases latency for CPU
applications. Resolving this
by adding another NIC card
will increase solution TCO.
The adoption of SNAP I/O
allows a dual-socket server
to bypass traversing the
UPI lanes when using one-
NIC configurations,
ultimately increasing
performance and TCO for
one-NIC dual socket
solutions.
This DfD will measure the
performance readings of
SNAP I/O against two Non-
SNAP I/O configurations to
demonstrate how using
SNAP I/O can increase
bandwidth, reduce latency
and optimize user TCO.
Figure 1: SNAP I/O Card
Figure 2: Comparing an unbalanced one-NIC solution and a balanced two-NIC solution
to a SNAP I/O one-NIC solution. The SNAP I/O solution on the right allows CPU 0 and 1
to communicate to their corresponding NIC card without traversing the UPI channels,
therefore reducing latency/TCO and freeing up UPI bandwidth for applications

Summary of content (3 pages)