Founded in 1993, NVIDIA began as a company focused on bringing 3D graphics to digital technologies and introduced the world to Graphics Processing Units (GPUs). By the early 2000s, they had solidified their place as the leading GPU developer for video game companies, securing the contract to supply Microsoft with units for the first Xbox console. Over the coming years, NVIDIA would branch out into more of the digital landscape, expanding their dominance in the graphics market through strategic acquisitions and partnerships, while exploring other technologies such as AI, CPUs, mobile, and automotive electronics among others.
Fast forward to 2019 and NVIDIA announce their plans to buy Mellanox for $6.9 billion, which completed in April 2020. This acquisition marked the biggest by far in NVIDIA’s history with them previously spending no more than $400 million. Mellanox was founded in 1999 and specialised in networking products with a focus on InfiniBand and Ethernet technologies. As a key player in data centres and high performance computing (HPC), Mellanox was the ideal acquisition for NVIDIA who were looking to strengthen their footprint in these areas – something that they have now certainly succeeded in.
NVIDIA and Mellanox had a longstanding collaborative relationship prior to the acquisition with complimentary contributions to global computing. According to NVIDIA, their combined technologies were already used in over 250 of the world’s Top 500 supercomputers. This strong presence is largely due to Mellanox’s significance in InfiniBand technology, a market previously dominated by mainly Mellanox and Intel. Intel was reportedly in the running for Mellanox’s acquisition, perhaps in an attempt to safeguard its influence in the InfiniBand market, but NVIDIA’s acquisition ultimately prevented Intel from monopolising. As a result, NVIDIA has emerged as the leading supplier of InfiniBand products.
InfiniBand technology is becoming a key component in data centres and high performance computing. Mellanox’s high-end switches and network cards, known for their InfiniBand capabilities, are already widely used in these environments. These network cards utilise Remote Direct Memory Access (RDMA), or ‘offloading’ which directly accesses the memory, taking some of the workload from the CPU and subsequently speeding up data processing.
NVIDIA’s co-founder and CEO, Jensen Huang, has emphasised the importance of data centre computing in the future of digital technology developments. When discussing the past and future plans of the company, Huang stated, “We were a GPU company and then we became a GPU systems company. We became a computing company which started from the chip up, now we are extending ourselves into a datacentre computing company” (HPCWire, 2019), highlighting NVIDIA’s plans to be ahead of the curve in AI and next generation computing.
Although GPUs have become integral in high end computing, better GPU hardware alone cannot support the requirements for AI evolution and ever increasing computing demands. High performance storage and network interconnects are also needed to keep data flowing at a rate that maximises the GPU capabilities. The merging of NVIDIA and Mellanox breaks down barriers between these technologies enabling “higher performance, greater utilisation of computing resources and lower operating costs” (NVIDIA, 2020). More innovative networking solutions are key to keeping up with computing demands, especially at a time when the pace of CPU advancements is slowing, and Moore’s Law seems to be coming to an end.
Moore’s Law is an observation made by Intel co-founder Gordon Moore. It states that the number of transistors on a microchip will double approximately every two years, driving exponential growth in processing power with minimal impact on cost. This trend, which has held true since 1965, is now reaching its natural end due to the physical limits of silicon based technology.
By acquiring Mellanox, NVIDIA positions itself at the forefront of the data centre and HPC industries, where innovative networking solutions will play a crucial role in sustaining the rapid pace of technological development.
Explore ATGBICS NVIDIA Mellanox® Compatible Ethernet & InfiniBand Product Range
Whether you’re supporting large data centres, cloud environments or advanced AI workloads, our NVIDIA Mellanox® compatibles deliver the reliability and performance your clients expect.
Our extensive range includes over 680 NVIDIA Mellanox® compatible Network Transceivers, Direct Attach & Active Optical Cables with data rates from 1Gbps to 800Gbps. Our transceiver and cable range support both Ethernet and InfiniBand networks including FDR, EDR, HDR and NDR port speeds.
Our clients have the assurance that they will receive a 100% compatible, MSA compliant version of the OEM product with up to 90% cost savings.
View our full NVIDIA Mellanox® compatible range or contact us with your live requirement.
References:
Why NVIDIA bought Mellanox: ‘Future Datacenters will be like High Performance Computers’ – HPCWire, March 14, 2019
NVIDIA Completes Acquisition of Mellanox, Creating Major Force Driving Next-Gen Data Centers – NVIDIA, April 27, 2020