News
The InfiniBand roadmap at the company goes back to the early days of 2001, starting with Mellanox delivering 10 Gb/sec SDR InfiniBand silicon and boards for switches and network interface cards. This ...
InfiniBand was used as a clustered storage backbone, too, and is now the preferred inter-node network for AI clusters doing machine learning training. If you were building a database cluster, you ...
InfiniBand aims to be many things to many users, but it is probably best summed up as a high-performance, highly scalable, low latency and highly reliable network for the computer center.
Google Network Architects Present Efficient Enterprise Data Center Using InfiniBand InfiniBand Technology Provides Performance and Efficient Use of Power June 29, 2010 12:01 PM Eastern Daylight Time ...
Networking company Mellanox Technologies, along with Hewlett-Packard and Dell, is demonstrating a next-generation FDR Infiniband network running at 56G bps (bits per second) at the International ...
InfiniBand's promise lies not merely in its ability to deliver high performance, but to do so while using low-cost servers. Right now InfiniBand supports x86, x64 (AMD64 and Intel EM64T), and ...
NASA Scales SGI Pleiades InfiniBand Cluster to 25,000 Intel® Xeon® Processor Cores. SGI® ICE™ System Continues to Push Boundaries of Scientific Discovery. February 27, ...
Network Appliance JNI. The FAS900 series of NAS appliances from Network Appliance, based here, connect to InfiniBand networks via JNI's InfiniStar IBP-1x02 HCA module, which is based on a Mellanox ...
In order to give AI developers and scientific researchers the fastest networking performance available on their workstations, Nvidia has introduced the next generation of its Nvidia Mellanox 400G ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results