- Built for scale-out applications
- Consolidate compute, storage, networking, power and cooling into a rack-scale solution
- supports up to 16 NVIDIA Tesla GPU cards, providing better expansion capacity
- Can be cascade up to 4 GPU Boxes via the PCI-E Switch for up to 64 GPUs in one daisy chain, for a gigantic compressing computing resource pool
The SR-AI rack is the world’s first rack server that adopts a PCI-e Fabric interconnect architecture design. It breaks the traditional server GPU/CPU coupled architecture through connecting the upward CPU computing/scheduling node to the downward GPU Box using a PCI-E Switch.