Inspur Blog

Inspur displays AI prowess at GTC China 2018

The largest GPU, AI and deep learning technology conference in the East took place at GTC China 2018 on November 20. As a diamond-level sponsor, Inspur was there in attendance representing the industry’s most robust and complete GPU server series.

Keynote: the speed and vigor of AI

NVIDIA founder and CEO Jensen Huang kicked things off with a keynote on the theme of speed, which was on impressive display at the conference.

NVIDIA’s HGX-2 server platform is nearly 550 times faster than a CPU-only server. On it an AI deep learning workload runs nearly 300 times faster, and a high-performance computing workload, nearly 160 times.

Jensen Huang also mentioned that with the release of NVIDIA’s new Turing-based T4 GPU, Inspur will bring the ultimate AI acceleration experience to AI users worldwide. Inspur’s systems based on the new T4 system – NF5280M4, NF5280M5, AGX-2, NF5468M5 – are expected to be available before the end of the year.

Currently, with the rapid permeation of artificial intelligence in various industries, deep learning inference has become the most promising future application for the integration of AI and cloud computing. Based on the superior ease of use and energy efficiency of NVIDIA GPUs, Inspur and NVIDIA have jointly aided many AI commercial companies around the world in deploying high-performance, low-power, high-bandwidth, low-latency AI inferencing platforms.

It is no wonder that Ian Buck, NVIDIA VP and GM of the Accelerated Computing business unit, once commented: “It is amazing the speed at which Inspur can deliver our latest technology, GPU platforms and software stacks to our customers.”

Inspur product showcase

At GTC China, Inspur demonstrated a full line of AI computing products and solutions based on NVIDIA’s leading GPU technology. The exhibits – including the AGX-5 AI super server for deep learning and high performance computing, AI server AGX-2 boasting the highest computational density, leading AI cloud computing platform NF5468M5, and the compact mobile supercomputing platform TS4220LC with liquid cooling – have garnered much attention.

According to Inspur product manager Allen Huang, the Inspur AI super server AGX-5, based on the NVIDIA HGX-2 server platform, delivers computing performance of up to 2PFlops. A single machine is equipped with 16 of the most powerful NVIDIA Tesla V100 Tensor Core 32GB GPUs in an 8U space. Under this architecture, even if the framework and model are very different, customers can easily achieve stellar training performance and rapid iteration of AI models and applications.

As Inspur maintains a 57% market share in the Chinese AI market (according to IDC’s 2017 China AI Infrastructure Market Research Report), Inspur leads the charge on AI computing solutions development in both East Asia and the United States. The advanced portfolio at GTC China 2018 as well as the recent SC18 demonstrates Inspur’s continued commitment in developing the best, most powerful performance platforms to serve the growing AI enterprise and its customers.

5 comments on “Inspur displays AI prowess at GTC China 2018

  1. 10.0.0.1 says:

    I am extremely impressed with your writing skills as well as with
    the layout on your blog. Is this a paid theme or did you modify it yourself?
    Anyway keep up the excellent quality writing, it is rare to see a
    great blog like this one these days.

  2. Marisol says:

    I do agree with all of the concepts you have offered on your post.
    They are really convincing and can certainly work. Nonetheless, the posts are very quick for novices.
    May just you please lengthen them a little from next time?

    Thank you for the post.

  3. I don’t ordinarily comment but I gotta state thanks for the post on this great one : D.

  4. Good article! We will be linking to this particularly great article
    on our website. Keep up the great writing.

  5. I love it when people come together and share opinions, great blog, keep it up.

Leave a Reply

Your email address will not be published. Required fields are marked *