TranslateProject/sources/talk/20191113 USPS invests in GPU-driven servers to speed package processing.md

61 lines
5.0 KiB
Markdown
Raw Normal View History

[#]: collector: (lujun9972)
[#]: translator: ( )
[#]: reviewer: ( )
[#]: publisher: ( )
[#]: url: ( )
[#]: subject: (USPS invests in GPU-driven servers to speed package processing)
[#]: via: (https://www.networkworld.com/article/3452521/usps-invests-in-gpu-driven-servers-to-speed-package-processing.html)
[#]: author: (Andy Patrizio https://www.networkworld.com/author/Andy-Patrizio/)
USPS invests in GPU-driven servers to speed package processing
======
U.S. Postal Service plans to use servers powered by Nvidia GPUs and deep learning software to train multiple AI algorithms for image recognition, yielding a tenfold increase in package-processing speed.
Thinkstock
The U.S. Postal Service is set to purchase GPU-accelerated servers from Hewlett Packard Enterprise that it expects will help accelerate package data processing up to 10 times over previous methods.
The plan is for a spring 2020 deployment, using HPE's Apollo 6500 servers, which come with up to eight Nvidia V100 Tensor Core GPUs. The Postal Service also will use Nvidia's EGX edge computing servers at nearly 200 of its processing locations in the U.S.
**READ MORE:** [How AI can improve network capacity planning][1]
Nvidia announced the USPS's plans at its GPU Technology Conference in Washington, D.C. Ian Buck, the former Stanford professor who created the CUDA language for programming Nvidia GPUs before joining the company to head AI initiatives, made the announcement in an opening keynote focused on AI.
Buck said half of the worlds enterprises today rely on AI for network protection and security, and 80% of the telcos will rely on it to protect their networks. “AI is a wonderful tool for looking at massive amounts of data and finding anomalies, pulling needles out of a haystack,” he told the audience.
The USPS — which processes 485 million pieces of mail per day, or 146 billion pieces of mail per year — plans to use servers powered by Nvidia's GPUs and deep learning software to train multiple AI algorithms for image recognition, according to Buck. Those algorithms would then be deployed to the EGX systems at the Postal Service's package processing sites.
The aim is to improve the speed and accuracy of recognizing package labels, which would improve the speed of package delivery and reduce the need for manual involvement.
### Nvidia AI deployments and market initiatives
AI is being embraced by a number of industries, to varying degrees of success. Nvidia uses itself as a guinea pig:
“At Nvidia we have a fleet of self-driving vehicles, which we use for both collecting data and testing our self-driving capabilities. We ingest and create literally petabytes of data every week that has to be processed by our own team of labelers and processed by AIs,” Buck told the crowd. “We have literally thousands of GPUs doing training every day, which are supporting hundreds of data scientists, which are defining the self-driving car capabilities.”
The module in Nvidias self-driving car is called Pegasus and consists of two Volta GPUs and two Tegra SOCs. “Its basically an AI supercomputer inside every car processing hundreds of petabytes of data,” Buck said.
The challenge now is to actually apply AI, he said. To do so, Nvidia has a number of AI projects for the automotive, healthcare, robotics and 5G industries. For healthcare, for example, Nvidia has its Clara software development kit with pretrained models to tackle tasks such as looking for a particular kind of cancer in minutes or hours.
For IoT, Nvidia has the Metropolis Internet of Things application framework as cities build out sensors to detect unsafe driving conditions, such as a vehicle driving the wrong way onto a freeway. Nvidia also has the DRIVE autonomous vehicle platform, which spans everything from cars to trucks to robotaxis to industrial vehicles. Nvidia's Omniverse kit targets design and media, and its Aerial products are for telcos moving to 5G, along with the EGX server.
To train new developers to build AI apps on GPUs, Nvidia announced that its Deep Learning Institute just added 12 new courses focused on AI training. So far, DLI has trained more than 180,000 AI workers.
Join the Network World communities on [Facebook][2] and [LinkedIn][3] to comment on topics that are top of mind.
--------------------------------------------------------------------------------
via: https://www.networkworld.com/article/3452521/usps-invests-in-gpu-driven-servers-to-speed-package-processing.html
作者:[Andy Patrizio][a]
选题:[lujun9972][b]
译者:[译者ID](https://github.com/译者ID)
校对:[校对者ID](https://github.com/校对者ID)
本文由 [LCTT](https://github.com/LCTT/TranslateProject) 原创编译,[Linux中国](https://linux.cn/) 荣誉推出
[a]: https://www.networkworld.com/author/Andy-Patrizio/
[b]: https://github.com/lujun9972
[1]: https://www.networkworld.com/article/3338100/using-ai-to-improve-network-capacity-planning-what-you-need-to-know.html
[2]: https://www.facebook.com/NetworkWorld/
[3]: https://www.linkedin.com/company/network-world