Google’s Billion connection deep learning neural net that was $1 million in hardware last year can be built with $20,000 in GPUs this year

You can now build a 1-billion-connection model with $20,000 worth of hardware. It opens up the world for researchers to improve the performance of speech recognition and computer vision. Down the line, this research on souped-up versions of neural networks running on GPUs could give rise to more powerful — and financially lucrative — GPU-based applications at large tech companies.

Ng’s team also built a super-sized, 11-billion-connection version of the cat detector for roughly $100,000. He wants to build a high-performance computer that will allow researchers who don’t have the deep pockets of some of these companies and universities to do research on deep learning. It’s a bit like what Apple and Microsoft did for personal computing or what cheaper sequencing hardware did for genomics. Both democratized technologies that were inaccessible to many.

The Google Cat experiment ran on 1,000 computers with 16,000 CPUs.

Scaling up deep learning algorithms has been shown to lead to increased performance in benchmark tasks and to enable discovery of complex high-level features. Recent e fforts to train extremely large networks (with over 1 billion parameters) have relied on cloud-like computing infrastructure and thousands of CPU cores. In this paper, we present technical details and results from our own system based on Commodity O ff-The-Shelf High Performance Computing (COTS HPC) technology: a cluster of GPU servers with Infinity- band interconnects and MPI. Our system is able to train 1 billion parameter networks on just 3 machines in a couple of days, and we show that it can scale to networks with over 11 billion parameters using just 16 machines. As this infrastructure is much more easily marshaled by others, the approach enables much wider-spread research with extremely large neural networks.

SOURCES – Stanford University, Wired Magazine

If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks