Amazon claims breakthrough in distributed deep learning

Posted on Wednesday, December 11 2019 @ 11:13 CET by Thomas De Maesschalck
Amazon logo
Computer scientists from Amazon and Rice University say they've made a breakthrough that can significantly speed up the training of deep learning systems. The work will be presented at this week's 2019 Conference on Neural Information Processing Systems (NeurIPS 2019) in Vancouver. The researchers say that in tests on an Amazon search dataset with 70 million queries and over 49 million products, the new "merged-average classifiers via hashing" (MACH) system was 7-10 times faster than existing large-scale, distributed deep-learning systems. Furthermore, it achieved this feat with a 2-4 times smaller memory footprint.
"Our training times are about 7-10 times faster, and our memory footprints are 2-4 times smaller than the best baseline performances of previously reported large-scale, distributed deep-learning systems," said Shrivastava, an assistant professor of computer science at Rice.

Medini, a Ph.D. student at Rice, said product search is challenging, in part, because of the sheer number of products. "There are about 1 million English words, for example, but there are easily more than 100 million products online."
More at Tech Xplore


About the Author

Thomas De Maesschalck

Thomas has been messing with computer since early childhood and firmly believes the Internet is the best thing since sliced bread. Enjoys playing with new tech, is fascinated by science, and passionate about financial markets. When not behind a computer, he can be found with running shoes on or lifting heavy weights in the weight room.



Loading Comments