Deep Learning for Computer Architects

Deep Learning for Computer Architects

By Brandon Reagen, Robert Adolf, Paul Whatmough, Gu-Yeon Wei, and David Brooks

This is a primer written for computer architects in the new and rapidly evolving field of deep learning. It reviews how machine learning has evolved since its inception in the 1960s and tracks the key developments leading up to the emergence of the powerful deep learning techniques that emerged in the last decade.

READ FULL DESCRIPTION

Quantity Price Discount
List Price $54.95  

Quick Quote

Lorem ipsum dolor sit amet, consectetur adipisicing elit

Non-returnable discount pricing

$54.95


Book Information

Publisher: Morgan & Claypool
Publish Date: 08/22/2017
Pages: 123
ISBN-13: 9781627057288
ISBN-10: 1627057285
Language: English

Full Description

This is a primer written for computer architects in the new and rapidly evolving field of deep learning. It reviews how machine learning has evolved since its inception in the 1960s and tracks the key developments leading up to the emergence of the powerful deep learning techniques that emerged in the last decade.

Machine learning, and specifically deep learning, has been hugely disruptive in many fields of computer science. The success of deep learning techniques in solving notoriously difficult classification and regression problems has resulted in their rapid adoption in solving real-world problems. The emergence of deep learning is widely attributed to a virtuous cycle whereby fundamental advancements in training deeper models were enabled by the availability of massive datasets and high-performance computer hardware.

It also reviews representative workloads, including the most commonly used datasets and seminal networks across a variety of domains. In addition to discussing the workloads themselves, it also details the most popular deep learning tools and show how aspiring practitioners can use the tools with the workloads to characterize and optimize DNNs.

The remainder of the book is dedicated to the design and optimization of hardware and architectures for machine learning. As high-performance hardware was so instrumental in the success of machine learning becoming a practical solution, this chapter recounts a variety of optimizations proposed recently to further improve future designs. Finally, it presents a review of recent research published in the area as well as a taxonomy to help readers understand how various contributions fall in context.

About the Authors

Brandon Reagen is a Ph. D. candidate at Harvard University. He received his B. S. degree in Computer Systems Engineering and Applied Mathematics from University of Massachusetts, Amherst in 2012 and his M.

Learn More


Robert Adolf is a Ph. D. candidate in computer architecture at Harvard University. After earning a B. S. in Computer Science from Northwestern University in 2005, he spent four years doing benchmarking and performance analysis of supercomputers at the Department of Defense.

Learn More


Paul Whatmough leads research on computer architecture for Machine Learning at ARM Research, Boston, MA. He is also an Associate in the School of Engineering and Applied Science at Harvard University.

Learn More


Gu-Yeon Wei is Gordon McKay Professor of Electrical Engineering and Computer Science in the School of Engineering and Applied Sciences (SEAS) at Harvard University.

Learn More


David Brooks is one of the nation's leading writers and commentators. He is an op-ed columnist for The New York Times and appears regularly on PBS NewsHour and Meet the Press.

Learn More

We have updated our privacy policy. Click here to read our full policy.