Google’s tensor processing units (TPUs) are interesting, but Nvidia is essential

Google revealed at its I/O 2016 that it has developed and manufactured its own chipset called a Tensor Processing Unit, a custom designed chip that is made to work with its open source machine learning framework, TensorFlow. The TPUs were used to power DeepMind, that defeated International Go champion Lee Sedol. They are already being used for StreetView, RankBrain and Inbox Smart Reply, and are expected to be deployed for Google Home and Google Assistant as well. Neural Nets use a very different kind of processing, and these require different approaches than conventional computers. Custom architecture is more viable, and uses less resources, and is faster. Google for example, managed to optimise its architecture to deliver better performance per watt consumed. The result was computation power an order of magnitude better than…


Link to Full Article: Google’s tensor processing units (TPUs) are interesting, but Nvidia is essential

Pin It on Pinterest

Share This